WO2012167301A1 - Positioning, tracking and trajectory estimation of a mobile object - Google Patents

Positioning, tracking and trajectory estimation of a mobile object Download PDF

Info

Publication number
WO2012167301A1
WO2012167301A1 PCT/AU2012/000481 AU2012000481W WO2012167301A1 WO 2012167301 A1 WO2012167301 A1 WO 2012167301A1 AU 2012000481 W AU2012000481 W AU 2012000481W WO 2012167301 A1 WO2012167301 A1 WO 2012167301A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
computer
implemented method
trajectory
sensors
Prior art date
Application number
PCT/AU2012/000481
Other languages
French (fr)
Inventor
Ashod DONIKIAN
Original Assignee
Navisens Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2011902307A external-priority patent/AU2011902307A0/en
Application filed by Navisens Pty Ltd filed Critical Navisens Pty Ltd
Publication of WO2012167301A1 publication Critical patent/WO2012167301A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/0009Transmission of position information to remote stations
    • G01S5/0072Transmission between mobile stations, e.g. anti-collision systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0284Relative positioning
    • G01S5/0289Relative positioning of multiple transceivers, e.g. in ad hoc networks

Definitions

  • the present invention generally relates to determining positioning, tracking and/or trajectory estimates of a mobile object. More particularly, the present invention relates to determining positioning, tracking and/or trajectory estimates of a mobile object in relation to a determined positioning, tracking and/or trajectory of another mobile object. Furthermore, the present invention relates to determining a position or location estimate of an external event that is at a distance from or remote to a mobile object.
  • GPS is a useful tracking system for outdoor tracking applications
  • the present invention relates to tracking of a mobile object in relation to another mobile object for which a more accurate localisation or position is able to be obtained, such as being measured or determined.
  • positioning, tracking and/or trajectory estimates of a first mobile object are determined in relation to a second mobile object, due to a better determination of positioning, tracking and/or trajectory of the second mobile object being available.
  • the second mobile object, or sensors associated with the second mobile object may have access to infrastructure-based information, or may better exploit sensor information and motion models due to mounting of sensors at a specific position or at several positions on the second mobile object.
  • the second mobile object may be better suited to holding or containing sensors, or more accurate sensors, for example the second mobile object may be larger than the first mobile object.
  • the second mobile object could be a person and body-worn sensors could be mounted on the torso or footwear of the person, or the second mobile object could be a vehicle having wheel encoders and/or laser scanners mounted to the vehicle.
  • a variety of other characteristics of the second mobile object could also make the second mobile object more suitable for accurate tracking, such as well-defined motion models and characteristics (e.g. human motion models, zero velocity updates, vehicular motion with nonholonomic constraints, planar motion, motion prediction and planning, etc.). Tracking of a first mobile object in such a manner, that is in relation to a second mobile object, enables the possibility to perform a more detailed analysis and visualisation of motions and trajectories of the first mobile object which otherwise would be difficult to achieve.
  • Reference to an "object” should be interpreted broadly and could include, for example, an asset, an article, a robot, a vehicle, a person, an animal, any type of target, a mobile or cellular telephone, a passive object, a weapon, a device or apparatus that can fire, emit or project a projectile or substance, a projectile itself that is fired, emitted or projected, etc..
  • Reference to "mobile” also should be interpreted broadly, such as being movable, portable, transportable or the like.
  • a computer-implemented method of tracking a first mobile object, provided with or associated with a first sensor, in relation to a second mobile object, provided with or associated with a second sensor comprising the steps of: determining a position, orientation or trajectory of the second sensor; and determining a position, orientation or trajectory of the first sensor at least partially based on or otherwise in relation to the position, orientation or trajectory of the second sensor.
  • determining the position, orientation or trajectory means one or more of these features can be determined, a particular feature need not be determined to the exclusion of the other features.
  • a system for tracking a plurality of mobile objects wherein at least two mobile objects of the plurality of mobile objects each include at least one sensor, wherein one or more sensor-to-sensor relationships are defined depending on the types of the at least two mobile objects, thereby enabling a first mobile object of the at least two mobile objects to be tracked based at least partially on tracking of a second mobile object of the at least two mobile objects.
  • the one or more sensor-to-sensor relationships are defined depending on one or more of: whether the at least one sensor is associated or disassociated; the hierarchy level of the at least one sensor; parameters of the at least one sensor; dynamic selection of a relationship based on a type of motion of the at least one sensor; and/or, one or more independent motion models.
  • Reference to a hierarchy should be read as a general reference to any type of hierarchy, relational arrangement, connectivity, chain or structured organisation as applied to sensor levels, associations, configurations or relationships.
  • Reference to sensor association or disassociation should be read as a general reference to any means of linking or delinking, pairing or impairing, coupling or uncoupling, or the like, two or more sensors. For example, this could involve changing the hierarchy, relational arrangement, chain, structure, level, association, configuration or relationship information or data between two or more sensors.
  • Reference to a threshold of association/disassociation, or the like should be read as a general reference to any type of threshold, level, value, measure or degree of association or disassociation.
  • a system for tracking a first mobile object comprising: a first sensor associated with or including the first mobile object; a second sensor associated with or including a second mobile object; at least one processor to determine or estimate a position, orientation or trajectory of the second sensor; and at least one processor to determine or estimate a position, orientation or trajectory of the first sensor; wherein the determined or estimated position, orientation or trajectory of the first sensor is at least partially based on the determined or estimated position, orientation or trajectory of the second sensor.
  • the at least one processor may be the same or different processors in relation to the first sensor and the second sensor determinations.
  • the first mobile object is provided with or includes a plurality of first sensors; the second mobile object is provided with or includes a plurality of second sensors; the first sensor and the second sensor are related by a fixed hierarchy configuration; and/or the first sensor and the second sensor are related by a variable hierarchy configuration.
  • a check can be made if the first sensor is associated with the second sensor, and if the first sensor is associated with the second sensor, then determining the position, orientation or trajectory of the first sensor at least partially based on the position, orientation or trajectory of the second sensor.
  • the first sensor is a child sensor and the second sensor is a parent sensor; and/or if motion of the second mobile object is detected by the second sensor, then information or data is transmitted to the first sensor.
  • a measure of association is determined between the first sensor and the second sensor.
  • the measure of association is compared to a threshold of association to determine if the first sensor and the second sensor are to be associated or remain associated.
  • a measure of disassociation is determined between the first sensor and the second sensor.
  • the measure of disassociation is compared to a threshold of disassociation to determine if the first sensor and the second sensor are to be disassociated or remain disassociated.
  • sensor-to-sensor relationships are assigned by associating or disassociating sensors based on sensor measurements.
  • determining a location of an external event at least partially based on using the determined position, orientation or trajectory of the first sensor to determine an alignment or aim of the first mobile object.
  • determining a trajectory of a projectile or substance fired or ejected by the first mobile object there is included estimating a coverage area of the projectile or substance based on parameters of the projectile or substance.
  • determining a point of impact of the projectile or substance there is included determining clustering of a plurality of trajectories, and determining an area of interest based on the determined clustering.
  • either or both the first sensor and the second sensor are an inertial measurement unit.
  • a method of and/or system for tracking of one or more mobile objects each of the one or more mobile objects including at least one sensor. Sensor-to-sensor relationships are established for state estimation, trajectory estimation, projectile trajectory estimation, enabling measurements/observations of the environment, and/or the distribution of location nodes/beacons which may act as location references and as signal repeaters.
  • a method of and/or system for assigning sensor-to-sensor relationships by associating and disassociating sensors based on sensor measurements and the interaction between sensors (such as motion timing, proximity, similarities in motion and/or trajectory estimates, matching complimentary motions, etc.), where the sensor association/disassociation may be static or dynamic.
  • Reference to an external event should be read broadly to cover any type of event, occurrence or location determination that is external, removed from, at a distance to, or remote to an object associated with a sensor.
  • an event may occur at the object which is detected by the sensor, such as object motion, and an external event may be determined or estimated, based on various information such as trajectory clustering, which is remote to the object, such as an external event that may have influenced the event at the object.
  • a method of and/or system for determining the location of an external event by determining the trajectory of a projectile or substance which may be fired, ejected and/or acted upon by an object, based on detection of motion events.
  • a method of and/or system for determining or estimating the trajectory to an external event such as the trajectory for a projectile or substance which may be fired, ejected and/or acted upon by an object, and estimating the coverage area of the projectile or substance.
  • coverage area may relate to a damage, spread or distribution area, such as near or in the vicinity of the external event, and be based on parameters of the projectile or substance (e.g. ballistic projectile information, projectile/object size, weight, aerodynamic properties), the force of a motion event, sensor measurements/characteristics (e.g. measured acceleration, angular rotation, etc.), and/or meteorological or environmental measurements, etc.
  • a method of and/or system for determining or estimating the trajectory to an external event which does not result in the firing of a trajectory or projectile or substance for example, by measuring the aim of an object such as a tool, or determining the gaze of a user (for example, one or more people browsing in an art gallery, whereby external events may be paintings or sculptures of interest, or one or more people traversing an environment, whereby external events are related to user intention/interaction with the environment, such as for human-machine interaction (HMI), augmented reality using a head mounted display (HMD), providing directions and/or guidance with respect to navigating the environment, etc.).
  • HMI human-machine interaction
  • HMD head mounted display
  • a method of and/or system for determining or visualising the location and/or orientation of mobile objects including the trajectory of a projectile or substance to an external event (such as a point of impact or coverage/distribution area), and/or storing several trajectories and locations of motion events and external events for concurrent visualisation, management, and analysis of such events.
  • a method of and/or system for clustering several events and/or trajectories determining an area of interest based on the number, type, or location of events and/or trajectories to external events, determining a probability of occurrence of events, and performing an assessment and determining the probability of occurrence of one or more external events (such as threats) and their location based on such information. For example, by determining a location and a threat level of a target based on the number and direction of weapon discharges, or determining the performance of an athlete based on sensed motion and trajectory events, such as a level of marksmanship, and/or athletic performance for sporting events, etc.
  • a computer-implemented method of determining a location and a probability of an external event based on an occurrence of one or more events and trajectories comprising the steps of: detecting the occurrence of the one or more events; determining one or more trajectories for the one or more events; determining the location of the external event based on a trajectory end-point for at least one trajectory; and determining the probability of the external event at least partially based on the one or more events and the one or more trajectories.
  • clustering a plurality of the one or more trajectories, and determining an area of interest based on the clustering.
  • the occurrence of the one or more events and the determining of the one or more trajectories for a first mobile object uses at least one first sensor associated with the first mobile object.
  • an event of the one or more events is the measurement of a motion; an event of the one or more events is release or firing of a projectile or substance; an event of the one or more events is a form of physical contact; an event of the one or more events is detected by motion estimation; an event of the one or more events is detected by a classification algorithm; a trajectory of the one or more trajectories is estimated at least partially based on one or more parameters of the projectile or substance; a coverage area of the projectile or substance is estimated at least partially based on one or more parameters of the projectile or substance; the external event is a threat; the external event is a target or goal; the external event is located by providing a grid reference; and/or the external event is located by providing an azimuth and elevation.
  • a system for determining a location and a probability of an external event based on an occurrence of one or more events and trajectories comprising: a sensor associated with a mobile object, the sensor able to detect the occurrence of the one or more events; and, at least one processor configured to: determine the one or more trajectories for the one or more events; determine the location of the external event based on a trajectory end-point for at least one trajectory; and, determine the probability of the external event at least partially based on the one or more events and the one or more trajectories.
  • FIG. 1 illustrates a flowchart of an example hierarchy configuration where only a single hierarchy level is utilised
  • FIG. 2 illustrates a flowchart of an example hierarchy configuration where a lower level hierarchy sensor influences more than one upper level hierarchy sensors
  • FIG. 3 illustrates a flowchart of an example hierarchy configuration where sub- levels of hierarchy are utilised
  • Fig. 4 illustrates a flowchart of an example hierarchy configuration where sensors in a primary hierarchy ranking may influence each other and how various stages of sensors in a secondary hierarchy ranking may in turn influence the primary sensor(s);
  • FIG. 5 illustrates a flowchart of an example method for transmitting and receiving sensor information at the occurrence of a motion
  • FIG. 6 illustrates a flowchart of an example method for a child sensor in a disassociated state to determine if the child sensor may be associated with a parent sensor;
  • Fig. 7 illustrates a flowchart of an example method for a child sensor, in an associated state, to determine if the child sensor may become disassociated, or if the child sensor may maintain association with the parent sensor with which the child sensor is currently associated;
  • Fig. 8 illustrates a flowchart of an example method for management of motion events and external events;
  • Fig. 9 illustrates example objects and associated sensors
  • Fig. 10 illustrates an example system that can be utilised to embody or give effect to a particular embodiment
  • FIG. 1 1 illustrates an example processing system that can be utilised to embody or give effect to a particular embodiment
  • Fig. 12 illustrates an example defence implementation for visualising a plurality of motion events (weapon discharge) to determine external events (enemy force locations);
  • Fig. 13 illustrates an example defence implementation of detecting a plurality of motion events (weapon discharge) and external events (enemy force locations) including clustering of a plurality of trajectories and probability estimation of external events (in this example locations) represented as error ellipses;
  • Fig. 14 illustrates an example emergency services/mining implementation, which also applies to other industries involving an interaction between personnel, vehicles, and other objects.
  • An embodiment of the present invention relates to tracking of one or more mobile objects to determine the position and/or orientation of one or more of the objects, which could be each object, (and where applicable, articulated segments of an object), thereby allowing for position determination, motion capture, detection and analysis of motion events, estimation of motion trajectory, calculation of performance parameters, and/or deriving of statistical information in relation to the use of the mobile object(s).
  • tracking of the object also may be used for detection of motion events related to use of that object and determination, visualisation or estimation of external events which initiated use of that object via the estimation of the trajectory of the object, and/or statistical analysis of several such trajectories.
  • tracking of the mobile object may also provide a means of data acquisition and analysis, whilst relaxing or eliminating the requirement for static or calibrated/surveyed sensor positioning, thus enabling rapid data acquisition under dynamic conditions.
  • the occurrence of an external event may be determined when the external event cannot reliably be directly measured or observed by the first or second sensor.
  • occurrence of an external event may be detected by events of the first or second sensor through use or manipulation of the first or second object, such as measuring motions by detecting physical contact, the firing of a weapon or object, abrupt changes in motion, or lack of motion (e.g. prolonged static or stationary periods).
  • Reference to an external event should be read broadly to cover any type of event, occurrence or location determination that is external, removed from, at a distance to, or remote to an object associated with a sensor.
  • a mobile object may include at least one sensor attached to, associated with, or integrated within the mobile object, or where applicable, multiple sensors can be provided for a mobile object where a plurality of sensors may be attached to or integrated within a mobile object or articulated regions of a mobile object if constructed of one or more segments and joints, such as an articulated vehicle or robot with permanent or semipermanent joints, or wearable sensors attached to the torso, limbs and/or extremities of a human body.
  • a mobile object may be represented by one or more sensors which collectively represent that mobile object (e.g. sensors which are mounted at segments of the limbs, head, and torso of a human), or may be represented by a single sensor (e.g. a single body-worn sensor mounted at the waist of a human, a sensor attached to a hand-held object such as a weapon or tool, a sensor attached to a vehicle, etc.).
  • each sensor may operate independently and may have a defined relationship with one or more other sensors attached to one or more mobile objects, and such relationships may be arbitrarily defined based on the type of mobile object being tracked by a sensor, the type of sensor employed, and the hierarchy level of the relationship. Relationships between sensors may be dynamically assigned and configured as sensors become associated or disassociated based on sensor measurements.
  • ⁇ sensors may be consecutively linked by a sehsor-to-sensor relationship chain (such as a kinematic chain), and/or several sensors may be concurrently linked by a relationship (for example, several mobile objects such as personnel on-board a mobile platform or vehicle, or several mobile objects carried by personnel).
  • a sehsor-to-sensor relationship chain such as a kinematic chain
  • several sensors may be concurrently linked by a relationship (for example, several mobile objects such as personnel on-board a mobile platform or vehicle, or several mobile objects carried by personnel).
  • the tracking system includes one or more mobile objects which are to be tracked, each of which includes at least one sensor, or if required or appropriate, more than one sensor, for example, to track articulated limb movement and/or to track the extremities of limbs.
  • each sensor can maintain a record of information including several fields which, depending on implementation, may include the following features.
  • a hierarchy level of a sensor and a level of hierarchy with which the sensor is able to be associated can be provided, which determines whether the sensor is influenced by other sensors and/or influences other sensors.
  • the hierarchy level i.e. rank
  • the hierarchy level may be defined in several ways, for example, as a ranking of increasing/decreasing levels of hierarchy which may also include sub-levels of ranking.
  • Fig. 1 illustrates an example hierarchy configuration 10 where only a single hierarchy pathway is utilised, such that sensors of lower level hierarchy (e.g. Hierarchy level 1) influence sensors of higher level hierarchy (e.g. Hierarchy level 2).
  • sensors of lower level hierarchy e.g. Hierarchy level 1
  • sensors of higher level hierarchy e.g. Hierarchy level 2.
  • a vehicle mounted sensor Hierarchy level 1
  • a body-worn sensor Hierarchy level 2
  • Hierarchy level 3 a sensor attached to a hand-held object
  • Fig. 2 illustrates an example hierarchy configuration 20 where a lower level hierarchy sensor (e.g. Hierarchy level 1) influences more than one higher level hierarchy sensors (e.g. Hierarchy level 2).
  • a vehicle mounted sensor e.g. Hierarchy level 1
  • a body-worn sensor e.g. Hierarchy level 2
  • Fig. 3 illustrates an example hierarchy configuration 30 where hierarchy sub-levels are utilised (e.g. Hierarchy sub-levels 2a, 2b, 2c, 2d), where the primary hierarchy level (e.g. Hierarchy level 2) indicates the group/class which the sensor belongs to, where sensors belonging a group/class of lower level hierarchy (e.g.
  • Hierarchy level 2 influence sensors which belong to a group/class of higher level hierarchy (e.g. Hierarchy level 3).
  • a vehicle mounted sensor e.g. Hierarchy level 1
  • a body-worn sensor e.g. Hierarchy level 2
  • a secondary level of hierarchy ranking which indicates the sensor hierarchy within its own group/class hierarchy can be provided.
  • multiple sensors which are within the group/class of body- worn sensors which influence each other, such as a foot-mounted sensor (e.g. Hierarchy sub-level 2a) influencing a waist-mounted sensor (e.g. Hierarchy sub-level 2b) which in-turn influences a sensor integrated into a glove, helmet, head mounted display (HMD), etc. (e.g. Hierarchy sub-level 2c).
  • Fig. 4 illustrates an example hierarchy configuration 40 showing how sensors in primary hierarchy levels (e.g. Hierarchy levels 1, 2, 3) may influence each other and how various stages of sensors in secondary hierarchy levels (e.g. Hierarchy sub-levels 2b, 2c, 2d, 3b) may in turn influence primary sensor(s), such that sensor association is dependent on both the primary and secondary hierarchy rankings and the number of sensors in the relationship chain.
  • primary hierarchy levels e.g. Hierarchy levels 1, 2, 3
  • secondary hierarchy levels e.g. Hierarchy sub-levels 2b, 2c, 2d, 3b
  • primary sensor(s) e.g. Hierarchy sub-levels 2b, 2c, 2d, 3b
  • the configuration may be varied, and the hierarchy level with which a sensor may be associated may take many configurations, as displayed by the dashed lines in Fig. 4, each of which indicate a possible order of hierarchy.
  • a sensor may be influenced by another sensor of lower than or equal to its own level of hierarchy. For example:
  • a sensor might be only influenced by a sensor of immediately lower hierarchy level (e.g. where a chain of sensor-to-sensor relationships are defined, such as sensors tracking articulated limb movement, each sensor is only influenced by a preceding sensor);
  • a sensor might be influenced by a sensor of lower hierarchy level which may not necessarily be of immediately lower hierarchy level (e.g. a sensor for a hand-held object which is freely manipulated by a human and thus is typically associated with a sensor of immediately lower hierarchy such as a body-worn sensor, may be associated with a sensor of even lower hierarchy level such as when it is placed into a vehicle having a sensor or storage unit/docking station/base-station having a sensor); and/or,
  • a sensor may be influenced by another sensor of the same hierarchy level (e.g. a person with a body-worn sensor physically interacting with another person with a body- worn sensor, a hand-held object with a sensor physically interacting with another hand- held object with a sensor, etc.).
  • another sensor of the same hierarchy level e.g. a person with a body-worn sensor physically interacting with another person with a body- worn sensor, a hand-held object with a sensor physically interacting with another hand- held object with a sensor, etc.
  • hierarchy levels may be configured such that they represent levels of compatibility between sensors/objects so that only sensors which are compatible may be associated.
  • hierarchy levels may be configured such that they represent "ownership", in which case sensors/objects which, for example, are allocated to an individual user (e.g. a weapon- mounted sensor and a body-worn sensor both allocated to an individual soldier) are only associated with each other.
  • sensor association may be limited or restricted to only those sensors/objects which may be "paired" or "grouped”.
  • the hierarchy configuration may be initialised and/or dynamically updated or adjusted. For example, consider the scenario where a person with a body- worn sensor is determined to be injured and/or immobile (for example determined by way of motion sensor measurements and/or by way of other sensor measurements such as health monitoring, etc.), and thus the hierarchy level may dynamically adjust such that the person requiring assistance may be associated with another body-worn sensor of the same level of hierarchy, such that a person which comes to their aid may physically interact with the injured and/or immobile person to move them to another location (e.g. drag/carry the person away).
  • a sensor-to-sensor relationship can be provided which defines the relationship a sensor may have with another sensor with which it may be associated.
  • Such relationships can be arbitrarily defined and a sensor may have more than one relationship which is selected based on a number of factors, such as:
  • the hierarchy level of the sensor which it is associated with if the sensor is associated, the hierarchy level of the sensor which it is associated with;
  • various parameters or statistics which characterise the sensor and the sensor performance with which it is associated, such as sensor accuracy, drift, temperature, operational conditions, etc.;
  • multiple relationships may be integrated with a multi-model estimator, for example, the most appropriate relationship may be dynamically selected based on the type of motion the sensor is undergoing, and/or more than one relationship may be applied using a multi- hypothesis estimator if the motion is uncertain or unknown; and/or,
  • Measure of Association MoA
  • a measure of association can be provided of which several implementations are possible. For example, one example method of achieving this is by defining a threshold of association which indicates a level for which agreeing measurements must exceed for a sensor to associate itself with another sensor.
  • the threshold may be dynamically adjusted and/or updated based on sensor measurements and operational conditions.
  • the threshold is compared to a Measurement of Association (MoA) which is calculated for parent/child sensors with compatible hierarchy ranking, based on one or more factors such as measurement similarities, motion timing, motion classification, and/or proximity between the sensors.
  • MoA Measurement of Association
  • the child sensor becomes associated with the parent sensor.
  • the threshold of association and MoA may be equivalent to a single sensor measurement or a combination of several sensor measurements, for example, if only a single measurement is available (e.g. proximity between sensors), then the MoA and threshold of association provide a method of implementing proximity measurements for association.
  • a measure of disassociation can be provided of which several implementations are possible. For example, one example method of achieving this is by defining a threshold of disassociation which indicates a level for which disagreeing measurements must exceed for a sensor to disassociate with another sensor with which it is currently associated.
  • the threshold may be dynamically adjusted and/or updated based on sensor measurements and operational conditions.
  • the threshold is compared to a Measurement of Disassociation (MoD) which is calculated for parent/child sensors based on one or more factors such as measurement similarities, motion timing, motion classification, and proximity between the sensors. If the calculated MoD is above the threshold of disassociation maintained by the child sensor, the child sensor becomes disassociated with the parent sensor.
  • MoD Measurement of Disassociation
  • the threshold of disassociation and MoD may be equivalent to a single sensor measurement or a combination of several sensor measurements, for example, if only a single measurement is available (e.g. proximity between sensors), then the MoD and threshold of disassociation provide a method of implementing proximity measurements for disassociation.
  • Sensors may be optionally assigned with other parameters relating to a sensor or an object associated with the sensor such as the following: [067] (a) The position and/or orientation of a sensor, if it is known or is able to be determined, and the time of calculation of the position and/or orientation estimate. If the sensor is unable to calculate its position and/or orientation, then the position and/or orientation and time of calculation reported may be based on the last known position and/or orientation of the sensor when it was last associated with a parent sensor for which a position and/or orientation was known. If the position and/or orientation are unknown, then an unknown status is reported. Depending on implementation, the position and/or orientation (and/or other state estimates) may form a core aspect or requirement of the sensor-to-sensor relationship and/or the calculation of the MoA/MoD.
  • a sensor identifier which identifies the type of sensor employed (e.g. an inertial sensor, wireless measurement sensor, GPS sensor/receiver, wheel encoder(s), etc.).
  • An object identifier which defines or categorises the type of object which the sensor represents (e.g. a mobile vehicle/platform, a human, a freely manipulated hand held object such as a weapon, tool, mobile phone, etc.).
  • Sensor mounting information such as me coordinates indicating the position and/or orientation of where a sensor is fixed/attached to an object. Such sensor mounting information may involve calibration procedures to calculate the sensor position, orientation, misalignment, etc.
  • An associated/disassociated status and hierarchy indicator which indicates whether the sensor is currently associated or disassociated.
  • a hierarchy configuration or indication thereof which indicates where a sensor and/or other sensors lie in a sensor relationship (which may include a chain of sensor-to-sensor relationships), in a similar manner to that illustrated in Figs. 1 to 4.
  • Such a configuration or indication may be static or dynamic and discovery may be supported.
  • a configuration hierarchy may be defined during initialisation and/or dynamically updated during association/disassociation of objects, and may be dynamically discoverable by a sensor/object discovery command via a communications interface.
  • an object may include a single sensor (in which case a sensor-to-sensor relationship represents an object- to-object relationship) or an object may include a plurality of sensors (in which case a sensor-to-sensor relationship represents a relationship between multiple sensors of a single object).
  • sensor-to-sensor relationships may represent an interaction between sensors which represent multiple objects and/or sensors which represent a single object.
  • Sensors which are at a level of hierarchy which dictates that they may influence other sensors may act upon sensors which are at a level of hierarchy which dictates that they may be influenced (for convenience, defined hereafter as child sensors) in such a manner that a relationship (e.g. any arbitrary relationship such as a kinematic relationship) is defined between the sensors.
  • a lower level hierarchy sensor may represent an object such as a mobile vehicle/platform which acts upon the next higher level sensor which may represent an object such as a human with a body-worn sensor, which in turn acts upon the next higher level sensor which may represent an object such as a hand-held object. In such a manner, a chain of sensor-to-sensor relationships may be defined and dynamically updated.
  • such a scenario may occur when a group of fire-fighters 510, each equipped with body- worn sensors, enter an instrumented vehicle 515 such as a fire truck or helicopter and exit the vehicle once transported to the location of a site 525 (such as a building or other type of area of operation), and commence fire-fighting operations whilst equipped with hand-held extinguishers and/or fire hoses which are instrumented with a sensor.
  • an instrumented vehicle 515 such as a fire truck or helicopter and exit the vehicle once transported to the location of a site 525 (such as a building or other type of area of operation), and commence fire-fighting operations whilst equipped with hand-held extinguishers and/or fire hoses which are instrumented with a sensor.
  • Another example of such a scenario is when a group of miners 510 within an underground mine, each equipped with body- worn sensors, enter an instrumented transportation vehicle 515 which may pick up and drop off miners at work sites 525 within the underground mine, where each miner may be operating heavy machinery 515 and/or equipment which may also be instrumented with sensors.
  • dynamic association and disassociation of sensors and subsequent state estimation allows for tracking of objects, notably higher level (i.e. child) objects, using relationships and hierarchy levels, which otherwise would not be feasible in an underground or dense urban area or in a GPS-denied environment without significant installation of infrastructure.
  • a mobile object such as a human may be represented by multiple sensors.
  • a lower level hierarchy sensor could be assigned as a foot-mounted sensor (e.g. worn or integrated into protective footwear) which acts upon the next higher level sensor which could be attached to the torso (e.g. worn or integrated into a protective vest, or equipment mounted on a belt), which in turn acts upon a next higher level sensor such as a sensor associated with a protective helmet, augmented reality goggles and/or hand-held objects such as tools, weapons or sporting equipment.
  • a next higher level sensor such as a sensor associated with a protective helmet, augmented reality goggles and/or hand-held objects such as tools, weapons or sporting equipment.
  • Such a scenario may occur with teams of personnel equipped for particular tasks, such as security/defence personnel, first responders such as fire-fighters, workers in industries such as mining, personnel utilising tools which may be freely manipulated, and athletes/sportspersons which involve freely manipulated equipment (e.g. a tennis racquet, a baseball bat, etc.).
  • Each sensor may have an arbitrary motion model which operates independently or in conjunction with a sensor-to-sensor relationship, and/or one or more additional arbitrary motion models which may be assigned as sensor-to-sensor relationship(s).
  • a parent sensor may represent an object such as a human equipped with a waist-mounted body-worn sensor with a human motion model
  • a child sensor may represent an object which is being influenced by the parent object once both sensors are associated, and may make use of a motion model which supports a freely-manipulated hand-held object such as a weapon, sporting equipment or a mobile phone, or a wearable object such as a glove, or an object with a vehicular motion model such as a cart or trolley.
  • the child sensor/object may make use of kinematic techniques for position and/or orientation estimation in relation to the parent sensor/object, and may also utilise a constraint condition to restrict the kinematic degrees of freedom (DOF).
  • DOF degrees of freedom
  • Such a kinematic relationship may also define one or more intermediate joints and segments within the kinematic chain which are treated as virtual joints and segments, for example, intermediate joints and segments which represent the torso, limbs, and extremities of a human body between a foot-mounted sensor and/or a waist-mounted sensor and a hand-held object with attached sensor.
  • the child sensor/object may be considered to estimate its motion, position, and/or orientation in a sensor coordinate frame which, according to the nature of the relationship, becomes aligned with and/or constrained by the coordinate frame of the parent sensor/object.
  • a child sensor/object may make use of the location, orientation and/or other information sources obtained from a parent sensor/object (such as the occurrence of a motion, motion detection, motion timing information, motion classification, etc.) as an observation within its own estimation platform (e.g. a probabilistic estimator which treats such sources of information as observations to perform an "update" using an appropriate observation model).
  • such measurements may also be used as an observation within the location estimation framework so that range measurements to fixed objects such as buildings, fixtures, terrain, and other references may then be incorporated within the estimation framework without the typical constraints of requiring a sensor which is fixed or rigidly mounted to a platform, or a sensor which is calibrated with respect to a platform such as a vehicle or a surveyed location point or a fixture. Additionally, the benefits of rapid measurement acquisition are apparent as the time required to set-up a stationary surveyed position is reduced or eliminated.
  • SLAM Simultaneous Localisation and Mapping
  • the estimation framework may also account for sensor characteristics and measurement errors (e.g. position, range, azimuth, and elevation errors) and utilise information from several weapons each equipped with a laser rangefinder, providing an overall framework catering for ranging for localisation purposes and ranging for targeting purposes from one or more sensors which may be networked (e.g. a body area network (BAN), to improve situational awareness as part of network centric warfare, etc.).
  • sensor characteristics and measurement errors e.g. position, range, azimuth, and elevation errors
  • BAN body area network
  • range measurements may be strategically acquired for localisation purposes, for example, by measuring the range to a fixed reference and allowing the localisation framework to handle such information accordingly, including generating a map of the environment and/or correlating such measurements with a map of the environment (e.g. range measurements to a wall within an indoor building).
  • Such range measurements may be manually or automatically acquired at certain intervals, for example, at regular time or distance intervals, with respect to a growing uncertainty in the localisation estimate, or if processing resources, power consumption, and sensor radiation/emissions allow, such measurements may be continuously acquired and handled by the location estimation platform (e.g. measurements may be rejected or accepted, several measurements to a single target may be acquired to reduce target uncertainty and eliminate spurious readings and clutter, etc.).
  • range measurements for the purpose of localisation and/or map building may be automatically acquired without user intervention or may be strategically acquired with user intervention, for example, with respect to the location and orientation of the weapon (e.g. while the user points and scans the weapon whilst securing an environment or building or whilst traversing an indoor or outdoor urban environment such as when walking down a lane or corridor).
  • the weapon orientation and motion is determined with a weapon-mounted sensor, and the weapon location may be determined by exploiting the location of a parent body- worn sensor, and the corresponding sensor-to-sensor relationship in conjunction with measurements from the weapon-mounted sensor.
  • a map of the environment is available, then such measurements may be appropriately handled by a location estimation framework (e.g. as range bearing observations or as features or landmarks which are correlated to the map), or if a map is not available or in addition to availability of a map, such range bearing measurements may be stored, processed, and/or visualised (e.g. as 3D point clouds, or as a feature-based or landmark-based representation utilising techniques such as line extraction or curve fitting, etc.) to acquire information about the surrounding environment, to share such information with other sensors/objects, and to build a map of the environment which may be used for subsequent localisation (e.g.
  • a location estimation framework e.g. as range bearing observations or as features or landmarks which are correlated to the map
  • visualised e.g. as 3D point clouds, or as a feature-based or landmark-based representation utilising techniques such as line extraction or curve fitting, etc.
  • a laser scan map or a feature/landmark map which also may be used in several estimation platforms and techniques known in the prior art, such as scan matching and feature/landmark extraction for Simultaneous Localisation and Mapping (SLAM), etc.).
  • SLAM Simultaneous Localisation and Mapping
  • a sensor/beacon 505 may be configured as a child sensor and associated with another mobile object configured as a parent sensor, and once placed within the environment, the sensor/beacon 505 may then disassociate and dynamically adjust its hierarchy level such that it may then act as a parent sensor to other mobile objects (including the same mobile object which it was previously associated with as a child sensor). Configured in such a manner, the sensor may then act as a mobile beacon, inheriting the location of the parent sensor/object at the time of disassociation, and providing localisation information to other sensors (e.g. other mobile objects traversing within the environment, such as personnel, vehicles, etc.).
  • sensors e.g. other mobile objects traversing within the environment, such as personnel, vehicles, etc.
  • the sensor/beacon may be strategically placed within the environment so that when the location of the beacon is re-visited or newly visited by mobile sensors/objects traversing the environment, the location information configured and transmitted by the beacon aids the location estimate of these sensors/objects.
  • various strategies may be used by the parent sensor/object, for example, placing beacons after travelling a certain distance or after a certain period of time has elapsed, placing beacons with respect to the environment structure such as when travelling around corners, doorways or open spaces, placing beacons with respect to an increasing uncertainty of the localisation estimate, etc.
  • measurement information from the beacon may be handled by each sensor as an observation within its own estimation platform (e.g. a probabilistic estimator which treats such sources of information as observations to perform an "update" using an appropriate observation model).
  • the dispersed beacons also may be configured to act as signal repeaters and routing nodes which enable data transmission/reception and communication in difficult/harsh environments where the transmission range and/or power is insufficient for direct transmission between units (e.g. in an underground mine or dense urban environment), or as an alternate means of transmission to fixed infrastructure or where direct transmission is unreliable, affording a degree of redundancy by routing messages via one or more nodes (e.g.
  • the estimation framework may also be considered to be an implementation of Simultaneous Localisation and Mapping (SLAM).
  • SLAM Simultaneous Localisation and Mapping
  • a similar scenario to the distributed mobile beacons can be envisaged in the case of an instrumented vehicle which transports personnel to the location of an incident or work environment (e.g. a vehicle carrying first responders such as fire-fighters or an incident response team or other safety/defence/security team or industrial workers such as miners, etc.).
  • first responders such as fire-fighters or an incident response team or other safety/defence/security team or industrial workers such as miners, etc.
  • personnel 510 equipped with body- worn sensors may become dynamically associated with and disassociated from the vehicle 515 whilst performing their duties, so that once stationary one or more sensors mounted or integrated within the vehicle 515 may then provide localisation information to other sensors based on the position and/or orientation of the vehicle (e.g.
  • the vehicle 515 may aid the location estimate of other sensors/objects in a similar manner to the mobile beacons 505, acting as a fixed base-station or command-and-control centre 520.
  • the vehicle sensor(s) may then dynamically adjust/update hierarchy level(s) and/or relationship(s) and measurement(s) accordingly and the process may continue.
  • sensors acting as mobile beacons may also transmit sensor corrections, acting as a ground-based reference station in a similar manner to differential GPS (DGPS), e.g. to act as a reference station for RF signal correction or synchronisation, or to compensate for drifting temperature and variations in environmental temperature or changes in atmospheric pressure due to weather conditions, etc.
  • DGPS differential GPS
  • the nature of the relationship between a parent sensor/object and a child sensor/object ensures that unreliable, error-prone, or drifting measurements, and uncertain and/or approximate motion models do not pose a significant deviation in the motion, position and/or orientation estimates of the child sensor/object.
  • System information such as measurement uncertainties, mapping information, corrections in sensor drift/bias, misalignment and related parameters may be shared between the parent sensor(s)/object(s) and child sensor(s)/object(s) to improve position and/or orientation estimates and other estimation parameters of each individual sensor/object. Such information may flow either from the parent sensor(s)/object(s) to child sensor(s)/object(s) or vice-versa.
  • a sensor may maintain a state estimate (e.g. position and/or orientation) and may provide such information to other sensors.
  • sensors may make use of several frameworks/techniques based on the sensor type and information available, such as integration of inertial sensors, the application of one or more motion models, the application of an optimisation algorithm (which may include constraints), and may utilise one of several available estimation techniques (e.g. a probabilistic estimator such as a Kalman filter or Sequential Monte-Carlo (SMC) approaches such as a Particle Filter, Simultaneous Localisation and Mapping (SLAM), etc.) and various sources of information (e.g.
  • a probabilistic estimator such as a Kalman filter or Sequential Monte-Carlo (SMC) approaches
  • SAM Simultaneous Localisation and Mapping
  • Each sensor may utilise a different estimation technique/methodology and measurement uncertainty based on factors such as sensor type and sensor characteristics, environmental conditions, the type of object being tracked, the type of motion and motion dynamics expected, the type of sensor-to-sensor relationship, and processor/memory availability and power consumption. Sensors may also make use of multi-model estimation and dynamically switch between models based on sensor measurements and/or sensor-to-sensor relationships.
  • the sensor association/disassociation comprising of sensor hierarchy and MoA/MoD may be implemented by one of several frameworks/techniques known in the literature which deal with information from multiple sensors and/or multiple targets, such as implementations and variations of Multi Target Tracking (MTT), Multiple Hypothesis Tracking (MHT), Interacting Multiple Model (IMM) estimation, sensor registration/association using Probabilistic Data Association (PDA) or Joint Probabilistic Data Association (JPDA), fuzzy logic, and/or Bayesian techniques and estimation methods such as a Kalman Filter or Sequential Monte-Carlo (SMC) approaches such as a Particle Filter, etc., and various strategies for event association such as maximum a posteriori probability (MAP), maximum likelihood (ML) estimation or Mutual Information (MI), etc.
  • MTT Multi Target Tracking
  • MHT Multiple Hypothesis Tracking
  • IMM Interacting Multiple Model
  • PDA Probabilistic Data Association
  • JPDA Joint Probabilistic Data Association
  • Bayesian techniques and estimation methods such as a Kalman Filter or Sequential Monte-Carlo
  • a parent sensor/object may be concurrently associated with one or more child sensors/objects (e.g. a vehicle acting as a parent object is associated with several body-worn sensors acting as child objects and representing occupants within a vehicle, or a human with a waist-mounted body-worn sensor acting as a parent object is associated with several child objects such as body-worn items such as clothing or protective gear or equipment carried by personnel, mobile nodes/beacons which may be deployed or dispersed throughout the environment and used for reference measurements and/or relaying communications).
  • a vehicle acting as a parent object is associated with several body-worn sensors acting as child objects and representing occupants within a vehicle
  • a human with a waist-mounted body-worn sensor acting as a parent object is associated with several child objects such as body-worn items such as clothing or protective gear or equipment carried by personnel, mobile nodes/beacons which may be deployed or dispersed throughout the environment and used for reference measurements and/or relaying communications).
  • An object may be tracked using a variety of types of sensors through either self- contained infrastructure-free techniques (e.g. inertial navigation, visual odometry by an onboard camera, laser measurements, node-to-node measurements, etc.), infrastructure- reliant techniques (e.g. GPS, RFID, WiFITM, active or passive beacons which may be pre- existing or dispersed within the environment during localisation, cell phone network towers, etc.), or any combination of the above.
  • self- contained infrastructure-free techniques e.g. inertial navigation, visual odometry by an onboard camera, laser measurements, node-to-node measurements, etc.
  • infrastructure- reliant techniques e.g. GPS, RFID, WiFITM, active or passive beacons which may be pre- existing or dispersed within the environment during localisation, cell phone network towers, etc.
  • Each object may contain multiple sensors of varying types, however, for the purpose of sensor association/disassociation, each object preferably employs a sensor which enables calculation of a MoA/MoD which may include motion and/or position/trajectory estimation, motion timing, motion estimation/classification, and/or distance/proximity measurements, etc.
  • a MoA/MoD which may include motion and/or position/trajectory estimation, motion timing, motion estimation/classification, and/or distance/proximity measurements, etc.
  • this may be in the form of one or more sensors measuring across one or more axes of motion, such as inertial sensors (e.g. one or more accelerometers and/or gyroscopes available as discrete components, a complete inertial measurement unit, etc.), magnetic sensors (e.g.
  • Each object may also make use of proximity sensors (e.g. RFID, sensor-to-sensor/node-to-node RF measurements, etc.) to determine a relative distance to one or more other mobile sensors/objects or base-stations.
  • proximity sensors e.g. RFID, sensor-to-sensor/node-to-node RF measurements, etc.
  • a parent sensor may publish, broadcast or otherwise transmit information in relation to the motion it is experiencing. Each child sensor can then listen for such information.
  • a sensor may simultaneously function as a parent and as a child (for example, a human represented by body-worn sensor acts as a parent to a sensor attached to a hand-held object, and simultaneously also acts as a child to a parent sensor which represents a vehicle or platform such as a personnel carrier, a truck or other heavy equipment, a ship or water vessel, etc.).
  • Such information may include continuous sensor data and/or discrete features extracted from sensor data (e.g. a sensor feature vector which contains discrete information about the motion), which may also include information such as the estimated position and/or orientation, measurement uncertainty, motion timing information, and/or an estimation or classification of the motion type, with the aim of allowing a child sensor/object to adequately associate itself with a parent sensor/object and perform state estimation.
  • sensor data e.g. a sensor feature vector which contains discrete information about the motion
  • information such as the estimated position and/or orientation, measurement uncertainty, motion timing information, and/or an estimation or classification of the motion type, with the aim of allowing a child sensor/object to adequately associate itself with a parent sensor/object and perform state estimation.
  • Such information may already form part of the data included with the record of information maintained by each sensor.
  • Such information may include wireless characteristics and/or measurements which indicate a level of proximity between objects, for example, Radio Frequency (RF) measurements such as the Received Signal Strength Indicator (RSSI), Time Of Flight (TOF) measurements, Time Difference Of Arrival (TDOA) measurements, Radio Frequency Identification (RFID) signals, and/or other related measurements and technologies such as generation of magnetic fields, acoustic transmissions, electromagnetic transmissions, optical transmissions, and Ultra Wideband (UWB) transmissions, etc.
  • RF Radio Frequency
  • RSSI Received Signal Strength Indicator
  • TOF Time Of Flight
  • TDOA Time Difference Of Arrival
  • RFID Radio Frequency Identification
  • Such communication may take many forms, for example, over a wireless transmission means, a wired network, and/or on a common bus which communicates with multiple sensors (e.g. Wireless Local Area Network (WLAN Wi-FiTM), Local Area Network (LAN), ZigbeeTM, Universal Serial Bus (USBTM), Controller Area Network (CAN) bus, or a custom interface/protocol, etc.).
  • WLAN Wi-FiTM Wireless Local Area Network
  • LAN Local Area Network
  • USBTM Universal Serial Bus
  • CAN Controller Area Network
  • custom interface/protocol etc.
  • Such communication may be over a particular transmission medium, channel, frequency, modulation, or coding, etc., which is common to a particular group of sensors such that sensors may be arranged in groups which only communicate amongst each other. For example, two teams of personnel each with body-worn sensors and each with weapon- mounted sensors operating as two independent groups, thus each team may utilise sensors which communicate only to sensors within that team/group. [0105] Such communication may be performed only with sensors/objects which are indicated as being "paired", e.g. a weapon-mounted sensor paired with a body-worn sensor or an instrumented vehicle paired with a body-worn sensor, both of which are allocated to an individual user and thus may only communicate association/disassociation information with each other.
  • paired e.g. a weapon-mounted sensor paired with a body-worn sensor or an instrumented vehicle paired with a body-worn sensor
  • the communication protocol may also support a mesh network and dynamic configuration of communication nodes and be "self-healing" to support the addition of new nodes and the loss of existing nodes, and may be based on an accepted communication standard such as IEEE 802.15.4 (ZigbeeTM), etc. Sensors/objects may also act as communication nodes within a sensor network which route messages via one or more sensors and afford a degree of redundancy and/or an extended communication range.
  • the communication protocol may also support or require encryption and/or the exchange of security credentials or keys prior to information exchange and may utilise an external server for key management and verification.
  • the communication protocol may also support or require authentication and/or identification, for example, in conjunction with a biometric scanner/identifier or a pin/password for enabling a device such as a weapon or vehicle.
  • a sensor may choose to transmit/receive such information only when the sensor has detected other sensor(s) within communication range and/or only when the sensor has detected that the sensor is undergoing motion.
  • a sensor may also choose to switch to power saving mode and/or power down various electronic integrated circuits (ICs) and sub-systems such as RF communication ICs and individual sensors.
  • ICs electronic integrated circuits
  • sub-systems such as RF communication ICs and individual sensors.
  • Sensors may also choose not to transmit, publish or receive information in certain situations, such as those which require "stealth” in terms of electronic emissions and/or radio frequency radiation, in situations which indicate that portable power usage is excessive and/or needs to be conserved, or in situations where communication is not possible due to lack of transmission power/range (e.g. in an underground mine or dense urban environment or over large distances).
  • sensor data and appropriate information may be internally buffered and/or stored until such time when such information may be shared, and such situations may be dynamically detected during operation.
  • the sharing of such information may not be required in real-time and thus may occur during post-processing or mission evaluation, negating the requirement for communication in real-time or during active operation.
  • a database of sensor and map information including the sensor position and/or orientation and map(s) may be stored on a remote server which is able to be accessed, updated, and shared with other sensors, such that sensor and map information is stored at a central repository and available for remote access for processing and observation from one or more remote computers or terminals.
  • the server may be hosted locally, nationally, or internationally, and communicates with the host- system using either wired and/or wireless communication (LAN/WLAN, etc.).
  • Such information may be routed from sensor-to-sensor via a mesh network or similar node- to-node communication methodology (e.g. ZigbeeTM).
  • a mesh network or similar node- to-node communication methodology e.g. ZigbeeTM.
  • a central unit such as a processing/data acquisition/communication unit
  • the central unit represents all data to and from an object and the sensors which represent that object, acting as a gateway for information and communication.
  • FIG. 5 there is illustrated an example method 50 for transmitting and receiving sensor information at the occurrence of a motion.
  • a check is made to determine if a motion has occurred. If a motion has occurred, then the remaining steps are completed:
  • steps 54 and 58 may be performed independently and concurrently (e.g. as separate processes and/or as event-driven/interrupt-driven procedures).
  • example method 50 may be implemented on a continuous basis, for example, executing at constant intervals during sensor data acquisition, independent of the occurrence of a motion. This may be particularly useful for applications such as motion capture.
  • example method 50 may be implemented by utilising a common processing unit which may receive and process parent and child sensor information accordingly without parent and child sensors communicating directly.
  • a child sensor may consider two scenarios: when the child sensor is in a disassociated state; and when the child sensor is in an associated state.
  • FIG. 6 there is illustrated an example method 70 for a child sensor in a disassociated state to determine if the child sensor may be associated with a parent sensor.
  • a check is made to determine if the child sensor is undergoing motion. If the child sensor is undergoing motion, then at step 74, a check is made to determine if one or more valid parent sensors are available.
  • a valid parent sensor is considered to be a sensor which is within communication and with compatible hierarchy level. If one or more valid parent sensors are not available, then at step 78, the child sensor may perform state estimation as a disassociated sensor using any of the various frameworks/techniques discussed herein.
  • communication is generically used and may include wireless or wired communication in real-time or near real-time, or may include post-processing.
  • communication is also used to include access to data stored on a medium such as a disk or memory card, so that all sensor data which is able to be accessed (e.g. during post-processing) is regarded as being within communication.
  • step 76 a check is made to determine if the valid parent sensor(s) are undergoing motion. If true, all valid parent sensors undergoing motion are selected for possible association. If false, at step 78, the sensor may perform state estimation as a disassociated sensor using any of the various frameworks/techniques discussed herein.
  • steps 80-86 valid parent sensors are analysed for possible association.
  • a Measurement of Association (MoA) is calculated for each valid parent sensor based on factors such as measurement similarities, motion timing, motion classification, and proximity between the sensors.
  • step 82 a check is made to determine if the MoA is above a required threshold.
  • step 84 the sensor is rejected as a potential candidate for association.
  • step 86 a check is made if more sensors are available. If so, steps 80-86 are repeated until all remaining parent sensors have been exhausted and all candidate parent sensors have been allocated a MoA.
  • step 88 once all sensors have been inspected, if there are no parent sensors which meet the required MoA threshold, then at step 78, the child sensor may perform state estimation as a disassociated sensor using any of the various frameworks/techniques discussed herein.
  • step 90 select the most- likely sensor based on the MoA, where a higher MoA indicates a closer match between sensors indicating that there is a high likelihood that they are associated. In most cases, the sensor selected will be the sensor with the highest MoA. If only a single sensor is available for association, then that sensor is selected.
  • the sensor selection process may include any of the various frameworks/techniques discussed herein.
  • the MoA may be considered equivalent to assigning a likelihood measure, and thus typically a sensor is selected after likelihood weighting, with the most-likely sensors having been assigned a higher probability of selection. With reference to Particle Filtering, this can be seen as the re-sampling procedure. With reference to an IMM tracker, the MoA may be used for determining switching probabilities and calculating model likelihoods.
  • step 92 the sensors are associated in accordance to their hierarchy, and at step 94 state estimation is performed using the appropriate sensor-to-sensor relationship and by applying the required estimation technique(s).
  • the child sensor and parent sensor(s) are now in an associated state.
  • a child sensor in an associated state, to determine if the child sensor may become disassociated, or if the child sensor may maintain association with the parent sensor with which the child sensor is currently associated.
  • a check is made to determine if the parent sensor which the child is associated with is undergoing motion. If false, then at step 104 a check is made to determine if the child sensor is undergoing motion. If false, then in terms of sensor association, no further action is taken as neither the child sensor nor the parent sensor are undergoing motion.
  • each sensor may continue to perform internal state estimation and motion analysis which may be especially important with respect to certain sensor technologies and estimation methodologies such as inertial sensing, motion estimation and classification, signal processing and filtering, etc.
  • step 102 if it was determined that the parent sensor is undergoing motion, or at step 104, if it was determined that the child sensor is undergoing motion, then at step 106, a check is made to determine if the parent sensor is valid.
  • a valid parent sensor is considered to be a sensor which is within communication and with compatible hierarchy level (e.g. the sensor may have dynamically changed hierarchy levels). If the parent sensor is no longer valid, then at step 118 the parent and child sensors are disassociated and at step 120, the child sensor may perform state estimation as a disassociated sensor using any of the various frameworks/techniques discussed herein. The parent sensor and the child sensor are now in a disassociated state.
  • a Measurement of Association is calculated for the parent sensor with which the child is currently associated, based on factors such as measurement similarities, motion timing, motion classification, and proximity between the sensors.
  • a measurement of disassociation is calculated for the parent sensor with which the child is currently associated, based on factors such as measurement similarities, motion timing, motion classification, and proximity between the sensors.
  • one or more measurements may be used to calculate a Measurement of Association (MoA) or a Measurement of Disassociation (MoD) which is compared to the threshold of association and threshold of disassociation maintained by each sensor.
  • measurements may include sensor measurements (e.g. raw measurements, post-processed or filtered measurements, and/or extracted features), trajectory estimates, motion estimation/classification, and timing and proximity information.
  • Sensor measurements may also pass a number of stages of processing such as filtering, and/or may be converted or calculated between various relationships such as acceleration, velocity, displacement, force, energy, work, power, etc., and/or may utilise related principles (e.g. power transfer).
  • such measurements also may be required to satisfy a validation region.
  • a child sensor may associate, disassociate, and re-associate itself with a parent sensor by examining the correspondence between its own measurements and those of available parent sensor(s) by analysing one or more of the following measurements (depending on availability and the type of sensors employed): [0127] (1) Timing similarities of the occurrence of motions which allow for parent and child sensors/objects to synchronise motions, for example, by a child sensor/object detecting that the child sensor/object is undergoing motion and detecting that a parent sensor/object also undergoing motion within a measurement window or time frame, where a lower time difference between motions of the sensors result in a higher MoA, and a higher time difference between motions of the sensors results in a higher MoD.
  • the timing measurements may take into account message transmission/reception time (which may include message routing across a number of nodes) and latency involved with data acquisition, measurement, communication, and processing.
  • message transmission/reception time which may include message routing across a number of nodes
  • latency involved with data acquisition, measurement, communication, and processing For performing timing measurements, generally a sensor/object only needs to keep track of internal timing and may make use of an internal clock (e.g. an oscillator source such as a crystal) and perform a relative time measurement between the occurrence of its own motion and the detection/notification of the motion of another sensor.
  • an internal clock e.g. an oscillator source such as a crystal
  • the object may also make use of a common time which has been initialised and/or is synchronised via communication messages (such as wireless transmissions, time synchronisation protocols, etc.) and/or a common time reference such as a GPS clock and/or a pulse output, e.g. GPS 1 pulse per second (1PPS) output.
  • a common time reference such as a GPS clock and/or a pulse output, e.g. GPS 1 pulse per second (1PPS) output.
  • PPS GPS 1 pulse per second
  • Such proximity measurements may be supported by RF/magnetic/acoustic signals and other transmissions and/or other forms such as optical/visual detection of features/beacons/tags on an object. For example, by detecting a matching pattern or colour, or by detecting an emitting light source such as a Light Emitting Diode (LED) either in the visual or non-visual spectrum (e.g. an Infra Red (IR) LED) which may transmit a coded pattern for synchronisation and/or identification purposes.
  • LED Light Emitting Diode
  • IR Infra Red
  • the child sensor/object may inherit the position estimate of the parent sensor/object and maintain that position estimate after disassociation, thus allowing the child sensor/object to become associated with a parent sensor/object which is within close proximity.
  • the child sensor/object may also inherit from the parent sensor/object other parameters or statistics which define the position estimate, such as the statistical accuracy or uncertainty of the position estimate, etc. [0129] (3) Similarities in sensor measurements could be used.
  • Child and parent sensors/objects may exhibit either similarities or differences in sensor measurements and/or trajectories which indicate both are attached by way of a physical relationship resulting in a higher MoA, or which indicate that they are undergoing independent manoeuvre(s) resulting in a higher MoD.
  • the comparison of motions and/or trajectories may take into account only the most recent/short-term portion of the motion trajectory or only motions and/or trajectories which occurred when a motion disturbance is detected.
  • Sensor measurements can indicate specific motions which indicate that sensors/objects are interacting, resulting in a higher MoA (e.g. a body-worn sensor indicating that a person is transitioning to an upright position, and a hand-held object placed on the floor detecting that it is being picked up, or a body- worn sensor indicating that a person is entering a vehicle and sitting down).
  • Sensor measurements can indicate specific motions which indicate that sensors/objects have ceased interacting, resulting in a higher MoD (e.g.
  • a body-worn sensor indicating that a person is transitioning to a kneeling or bent over position, and a hand-held object detecting that it has been placed down on the floor, or a body-worn sensor indicating that a person is exiting a vehicle and standing up).
  • Such measurements may be supported by motion detection and/or classification algorithms which identify the motion being performed, and/or by motion estimation algorithms which are able to estimate the motion being performed.
  • Processing may be performed either on-board the child sensor/object, on-board the parent sensor/object, or at one or more independent local or remote host system(s) and processing platform(s). Processing may be performed in a decentralised and/or distributed manner as a combination of the above with either wired and/or wireless connectivity.
  • the position of the parent and child sensor(s)/object(s) may be represented in 2D/3D and performance aspects and events also may be displayed. Where required, data from sensors is synchronised for processing, analysis, and/or visualisation.
  • Sensor/object synchronisation and/or state estimation may be performed "on demand" at the occurrence of a motion. For example, data may be buffered and/or stored until such time that a significant motion event occurs such as the firing of a weapon, at which time, a child sensor attached to or incorporated within the weapon may choose to perform state estimation (such as position and/or orientation and/or projectile trajectory estimation) in accordance with information from the parent object with which it is associated, or the child sensor may choose to provide such information to the parent object which it is associated with so that the parent sensor/object may perform such state estimation.
  • state estimation such as position and/or orientation and/or projectile trajectory estimation
  • Sensor/object synchronisation and state estimation may be performed/executed on an event-driven basis (e.g. at the occurrence of a motion and/or with reference to one or more sensor measurements), on a periodic basis (e.g. at regular intervals), on a query-basis (e.g. when requested), or any combination of the above depending on the sensor(s) utilised, the type of object being tracked, the type of motion and motion dynamics expected, the processing resources and battery power available, and the nature of the algorithm(s) implemented.
  • an event-driven basis e.g. at the occurrence of a motion and/or with reference to one or more sensor measurements
  • a periodic basis e.g. at regular intervals
  • a query-basis e.g. when requested
  • the location of occurrence of such events and trajectories and the location of external events may be stored, including, if available, the probability of occurrence and/or error/uncertainty estimate, for example, representing the uncertainty in position and/or orientation of the external event and/or point of contact of the projectile trajectory as an error ellipsoid.
  • the resulting motion event(s) and/or motion trajectories (including projectile/object motion trajectories) and/or external event(s) then may be processed for analysis, for example, to generate performance metrics and/or to generate a statistical representation of events.
  • Such an analysis is especially useful for management of such events in real-time, near real-time and/or during an after action review (AAR).
  • AAR after action review
  • Such an analysis may also include visualisation and/or generation of performance metrics and statistics of such event(s) and/or trajectories (including projectile/object motion trajectories) and/or external event(s).
  • performance metrics and statistics including projectile/object motion trajectories
  • external event(s) including projectile/object motion trajectories
  • external event(s) including projectile/object motion trajectories
  • the location and/or probability of such threats may also be displayed, resulting in improved situational awareness and management of such threats.
  • a weapon For example, if a weapon is discharged, this can represent the location and probability of an external event such as engaging a threat such as an enemy (by projectile trajectory estimation which reveals the location of the target/enemy) and/or as a reactive or defensive event which indicates that a soldier may require support (by knowledge of the location of the sensor/object representing that soldier).
  • an external event such as engaging a threat such as an enemy (by projectile trajectory estimation which reveals the location of the target/enemy) and/or as a reactive or defensive event which indicates that a soldier may require support (by knowledge of the location of the sensor/object representing that soldier).
  • fire-fighters aiming fire extinguishing equipment e.g.
  • a fire hose or fire extinguisher at particular locations and regions within a building indicate locations of external events such as threats, in this case, a fire, and the severity of the external event (threat), in this case, the distribution/extent and intensity of the fire, by estimating the trajectory of water spray/scatter and by monitoring the duration of use.
  • motion event(s) and trajectories represent performance-oriented events (such as sports)
  • such events may help to. improve performance and analysis of a player's skills (e.g. by representing the location of a player and motion/direction/force/etc. of swings and hits of a tennis racquet, golf club or baseball bat, etc.).
  • Such motion events and/or trajectories including projectile/object trajectories
  • external events may be clustered and their location may be displayed with a level and probability of an external event which caused or influenced the occurrence of motion events and trajectories.
  • an external event e.g. threat or enemy
  • fire-fighters pointing fire extinguishing equipment (such as several fire hoses) for long periods of time at particular regions within a building indicate the location(s) of fires (as external events) with high probability.
  • such information may be used to analyse a player's performance, strengths, and weaknesses.
  • industrial environments e.g. operating heavy machinery such as mining or building equipment, operating tools in a manufacturing plant, etc.
  • such information may be used for analysing work skills, training staff, and ensuring quality of manufacture or analysing aspects such as yield efficiency or accuracy or worker fatigue, etc.
  • Such an implementation may, for example, take into account the probability of each motion event (such as the firing of a weapon) and/or the estimated uncertainty in terms of the error ellipsoid of the position and/or orientation of the resultant projectile trajectories (which indicate external events), their points of intersection, and/or if a building/environment floor plan or map is available either as a 2D/3D representation, their estimated point of impact/collision with obstacles represented within that floor plan/map (e.g. as determined by a technique such as ray tracing).
  • Such representations of motion events and external events can be used for better planning in real-time, near real-time, after action review, and for estimating the probability of future events.
  • an automated path planner which directs vehicles and/or troops may take into account such threats (external events) for avoidance of further conflict or for directing backup and support.
  • an automated path planner may direct fire-fighters manoeuvring within a building to take a path which avoids threats (external events) such as sources of fire or locations of a building which are estimated to be structurally unsafe/unstable due to the intensity/duration of fire as estimated by trajectory estimation. Storing and processing such information also allows for estimating the probability of occurrence of future events. For example, given a history of events, the probability of occurrence and location of threats in a region of conflict, or in the occurrence of a fire of similar duration and/or intensity in a building.
  • Motion events may be detected in several ways, for example, by utilising signal processing and feature extraction techniques operating on continuous sensor data such as detecting peak motions along one or more axes of measurement using inertial sensors (e.g. detecting weapon discharge by peak acceleration measurements along the longitudinal axis of the weapon), by a motion classification algorithm, by physical interfacing (e.g. mechanical, electrical) to a switch, trigger, or other mechanism which is actuated during the motion event, or a combination of the above.
  • inertial sensors e.g. detecting weapon discharge by peak acceleration measurements along the longitudinal axis of the weapon
  • motion classification algorithm e.g. mechanical, electrical
  • a check is made to determine if a motion event (such as the firing of a weapon) has occurred. If a motion event has occurred, then at step 134, the position, orientation and/or trajectory at the occurrence of the motion event is estimated. At step 136, a check is made to determine if the motion event results in the firing or ejection of a projectile/object/substance.
  • a motion event such as the firing of a weapon
  • the position, orientation and/or trajectory of the external event is estimated, based on the trajectory estimate of the projectile/object/substance, taking into account various parameters and properties regarding the projectile/object/substance, such as aerodynamic properties and/or the force of the motion event and/or meteorological or environmental measurements, etc. If false, then at step 140, the position, orientation and/or trajectory of the external event is estimated. [0142] At step 142, a check is then made to determine if a building floor plan or layout of a building and/or a map of the surrounding environment is available.
  • step 144 the location and/or orientation of the external event and/or the point of impact or collision of the projectile/object/substance (if step 136 was true) with the building, terrain, and/or object(s) represented within the floor plan and/or map of the environment is estimated. If a projectile/object/substance was fired or ejected, this may also include an estimate of the coverage area, taking into account various parameters and properties regarding the projectile/object/substance, such as aerodynamic properties and/or the force of the motion event and/or meteorological or environmental measurements, building/construction materials at the point of impact, etc.
  • the coverage area may represent the extent of damage in the case of a projectile fired by a weapon, or the coverage area due to the spread/distribution of liquid/chemical spray from a fire hose or other equipment, etc.).
  • a check is made to determine if multiple trajectories exist and if they coincide by trajectory and/or location of external events. If true, then at step 148, similar trajectories and/or external events are clustered and the location (2D/3D) and/or orientation including probability/error uncertainty estimate and/or probability of occurrence of the trajectory and/or external event and/or motion event are estimated.
  • the motion event position and/or orientation and/or probability/error uncertainty and/or probability of motion event occurrence, the resultant trajectory, and the external event location and/or orientation and/or probability/error uncertainty and/or the probability of external event occurrence are stored for future reference and analysis.
  • such information may then be published/transmitted and/or visualised or made available to one or more mobile units or command and control stations, etc.
  • the sensor(s)/object(s) being tracked may be displayed within a 2D/3D visualisation platform which may be displayed at one or more command and control stations, within one or more mobile vehicles (e.g. for collision/proximity avoidance in mining, or at a mobile command and control centre), at one or more mobile units carried by personnel either as separate processing/computing systems or integrated into existing computing/processing systems carried by personnel, and/or displayed on a Head-Up Display (HUD)/wearable visualisation platform.
  • HUD Head-Up Display
  • a 2D/3D visualisation may be displayed, indicating the pathway/trajectory followed by the object, including the current location (2D/3D), orientation, and altitude of tracked objects, the location and/or orientation/heading of detected motion events and/or external events, and estimated trajectories at each of those detected motion events (including projectile/object trajectories and/or points of impact). If required, uncertainty estimates (e.g. as an error ellipse) may also be displayed.
  • estimates of location, orientation and/or trajectories of motion events and external events may coincide with physical aspects and representations of the environment, for example walls, floors, ceilings, fixtures and fittings, and mobile objects and/or objects traversing the environment (such as personnel and vehicles, including objects which are being independently tracked by an external system), and thus, may be represented and visualised accordingly, for example, by calculating the location of a collision of a projectile with the environment and objects within the environment (for example, by utilising techniques such as ray tracing).
  • the user viewpoint may be displayed as a 3rd person viewpoint (i.e. as an observer) and may be freely moved around and tilted within the virtual 3D environment during such visualisation.
  • the user viewpoint may be displayed as a 1st person viewpoint of the object being tracked and may automatically move and orient itself with the object, useful for viewing the point-of-view of the object, e.g. a weapon sight as the weapon is positioned and oriented within the environment, and a trajectory which emanates from the weapon and indicates the trajectory a projectile may follow and the collision point within the environment, and if a map is available, objects within the environment.
  • the visualisation of the external event(s) and estimated projectile/object/substance trajectories may take into account various aforementioned parameters such object size, weight, aerodynamic properties and/or the force of the motion event and/or meteorological or environmental measurements, building/construction materials at the point of impact, etc., such that the projectile trajectory and/or coverage area (in terms of damage or spray of liquid/chemicals etc.) may be analysed and visualised either prior to the firing or ejection of a projectile/object/substance (e.g. estimated and visualised as a trajectory and coverage area in real-time in accordance with dynamic sensor and parameter measurements), and/or after the firing or ejection of a projectile/object/substance (e.g. for improved situational awareness in real-time or near real-time, and/or for post-processing such as when performing an after action review, etc.).
  • a projectile/object/substance e.g. estimated and visualised as a trajectory and coverage area in real-time in accordance with dynamic sensor and parameter
  • the 2D/3D visualisation may include a data overlay for each sensor/object being tracked, such as the object ID/name/type, location coordinates (e.g. in a Cartesian or geographic coordinate system), health status and vital signs, number of motion event(s) detected (e.g. number of shots fired), battery/power levels, etc.
  • a visualisation may be performed in real-time, near real-time, or played back during post-processing and after action review (AAR).
  • AAR after action review
  • Such a visualisation may be included with a notification of an event and occur on-demand due to the occurrence of events, such as weapon discharge.
  • FIG. 9 there is illustrated a mobile parent object 162 where one or more parent sensors 164 are attached to, associated with or integrated within the mobile parent object 162.
  • a mobile child object 166 likewise has one or more child sensors 168 attached to, associated with or integrated within the mobile child object 162.
  • Information 170 may be transmitted or received between at least one of the parent sensors 164 and at least one of the child sensors 168.
  • At least one of the parent sensors 164 may communicate information 172 with an external system or network, and at least one of the child sensors 168 may communicate information 174 with the same or another external system or network.
  • the sensors 164, 168 could be devices, or provided as sensor systems, that are rigidly attached to or embedded within an object and provide an interface for information exchange to other sensors/objects, and/or to an independent host system or network, either in real-time, near real-time, on-demand, or for storage or post-processing at a later time/date.
  • Information could be processed in a centralised or decentralised manner on-board the sensor/object, offloaded to other sensors/objects, and/or sent to one or more host-systems.
  • a number of occurrences of weapon discharges have occurred, multiple weapon discharges and/or trajectories within close vicinity may result in an escalation of the threat probability based on clustering of nearby trajectories, the number of weapon discharges, the number of personnel involved, and the vicinity of trajectory estimates.
  • This may be performed by employing a form of unsupervised learning or a clustering algorithm, several of which are known in the literature, such as k-means clustering, nearest neighbour methods, and other forms of clustering, classification, and data mining techniques. Based on such information, further statistical information is able to be extracted to analyse the reaction time and accuracy of marksmanship of personnel during performance analysis and training exercises.
  • a form of unsupervised learning or a clustering algorithm several of which are known in the literature, such as k-means clustering, nearest neighbour methods, and other forms of clustering, classification, and data mining techniques.
  • a visualization for the tracking of personnel 305 for example military, police or special services personnel having hand-held weapons 310 is provided.
  • the method/system can track personnel motions or pathways 320, and can determine the number of weapon 310 uses/discharges, and/or projectile trajectories 330 where weapon discharges result in the firing or ejection of a projectile or object.
  • Tracking of personnel 305 can utilise a first sensor, or multiple sensors placed about a body.
  • Tracking of hand-held weapons 310 for trajectory determination or estimation during weapon discharge can utilise one or more sensors mounted on or integrated within weapon 310 carried by one or more personnel 305.
  • FIG. 13 illustrates a visualisation of such an example, whereby body-worn sensors are used to locate friendly-force personnel 405 (displayed as small circles) and motion events such as weapon-discharge result in ballistic trajectory estimation for determination of external events, in this case, the location of an enemy force.
  • Information from a plurality of motion events and trajectories is clustered and processed accordingly to determine a probability of threat location 410 (i.e.
  • External event 410) and heading with error distribution as displayed by the ellipses and threat grid reference displayed by the crosshairs.
  • Friendly force locations 405 and enemy force locations 410 are displayed with respect to an available building layout 420.
  • This provides an example defence implementation for detecting a plurality of motion events (e.g. weapon discharge) and external events (e.g. enemy force locations) including clustering of a plurality of trajectories and probability estimation of the external events (in this example locations) represented as the error ellipses.
  • a fire hose with an attached sensor can indicate the location of threats by estimating the nozzle orientation and water trajectory during application, presenting a set of locations of threats allowing for analysis of the scale and distribution of a building fire, which may be further segregated or distributed across floors of a building, and estimating the extent of damage.
  • tracking a tennis racquet for recording the number of shots, the position and/or orientation of the racquet, and related statistics such as number of forehand or backhand shots, the force and/or velocity of shots, and estimating the trajectory of the ball.
  • tracking of gloves e.g. boxing gloves
  • boots e.g. soccer/football boots
  • tracking of events e.g. hits/kicks/shots
  • force, velocity, and related statistics e.g. hits/kicks/shots
  • sensors integrated into the protective equipment and clothing worn by first responders e.g. fire-fighters, police, security and defence personnel, etc.
  • workers operating within industrial and hazardous environments e.g. miners
  • sensors integrated into the protective equipment and clothing of a motorcycle rider where sensors may be integrated into boots and/or a leather jacket/suit/back/torso protector to monitor the upper torso and/or gloves to monitor hand position and/or helmet to monitor head position, along with one or more sensors mounted onto the motorcycle, with the purpose of estimating the position of the rider with respect to the motorcycle and/or activating active safety devices if the rider and motorcycle become disassociated whilst in motion (i.e. the rider has fallen off and/or crashed).
  • the implementation allows for activation of active safety devices and aids (such as air-bags) at the earliest possible time with a high level of confidence of the occurrence of such a threat, and/or automatically calling for help or emergency services.
  • active safety devices and aids such as air-bags
  • Such a methodology may be applied to other sports and hobbies both on land, water, and air (e.g. as motor-sports, speed boats, jet skis, skiing, etc.).
  • Tracking personnel Tracking personnel within challenging environments (e.g. underground locations, mines, urban environments, GPS-denied environments, etc.). Interaction between personnel and equipment, for example, fire-fighters, miners, security and defence personnel, and other individuals and/or teams of personnel which are transported via a vehicle, carry and interact with equipment, or operate heavy machinery.
  • challenging environments e.g. underground locations, mines, urban environments, GPS-denied environments, etc.
  • Interaction between personnel and equipment for example, fire-fighters, miners, security and defence personnel, and other individuals and/or teams of personnel which are transported via a vehicle, carry and interact with equipment, or operate heavy machinery.
  • Objects being tracked may be shared between users, for example, a firearm, a tennis racquet, or other object/weapon/equipment/tool/device/etc. which is used in the course of performing a service, during team sport, training exercises, electronic entertainment, and/or day-to-day living. Items such as weapons, tools, and equipment. Sporting equipment such as a bat or tennis racquet. Controllers used for electronic or entertainment devices. Team sports which handle and pass on an object such as a ball or a baton. Training facilities and athletic performance analysis and manipulating athletic equipment. Rehabilitation and medical analysis and monitoring progress. Smart homes and environments.
  • the sensors e.g. sensing systems or hardware
  • the sensors may be physically mounted or attached to objects, either as a motion estimation/localisation-specific device or integrated with existing technology such as communication equipment including mobile phones and radios, weapons such as small-arms and purpose-specific "smart" weapons, safety equipment such as fire extinguishers, wearable safety equipment and clothing such as vests, boots, helmets, and tools such as drills (either hand-held or as an articulated unit attached/mounted to heavy equipment).
  • the sensing systems or hardware may make use of parasitic power from such platforms or equipment, such as utilising the battery or power supply which is either internal to the equipment (e.g. battery in a mobile phone, radio transceiver, mp3 player, vehicle, etc.) or external to the equipment, if needed, (e.g. an external battery pack such as those carried by mining personnel and other industrial workers, which may be mounted/attached to the belt or carried in a backpack or integrated into clothing).
  • the equipment e.g. battery in a mobile phone, radio transceiver, mp3 player, vehicle, etc.
  • an external battery pack such as those carried by mining personnel and other industrial workers, which may be mounted/attached to the belt or carried in a backpack or integrated into clothing.
  • Self-contained sensors can be utilised such as the inertial measurement units (IMUs) available from several manufacturers such as HoneywellTM, CrossbowTM, xSensTM, MEMSenseTM, etc.
  • IMUs inertial measurement units
  • MEMS Micro Electro-Mechanical Systems
  • Self-contained sensors can be utilised such as the inertial measurement units (IMUs) available from several manufacturers such as HoneywellTM, CrossbowTM, xSensTM, MEMSenseTM, etc.
  • Inertial measurement units or inertial-based navigation sensors which are assembled with discrete components based on Micro Electro-Mechanical Systems (MEMS) sensors can be utilised for example those manufactured by companies such as Analog DevicesTM, and which may consist of magnetic and barometric sensors such as those manufactured by HoneywellTM and FreescaleTM Semiconductor.
  • MEMS Micro Electro-Mechan
  • Such sensors also commonly implement a data fusion algorithm to optimally combine the outputs of all sensors in a complimentary fashion to measure attitude and heading, commonly referred to as an Attitude and Heading Reference System (AHRS).
  • AHRS Attitude and Heading Reference System
  • Such sensors may also be pre-existing within certain electronic devices such as mobile phones (e.g. the AppleTM iPhoneTM or HTCTM IncredibleTM), mp3 players (e.g. the Apple iPod ), and other computing hardware (e.g.
  • the Apple iPad typically for the purposes of usability, gaming, and supporting the user interface (UI) in terms of tilt/rotation and input detection using accelerometer(s) and/or gyroscope(s) and/or a magnetic compass for heading determination for navigation applications which typically combine self-contained sensors with GPS or assisted GPS (A-GPS), or for example, fitness/exercise and step/calorie counting pedometer applications such as the NokiaTM SportsTracker.
  • UI user interface
  • accelerometer(s) and/or gyroscope(s) and/or a magnetic compass for heading determination for navigation applications which typically combine self-contained sensors with GPS or assisted GPS (A-GPS), or for example, fitness/exercise and step/calorie counting pedometer applications such as the NokiaTM SportsTracker.
  • A-GPS assisted GPS
  • fitness/exercise and step/calorie counting pedometer applications such as the NokiaTM SportsTracker.
  • Such electronic devices may allow for external access to the sensor data (e.g.
  • API Application Programming Interface
  • Self-contained sensors can be utilised which make direct observations of the environment, such as low-cost and high-end cameras manufactured by PointGreyTM, and laser range-finders such as devices manufactured by Sick and Hokuyo Automatic .
  • Beacon-based sensing and localisation technologies can be utilised which employ measurements from fixed base-stations and/or mobile nodes and are based on various technologies such Wireless Local Area Network (WLAN/Wi-FiTM) and/or mobile phone towers and/or other communication network towers/infrastructure.
  • WLAN/Wi-FiTM Wireless Local Area Network
  • Wi-FiTM Wireless Local Area Network
  • mobile phone towers and/or other communication network towers/infrastructure For example available from EkahauTM or AeroScoutTM and/or accessible via an Application Programmable Interface (API) from companies such as SkyHook WirelessTM, GoogleTM or AppleTM.
  • API Application Programmable Interface
  • technologies which utilise Ultra Wideband (UWB) measurements from companies such as TimeDomain , and other localisation technologies such as those based on ZigBee are available from companies such as AwarePointTM.
  • Sensing platforms/objects/equipment The aforementioned sensors may be used in conjunction with different types of manned or unmanned robotic platforms and vehicles either with existing facilities for on-board dead reckoning or with retrofitted dead reckoning sensors such as wheel encoders and inertial measurement units and sensors such as cameras and lasers.
  • the aforementioned sensors may also be used in conjunction with any freely-manipulated objects and equipment.
  • Example hardware that could be utilised includes:
  • Vehicles and platforms for operation in land/sea/air environments which have been instrumented and are either in a manned or unmanned configuration and are either custom- designed or are standard production vehicles;
  • Body-worn sensors mounted on a human
  • Hand-held objects such as weapons, sporting/athletic equipment, tools, safety equipment, and machinery;
  • Processing systems 184 The methods or processes described herein may be implemented to execute on a processing system 184, that could be part of or separate to sensor system 182, in a variety of different ways.
  • Processing system 184 could store and/or retrieve information or data from local database 190.
  • a state estimate such as a position, direction, trajectory, motion estimation/classification, etc.
  • extracting features from sensor data and providing such information to one or more algorithms responsible for sensor association/disassociation and/or management of motion events and/or sensor-to-sensor estimation using inter-process communication.
  • the host- system might also maintain a map of the environment as a supported map representation (e.g. occupancy grid map, feature-based map, landmark-based map), and provide such information to motion event management, visualisation, and estimation algorithms.
  • server(s) 186 which communicates with a host-system using either wired and/or wireless communication (LAN/WLAN, etc), such as network 188.
  • LAN/WLAN wireless local area network
  • Servers) 186 can store and/or retrieve information or data from one or more databases 192.
  • methods or processes described herein may be implemented as an independent process on a remote server which may be hosted locally, nationally, or internationally, and communicates with the host-system using either wired arid/or wireless communication (LAN/WLAN, etc).
  • maps of the environment and/or sensor or object parameters e.g. position and/or orientation, sensor/object identifier, status, etc.
  • Such functionality may be accessible via an Application Programmable Interface (API) from a map provider (e.g. a digital mapping company, crowd-sourced from users, GoogleTM Maps, etc).
  • API Application Programmable Interface
  • processor and/or memory utilisation of the host-system is reduced and a degree of redundancy may be afforded.
  • one or more host-systems operating independently as part of one or more sensor systems may share a common map of the environment and/or sensor or object parameters (e.g. position and/or orientation, sensor/object identifier, status, etc.) with the remote server, and thus the remote server can provide and share such information with several host-systems and act as a common repository of information operating as a cloud computing service.
  • methods or processes described herein may also be implemented as a single process in conjunction with the tasks of the host-system, resulting in a tightly integrated set of tasks such as maintaining and managing the sensor interface, calculating a state estimate and/or extracting features from sensor data, association/disassociation of sensors and management of motion events, sensor-to-sensor estimation, and maintaining a map of the environment and/or sensor or object parameters.
  • the processor executing the motion estimation framework/algorithm(s) may consist of a standard Central Processing Unit (CPU) such as those manufactured by IntelTM or AMDTM and/or processors based on technology available from Advanced RISC Machines (ARMTM), etc., commonly found in desktop and portable handheld computers.
  • the operating system may be a standard operating system such as Microsoft WindowsTM, a UNIX-based operating systems such as Linux , or Apple MacOS , a mobile operating system such as iOS or Android or Windows Phone 7 , or an industrial operating system such as QNX NeutrinoTM.
  • the operating system may also be an application-specific OS or an embedded OS or Real-Time OS (RTOS).
  • the motion estimation framework/algorithm(s) may be executed either online (in real-time or near real-time) or offline for post-processing or batch-processing.
  • the processor executing the motion estimation framework/algorithm(s) may include an embedded platform such as a microcontroller, microprocessor, Programmable Logic Device (PLD), Digital Signal Processor (DSP), such as those manufactured by AtmelTM, MicrochipTM, Texas Instruments (TI)TM, and/or based on technology available from Advanced RISC Machines (ARMTM), etc.
  • an embedded platform such as a microcontroller, microprocessor, Programmable Logic Device (PLD), Digital Signal Processor (DSP), such as those manufactured by AtmelTM, MicrochipTM, Texas Instruments (TI)TM, and/or based on technology available from Advanced RISC Machines (ARMTM), etc.
  • the motion estimation framework/algorithm(s) may execute on one or more cores whilst the remaining cores may be utilised for other tasks such as sensor communication and/or position estimation and/or orientation estimation and/or other state estimation, depending on the number of cores and processor availability and utilisation.
  • processing system 200 generally includes at least one processor 202, or processing unit or plurality of processors, memory 204, at least one input device 206 and at least one output device 208, coupled together via a bus or group of buses 210.
  • input device 206 and output device 208 could be the same device.
  • An interface 212 can also be provided for coupling the processing system 200 to one or more peripheral devices, for example interface 212 could be a PCI card or PC card.
  • At least one storage device 214 which houses at least one database 216 can also be provided.
  • the memory 204 can be any form of memory device, for example, volatile or non-volatile memory, solid state storage devices, magnetic devices, etc.
  • the processor 202 could include more than one distinct processing device, for example to handle different functions within the processing system 200.
  • Input device 206 receives input data 218 and can include, for example, a data receiver or antenna or wireless data adaptor, etc. Input data 218 could come from different sources, for example sensed data in conjunction with data received via a network.
  • Output device 208 produces or generates output data 220 and can include, for example, a display device, a port for example a USB port, a peripheral component adaptor, a data transmitter or antenna such as a' wireless network adaptor, etc.
  • Output data 220 could be distinct and derived from different output devices.
  • a user could view data output, or an interpretation of the data output, on, for example, a monitor or using a printer.
  • the storage device 214 can be any form of data or information storage means, for example, volatile or non-volatile memory, solid state storage devices, magnetic devices, etc.
  • a user has access to one or more terminals which are capable of requesting and/or receiving information or data from the sensing system.
  • a terminal may be a type of processing system, computer or computerised device, mobile or cellular telephone, mobile data terminal, portable computer, Personal Digital Assistant (PDA), etc.
  • An information source can be provided to be in communication with the sensor system and can include a server, or any type of terminal, that may be associated with one or more storage devices that are able to store information or data, for example in one or more databases residing on a storage device.
  • the processing system 200 is adapted to allow data or information to be stored in and/or retrieved from, via wired or wireless communication means, the memory or the at least one database 216.
  • the interface 212 may allow wired and/or wireless communication between the processing unit 202 and peripheral components that may serve a specialised purpose.
  • the processor 202 receives instructions as input data 218 via input device 206 and can display processed results or other output to a user by utilising output device 208. More than one input device 206 and/or output device 208 can be provided. It should be appreciated that the processing system 200 may be any form of terminal, specialised hardware, or the like.
  • the processing system 200 may be a part of a networked communications system.
  • Processing system 200 could connect to network, for example the Internet or a LAN or WAN.
  • Input data 218 and output data 220 could be communicated to other devices via the network.
  • Optional embodiments of the present invention may also be said to broadly consist in the parts, elements and features referred to or indicated herein, individually or collectively, in any or all combinations of two or more of the parts, elements or features, and wherein specific integers are mentioned herein which have known equivalents in the art to which the invention relates, such known equivalents are deemed to be incorporated herein as if individually set forth.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

In one form, positioning, tracking and/or trajectory estimates of a first mobile object are determined in relation to a second mobile object, due to a better determination of positioning, tracking and/or trajectory of the second mobile object being available. Tracking of a first mobile object in relation to a second mobile object, enables improved analysis or visualisation of motions and trajectories of the first mobile object. An object can be an asset, an article, a robot, a vehicle, a person, an animal, any type of target, a mobile or cellular telephone, a passive object, a weapon, etc. A first/child mobile object is provided with a first sensor and a second/parent mobile object is provided with a second sensor. In another form, determining a location and a probability of an external event is based on an occurrence of one or more events and trajectories.

Description

POSITIONING, TRACKING AND TRAJECTORY ESTIMATION
OF A MOBILE OBJECT Technical Field
[001] The present invention generally relates to determining positioning, tracking and/or trajectory estimates of a mobile object. More particularly, the present invention relates to determining positioning, tracking and/or trajectory estimates of a mobile object in relation to a determined positioning, tracking and/or trajectory of another mobile object. Furthermore, the present invention relates to determining a position or location estimate of an external event that is at a distance from or remote to a mobile object.
Background
[002] Many localisation systems which are able to directly observe the environment or make measurements from sensors distributed within the environment are error prone due to the nature of the measurement characteristics of the sensors employed (e.g. RF measurements and building materials), interference or obstructions within the environment of operation (e.g. building layout, obstruction of laser and camera field of view due to crowding, changes in lighting levels, poor Global Positioning Systems (GPS) satellite coverage in urban environments), and/or poor data association with observed landmarks or features within the environment of operation (e.g. for feature-based/landmark-based localisation and mapping). These problems are compounded by the use of low-cost sensors which are noisy and suffer from inherent problems and measurement inaccuracies. [003] A variety of systems are known that can be used to track mobile objects (e.g. mobile assets) to obtain a positioning, tracking and/or trajectory estimate of a mobile object, such as GPS, inertial and non-inertial sensor devices. Although GPS is a useful tracking system for outdoor tracking applications, a number of limitations exist when applying GPS to indoor and urban navigation or tracking. For example, in indoor environments and in close proximity to buildings the line of sight of GPS satellites may be substantially obscured and GPS signals can be highly attenuated. With weakened signals, GPS receivers have difficulty receiving GPS signals and calculating accurate position information. [004] Tracking of a mobile object which may be freely positioned, orientated, and manipulated poses several difficulties if infrastructure-dependent sensor measurements (such as GPS, active or passive beacons, fixed cameras, range/bearing/proximity measurements to base stations, and/or other known references) are not available to determine positioning, tracking and/or trajectory estimates of the mobile object. Furthermore, such measurements may be occluded or error prone due to the manipulation or actual use of the mobile object and/or due to the environment of operation, such as near obstructive building materials, remote locations or crowded environments. [005] There is a need for a method, system and/or computer readable medium of instructions which addresses or at least ameliorates one or more problems inherent in the prior art.
[006] The reference in this specification to any prior publication (or information derived from the prior publication), or to any matter which is known, is not, and should not be taken as an acknowledgment or admission or any form of suggestion that the prior publication (or information derived from the prior publication) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates.
Brief Summary
[007] In one broad aspect, the present invention relates to tracking of a mobile object in relation to another mobile object for which a more accurate localisation or position is able to be obtained, such as being measured or determined.
[008] In one form, positioning, tracking and/or trajectory estimates of a first mobile object are determined in relation to a second mobile object, due to a better determination of positioning, tracking and/or trajectory of the second mobile object being available. The second mobile object, or sensors associated with the second mobile object, may have access to infrastructure-based information, or may better exploit sensor information and motion models due to mounting of sensors at a specific position or at several positions on the second mobile object. [009] In another example form, the second mobile object may be better suited to holding or containing sensors, or more accurate sensors, for example the second mobile object may be larger than the first mobile object. In non-limiting examples, the second mobile object could be a person and body-worn sensors could be mounted on the torso or footwear of the person, or the second mobile object could be a vehicle having wheel encoders and/or laser scanners mounted to the vehicle. A variety of other characteristics of the second mobile object could also make the second mobile object more suitable for accurate tracking, such as well-defined motion models and characteristics (e.g. human motion models, zero velocity updates, vehicular motion with nonholonomic constraints, planar motion, motion prediction and planning, etc.). Tracking of a first mobile object in such a manner, that is in relation to a second mobile object, enables the possibility to perform a more detailed analysis and visualisation of motions and trajectories of the first mobile object which otherwise would be difficult to achieve. [010] Reference to an "object" should be interpreted broadly and could include, for example, an asset, an article, a robot, a vehicle, a person, an animal, any type of target, a mobile or cellular telephone, a passive object, a weapon, a device or apparatus that can fire, emit or project a projectile or substance, a projectile itself that is fired, emitted or projected, etc.. Reference to "mobile" also should be interpreted broadly, such as being movable, portable, transportable or the like.
[011] In one aspect, there is provided a computer-implemented method of tracking a first mobile object, provided with or associated with a first sensor, in relation to a second mobile object, provided with or associated with a second sensor, comprising the steps of: determining a position, orientation or trajectory of the second sensor; and determining a position, orientation or trajectory of the first sensor at least partially based on or otherwise in relation to the position, orientation or trajectory of the second sensor. Reference to determining the position, orientation or trajectory means one or more of these features can be determined, a particular feature need not be determined to the exclusion of the other features.
[012] In another aspect, there is provided a system for tracking a plurality of mobile objects, wherein at least two mobile objects of the plurality of mobile objects each include at least one sensor, wherein one or more sensor-to-sensor relationships are defined depending on the types of the at least two mobile objects, thereby enabling a first mobile object of the at least two mobile objects to be tracked based at least partially on tracking of a second mobile object of the at least two mobile objects.
[013] Optionally, the one or more sensor-to-sensor relationships are defined depending on one or more of: whether the at least one sensor is associated or disassociated; the hierarchy level of the at least one sensor; parameters of the at least one sensor; dynamic selection of a relationship based on a type of motion of the at least one sensor; and/or, one or more independent motion models.
[014] Reference to a hierarchy should be read as a general reference to any type of hierarchy, relational arrangement, connectivity, chain or structured organisation as applied to sensor levels, associations, configurations or relationships.
[015] Reference to sensor association or disassociation should be read as a general reference to any means of linking or delinking, pairing or impairing, coupling or uncoupling, or the like, two or more sensors. For example, this could involve changing the hierarchy, relational arrangement, chain, structure, level, association, configuration or relationship information or data between two or more sensors. Reference to a threshold of association/disassociation, or the like, should be read as a general reference to any type of threshold, level, value, measure or degree of association or disassociation.
[016] In another aspect, there is provided a system for tracking a first mobile object, comprising: a first sensor associated with or including the first mobile object; a second sensor associated with or including a second mobile object; at least one processor to determine or estimate a position, orientation or trajectory of the second sensor; and at least one processor to determine or estimate a position, orientation or trajectory of the first sensor; wherein the determined or estimated position, orientation or trajectory of the first sensor is at least partially based on the determined or estimated position, orientation or trajectory of the second sensor. The at least one processor may be the same or different processors in relation to the first sensor and the second sensor determinations. [017] According to further various example aspects: the first mobile object is provided with or includes a plurality of first sensors; the second mobile object is provided with or includes a plurality of second sensors; the first sensor and the second sensor are related by a fixed hierarchy configuration; and/or the first sensor and the second sensor are related by a variable hierarchy configuration. Optionally, a check can be made if the first sensor is associated with the second sensor, and if the first sensor is associated with the second sensor, then determining the position, orientation or trajectory of the first sensor at least partially based on the position, orientation or trajectory of the second sensor. [018] According to further various example aspects: the first sensor is a child sensor and the second sensor is a parent sensor; and/or if motion of the second mobile object is detected by the second sensor, then information or data is transmitted to the first sensor.
[019] Optionally, a measure of association is determined between the first sensor and the second sensor. Optionally, the measure of association is compared to a threshold of association to determine if the first sensor and the second sensor are to be associated or remain associated. Optionally, a measure of disassociation is determined between the first sensor and the second sensor. Optionally, the measure of disassociation is compared to a threshold of disassociation to determine if the first sensor and the second sensor are to be disassociated or remain disassociated. Optionally, sensor-to-sensor relationships are assigned by associating or disassociating sensors based on sensor measurements.
[020] In another example aspect, there is included determining a location of an external event at least partially based on using the determined position, orientation or trajectory of the first sensor to determine an alignment or aim of the first mobile object. In another example aspect, there is included determining a trajectory of a projectile or substance fired or ejected by the first mobile object. In another example aspect, there is included estimating a coverage area of the projectile or substance based on parameters of the projectile or substance. In another example aspect, there is included determining a point of impact of the projectile or substance. In another example aspect, there is included determining clustering of a plurality of trajectories, and determining an area of interest based on the determined clustering. In another example aspect, either or both the first sensor and the second sensor are an inertial measurement unit. [021] In another example embodiment, there is provided a method of and/or system for tracking of one or more mobile objects, each of the one or more mobile objects including at least one sensor. Sensor-to-sensor relationships are established for state estimation, trajectory estimation, projectile trajectory estimation, enabling measurements/observations of the environment, and/or the distribution of location nodes/beacons which may act as location references and as signal repeaters.
[022] In another example embodiment, there is provided a method of and/or system for assigning sensor-to-sensor relationships by associating and disassociating sensors based on sensor measurements and the interaction between sensors (such as motion timing, proximity, similarities in motion and/or trajectory estimates, matching complimentary motions, etc.), where the sensor association/disassociation may be static or dynamic.
[023] In another example embodiment, there is provided a method of and/or system for determining the occurrence of external events which cannot reliably be directly measured or observed by the first or second sensor, by detecting events of the first or second sensor through use or manipulation of the first or second object, such as measuring motions by detecting physical contact, the firing of a weapon or object, abrupt changes in motion, or lack of motion (e.g. prolonged static or stationary periods).
[024] Reference to an external event should be read broadly to cover any type of event, occurrence or location determination that is external, removed from, at a distance to, or remote to an object associated with a sensor. For example, an event may occur at the object which is detected by the sensor, such as object motion, and an external event may be determined or estimated, based on various information such as trajectory clustering, which is remote to the object, such as an external event that may have influenced the event at the object.
[025] In another example embodiment, there is provided a method of and/or system for determining the location of an external event by determining the trajectory of a projectile or substance which may be fired, ejected and/or acted upon by an object, based on detection of motion events. [026] In another example embodiment* there is provided a method of and/or system for determining or estimating the trajectory to an external event, such as the trajectory for a projectile or substance which may be fired, ejected and/or acted upon by an object, and estimating the coverage area of the projectile or substance. For example coverage area may relate to a damage, spread or distribution area, such as near or in the vicinity of the external event, and be based on parameters of the projectile or substance (e.g. ballistic projectile information, projectile/object size, weight, aerodynamic properties), the force of a motion event, sensor measurements/characteristics (e.g. measured acceleration, angular rotation, etc.), and/or meteorological or environmental measurements, etc.
[027] In another example embodiment, there is provided a method of and/or system for determining or estimating the trajectory to an external event which does not result in the firing of a trajectory or projectile or substance, for example, by measuring the aim of an object such as a tool, or determining the gaze of a user (for example, one or more people browsing in an art gallery, whereby external events may be paintings or sculptures of interest, or one or more people traversing an environment, whereby external events are related to user intention/interaction with the environment, such as for human-machine interaction (HMI), augmented reality using a head mounted display (HMD), providing directions and/or guidance with respect to navigating the environment, etc.).
[028] In another example embodiment, there is provided a method of and/or system for determining or visualising the location and/or orientation of mobile objects, including the trajectory of a projectile or substance to an external event (such as a point of impact or coverage/distribution area), and/or storing several trajectories and locations of motion events and external events for concurrent visualisation, management, and analysis of such events.
[029] In another example embodiment, there is provided a method of and/or system for clustering several events and/or trajectories, determining an area of interest based on the number, type, or location of events and/or trajectories to external events, determining a probability of occurrence of events, and performing an assessment and determining the probability of occurrence of one or more external events (such as threats) and their location based on such information. For example, by determining a location and a threat level of a target based on the number and direction of weapon discharges, or determining the performance of an athlete based on sensed motion and trajectory events, such as a level of marksmanship, and/or athletic performance for sporting events, etc. [030] In another aspect, there is provided a computer-implemented method of determining a location and a probability of an external event based on an occurrence of one or more events and trajectories, comprising the steps of: detecting the occurrence of the one or more events; determining one or more trajectories for the one or more events; determining the location of the external event based on a trajectory end-point for at least one trajectory; and determining the probability of the external event at least partially based on the one or more events and the one or more trajectories.
[031] Optionally, there is also included clustering a plurality of the one or more trajectories, and determining an area of interest based on the clustering. Optionally, the occurrence of the one or more events and the determining of the one or more trajectories for a first mobile object uses at least one first sensor associated with the first mobile object.
[032] According to various example aspects: an event of the one or more events is the measurement of a motion; an event of the one or more events is release or firing of a projectile or substance; an event of the one or more events is a form of physical contact; an event of the one or more events is detected by motion estimation; an event of the one or more events is detected by a classification algorithm; a trajectory of the one or more trajectories is estimated at least partially based on one or more parameters of the projectile or substance; a coverage area of the projectile or substance is estimated at least partially based on one or more parameters of the projectile or substance; the external event is a threat; the external event is a target or goal; the external event is located by providing a grid reference; and/or the external event is located by providing an azimuth and elevation.
[033] In another example aspect, there is provided a system for determining a location and a probability of an external event based on an occurrence of one or more events and trajectories, comprising: a sensor associated with a mobile object, the sensor able to detect the occurrence of the one or more events; and, at least one processor configured to: determine the one or more trajectories for the one or more events; determine the location of the external event based on a trajectory end-point for at least one trajectory; and, determine the probability of the external event at least partially based on the one or more events and the one or more trajectories. Brief Description Of Figures
[034] Example embodiments should become apparent from the following description, which is given by way of example only, of at least one preferred but non-limiting embodiment, described in connection with the accompanying figures. [035] Fig. 1 illustrates a flowchart of an example hierarchy configuration where only a single hierarchy level is utilised;
[036] Fig. 2 illustrates a flowchart of an example hierarchy configuration where a lower level hierarchy sensor influences more than one upper level hierarchy sensors;
[037] Fig. 3 illustrates a flowchart of an example hierarchy configuration where sub- levels of hierarchy are utilised;
[038] Fig. 4 illustrates a flowchart of an example hierarchy configuration where sensors in a primary hierarchy ranking may influence each other and how various stages of sensors in a secondary hierarchy ranking may in turn influence the primary sensor(s);
[039] Fig. 5 illustrates a flowchart of an example method for transmitting and receiving sensor information at the occurrence of a motion;
[040] Fig. 6 illustrates a flowchart of an example method for a child sensor in a disassociated state to determine if the child sensor may be associated with a parent sensor;
[041] Fig. 7 illustrates a flowchart of an example method for a child sensor, in an associated state, to determine if the child sensor may become disassociated, or if the child sensor may maintain association with the parent sensor with which the child sensor is currently associated; [042] Fig. 8 illustrates a flowchart of an example method for management of motion events and external events;
[043] Fig. 9 illustrates example objects and associated sensors;
[044] Fig. 10 illustrates an example system that can be utilised to embody or give effect to a particular embodiment;
[045] Fig. 1 1 illustrates an example processing system that can be utilised to embody or give effect to a particular embodiment;
[046] Fig. 12 illustrates an example defence implementation for visualising a plurality of motion events (weapon discharge) to determine external events (enemy force locations); [047] Fig. 13 illustrates an example defence implementation of detecting a plurality of motion events (weapon discharge) and external events (enemy force locations) including clustering of a plurality of trajectories and probability estimation of external events (in this example locations) represented as error ellipses; [048] Fig. 14 illustrates an example emergency services/mining implementation, which also applies to other industries involving an interaction between personnel, vehicles, and other objects.
Preferred Embodiments
[049] .The following modes, given by way of example only, are described in order to provide a more precise understanding of the subject matter of a preferred embodiment or embodiments. In the figures, incorporated to illustrate features of an example embodiment, like reference numerals are used to identify like parts throughout the figures. Overview
[050] An embodiment of the present invention relates to tracking of one or more mobile objects to determine the position and/or orientation of one or more of the objects, which could be each object, (and where applicable, articulated segments of an object), thereby allowing for position determination, motion capture, detection and analysis of motion events, estimation of motion trajectory, calculation of performance parameters, and/or deriving of statistical information in relation to the use of the mobile object(s).
[051] Where use of a mobile object requires, for example, alignment, aiming, a physical interaction with another object, throwing an object, and/or the firing or ejection of a projectile or substance, tracking of the object also may be used for detection of motion events related to use of that object and determination, visualisation or estimation of external events which initiated use of that object via the estimation of the trajectory of the object, and/or statistical analysis of several such trajectories. Where use of a mobile object involves, for example, acquisition of sensor measurements or data during manipulation of the mobile object, tracking of the mobile object may also provide a means of data acquisition and analysis, whilst relaxing or eliminating the requirement for static or calibrated/surveyed sensor positioning, thus enabling rapid data acquisition under dynamic conditions. Generally, the occurrence of an external event may be determined when the external event cannot reliably be directly measured or observed by the first or second sensor. For example, occurrence of an external event may be detected by events of the first or second sensor through use or manipulation of the first or second object, such as measuring motions by detecting physical contact, the firing of a weapon or object, abrupt changes in motion, or lack of motion (e.g. prolonged static or stationary periods). Reference to an external event should be read broadly to cover any type of event, occurrence or location determination that is external, removed from, at a distance to, or remote to an object associated with a sensor.
[052] A mobile object may include at least one sensor attached to, associated with, or integrated within the mobile object, or where applicable, multiple sensors can be provided for a mobile object where a plurality of sensors may be attached to or integrated within a mobile object or articulated regions of a mobile object if constructed of one or more segments and joints, such as an articulated vehicle or robot with permanent or semipermanent joints, or wearable sensors attached to the torso, limbs and/or extremities of a human body. Thus, a mobile object may be represented by one or more sensors which collectively represent that mobile object (e.g. sensors which are mounted at segments of the limbs, head, and torso of a human), or may be represented by a single sensor (e.g. a single body-worn sensor mounted at the waist of a human, a sensor attached to a hand-held object such as a weapon or tool, a sensor attached to a vehicle, etc.).
[053] For a plurality of sensors, which includes both scenarios where sensors are provided on a single object or sensors are provided on multiple objects, each sensor may operate independently and may have a defined relationship with one or more other sensors attached to one or more mobile objects, and such relationships may be arbitrarily defined based on the type of mobile object being tracked by a sensor, the type of sensor employed, and the hierarchy level of the relationship. Relationships between sensors may be dynamically assigned and configured as sensors become associated or disassociated based on sensor measurements. At times, several sensors may be consecutively linked by a sehsor-to-sensor relationship chain (such as a kinematic chain), and/or several sensors may be concurrently linked by a relationship (for example, several mobile objects such as personnel on-board a mobile platform or vehicle, or several mobile objects carried by personnel).
[054] The tracking system includes one or more mobile objects which are to be tracked, each of which includes at least one sensor, or if required or appropriate, more than one sensor, for example, to track articulated limb movement and/or to track the extremities of limbs. For the purpose of assigning sensor-to-sensor relationships, each sensor can maintain a record of information including several fields which, depending on implementation, may include the following features.
Hierarchy
[055] A hierarchy level of a sensor and a level of hierarchy with which the sensor is able to be associated can be provided, which determines whether the sensor is influenced by other sensors and/or influences other sensors. The hierarchy level (i.e. rank) may be defined in several ways, for example, as a ranking of increasing/decreasing levels of hierarchy which may also include sub-levels of ranking.
[056] Fig. 1 illustrates an example hierarchy configuration 10 where only a single hierarchy pathway is utilised, such that sensors of lower level hierarchy (e.g. Hierarchy level 1) influence sensors of higher level hierarchy (e.g. Hierarchy level 2). For example, a vehicle mounted sensor (Hierarchy level 1) may influence a body-worn sensor (Hierarchy level 2) which may influence a sensor attached to a hand-held object (Hierarchy level 3).
[057] Fig. 2 illustrates an example hierarchy configuration 20 where a lower level hierarchy sensor (e.g. Hierarchy level 1) influences more than one higher level hierarchy sensors (e.g. Hierarchy level 2). For example, a vehicle mounted sensor (e.g. Hierarchy level 1 ) influences several personnel each equipped with a body-worn sensor (e.g. Hierarchy level 2). [058] Fig. 3 illustrates an example hierarchy configuration 30 where hierarchy sub-levels are utilised (e.g. Hierarchy sub-levels 2a, 2b, 2c, 2d), where the primary hierarchy level (e.g. Hierarchy level 2) indicates the group/class which the sensor belongs to, where sensors belonging a group/class of lower level hierarchy (e.g. Hierarchy level 2) influence sensors which belong to a group/class of higher level hierarchy (e.g. Hierarchy level 3). For example, a vehicle mounted sensor (e.g. Hierarchy level 1) may influence a body-worn sensor (e.g. Hierarchy level 2) which may influence a sensor attached to a hand-held object (e.g. Hierarchy level 3). A secondary level of hierarchy ranking which indicates the sensor hierarchy within its own group/class hierarchy can be provided. For example, multiple sensors which are within the group/class of body- worn sensors which influence each other, such as a foot-mounted sensor (e.g. Hierarchy sub-level 2a) influencing a waist-mounted sensor (e.g. Hierarchy sub-level 2b) which in-turn influences a sensor integrated into a glove, helmet, head mounted display (HMD), etc. (e.g. Hierarchy sub-level 2c).
[059] Fig. 4 illustrates an example hierarchy configuration 40 showing how sensors in primary hierarchy levels (e.g. Hierarchy levels 1, 2, 3) may influence each other and how various stages of sensors in secondary hierarchy levels (e.g. Hierarchy sub-levels 2b, 2c, 2d, 3b) may in turn influence primary sensor(s), such that sensor association is dependent on both the primary and secondary hierarchy rankings and the number of sensors in the relationship chain. Depending on the nature of the tracking system and the number of sensors, the configuration may be varied, and the hierarchy level with which a sensor may be associated may take many configurations, as displayed by the dashed lines in Fig. 4, each of which indicate a possible order of hierarchy. Depending on the nature of the tracking system, a sensor may be influenced by another sensor of lower than or equal to its own level of hierarchy. For example:
a sensor might be only influenced by a sensor of immediately lower hierarchy level (e.g. where a chain of sensor-to-sensor relationships are defined, such as sensors tracking articulated limb movement, each sensor is only influenced by a preceding sensor);
a sensor might be influenced by a sensor of lower hierarchy level which may not necessarily be of immediately lower hierarchy level (e.g. a sensor for a hand-held object which is freely manipulated by a human and thus is typically associated with a sensor of immediately lower hierarchy such as a body-worn sensor, may be associated with a sensor of even lower hierarchy level such as when it is placed into a vehicle having a sensor or storage unit/docking station/base-station having a sensor); and/or,
a sensor may be influenced by another sensor of the same hierarchy level (e.g. a person with a body-worn sensor physically interacting with another person with a body- worn sensor, a hand-held object with a sensor physically interacting with another hand- held object with a sensor, etc.).
[060] Depending on the nature of the system, multiple levels and/or sub-levels of hierarchy may be required, or a single level and/or sub-level of hierarchy may be sufficient. Depending on the nature of the system, hierarchy levels may be configured such that they represent levels of compatibility between sensors/objects so that only sensors which are compatible may be associated. Depending on the nature of the system, hierarchy levels may be configured such that they represent "ownership", in which case sensors/objects which, for example, are allocated to an individual user (e.g. a weapon- mounted sensor and a body-worn sensor both allocated to an individual soldier) are only associated with each other. Thus, sensor association may be limited or restricted to only those sensors/objects which may be "paired" or "grouped". Note that this does not imply that such sensors/objects do not communicate with other sensors/objects, e.g. for receiving corrections, etc. [061] The hierarchy configuration may be initialised and/or dynamically updated or adjusted. For example, consider the scenario where a person with a body- worn sensor is determined to be injured and/or immobile (for example determined by way of motion sensor measurements and/or by way of other sensor measurements such as health monitoring, etc.), and thus the hierarchy level may dynamically adjust such that the person requiring assistance may be associated with another body-worn sensor of the same level of hierarchy, such that a person which comes to their aid may physically interact with the injured and/or immobile person to move them to another location (e.g. drag/carry the person away).
Sensor-to-sensor relationship
[062] A sensor-to-sensor relationship can be provided which defines the relationship a sensor may have with another sensor with which it may be associated. Such relationships can be arbitrarily defined and a sensor may have more than one relationship which is selected based on a number of factors, such as:
whether the sensor is associated or whether the sensor is disassociated;
if the sensor is associated, the hierarchy level of the sensor which it is associated with;
if the sensor is associated, various parameters or statistics which characterise the sensor and the sensor performance with which it is associated, such as sensor accuracy, drift, temperature, operational conditions, etc.;
multiple relationships may be integrated with a multi-model estimator, for example, the most appropriate relationship may be dynamically selected based on the type of motion the sensor is undergoing, and/or more than one relationship may be applied using a multi- hypothesis estimator if the motion is uncertain or unknown; and/or,
if the sensor is not associated with another sensor, then it may utilise one or more independent motion models. Measure of Association (MoA)
[063] A measure of association can be provided of which several implementations are possible. For example, one example method of achieving this is by defining a threshold of association which indicates a level for which agreeing measurements must exceed for a sensor to associate itself with another sensor. The threshold may be dynamically adjusted and/or updated based on sensor measurements and operational conditions. During sensor association, the threshold is compared to a Measurement of Association (MoA) which is calculated for parent/child sensors with compatible hierarchy ranking, based on one or more factors such as measurement similarities, motion timing, motion classification, and/or proximity between the sensors.
[064] If the calculated MoA is above a threshold of association maintained by the child sensor, the child sensor becomes associated with the parent sensor. Depending on sensor measurements and implementation, the threshold of association and MoA may be equivalent to a single sensor measurement or a combination of several sensor measurements, for example, if only a single measurement is available (e.g. proximity between sensors), then the MoA and threshold of association provide a method of implementing proximity measurements for association.
Measure of Disassociation (MoD)
[065] A measure of disassociation can be provided of which several implementations are possible. For example, one example method of achieving this is by defining a threshold of disassociation which indicates a level for which disagreeing measurements must exceed for a sensor to disassociate with another sensor with which it is currently associated. The threshold may be dynamically adjusted and/or updated based on sensor measurements and operational conditions. During sensor disassociation, the threshold is compared to a Measurement of Disassociation (MoD) which is calculated for parent/child sensors based on one or more factors such as measurement similarities, motion timing, motion classification, and proximity between the sensors. If the calculated MoD is above the threshold of disassociation maintained by the child sensor, the child sensor becomes disassociated with the parent sensor. Depending on sensor measurements and implementation, the threshold of disassociation and MoD may be equivalent to a single sensor measurement or a combination of several sensor measurements, for example, if only a single measurement is available (e.g. proximity between sensors), then the MoD and threshold of disassociation provide a method of implementing proximity measurements for disassociation.
Sensor or object parameters
[066] Sensors may be optionally assigned with other parameters relating to a sensor or an object associated with the sensor such as the following: [067] (a) The position and/or orientation of a sensor, if it is known or is able to be determined, and the time of calculation of the position and/or orientation estimate. If the sensor is unable to calculate its position and/or orientation, then the position and/or orientation and time of calculation reported may be based on the last known position and/or orientation of the sensor when it was last associated with a parent sensor for which a position and/or orientation was known. If the position and/or orientation are unknown, then an unknown status is reported. Depending on implementation, the position and/or orientation (and/or other state estimates) may form a core aspect or requirement of the sensor-to-sensor relationship and/or the calculation of the MoA/MoD.
[068] (b) The current motion being performed, if motion detection is supported and/or if motion is able to be determined, for example, by sensor measurements or by a motion classification algorithm. If the motion being performed is unknown, then an unknown status is reported. Depending on implementation, the current motion being performed (and/or other state estimates) may form a core aspect or requirement of the sensor-to-sensor relationship and/or the calculation of the MoA/MoD.
[069] (c) A sensor identifier which identifies the type of sensor employed (e.g. an inertial sensor, wireless measurement sensor, GPS sensor/receiver, wheel encoder(s), etc.).
[070] (d) An object identifier which defines or categorises the type of object which the sensor represents (e.g. a mobile vehicle/platform, a human, a freely manipulated hand held object such as a weapon, tool, mobile phone, etc.). [071] (e) Sensor mounting information such as me coordinates indicating the position and/or orientation of where a sensor is fixed/attached to an object. Such sensor mounting information may involve calibration procedures to calculate the sensor position, orientation, misalignment, etc. [072] (f) An associated/disassociated status and hierarchy indicator which indicates whether the sensor is currently associated or disassociated. [073] (g) A hierarchy configuration or indication thereof which indicates where a sensor and/or other sensors lie in a sensor relationship (which may include a chain of sensor-to-sensor relationships), in a similar manner to that illustrated in Figs. 1 to 4. Such a configuration or indication may be static or dynamic and discovery may be supported. For example, a configuration hierarchy may be defined during initialisation and/or dynamically updated during association/disassociation of objects, and may be dynamically discoverable by a sensor/object discovery command via a communications interface.
[074] (h) Various other parameters or statistics which characterise the sensor and object and may be useful for defining sensor relationships and performing state estimation. The fields which store such parameters and statistics may hold fixed values and/or become updated during operation (for example, to indicate drift in the sensors, to indicate the quality of measurements, or to indicate the sensor association status and hierarchy configuration, etc.).
Relationships
[075] It should be noted that when referring to sensor relationships, an object may include a single sensor (in which case a sensor-to-sensor relationship represents an object- to-object relationship) or an object may include a plurality of sensors (in which case a sensor-to-sensor relationship represents a relationship between multiple sensors of a single object). Thus, sensor-to-sensor relationships may represent an interaction between sensors which represent multiple objects and/or sensors which represent a single object.
[076] Sensors which are at a level of hierarchy which dictates that they may influence other sensors (for convenience, defined hereafter as parent sensors) may act upon sensors which are at a level of hierarchy which dictates that they may be influenced (for convenience, defined hereafter as child sensors) in such a manner that a relationship (e.g. any arbitrary relationship such as a kinematic relationship) is defined between the sensors. [077] For example, a lower level hierarchy sensor may represent an object such as a mobile vehicle/platform which acts upon the next higher level sensor which may represent an object such as a human with a body-worn sensor, which in turn acts upon the next higher level sensor which may represent an object such as a hand-held object. In such a manner, a chain of sensor-to-sensor relationships may be defined and dynamically updated.
[078] As an illustrative example, referring to Fig. 14, such a scenario may occur when a group of fire-fighters 510, each equipped with body- worn sensors, enter an instrumented vehicle 515 such as a fire truck or helicopter and exit the vehicle once transported to the location of a site 525 (such as a building or other type of area of operation), and commence fire-fighting operations whilst equipped with hand-held extinguishers and/or fire hoses which are instrumented with a sensor. Another example of such a scenario is when a group of miners 510 within an underground mine, each equipped with body- worn sensors, enter an instrumented transportation vehicle 515 which may pick up and drop off miners at work sites 525 within the underground mine, where each miner may be operating heavy machinery 515 and/or equipment which may also be instrumented with sensors. In such examples, dynamic association and disassociation of sensors and subsequent state estimation allows for tracking of objects, notably higher level (i.e. child) objects, using relationships and hierarchy levels, which otherwise would not be feasible in an underground or dense urban area or in a GPS-denied environment without significant installation of infrastructure. [079] Similarly, a mobile object such as a human may be represented by multiple sensors. A lower level hierarchy sensor could be assigned as a foot-mounted sensor (e.g. worn or integrated into protective footwear) which acts upon the next higher level sensor which could be attached to the torso (e.g. worn or integrated into a protective vest, or equipment mounted on a belt), which in turn acts upon a next higher level sensor such as a sensor associated with a protective helmet, augmented reality goggles and/or hand-held objects such as tools, weapons or sporting equipment. Such a scenario may occur with teams of personnel equipped for particular tasks, such as security/defence personnel, first responders such as fire-fighters, workers in industries such as mining, personnel utilising tools which may be freely manipulated, and athletes/sportspersons which involve freely manipulated equipment (e.g. a tennis racquet, a baseball bat, etc.).
[080] Each sensor may have an arbitrary motion model which operates independently or in conjunction with a sensor-to-sensor relationship, and/or one or more additional arbitrary motion models which may be assigned as sensor-to-sensor relationship(s). For example, a parent sensor may represent an object such as a human equipped with a waist-mounted body-worn sensor with a human motion model, and a child sensor may represent an object which is being influenced by the parent object once both sensors are associated, and may make use of a motion model which supports a freely-manipulated hand-held object such as a weapon, sporting equipment or a mobile phone, or a wearable object such as a glove, or an object with a vehicular motion model such as a cart or trolley.
[081] Depending on the nature of the relationship between a parent and a child sensor/object, the child sensor/object may make use of kinematic techniques for position and/or orientation estimation in relation to the parent sensor/object, and may also utilise a constraint condition to restrict the kinematic degrees of freedom (DOF). Such a kinematic relationship may also define one or more intermediate joints and segments within the kinematic chain which are treated as virtual joints and segments, for example, intermediate joints and segments which represent the torso, limbs, and extremities of a human body between a foot-mounted sensor and/or a waist-mounted sensor and a hand-held object with attached sensor.
[082] Depending on the relationship type, the child sensor/object may be considered to estimate its motion, position, and/or orientation in a sensor coordinate frame which, according to the nature of the relationship, becomes aligned with and/or constrained by the coordinate frame of the parent sensor/object. Furthermore, a child sensor/object may make use of the location, orientation and/or other information sources obtained from a parent sensor/object (such as the occurrence of a motion, motion detection, motion timing information, motion classification, etc.) as an observation within its own estimation platform (e.g. a probabilistic estimator which treats such sources of information as observations to perform an "update" using an appropriate observation model).
[083] For example, consider the scenario of a freely manipulated sensor/object which includes a laser rangefinder which is either integrated into or mounted alongside the sensor/object and acquires range measurements which are taken with respect to the freely manipulated sensor/object. This may occur in the case of a weapon-mounted laser rangefinder which is used for range-to-target purposes, and in the case of target localisation purposes, typically requires a surveyed position which may be determined by GPS and/or other methods such as manual surveying and/or other aids. However, under highly dynamic situations, during urban combat, in heavy foliage or difficult terrain, in adverse weather conditions, in underground and indoor locations, or when GPS is jammed, acquiring such surveyed positions is not feasible and/or GPS may be unreliable or unavailable. By utilising the relationship between child and parent sensors and their respective position and orientation measurements, the laser rangefinder measurements may be processed without the requirement for a surveyed position and without manual map measurements or manual surveying or reliance on GPS, etc.
[084] Furthermore, in addition to range-to-target purposes, such measurements may also be used as an observation within the location estimation framework so that range measurements to fixed objects such as buildings, fixtures, terrain, and other references may then be incorporated within the estimation framework without the typical constraints of requiring a sensor which is fixed or rigidly mounted to a platform, or a sensor which is calibrated with respect to a platform such as a vehicle or a surveyed location point or a fixture. Additionally, the benefits of rapid measurement acquisition are apparent as the time required to set-up a stationary surveyed position is reduced or eliminated. Thus, incorporating range measurements to fixed references within the environment (where such measurements may also correlate with a map of the environment, if one is available or if one is being simultaneously generated using a technique such as Simultaneous Localisation and Mapping (SLAM)) provides an additional source of information which aids the location estimation. The estimation framework may also account for sensor characteristics and measurement errors (e.g. position, range, azimuth, and elevation errors) and utilise information from several weapons each equipped with a laser rangefinder, providing an overall framework catering for ranging for localisation purposes and ranging for targeting purposes from one or more sensors which may be networked (e.g. a body area network (BAN), to improve situational awareness as part of network centric warfare, etc.). [085] When integrated in such a manner, range measurements may be strategically acquired for localisation purposes, for example, by measuring the range to a fixed reference and allowing the localisation framework to handle such information accordingly, including generating a map of the environment and/or correlating such measurements with a map of the environment (e.g. range measurements to a wall within an indoor building). Such range measurements may be manually or automatically acquired at certain intervals, for example, at regular time or distance intervals, with respect to a growing uncertainty in the localisation estimate, or if processing resources, power consumption, and sensor radiation/emissions allow, such measurements may be continuously acquired and handled by the location estimation platform (e.g. measurements may be rejected or accepted, several measurements to a single target may be acquired to reduce target uncertainty and eliminate spurious readings and clutter, etc.). Illustrative examples
[086] In an illustrative example of a weapon-mounted rangefinder, range measurements for the purpose of localisation and/or map building may be automatically acquired without user intervention or may be strategically acquired with user intervention, for example, with respect to the location and orientation of the weapon (e.g. while the user points and scans the weapon whilst securing an environment or building or whilst traversing an indoor or outdoor urban environment such as when walking down a lane or corridor). In such a scenario, the weapon orientation and motion is determined with a weapon-mounted sensor, and the weapon location may be determined by exploiting the location of a parent body- worn sensor, and the corresponding sensor-to-sensor relationship in conjunction with measurements from the weapon-mounted sensor. If a map of the environment is available, then such measurements may be appropriately handled by a location estimation framework (e.g. as range bearing observations or as features or landmarks which are correlated to the map), or if a map is not available or in addition to availability of a map, such range bearing measurements may be stored, processed, and/or visualised (e.g. as 3D point clouds, or as a feature-based or landmark-based representation utilising techniques such as line extraction or curve fitting, etc.) to acquire information about the surrounding environment, to share such information with other sensors/objects, and to build a map of the environment which may be used for subsequent localisation (e.g. a laser scan map or a feature/landmark map which also may be used in several estimation platforms and techniques known in the prior art, such as scan matching and feature/landmark extraction for Simultaneous Localisation and Mapping (SLAM), etc.). [087] Several examples may also make particular use of the dynamic nature of sensor association and the resultant sensor-to-sensor relationships. Referring to Fig. 14, consider the scenario where one or more sensors 505 configured as mobile beacons are carried by one or more mobile objects (e.g. carried by personnel 510, vehicles 515, etc.) and dispersed whilst traversing an environment (e.g. building 525). Whilst being transported, a sensor/beacon 505 may be configured as a child sensor and associated with another mobile object configured as a parent sensor, and once placed within the environment, the sensor/beacon 505 may then disassociate and dynamically adjust its hierarchy level such that it may then act as a parent sensor to other mobile objects (including the same mobile object which it was previously associated with as a child sensor). Configured in such a manner, the sensor may then act as a mobile beacon, inheriting the location of the parent sensor/object at the time of disassociation, and providing localisation information to other sensors (e.g. other mobile objects traversing within the environment, such as personnel, vehicles, etc.). In this scenario the sensor/beacon may be strategically placed within the environment so that when the location of the beacon is re-visited or newly visited by mobile sensors/objects traversing the environment, the location information configured and transmitted by the beacon aids the location estimate of these sensors/objects. In placing the beacon(s), various strategies may be used by the parent sensor/object, for example, placing beacons after travelling a certain distance or after a certain period of time has elapsed, placing beacons with respect to the environment structure such as when travelling around corners, doorways or open spaces, placing beacons with respect to an increasing uncertainty of the localisation estimate, etc.
[088] During operation, measurement information from the beacon (regardless of whether the beacon provides a range, bearing and/or relative proximity, etc.) may be handled by each sensor as an observation within its own estimation platform (e.g. a probabilistic estimator which treats such sources of information as observations to perform an "update" using an appropriate observation model). The dispersed beacons also may be configured to act as signal repeaters and routing nodes which enable data transmission/reception and communication in difficult/harsh environments where the transmission range and/or power is insufficient for direct transmission between units (e.g. in an underground mine or dense urban environment), or as an alternate means of transmission to fixed infrastructure or where direct transmission is unreliable, affording a degree of redundancy by routing messages via one or more nodes (e.g. in the case of destroyed or damaged infrastructure in a collapsed mine or building, in an emergency situation, or when operating in a hostile environment without access to infrastructure, etc.). [089] Furthermore, as the beacons are dispersed within the environment and simultaneously mapped during localisation, the estimation framework may also be considered to be an implementation of Simultaneous Localisation and Mapping (SLAM). During operation, if the beacons are picked up and/or interacted with (e.g. collected or relocated), they may once again dynamically adjust/update their hierarchy level and/or relationship(s) and measurement(s) accordingly and the process may continue.
[090] A similar scenario to the distributed mobile beacons can be envisaged in the case of an instrumented vehicle which transports personnel to the location of an incident or work environment (e.g. a vehicle carrying first responders such as fire-fighters or an incident response team or other safety/defence/security team or industrial workers such as miners, etc.). Referring to Fig. 14, personnel 510 equipped with body- worn sensors may become dynamically associated with and disassociated from the vehicle 515 whilst performing their duties, so that once stationary one or more sensors mounted or integrated within the vehicle 515 may then provide localisation information to other sensors based on the position and/or orientation of the vehicle (e.g. coordinates and/or range and/or bearing and/or relative proximity measurements, etc.) so that the vehicle 515 may aid the location estimate of other sensors/objects in a similar manner to the mobile beacons 505, acting as a fixed base-station or command-and-control centre 520. Once mobile, the vehicle sensor(s) may then dynamically adjust/update hierarchy level(s) and/or relationship(s) and measurement(s) accordingly and the process may continue.
[091] In the above scenarios of sensors acting as mobile beacons (regardless as to whether the sensors are distributed within the environment or are integrated into a vehicle, etc.), such sensors may also transmit sensor corrections, acting as a ground-based reference station in a similar manner to differential GPS (DGPS), e.g. to act as a reference station for RF signal correction or synchronisation, or to compensate for drifting temperature and variations in environmental temperature or changes in atmospheric pressure due to weather conditions, etc. [092] The nature of the relationship between a parent sensor/object and a child sensor/object ensures that unreliable, error-prone, or drifting measurements, and uncertain and/or approximate motion models do not pose a significant deviation in the motion, position and/or orientation estimates of the child sensor/object.
[093] System information such as measurement uncertainties, mapping information, corrections in sensor drift/bias, misalignment and related parameters may be shared between the parent sensor(s)/object(s) and child sensor(s)/object(s) to improve position and/or orientation estimates and other estimation parameters of each individual sensor/object. Such information may flow either from the parent sensor(s)/object(s) to child sensor(s)/object(s) or vice-versa.
[094] The dynamic nature of such relationships ensures that objects which may be influenced, handled, or interacted with by one or more parent objects (such as hand-held objects which may be picked up or placed down, manipulated, and/or shared, or personnel entering and leaving a vehicle, etc.) are able to be tracked accordingly in such a manner that a detailed analysis of motions and trajectories (including object and projectile trajectories) can be performed.
[095] A sensor, or memory associated with a sensor, may maintain a state estimate (e.g. position and/or orientation) and may provide such information to other sensors. For state estimation, sensors may make use of several frameworks/techniques based on the sensor type and information available, such as integration of inertial sensors, the application of one or more motion models, the application of an optimisation algorithm (which may include constraints), and may utilise one of several available estimation techniques (e.g. a probabilistic estimator such as a Kalman filter or Sequential Monte-Carlo (SMC) approaches such as a Particle Filter, Simultaneous Localisation and Mapping (SLAM), etc.) and various sources of information (e.g. a map of the environment or a map of features landmarks within the environment, etc.). Each sensor may utilise a different estimation technique/methodology and measurement uncertainty based on factors such as sensor type and sensor characteristics, environmental conditions, the type of object being tracked, the type of motion and motion dynamics expected, the type of sensor-to-sensor relationship, and processor/memory availability and power consumption. Sensors may also make use of multi-model estimation and dynamically switch between models based on sensor measurements and/or sensor-to-sensor relationships. [096] The sensor association/disassociation comprising of sensor hierarchy and MoA/MoD may be implemented by one of several frameworks/techniques known in the literature which deal with information from multiple sensors and/or multiple targets, such as implementations and variations of Multi Target Tracking (MTT), Multiple Hypothesis Tracking (MHT), Interacting Multiple Model (IMM) estimation, sensor registration/association using Probabilistic Data Association (PDA) or Joint Probabilistic Data Association (JPDA), fuzzy logic, and/or Bayesian techniques and estimation methods such as a Kalman Filter or Sequential Monte-Carlo (SMC) approaches such as a Particle Filter, etc., and various strategies for event association such as maximum a posteriori probability (MAP), maximum likelihood (ML) estimation or Mutual Information (MI), etc. For example, the hierarchy and MoA/MoD may be used as determining switching probabilities and calculating model likelihoods for association/disassociation of sensors/objects and performing state estimation.
[097] At any one time, several sensors/objects may be available, each of which may become associated or disassociated. A parent sensor/object may be concurrently associated with one or more child sensors/objects (e.g. a vehicle acting as a parent object is associated with several body-worn sensors acting as child objects and representing occupants within a vehicle, or a human with a waist-mounted body-worn sensor acting as a parent object is associated with several child objects such as body-worn items such as clothing or protective gear or equipment carried by personnel, mobile nodes/beacons which may be deployed or dispersed throughout the environment and used for reference measurements and/or relaying communications).
[098] An object may be tracked using a variety of types of sensors through either self- contained infrastructure-free techniques (e.g. inertial navigation, visual odometry by an onboard camera, laser measurements, node-to-node measurements, etc.), infrastructure- reliant techniques (e.g. GPS, RFID, WiFI™, active or passive beacons which may be pre- existing or dispersed within the environment during localisation, cell phone network towers, etc.), or any combination of the above.
[099] Each object may contain multiple sensors of varying types, however, for the purpose of sensor association/disassociation, each object preferably employs a sensor which enables calculation of a MoA/MoD which may include motion and/or position/trajectory estimation, motion timing, motion estimation/classification, and/or distance/proximity measurements, etc. Depending on the type of object and motions analysed, this may be in the form of one or more sensors measuring across one or more axes of motion, such as inertial sensors (e.g. one or more accelerometers and/or gyroscopes available as discrete components, a complete inertial measurement unit, etc.), magnetic sensors (e.g. a dual-axis or tri-axis magnetometer measuring the Earth's magnetic field to determine the magnetic heading and/or tilt), F/wireless measurements, GPS, etc. Each object may also make use of proximity sensors (e.g. RFID, sensor-to-sensor/node-to-node RF measurements, etc.) to determine a relative distance to one or more other mobile sensors/objects or base-stations.
Transmit/receive sensor in formation
[0100] A parent sensor may publish, broadcast or otherwise transmit information in relation to the motion it is experiencing. Each child sensor can then listen for such information. A sensor may simultaneously function as a parent and as a child (for example, a human represented by body-worn sensor acts as a parent to a sensor attached to a hand-held object, and simultaneously also acts as a child to a parent sensor which represents a vehicle or platform such as a personnel carrier, a truck or other heavy equipment, a ship or water vessel, etc.).
[0101] Such information may include continuous sensor data and/or discrete features extracted from sensor data (e.g. a sensor feature vector which contains discrete information about the motion), which may also include information such as the estimated position and/or orientation, measurement uncertainty, motion timing information, and/or an estimation or classification of the motion type, with the aim of allowing a child sensor/object to adequately associate itself with a parent sensor/object and perform state estimation. Such information may already form part of the data included with the record of information maintained by each sensor.
[0102] Such information may include wireless characteristics and/or measurements which indicate a level of proximity between objects, for example, Radio Frequency (RF) measurements such as the Received Signal Strength Indicator (RSSI), Time Of Flight (TOF) measurements, Time Difference Of Arrival (TDOA) measurements, Radio Frequency Identification (RFID) signals, and/or other related measurements and technologies such as generation of magnetic fields, acoustic transmissions, electromagnetic transmissions, optical transmissions, and Ultra Wideband (UWB) transmissions, etc.
[0103] Such communication may take many forms, for example, over a wireless transmission means, a wired network, and/or on a common bus which communicates with multiple sensors (e.g. Wireless Local Area Network (WLAN Wi-Fi™), Local Area Network (LAN), Zigbee™, Universal Serial Bus (USB™), Controller Area Network (CAN) bus, or a custom interface/protocol, etc.).
[0104] Such communication may be over a particular transmission medium, channel, frequency, modulation, or coding, etc., which is common to a particular group of sensors such that sensors may be arranged in groups which only communicate amongst each other. For example, two teams of personnel each with body-worn sensors and each with weapon- mounted sensors operating as two independent groups, thus each team may utilise sensors which communicate only to sensors within that team/group. [0105] Such communication may be performed only with sensors/objects which are indicated as being "paired", e.g. a weapon-mounted sensor paired with a body-worn sensor or an instrumented vehicle paired with a body-worn sensor, both of which are allocated to an individual user and thus may only communicate association/disassociation information with each other.
[0106] The communication protocol may also support a mesh network and dynamic configuration of communication nodes and be "self-healing" to support the addition of new nodes and the loss of existing nodes, and may be based on an accepted communication standard such as IEEE 802.15.4 (Zigbee™), etc. Sensors/objects may also act as communication nodes within a sensor network which route messages via one or more sensors and afford a degree of redundancy and/or an extended communication range. [0107] The communication protocol may also support or require encryption and/or the exchange of security credentials or keys prior to information exchange and may utilise an external server for key management and verification. The communication protocol may also support or require authentication and/or identification, for example, in conjunction with a biometric scanner/identifier or a pin/password for enabling a device such as a weapon or vehicle.
[0108] For efficiency purposes (e.g. in terms of battery life, power usage, processor utilisation, and/or bandwidth usage), a sensor may choose to transmit/receive such information only when the sensor has detected other sensor(s) within communication range and/or only when the sensor has detected that the sensor is undergoing motion. A sensor may also choose to switch to power saving mode and/or power down various electronic integrated circuits (ICs) and sub-systems such as RF communication ICs and individual sensors. Sensors may also choose not to transmit, publish or receive information in certain situations, such as those which require "stealth" in terms of electronic emissions and/or radio frequency radiation, in situations which indicate that portable power usage is excessive and/or needs to be conserved, or in situations where communication is not possible due to lack of transmission power/range (e.g. in an underground mine or dense urban environment or over large distances). In such examples, sensor data and appropriate information may be internally buffered and/or stored until such time when such information may be shared, and such situations may be dynamically detected during operation.
[0109] In some scenarios, the sharing of such information may not be required in real-time and thus may occur during post-processing or mission evaluation, negating the requirement for communication in real-time or during active operation.
[01 10] For sensor/object and map management, a database of sensor and map information including the sensor position and/or orientation and map(s) (e.g. occupancy grid map(s), feature-based map(s), landmark-based map(s), etc.) may be stored on a remote server which is able to be accessed, updated, and shared with other sensors, such that sensor and map information is stored at a central repository and available for remote access for processing and observation from one or more remote computers or terminals. The server may be hosted locally, nationally, or internationally, and communicates with the host- system using either wired and/or wireless communication (LAN/WLAN, etc.). Similarly, such information may be routed from sensor-to-sensor via a mesh network or similar node- to-node communication methodology (e.g. Zigbee™). [01 1 1] Where an object includes more than one sensor and includes a level of hierarchy which defines sensor interaction within that object (e.g. a primary hierarchy level object such as a human with secondary hierarchy level sensors attached to limbs and/or extremities), a central unit (such as a processing/data acquisition/communication unit) may manage communication, synchronisation, data, and processing of all sensors which belong to that object, such that the central unit may then publish/broadcast relevant information to other sensors/objects (which may also be represented by a central unit and include more than one sensor and level of hierarchy) and may receive relevant information from other sensors/objects and provide such information to sensors within the object which are managed by the central unit. Thus, the central unit represents all data to and from an object and the sensors which represent that object, acting as a gateway for information and communication.
[01 12] Referring to Fig. 5 there is illustrated an example method 50 for transmitting and receiving sensor information at the occurrence of a motion. At step 52, a check is made to determine if a motion has occurred. If a motion has occurred, then the remaining steps are completed:
1. At step 54, a check is made to determine if the sensor is a parent sensor. If true, at step 56, the sensor transmits sensor information so that child sensor(s) are able to receive such information;
2. At step 58, a check is made to determine if the sensor is a child sensor. If true, at step 60, the sensor receives such information. [01 13] Alternatively, steps 54 and 58 may be performed independently and concurrently (e.g. as separate processes and/or as event-driven/interrupt-driven procedures).
[01 14] Alternatively, example method 50 may be implemented on a continuous basis, for example, executing at constant intervals during sensor data acquisition, independent of the occurrence of a motion. This may be particularly useful for applications such as motion capture.
[01 15] Alternatively, example method 50 may be implemented by utilising a common processing unit which may receive and process parent and child sensor information accordingly without parent and child sensors communicating directly.
Association/disassociation of sensors/objects
[01 16] In determining sensor association, a child sensor may consider two scenarios: when the child sensor is in a disassociated state; and when the child sensor is in an associated state.
[01 17] Referring to Fig. 6, there is illustrated an example method 70 for a child sensor in a disassociated state to determine if the child sensor may be associated with a parent sensor. At step 72, a check is made to determine if the child sensor is undergoing motion. If the child sensor is undergoing motion, then at step 74, a check is made to determine if one or more valid parent sensors are available. A valid parent sensor is considered to be a sensor which is within communication and with compatible hierarchy level. If one or more valid parent sensors are not available, then at step 78, the child sensor may perform state estimation as a disassociated sensor using any of the various frameworks/techniques discussed herein. It should be noted that the term communication is generically used and may include wireless or wired communication in real-time or near real-time, or may include post-processing. The term communication is also used to include access to data stored on a medium such as a disk or memory card, so that all sensor data which is able to be accessed (e.g. during post-processing) is regarded as being within communication.
[01 18] If valid parent sensor(s) are available, at step 76, a check is made to determine if the valid parent sensor(s) are undergoing motion. If true, all valid parent sensors undergoing motion are selected for possible association. If false, at step 78, the sensor may perform state estimation as a disassociated sensor using any of the various frameworks/techniques discussed herein. [01 19] At steps 80-86, valid parent sensors are analysed for possible association. At step 80, a Measurement of Association (MoA) is calculated for each valid parent sensor based on factors such as measurement similarities, motion timing, motion classification, and proximity between the sensors. At step 82 a check is made to determine if the MoA is above a required threshold. If false, at step 84 the sensor is rejected as a potential candidate for association. At step 86, a check is made if more sensors are available. If so, steps 80-86 are repeated until all remaining parent sensors have been exhausted and all candidate parent sensors have been allocated a MoA. At step 88, once all sensors have been inspected, if there are no parent sensors which meet the required MoA threshold, then at step 78, the child sensor may perform state estimation as a disassociated sensor using any of the various frameworks/techniques discussed herein.
[0120] If at step 88 at least one parent sensor was selected, then at step 90, select the most- likely sensor based on the MoA, where a higher MoA indicates a closer match between sensors indicating that there is a high likelihood that they are associated. In most cases, the sensor selected will be the sensor with the highest MoA. If only a single sensor is available for association, then that sensor is selected. The sensor selection process may include any of the various frameworks/techniques discussed herein. For example, in some probabilistic estimation methodologies which maintain multiple state estimates (which in this example may be seen as being equivalent to maintaining multiple hypotheses of association), the MoA may be considered equivalent to assigning a likelihood measure, and thus typically a sensor is selected after likelihood weighting, with the most-likely sensors having been assigned a higher probability of selection. With reference to Particle Filtering, this can be seen as the re-sampling procedure. With reference to an IMM tracker, the MoA may be used for determining switching probabilities and calculating model likelihoods.
[0121] Once a parent sensor has been selected for association, at step 92 the sensors are associated in accordance to their hierarchy, and at step 94 state estimation is performed using the appropriate sensor-to-sensor relationship and by applying the required estimation technique(s). The child sensor and parent sensor(s) are now in an associated state.
[0122] Referring to Fig. 7, there is illustrated an example method 100 for a child sensor, in an associated state, to determine if the child sensor may become disassociated, or if the child sensor may maintain association with the parent sensor with which the child sensor is currently associated. At step 102, a check is made to determine if the parent sensor which the child is associated with is undergoing motion. If false, then at step 104 a check is made to determine if the child sensor is undergoing motion. If false, then in terms of sensor association, no further action is taken as neither the child sensor nor the parent sensor are undergoing motion. Note that each sensor may continue to perform internal state estimation and motion analysis which may be especially important with respect to certain sensor technologies and estimation methodologies such as inertial sensing, motion estimation and classification, signal processing and filtering, etc.
[0123] At step 102, if it was determined that the parent sensor is undergoing motion, or at step 104, if it was determined that the child sensor is undergoing motion, then at step 106, a check is made to determine if the parent sensor is valid. A valid parent sensor is considered to be a sensor which is within communication and with compatible hierarchy level (e.g. the sensor may have dynamically changed hierarchy levels). If the parent sensor is no longer valid, then at step 118 the parent and child sensors are disassociated and at step 120, the child sensor may perform state estimation as a disassociated sensor using any of the various frameworks/techniques discussed herein. The parent sensor and the child sensor are now in a disassociated state.
[0124] If the parent sensor is a valid sensor, then at step 108 a Measurement of Association (MoA) is calculated for the parent sensor with which the child is currently associated, based on factors such as measurement similarities, motion timing, motion classification, and proximity between the sensors. At step 1 10 a check is made to determine if the MoA is above the required threshold. If true, then the sensors remain associated and at step 1 12 state estimation is performed using the appropriate sensor-to- sensor relationship and by applying the required estimation technique(s). [0125] If at step 110 the MoA is not above the required threshold, then at step 1 14 a measurement of disassociation (MoD) is calculated for the parent sensor with which the child is currently associated, based on factors such as measurement similarities, motion timing, motion classification, and proximity between the sensors. At step 1 16 a check is made to determine if the MoD is above the required threshold for disassociation. If false, then the sensors remain associated and at step 112 state estimation is performed using the appropriate sensor-to-sensor relationship and by applying the required estimation technique(s). If the MoD is above the threshold, then at step 118 the parent and child sensors are disassociated and at step 120, the child sensor may perform state estimation as a disassociated sensor using any Of the various frameworks/techniques discussed herein. The parent sensor and the child sensor are now in a disassociated state.
Measurement of association/disassociation
[0126] Collectively, one or more measurements may be used to calculate a Measurement of Association (MoA) or a Measurement of Disassociation (MoD) which is compared to the threshold of association and threshold of disassociation maintained by each sensor. Depending on availability, such measurements may include sensor measurements (e.g. raw measurements, post-processed or filtered measurements, and/or extracted features), trajectory estimates, motion estimation/classification, and timing and proximity information. Sensor measurements may also pass a number of stages of processing such as filtering, and/or may be converted or calculated between various relationships such as acceleration, velocity, displacement, force, energy, work, power, etc., and/or may utilise related principles (e.g. power transfer). In some estimation/data association frameworks, such measurements also may be required to satisfy a validation region. A child sensor may associate, disassociate, and re-associate itself with a parent sensor by examining the correspondence between its own measurements and those of available parent sensor(s) by analysing one or more of the following measurements (depending on availability and the type of sensors employed): [0127] (1) Timing similarities of the occurrence of motions which allow for parent and child sensors/objects to synchronise motions, for example, by a child sensor/object detecting that the child sensor/object is undergoing motion and detecting that a parent sensor/object also undergoing motion within a measurement window or time frame, where a lower time difference between motions of the sensors result in a higher MoA, and a higher time difference between motions of the sensors results in a higher MoD. The timing measurements may take into account message transmission/reception time (which may include message routing across a number of nodes) and latency involved with data acquisition, measurement, communication, and processing. For performing timing measurements, generally a sensor/object only needs to keep track of internal timing and may make use of an internal clock (e.g. an oscillator source such as a crystal) and perform a relative time measurement between the occurrence of its own motion and the detection/notification of the motion of another sensor. However, for data logging/storage and post-processing, the object may also make use of a common time which has been initialised and/or is synchronised via communication messages (such as wireless transmissions, time synchronisation protocols, etc.) and/or a common time reference such as a GPS clock and/or a pulse output, e.g. GPS 1 pulse per second (1PPS) output. [0128] (2) Position/proximity measurements which indicate that the parent and child sensors/objects are within close proximity (either as a relative measurement and/or as an absolute measurement when absolute positions of sensors/objects are known), where a lower distance between sensors results in a higher MoA, and a larger distance between sensors results in a higher MoD. Such proximity measurements may be supported by RF/magnetic/acoustic signals and other transmissions and/or other forms such as optical/visual detection of features/beacons/tags on an object. For example, by detecting a matching pattern or colour, or by detecting an emitting light source such as a Light Emitting Diode (LED) either in the visual or non-visual spectrum (e.g. an Infra Red (IR) LED) which may transmit a coded pattern for synchronisation and/or identification purposes. Where a sensor/object which is not able to maintain an accurate estimation of its position becomes associated with a parent sensor/object which is able to maintain a position estimate, then depending on the nature of the sensor-to-sensor relationship, the child sensor/object may inherit the position estimate of the parent sensor/object and maintain that position estimate after disassociation, thus allowing the child sensor/object to become associated with a parent sensor/object which is within close proximity. In such a scenario, the child sensor/object may also inherit from the parent sensor/object other parameters or statistics which define the position estimate, such as the statistical accuracy or uncertainty of the position estimate, etc. [0129] (3) Similarities in sensor measurements could be used. Child and parent sensors/objects may exhibit either similarities or differences in sensor measurements and/or trajectories which indicate both are attached by way of a physical relationship resulting in a higher MoA, or which indicate that they are undergoing independent manoeuvre(s) resulting in a higher MoD. Depending on the type and quality of sensors employed and the motions performed, the comparison of motions and/or trajectories may take into account only the most recent/short-term portion of the motion trajectory or only motions and/or trajectories which occurred when a motion disturbance is detected.
[0130] (4) Complimentary motions performed by parent and child sensors/objects could be used. Sensor measurements can indicate specific motions which indicate that sensors/objects are interacting, resulting in a higher MoA (e.g. a body-worn sensor indicating that a person is transitioning to an upright position, and a hand-held object placed on the floor detecting that it is being picked up, or a body- worn sensor indicating that a person is entering a vehicle and sitting down). Sensor measurements can indicate specific motions which indicate that sensors/objects have ceased interacting, resulting in a higher MoD (e.g. a body-worn sensor indicating that a person is transitioning to a kneeling or bent over position, and a hand-held object detecting that it has been placed down on the floor, or a body-worn sensor indicating that a person is exiting a vehicle and standing up). Such measurements may be supported by motion detection and/or classification algorithms which identify the motion being performed, and/or by motion estimation algorithms which are able to estimate the motion being performed. General information
[0131] Processing may be performed either on-board the child sensor/object, on-board the parent sensor/object, or at one or more independent local or remote host system(s) and processing platform(s). Processing may be performed in a decentralised and/or distributed manner as a combination of the above with either wired and/or wireless connectivity. The position of the parent and child sensor(s)/object(s) may be represented in 2D/3D and performance aspects and events also may be displayed. Where required, data from sensors is synchronised for processing, analysis, and/or visualisation. Several techniques to perform such synchronisation are available, including time-stamping of data during transmission, reception, and/or storage, initialisation of a global system time, synchronisation by RF measurements, synchronisation with a GPS-corrected system clock from an on-board GPS receiver, and/or using a standard time-synchronisation protocol such as the Network Time Protocol (NTP). Sensor/object synchronisation and state estimation may be performed offline by post-processing sensor data and corresponding timestamps which have been stored during operation.
[0132] Sensor/object synchronisation and/or state estimation may be performed "on demand" at the occurrence of a motion. For example, data may be buffered and/or stored until such time that a significant motion event occurs such as the firing of a weapon, at which time, a child sensor attached to or incorporated within the weapon may choose to perform state estimation (such as position and/or orientation and/or projectile trajectory estimation) in accordance with information from the parent object with which it is associated, or the child sensor may choose to provide such information to the parent object which it is associated with so that the parent sensor/object may perform such state estimation.
[0133] Sensor/object synchronisation and state estimation may be performed/executed on an event-driven basis (e.g. at the occurrence of a motion and/or with reference to one or more sensor measurements), on a periodic basis (e.g. at regular intervals), on a query-basis (e.g. when requested), or any combination of the above depending on the sensor(s) utilised, the type of object being tracked, the type of motion and motion dynamics expected, the processing resources and battery power available, and the nature of the algorithm(s) implemented.
Management of motion events and external events
[0134] At the occurrence of motion event(s) and/or estimation of resulting motion trajectories (including projectile/object motion trajectories), the location of occurrence of such events and trajectories and the location of external events may be stored, including, if available, the probability of occurrence and/or error/uncertainty estimate, for example, representing the uncertainty in position and/or orientation of the external event and/or point of contact of the projectile trajectory as an error ellipsoid. [0135] The resulting motion event(s) and/or motion trajectories (including projectile/object motion trajectories) and/or external event(s) then may be processed for analysis, for example, to generate performance metrics and/or to generate a statistical representation of events. Such an analysis is especially useful for management of such events in real-time, near real-time and/or during an after action review (AAR). Such an analysis may also include visualisation and/or generation of performance metrics and statistics of such event(s) and/or trajectories (including projectile/object motion trajectories) and/or external event(s). [0136] Where such motion event(s) and trajectories to external event(s) represent threats, then the location and/or probability of such threats may also be displayed, resulting in improved situational awareness and management of such threats. For example, if a weapon is discharged, this can represent the location and probability of an external event such as engaging a threat such as an enemy (by projectile trajectory estimation which reveals the location of the target/enemy) and/or as a reactive or defensive event which indicates that a soldier may require support (by knowledge of the location of the sensor/object representing that soldier). As another example, fire-fighters aiming fire extinguishing equipment (e.g. a fire hose or fire extinguisher) at particular locations and regions within a building indicate locations of external events such as threats, in this case, a fire, and the severity of the external event (threat), in this case, the distribution/extent and intensity of the fire, by estimating the trajectory of water spray/scatter and by monitoring the duration of use.
[0137] Where such motion event(s) and trajectories represent performance-oriented events (such as sports), such events may help to. improve performance and analysis of a player's skills (e.g. by representing the location of a player and motion/direction/force/etc. of swings and hits of a tennis racquet, golf club or baseball bat, etc.).
[0138] When several of such motion events and/or trajectories (including projectile/object trajectories) and/or external events coincide either by way of timing, trajectory, or location, such events and/or trajectories and/or external events may be clustered and their location may be displayed with a level and probability of an external event which caused or influenced the occurrence of motion events and trajectories. For example, where several weapons are discharged and where a number of estimated projectile trajectories coincide, this can represent a location of an external event (e.g. threat or enemy) with high probability. As another example, fire-fighters pointing fire extinguishing equipment (such as several fire hoses) for long periods of time at particular regions within a building indicate the location(s) of fires (as external events) with high probability. In sports, for example, such information may be used to analyse a player's performance, strengths, and weaknesses. In industrial environments (e.g. operating heavy machinery such as mining or building equipment, operating tools in a manufacturing plant, etc.), such information may be used for analysing work skills, training staff, and ensuring quality of manufacture or analysing aspects such as yield efficiency or accuracy or worker fatigue, etc. Such an implementation may, for example, take into account the probability of each motion event (such as the firing of a weapon) and/or the estimated uncertainty in terms of the error ellipsoid of the position and/or orientation of the resultant projectile trajectories (which indicate external events), their points of intersection, and/or if a building/environment floor plan or map is available either as a 2D/3D representation, their estimated point of impact/collision with obstacles represented within that floor plan/map (e.g. as determined by a technique such as ray tracing).
[0139] Such representations of motion events and external events can be used for better planning in real-time, near real-time, after action review, and for estimating the probability of future events. For example, an automated path planner which directs vehicles and/or troops may take into account such threats (external events) for avoidance of further conflict or for directing backup and support. As another example, an automated path planner may direct fire-fighters manoeuvring within a building to take a path which avoids threats (external events) such as sources of fire or locations of a building which are estimated to be structurally unsafe/unstable due to the intensity/duration of fire as estimated by trajectory estimation. Storing and processing such information also allows for estimating the probability of occurrence of future events. For example, given a history of events, the probability of occurrence and location of threats in a region of conflict, or in the occurrence of a fire of similar duration and/or intensity in a building.
[0140] Motion events may be detected in several ways, for example, by utilising signal processing and feature extraction techniques operating on continuous sensor data such as detecting peak motions along one or more axes of measurement using inertial sensors (e.g. detecting weapon discharge by peak acceleration measurements along the longitudinal axis of the weapon), by a motion classification algorithm, by physical interfacing (e.g. mechanical, electrical) to a switch, trigger, or other mechanism which is actuated during the motion event, or a combination of the above.
[0141] Referring to Fig. 8, there is illustrated an example method 130 for management of motion events and external events. At step 132, a check is made to determine if a motion event (such as the firing of a weapon) has occurred. If a motion event has occurred, then at step 134, the position, orientation and/or trajectory at the occurrence of the motion event is estimated. At step 136, a check is made to determine if the motion event results in the firing or ejection of a projectile/object/substance. If true, then at step 138, the position, orientation and/or trajectory of the external event is estimated, based on the trajectory estimate of the projectile/object/substance, taking into account various parameters and properties regarding the projectile/object/substance, such as aerodynamic properties and/or the force of the motion event and/or meteorological or environmental measurements, etc. If false, then at step 140, the position, orientation and/or trajectory of the external event is estimated. [0142] At step 142, a check is then made to determine if a building floor plan or layout of a building and/or a map of the surrounding environment is available. If true, then at step 144 the location and/or orientation of the external event and/or the point of impact or collision of the projectile/object/substance (if step 136 was true) with the building, terrain, and/or object(s) represented within the floor plan and/or map of the environment is estimated. If a projectile/object/substance was fired or ejected, this may also include an estimate of the coverage area, taking into account various parameters and properties regarding the projectile/object/substance, such as aerodynamic properties and/or the force of the motion event and/or meteorological or environmental measurements, building/construction materials at the point of impact, etc. (for example, the coverage area may represent the extent of damage in the case of a projectile fired by a weapon, or the coverage area due to the spread/distribution of liquid/chemical spray from a fire hose or other equipment, etc.). [0143] At step 146, a check is made to determine if multiple trajectories exist and if they coincide by trajectory and/or location of external events. If true, then at step 148, similar trajectories and/or external events are clustered and the location (2D/3D) and/or orientation including probability/error uncertainty estimate and/or probability of occurrence of the trajectory and/or external event and/or motion event are estimated. At step 150, the motion event position and/or orientation and/or probability/error uncertainty and/or probability of motion event occurrence, the resultant trajectory, and the external event location and/or orientation and/or probability/error uncertainty and/or the probability of external event occurrence are stored for future reference and analysis. At step 152, such information may then be published/transmitted and/or visualised or made available to one or more mobile units or command and control stations, etc.
Visualisation and monitoring of obiect(s)and external event(s)
[0144] The sensor(s)/object(s) being tracked may be displayed within a 2D/3D visualisation platform which may be displayed at one or more command and control stations, within one or more mobile vehicles (e.g. for collision/proximity avoidance in mining, or at a mobile command and control centre), at one or more mobile units carried by personnel either as separate processing/computing systems or integrated into existing computing/processing systems carried by personnel, and/or displayed on a Head-Up Display (HUD)/wearable visualisation platform. Based on the level of visualisation hardware available, a 2D/3D visualisation may be displayed, indicating the pathway/trajectory followed by the object, including the current location (2D/3D), orientation, and altitude of tracked objects, the location and/or orientation/heading of detected motion events and/or external events, and estimated trajectories at each of those detected motion events (including projectile/object trajectories and/or points of impact). If required, uncertainty estimates (e.g. as an error ellipse) may also be displayed.
[0145] If a map is available, the visualisation may be performed with respect to the map in 2D or in 3D. In certain instances, estimates of location, orientation and/or trajectories of motion events and external events (including projectile/object trajectories) may coincide with physical aspects and representations of the environment, for example walls, floors, ceilings, fixtures and fittings, and mobile objects and/or objects traversing the environment (such as personnel and vehicles, including objects which are being independently tracked by an external system), and thus, may be represented and visualised accordingly, for example, by calculating the location of a collision of a projectile with the environment and objects within the environment (for example, by utilising techniques such as ray tracing). [0146] The user viewpoint may be displayed as a 3rd person viewpoint (i.e. as an observer) and may be freely moved around and tilted within the virtual 3D environment during such visualisation. The user viewpoint may be displayed as a 1st person viewpoint of the object being tracked and may automatically move and orient itself with the object, useful for viewing the point-of-view of the object, e.g. a weapon sight as the weapon is positioned and oriented within the environment, and a trajectory which emanates from the weapon and indicates the trajectory a projectile may follow and the collision point within the environment, and if a map is available, objects within the environment.
[0147] The visualisation of the external event(s) and estimated projectile/object/substance trajectories may take into account various aforementioned parameters such object size, weight, aerodynamic properties and/or the force of the motion event and/or meteorological or environmental measurements, building/construction materials at the point of impact, etc., such that the projectile trajectory and/or coverage area (in terms of damage or spray of liquid/chemicals etc.) may be analysed and visualised either prior to the firing or ejection of a projectile/object/substance (e.g. estimated and visualised as a trajectory and coverage area in real-time in accordance with dynamic sensor and parameter measurements), and/or after the firing or ejection of a projectile/object/substance (e.g. for improved situational awareness in real-time or near real-time, and/or for post-processing such as when performing an after action review, etc.).
[0148] The 2D/3D visualisation may include a data overlay for each sensor/object being tracked, such as the object ID/name/type, location coordinates (e.g. in a Cartesian or geographic coordinate system), health status and vital signs, number of motion event(s) detected (e.g. number of shots fired), battery/power levels, etc. Such a visualisation may be performed in real-time, near real-time, or played back during post-processing and after action review (AAR). Such a visualisation may be included with a notification of an event and occur on-demand due to the occurrence of events, such as weapon discharge. Example physical implementation
[0149] Referring to Fig. 9 there is illustrated a mobile parent object 162 where one or more parent sensors 164 are attached to, associated with or integrated within the mobile parent object 162. A mobile child object 166 likewise has one or more child sensors 168 attached to, associated with or integrated within the mobile child object 162. Information 170 may be transmitted or received between at least one of the parent sensors 164 and at least one of the child sensors 168. At least one of the parent sensors 164 may communicate information 172 with an external system or network, and at least one of the child sensors 168 may communicate information 174 with the same or another external system or network.
[0150] The sensors 164, 168 could be devices, or provided as sensor systems, that are rigidly attached to or embedded within an object and provide an interface for information exchange to other sensors/objects, and/or to an independent host system or network, either in real-time, near real-time, on-demand, or for storage or post-processing at a later time/date. There could be provided a wired or wireless interface to other sensors/objects and/or to host-system(s) for bi-directional information exchange. There could be provided an on-board storage facility to store data for information exchange at irregular intervals or during a docking procedure at a charging or synchronisation bay. Information could be processed in a centralised or decentralised manner on-board the sensor/object, offloaded to other sensors/objects, and/or sent to one or more host-systems.
Applications
[0151] The following applications are provided as non-limiting illustrative examples.
[0152] (1) Tracking of small-arms and passive or active hand -held weapons. Tracking of hand-held weapons such as small-arms, batons, lasers, TASERs™, etc. for determining either the number of weapon uses/discharges, estimating a projectile trajectory where weapon discharge results in the firing of a projectile, and estimating the force and orientation of contact where weapon use results in physical contact, including the complete orientation and capture of the motion during use, including the time and/or location of such occurrences. Tracking of small arms for ballistic trajectory estimation during weapon discharge using a sensor mounted on or integrated within a weapon, and estimation of the location of threats (including the azimuth and elevation and/or a grid reference) based on ballistic trajectory estimation by weapons carried by one or more personnel. Where a number of occurrences of weapon discharges have occurred, multiple weapon discharges and/or trajectories within close vicinity may result in an escalation of the threat probability based on clustering of nearby trajectories, the number of weapon discharges, the number of personnel involved, and the vicinity of trajectory estimates. This may be performed by employing a form of unsupervised learning or a clustering algorithm, several of which are known in the literature, such as k-means clustering, nearest neighbour methods, and other forms of clustering, classification, and data mining techniques. Based on such information, further statistical information is able to be extracted to analyse the reaction time and accuracy of marksmanship of personnel during performance analysis and training exercises.
[0153] Referring to Fig. 12, a visualization for the tracking of personnel 305 (for example military, police or special services personnel) having hand-held weapons 310 is provided. For example, the method/system can track personnel motions or pathways 320, and can determine the number of weapon 310 uses/discharges, and/or projectile trajectories 330 where weapon discharges result in the firing or ejection of a projectile or object. Tracking of personnel 305 can utilise a first sensor, or multiple sensors placed about a body. Tracking of hand-held weapons 310 for trajectory determination or estimation during weapon discharge can utilise one or more sensors mounted on or integrated within weapon 310 carried by one or more personnel 305.
[0154] Referring to Fig. 13, where a number of occurrences of weapon discharges have occurred, multiple weapon discharges and/or trajectories may result in an escalation of the threat probability based on clustering of trajectories. Fig. 13 illustrates a visualisation of such an example, whereby body-worn sensors are used to locate friendly-force personnel 405 (displayed as small circles) and motion events such as weapon-discharge result in ballistic trajectory estimation for determination of external events, in this case, the location of an enemy force. Information from a plurality of motion events and trajectories is clustered and processed accordingly to determine a probability of threat location 410 (i.e. external event 410) and heading with error distribution as displayed by the ellipses and threat grid reference displayed by the crosshairs. Friendly force locations 405 and enemy force locations 410 are displayed with respect to an available building layout 420. This provides an example defence implementation for detecting a plurality of motion events (e.g. weapon discharge) and external events (e.g. enemy force locations) including clustering of a plurality of trajectories and probability estimation of the external events (in this example locations) represented as the error ellipses.
[0155] (2) Similarly, such a methodology can be applied to other fields which may benefit from the localisation of a threat using a device which requires aiming or alignment with a threat prior to or during actuation. For example, a fire hose with an attached sensor can indicate the location of threats by estimating the nozzle orientation and water trajectory during application, presenting a set of locations of threats allowing for analysis of the scale and distribution of a building fire, which may be further segregated or distributed across floors of a building, and estimating the extent of damage. [0156] (3) Tracking of items for sports performance analysis. For example, tracking a tennis racquet for recording the number of shots, the position and/or orientation of the racquet, and related statistics such as number of forehand or backhand shots, the force and/or velocity of shots, and estimating the trajectory of the ball. For example, tracking of gloves (e.g. boxing gloves) or boots (e.g. soccer/football boots) for recording the number of events (e.g. hits/kicks/shots), force, velocity, and related statistics, and position and/or orientation and trajectory of the object being tracked and/or the projectile trajectory.
[0157] (4) Active safety. For example, sensors integrated into the protective equipment and clothing worn by first responders (e.g. fire-fighters, police, security and defence personnel, etc.) and workers operating within industrial and hazardous environments (e.g. miners). For example, sensors integrated into the protective equipment and clothing of a motorcycle rider where sensors may be integrated into boots and/or a leather jacket/suit/back/torso protector to monitor the upper torso and/or gloves to monitor hand position and/or helmet to monitor head position, along with one or more sensors mounted onto the motorcycle, with the purpose of estimating the position of the rider with respect to the motorcycle and/or activating active safety devices if the rider and motorcycle become disassociated whilst in motion (i.e. the rider has fallen off and/or crashed). The implementation allows for activation of active safety devices and aids (such as air-bags) at the earliest possible time with a high level of confidence of the occurrence of such a threat, and/or automatically calling for help or emergency services. Such a methodology may be applied to other sports and hobbies both on land, water, and air (e.g. as motor-sports, speed boats, jet skis, skiing, etc.).
[0158] (5) Tracking personnel. Tracking personnel within challenging environments (e.g. underground locations, mines, urban environments, GPS-denied environments, etc.). Interaction between personnel and equipment, for example, fire-fighters, miners, security and defence personnel, and other individuals and/or teams of personnel which are transported via a vehicle, carry and interact with equipment, or operate heavy machinery.
[0159] (6) Other fields and applications. Interaction with smart devices using wearable clothing such as gloves, hand-held objects such as pointers, or regular objects/devices with the sensor attached or embedded within the object/device (e.g. passive or active objects such as devices, equipment, tools, containers, controllers, hand-held computers, mobile phones, PDAs and other items, etc.). Mapping of environments such as buildings, tunnels, urban areas, industrial environments, and underground environments such as mines, etc. using freely manipulated sensors such as laser scanners. May be used during actual real-life scenarios whilst deployed in the field and during training exercises, simulated environments such as computer entertainment and gaming devices and performance training facilities, and during day-to-day living. May be combined with audio/visual recording devices for synchronised data recording either integrated into or mounted alongside the described device/sensor or distributed within an environment of operation.
[0160] (7) Objects being tracked may be shared between users, for example, a firearm, a tennis racquet, or other object/weapon/equipment/tool/device/etc. which is used in the course of performing a service, during team sport, training exercises, electronic entertainment, and/or day-to-day living. Items such as weapons, tools, and equipment. Sporting equipment such as a bat or tennis racquet. Controllers used for electronic or entertainment devices. Team sports which handle and pass on an object such as a ball or a baton. Training facilities and athletic performance analysis and manipulating athletic equipment. Rehabilitation and medical analysis and monitoring progress. Smart homes and environments.
Processing or hardware systems
[0161] Referring to Fig. 10, the methods or processes described herein can be implemented using a system 180 including a variety of sensor hardware and processing systems. The sensors (e.g. sensing systems or hardware) may be physically mounted or attached to objects, either as a motion estimation/localisation-specific device or integrated with existing technology such as communication equipment including mobile phones and radios, weapons such as small-arms and purpose-specific "smart" weapons, safety equipment such as fire extinguishers, wearable safety equipment and clothing such as vests, boots, helmets, and tools such as drills (either hand-held or as an articulated unit attached/mounted to heavy equipment). Furthermore, there may be various other industry- specific implementations, for example, integration into a cap lamp as commonly used by miners and other industrial personnel or integration with an external battery/power compartment of the cap lamp. Additionally, the sensing systems or hardware may make use of parasitic power from such platforms or equipment, such as utilising the battery or power supply which is either internal to the equipment (e.g. battery in a mobile phone, radio transceiver, mp3 player, vehicle, etc.) or external to the equipment, if needed, (e.g. an external battery pack such as those carried by mining personnel and other industrial workers, which may be mounted/attached to the belt or carried in a backpack or integrated into clothing).
[0162] Sensors, tracking systems or sensing hardware 182. Self-contained sensors can be utilised such as the inertial measurement units (IMUs) available from several manufacturers such as Honeywell™, Crossbow™, xSens™, MEMSense™, etc. Inertial measurement units or inertial-based navigation sensors which are assembled with discrete components based on Micro Electro-Mechanical Systems (MEMS) sensors can be utilised for example those manufactured by companies such as Analog Devices™, and which may consist of magnetic and barometric sensors such as those manufactured by Honeywell™ and Freescale™ Semiconductor. Such sensors typically combine discrete components of tri-axial acceleration, angular rotation, and magnetic field sensors along with barometric sensing. Such sensors also commonly implement a data fusion algorithm to optimally combine the outputs of all sensors in a complimentary fashion to measure attitude and heading, commonly referred to as an Attitude and Heading Reference System (AHRS). Such sensors may also be pre-existing within certain electronic devices such as mobile phones (e.g. the Apple™ iPhone™ or HTC™ Incredible™), mp3 players (e.g. the Apple iPod ), and other computing hardware (e.g. the Apple iPad ), typically for the purposes of usability, gaming, and supporting the user interface (UI) in terms of tilt/rotation and input detection using accelerometer(s) and/or gyroscope(s) and/or a magnetic compass for heading determination for navigation applications which typically combine self-contained sensors with GPS or assisted GPS (A-GPS), or for example, fitness/exercise and step/calorie counting pedometer applications such as the Nokia™ SportsTracker. Such electronic devices may allow for external access to the sensor data (e.g. via a wireless interface such as BlueTooth™ or Wi-Fi™ or a wired interface such as USB™) or allow access to the on-board sensing device(s) via an Application Programming Interface (API) and development and execution of software within the internal processor of the device, such that developed algorithms and software may execute internal to the device itself utilising the built-in sensor(s), processor(s), and communication system(s).
[0163] Self-contained sensors can be utilised which make direct observations of the environment, such as low-cost and high-end cameras manufactured by PointGrey™, and laser range-finders such as devices manufactured by Sick and Hokuyo Automatic .
[0164] Beacon-based sensing and localisation technologies can be utilised which employ measurements from fixed base-stations and/or mobile nodes and are based on various technologies such Wireless Local Area Network (WLAN/Wi-Fi™) and/or mobile phone towers and/or other communication network towers/infrastructure. For example available from Ekahau™ or AeroScout™ and/or accessible via an Application Programmable Interface (API) from companies such as SkyHook Wireless™, Google™ or Apple™. Technologies which utilise Ultra Wideband (UWB) measurements from companies such as TimeDomain , and other localisation technologies such as those based on ZigBee are available from companies such as AwarePoint™.
[0165] Sensing platforms/objects/equipment. The aforementioned sensors may be used in conjunction with different types of manned or unmanned robotic platforms and vehicles either with existing facilities for on-board dead reckoning or with retrofitted dead reckoning sensors such as wheel encoders and inertial measurement units and sensors such as cameras and lasers. The aforementioned sensors may also be used in conjunction with any freely-manipulated objects and equipment. Example hardware that could be utilised includes:
Vehicles and platforms for operation in land/sea/air environments which have been instrumented and are either in a manned or unmanned configuration and are either custom- designed or are standard production vehicles;
Body-worn sensors mounted on a human;
Sensors integrated into clothing such as safety/protective/sports clothing.
Hand-held objects such as weapons, sporting/athletic equipment, tools, safety equipment, and machinery;
Heavy equipment/machinery used in industrial and commercial environments. [0166] Processing systems 184. The methods or processes described herein may be implemented to execute on a processing system 184, that could be part of or separate to sensor system 182, in a variety of different ways. Processing system 184 could store and/or retrieve information or data from local database 190. For example, as an independent process executed on a host-system, with the host-system responsible for maintaining and managing the sensor interface, and calculating a state estimate (such as a position, direction, trajectory, motion estimation/classification, etc.) and/or extracting features from sensor data, and providing such information to one or more algorithms responsible for sensor association/disassociation and/or management of motion events and/or sensor-to-sensor estimation using inter-process communication. Similarly, the host- system might also maintain a map of the environment as a supported map representation (e.g. occupancy grid map, feature-based map, landmark-based map), and provide such information to motion event management, visualisation, and estimation algorithms.
[0167] In a similar manner, methods or processes described herein may be implemented as a remote process executed on an independent system such as server(s) 186 which communicates with a host-system using either wired and/or wireless communication (LAN/WLAN, etc), such as network 188. In this case, processor and/or memory utilisation of the host-system is reduced and a degree of redundancy may be afforded. Servers) 186 can store and/or retrieve information or data from one or more databases 192.
[0168] In a similar manner, methods or processes described herein may be implemented as an independent process on a remote server which may be hosted locally, nationally, or internationally, and communicates with the host-system using either wired arid/or wireless communication (LAN/WLAN, etc). In such an example, maps of the environment and/or sensor or object parameters (e.g. position and/or orientation, sensor/object identifier, status, etc.) may be stored on the server and provided to the host-system on-demand or pre-cached prior to entering an environment. Such functionality may be accessible via an Application Programmable Interface (API) from a map provider (e.g. a digital mapping company, crowd-sourced from users, Google™ Maps, etc). In this case, processor and/or memory utilisation of the host-system is reduced and a degree of redundancy may be afforded. Additionally, one or more host-systems operating independently as part of one or more sensor systems may share a common map of the environment and/or sensor or object parameters (e.g. position and/or orientation, sensor/object identifier, status, etc.) with the remote server, and thus the remote server can provide and share such information with several host-systems and act as a common repository of information operating as a cloud computing service.
[0169] In a similar manner, methods or processes described herein may also be implemented as a single process in conjunction with the tasks of the host-system, resulting in a tightly integrated set of tasks such as maintaining and managing the sensor interface, calculating a state estimate and/or extracting features from sensor data, association/disassociation of sensors and management of motion events, sensor-to-sensor estimation, and maintaining a map of the environment and/or sensor or object parameters.
[0170] The processor executing the motion estimation framework/algorithm(s) may consist of a standard Central Processing Unit (CPU) such as those manufactured by Intel™ or AMD™ and/or processors based on technology available from Advanced RISC Machines (ARM™), etc., commonly found in desktop and portable handheld computers. The operating system may be a standard operating system such as Microsoft Windows™, a UNIX-based operating systems such as Linux , or Apple MacOS , a mobile operating system such as iOS or Android or Windows Phone 7 , or an industrial operating system such as QNX Neutrino™. The operating system may also be an application-specific OS or an embedded OS or Real-Time OS (RTOS). The motion estimation framework/algorithm(s) may be executed either online (in real-time or near real-time) or offline for post-processing or batch-processing.
[0171] The processor executing the motion estimation framework/algorithm(s) may include an embedded platform such as a microcontroller, microprocessor, Programmable Logic Device (PLD), Digital Signal Processor (DSP), such as those manufactured by Atmel™, Microchip™, Texas Instruments (TI)™, and/or based on technology available from Advanced RISC Machines (ARM™), etc.
[0172] In a multi-core processor, the motion estimation framework/algorithm(s) may execute on one or more cores whilst the remaining cores may be utilised for other tasks such as sensor communication and/or position estimation and/or orientation estimation and/or other state estimation, depending on the number of cores and processor availability and utilisation.
[0173] Thus, particular embodiments can be realised using a processing system, for example as part of a sensing system, local or remote to a sensing system, which could be integrated with, local or remote to the sensing system, an example of which is shown in Fig. 1 1. In particular, processing system 200 generally includes at least one processor 202, or processing unit or plurality of processors, memory 204, at least one input device 206 and at least one output device 208, coupled together via a bus or group of buses 210. In certain embodiments, input device 206 and output device 208 could be the same device. An interface 212 can also be provided for coupling the processing system 200 to one or more peripheral devices, for example interface 212 could be a PCI card or PC card. At least one storage device 214 which houses at least one database 216 can also be provided. The memory 204 can be any form of memory device, for example, volatile or non-volatile memory, solid state storage devices, magnetic devices, etc. The processor 202 could include more than one distinct processing device, for example to handle different functions within the processing system 200. [0174] Input device 206 receives input data 218 and can include, for example, a data receiver or antenna or wireless data adaptor, etc. Input data 218 could come from different sources, for example sensed data in conjunction with data received via a network. Output device 208 produces or generates output data 220 and can include, for example, a display device, a port for example a USB port, a peripheral component adaptor, a data transmitter or antenna such as a' wireless network adaptor, etc. Output data 220 could be distinct and derived from different output devices. A user could view data output, or an interpretation of the data output, on, for example, a monitor or using a printer. The storage device 214 can be any form of data or information storage means, for example, volatile or non-volatile memory, solid state storage devices, magnetic devices, etc.
[0175] In a networked information or data communications system, a user has access to one or more terminals which are capable of requesting and/or receiving information or data from the sensing system. In such a communications system, a terminal may be a type of processing system, computer or computerised device, mobile or cellular telephone, mobile data terminal, portable computer, Personal Digital Assistant (PDA), etc. An information source can be provided to be in communication with the sensor system and can include a server, or any type of terminal, that may be associated with one or more storage devices that are able to store information or data, for example in one or more databases residing on a storage device.
[0176] In use, the processing system 200 is adapted to allow data or information to be stored in and/or retrieved from, via wired or wireless communication means, the memory or the at least one database 216. The interface 212 may allow wired and/or wireless communication between the processing unit 202 and peripheral components that may serve a specialised purpose. The processor 202 receives instructions as input data 218 via input device 206 and can display processed results or other output to a user by utilising output device 208. More than one input device 206 and/or output device 208 can be provided. It should be appreciated that the processing system 200 may be any form of terminal, specialised hardware, or the like. The processing system 200 may be a part of a networked communications system. Processing system 200 could connect to network, for example the Internet or a LAN or WAN. Input data 218 and output data 220 could be communicated to other devices via the network. [0177] Optional embodiments of the present invention may also be said to broadly consist in the parts, elements and features referred to or indicated herein, individually or collectively, in any or all combinations of two or more of the parts, elements or features, and wherein specific integers are mentioned herein which have known equivalents in the art to which the invention relates, such known equivalents are deemed to be incorporated herein as if individually set forth.
[0178] Although a preferred embodiment has been described in detail, it should be understood that many modifications, changes, substitutions or alterations will be apparent to those skilled in the art without departing from the scope of the present invention. The present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, firmware, or an embodiment combining software and hardware aspects.

Claims

The claims:
1. A computer-implemented method of tracking a first mobile object, provided with a first sensor, in relation to a second mobile object, provided with a second sensor, comprising the steps of:
determining a position, orientation or trajectory of the second sensor; and determining a position, orientation or trajectory of the first sensor at least partially based on the position, orientation or trajectory of the second sensor.
2. The computer-implemented method of claim 1, wherein the first mobile object is provided with a plurality of first sensors.
3. The computer-implemented method of either claim 1 or 2, wherein the second mobile object is provided with a plurality of second sensors.
4. The computer-implemented method of any one of claims 1 to 3, wherein the first sensor and the second sensor are related by a fixed hierarchy configuration.
5. The computer-implemented method of any one of claims 1 to 3, wherein the first sensor and the second sensor are related by a variable hierarchy configuration.
6. The computer-implemented method of claim 5, also including checking if the first sensor is associated with the second sensor, and if the first sensor is associated with the second sensor, then determining the position, orientation or trajectory of the first sensor at least partially based on the position, orientation or trajectory of the second sensor.
7. The computer-implemented method of any one of claims 1 to 5, wherein the first sensor is a child sensor and the second sensor is a parent sensor.
8. The computer-implemented method of any one of claims 1 to 6, wherein if motion of the second mobile object is detected by the second sensor, then information is transmitted to the first sensor.
9. The computer-implemented method of claim 5, wherein a measure of association is determined between the first sensor and the second sensor.
10. The computer-implemented method of claim 9, wherein the measure of association is compared to a threshold of association to determine if the first sensor and the second sensor are to be associated or remain associated.
11. The computer-implemented method of claim 5, wherein a measure of disassociation is determined between the first sensor and the second sensor.
12. The computer-implemented method of claim 1 1 , wherein the measure of disassociation is compared to a threshold of disassociation to determine if the first sensor and the second sensor are to be disassociated or remain disassociated.
13. The computer-implemented method of claim 5, wherein sensor-to-sensor relationships are assigned by associating or disassociating sensors based on sensor measurements.
14. The computer-implemented method of any one of claims 1 to 13, further including detennining a location of an external event at least partially based on using the determined position, orientation or trajectory of the first sensor to determine an alignment or aim of the first mobile object.
15. The computer-implemented method of any one of claims 1 to 14, further including determining a trajectory of a projectile or substance fired or ejected by the first mobile object.
16. The computeF-implemented method of claim 15, including estimating a coverage area of the projectile or substance based on parameters of the projectile or substance.
17. The computer-implemented method of claim 15, including determining a point of impact of the projectile or substance.
18. The computer-implemented method of any one of claims 15 to 17, including determining clustering of a plurality of trajectories, and determining an area of interest based on the determined clustering.
19. The computer-implemented method of any one of claims 1 to 18, wherein either or both the first sensor and the second sensor are an inertial measurement unit.
20. A system for tracking a plurality of mobile objects, wherein at least two mobile objects of the plurality of mobile objects each include at least one sensor, wherein one or more sensor-to-sensor relationships are defined depending on the types of the at least two mobile objects, thereby enabling a first mobile object of the at least two mobile objects to be tracked based at least partially on tracking of a second mobile object of the at least two mobile objects.
21. The system of claim 20, wherein the one or more sensor-to-sensor relationships are defined depending on one or more of:
whether the at least one sensor is associated or disassociated;
the hierarchy level of the at least one sensor;
parameters of the at least one sensor;
dynamic selection of a relationship based on a type of motion of the at least one sensor; and,
one or more independent motion models.
22. A system for tracking a first mobile object, comprising:
a first sensor associated with the first mobile object;
a second sensor associated with a second mobile object;
at least one processor to determine a position, orientation or trajectory of the second sensor; and
at least one processor to determine a position, orientation or trajectory of the first sensor;
wherein the determined position, orientation or trajectory of the first sensor is at least partially based on the determined position, orientation or trajectory of the second sensor.
23. A computer-implemented method of determining a location and a probability of an external event based on an occurrence of one or more events and trajectories, comprising the steps of:
detecting the occurrence of the one or more events;
determining one or more trajectories for the one or more events;
determining the location of the external event based on a trajectory end-point for at least one trajectory; and
determining the probability of the external event at least partially based on the one or more events and the one or more trajectories.
24. The computer-implemented method of claim 23, further including clustering a plurality of the one or more trajectories, and determining an area of interest based on the clustering.
25. The computer-implemented method of either claim 23 or 24, wherein the occurrence of the one or more events and the determining of the one or more trajectories for a first mobile object uses at least one first sensor associated with the first mobile object.
26. The computer-implemented method of either claim 23 or 24, wherein the occurrence of an event and detennining a trajectory for a first mobile object, provided with a first sensor, is tracked in relation to a second mobile object, provided with a second sensor, and further comprises the steps of:
determining a position, orientation or trajectory of the second sensor; and determining a position, orientation or trajectory of the first sensor at least partially based on the position, orientation or trajectory of the second sensor.
27. The computer-implemented method of any one of claims 23 to 26, wherein an event of the one or more events is the measurement of a motion.
28. The computer-implemented method of any one of claims 23 to 26, wherein an event of the one or more events is the release or firing of a projectile or substance.
29. The computer-implemented method of any one of claims 23 to 26, wherein an event of the one or more events is a form of physical contact.
30. The computer-implemented method of any one of claims 23 to 26, wherein an event of the one or more events is detected by motion estimation.
31. The computer-implemented method of any one of claims 23 to 26, wherein an event of the one or more events is detected by a classification algorithm.
32. The computer-implemented method of claim 28, wherein a trajectory of the one or more trajectories is estimated at least partially based on one or more parameters of the projectile or substance.
33. The computer-implemented method of claim 28, wherein a coverage area of the projectile or substance is estimated at least partially based on one or more parameters of the projectile or substance.
34. The computer-implemented method of any one of claims 23 to 33, wherein the external event is a threat.
35. The computer-implemented method of any one of claims 23 to 33, wherein the external event is a target or goal.
36. The computer-implemented method of any one of claims 23 to 33, wherein the external event is located by providing a grid reference.
37. The computer-implemented method of any one of claims 23 to 33, wherein the external event is located by providing an azimuth and elevation.
38. A system for determining a location and a probability of an external event based on an occurrence of one or more events and trajectories, comprising:
a sensor associated with a mobile object, the sensor able to detect the occurrence of the one or more events; and, at least one processor configured to:
determine the one or more trajectories for the one or more events;
determine the location of the external event based on a trajectory end-point for at least one trajectory; and,
determine the probability of the external event at least partially based on the one or more events and the one or more trajectories.
PCT/AU2012/000481 2011-06-10 2012-05-04 Positioning, tracking and trajectory estimation of a mobile object WO2012167301A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2011902307 2011-06-10
AU2011902307A AU2011902307A0 (en) 2011-06-10 Positioning, tracking and trajectory estimation of a mobile object

Publications (1)

Publication Number Publication Date
WO2012167301A1 true WO2012167301A1 (en) 2012-12-13

Family

ID=47295255

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2012/000481 WO2012167301A1 (en) 2011-06-10 2012-05-04 Positioning, tracking and trajectory estimation of a mobile object

Country Status (1)

Country Link
WO (1) WO2012167301A1 (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9092898B1 (en) 2014-07-03 2015-07-28 Federico Fraccaroli Method, system and apparatus for the augmentation of radio emissions
EP3016415A1 (en) * 2014-11-03 2016-05-04 Peter Brettschneider Method and device for locating people
US20160249164A1 (en) * 2015-02-25 2016-08-25 Qualcomm Incorporated Techniques for use in determining a position using visible light communication
US9707961B1 (en) 2016-01-29 2017-07-18 Ford Global Technologies, Llc Tracking objects within a dynamic environment for improved localization
US9880561B2 (en) 2016-06-09 2018-01-30 X Development Llc Sensor trajectory planning for a vehicle
CN107924455A (en) * 2015-07-14 2018-04-17 尤尼伐控股有限公司 Computer vision process
WO2018134462A1 (en) * 2017-01-17 2018-07-26 Latorre Otero Alejandro Location system for kitesurfing
US10108194B1 (en) 2016-09-02 2018-10-23 X Development Llc Object placement verification
US10212262B2 (en) 2013-06-06 2019-02-19 Zebra Technologies Corporation Modular location tag for a real time location system network
US10333568B2 (en) 2013-06-06 2019-06-25 Zebra Technologies Corporation Method and apparatus for associating radio frequency identification tags with participants
US10356556B2 (en) 2017-10-06 2019-07-16 Comissariat À L'Énergie Atomique Et Aux Énergies Alternatives Method for locating mobile devices in a common frame of reference
DE102018110145A1 (en) * 2018-04-26 2019-10-31 Trumpf Werkzeugmaschinen Gmbh + Co. Kg INTERIOR LOCATION SYSTEM FOR INDUSTRIAL MANUFACTURING
US10509099B2 (en) 2013-06-06 2019-12-17 Zebra Technologies Corporation Method, apparatus and computer program product improving real time location systems with multiple location technologies
CN110585640A (en) * 2019-09-23 2019-12-20 应急管理部四川消防研究所 Monitoring system applied to rail train fire extinguisher and tracking and positioning method
US10591578B2 (en) 2014-06-06 2020-03-17 Zebra Technologies Corporation Method, apparatus, and computer program product for employing a spatial association model in a real time location system
CN111376249A (en) * 2018-12-28 2020-07-07 阿里巴巴集团控股有限公司 Mobile equipment positioning system, method and device and mobile equipment
CN111948642A (en) * 2020-08-13 2020-11-17 贵州航天南海科技有限责任公司 Data processing method for detecting and tracking weak small target and high maneuvering target in strong clutter environment
CN112114309A (en) * 2020-08-10 2020-12-22 西安电子科技大学 JPDA multi-target tracking method based on optimal contour coefficient self-adaptive K-means clustering
US11023303B2 (en) 2013-06-06 2021-06-01 Zebra Technologies Corporation Methods and apparatus to correlate unique identifiers and tag-individual correlators based on status change indications
US11035924B2 (en) 2015-06-12 2021-06-15 Smartbow Gmbh Method for locating animals using radio waves
US11287511B2 (en) 2013-06-06 2022-03-29 Zebra Technologies Corporation Method, apparatus, and computer program product improving real time location systems with multiple location technologies
US11391571B2 (en) 2014-06-05 2022-07-19 Zebra Technologies Corporation Method, apparatus, and computer program for enhancement of event visualizations based on location data
US11423464B2 (en) 2013-06-06 2022-08-23 Zebra Technologies Corporation Method, apparatus, and computer program product for enhancement of fan experience based on location data
US11429084B2 (en) 2017-09-05 2022-08-30 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Assisted assigning of a workpiece to a mobile unit of an indoor location system
US11635297B2 (en) 2017-09-05 2023-04-25 TRUMPF Werkzeugmaschinen SE + Co. KG Image-supported assignment of a processing plan to a mobile unit data set of a mobile unit of an indoor location system
US11755851B2 (en) 2019-08-23 2023-09-12 Cfa Properties, Inc. Object detection-based control of projected content
CN116977902A (en) * 2023-08-14 2023-10-31 长春工业大学 Target tracking method and system for on-board photoelectric stabilized platform of coastal defense

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7538724B1 (en) * 2007-06-15 2009-05-26 Itt Manufacturing Enterprises, Inc. Method and system for relative tracking
US7947936B1 (en) * 2004-10-01 2011-05-24 The United States Of America As Represented By The Secretary Of The Navy Apparatus and method for cooperative multi target tracking and interception

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7947936B1 (en) * 2004-10-01 2011-05-24 The United States Of America As Represented By The Secretary Of The Navy Apparatus and method for cooperative multi target tracking and interception
US7538724B1 (en) * 2007-06-15 2009-05-26 Itt Manufacturing Enterprises, Inc. Method and system for relative tracking

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KUNG, H. T ET AL.: "Efficient Location Tracking Using Sensor Networks", PROCEEDINGS OF 2003 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2003, pages 1 - 8 *

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11287511B2 (en) 2013-06-06 2022-03-29 Zebra Technologies Corporation Method, apparatus, and computer program product improving real time location systems with multiple location technologies
US10212262B2 (en) 2013-06-06 2019-02-19 Zebra Technologies Corporation Modular location tag for a real time location system network
US10509099B2 (en) 2013-06-06 2019-12-17 Zebra Technologies Corporation Method, apparatus and computer program product improving real time location systems with multiple location technologies
US10333568B2 (en) 2013-06-06 2019-06-25 Zebra Technologies Corporation Method and apparatus for associating radio frequency identification tags with participants
US11023303B2 (en) 2013-06-06 2021-06-01 Zebra Technologies Corporation Methods and apparatus to correlate unique identifiers and tag-individual correlators based on status change indications
US11423464B2 (en) 2013-06-06 2022-08-23 Zebra Technologies Corporation Method, apparatus, and computer program product for enhancement of fan experience based on location data
US11391571B2 (en) 2014-06-05 2022-07-19 Zebra Technologies Corporation Method, apparatus, and computer program for enhancement of event visualizations based on location data
US10591578B2 (en) 2014-06-06 2020-03-17 Zebra Technologies Corporation Method, apparatus, and computer program product for employing a spatial association model in a real time location system
US11156693B2 (en) 2014-06-06 2021-10-26 Zebra Technologies Corporation Method, apparatus, and computer program product for employing a spatial association model in a real time location system
US9092898B1 (en) 2014-07-03 2015-07-28 Federico Fraccaroli Method, system and apparatus for the augmentation of radio emissions
EP3016415A1 (en) * 2014-11-03 2016-05-04 Peter Brettschneider Method and device for locating people
WO2016137660A1 (en) * 2015-02-25 2016-09-01 Qualcomm Incorporated Techniques for use in determining a position using visible light communication
US20160249164A1 (en) * 2015-02-25 2016-08-25 Qualcomm Incorporated Techniques for use in determining a position using visible light communication
US11035924B2 (en) 2015-06-12 2021-06-15 Smartbow Gmbh Method for locating animals using radio waves
CN107924455A (en) * 2015-07-14 2018-04-17 尤尼伐控股有限公司 Computer vision process
CN107924455B (en) * 2015-07-14 2024-02-23 尤尼伐控股有限公司 Computer vision process
US9707961B1 (en) 2016-01-29 2017-07-18 Ford Global Technologies, Llc Tracking objects within a dynamic environment for improved localization
US10077054B2 (en) 2016-01-29 2018-09-18 Ford Global Technologies, Llc Tracking objects within a dynamic environment for improved localization
US10754350B2 (en) 2016-06-09 2020-08-25 X Development Llc Sensor trajectory planning for a vehicle
US9880561B2 (en) 2016-06-09 2018-01-30 X Development Llc Sensor trajectory planning for a vehicle
US10108194B1 (en) 2016-09-02 2018-10-23 X Development Llc Object placement verification
WO2018134462A1 (en) * 2017-01-17 2018-07-26 Latorre Otero Alejandro Location system for kitesurfing
US11429084B2 (en) 2017-09-05 2022-08-30 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Assisted assigning of a workpiece to a mobile unit of an indoor location system
US11635297B2 (en) 2017-09-05 2023-04-25 TRUMPF Werkzeugmaschinen SE + Co. KG Image-supported assignment of a processing plan to a mobile unit data set of a mobile unit of an indoor location system
US10356556B2 (en) 2017-10-06 2019-07-16 Comissariat À L'Énergie Atomique Et Aux Énergies Alternatives Method for locating mobile devices in a common frame of reference
DE102018110145A1 (en) * 2018-04-26 2019-10-31 Trumpf Werkzeugmaschinen Gmbh + Co. Kg INTERIOR LOCATION SYSTEM FOR INDUSTRIAL MANUFACTURING
US11356811B2 (en) 2018-04-26 2022-06-07 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Indoor location systems for industrial production
CN111376249B (en) * 2018-12-28 2024-04-09 浙江菜鸟供应链管理有限公司 Mobile equipment positioning system, method and device and mobile equipment
CN111376249A (en) * 2018-12-28 2020-07-07 阿里巴巴集团控股有限公司 Mobile equipment positioning system, method and device and mobile equipment
US11755851B2 (en) 2019-08-23 2023-09-12 Cfa Properties, Inc. Object detection-based control of projected content
CN110585640B (en) * 2019-09-23 2021-08-27 应急管理部四川消防研究所 Monitoring system applied to rail train fire extinguisher and tracking and positioning method
CN110585640A (en) * 2019-09-23 2019-12-20 应急管理部四川消防研究所 Monitoring system applied to rail train fire extinguisher and tracking and positioning method
CN112114309A (en) * 2020-08-10 2020-12-22 西安电子科技大学 JPDA multi-target tracking method based on optimal contour coefficient self-adaptive K-means clustering
CN111948642A (en) * 2020-08-13 2020-11-17 贵州航天南海科技有限责任公司 Data processing method for detecting and tracking weak small target and high maneuvering target in strong clutter environment
CN116977902A (en) * 2023-08-14 2023-10-31 长春工业大学 Target tracking method and system for on-board photoelectric stabilized platform of coastal defense
CN116977902B (en) * 2023-08-14 2024-01-23 长春工业大学 Target tracking method and system for on-board photoelectric stabilized platform of coastal defense

Similar Documents

Publication Publication Date Title
WO2012167301A1 (en) Positioning, tracking and trajectory estimation of a mobile object
US9810767B1 (en) Location estimation system
Ferreira et al. Localization and positioning systems for emergency responders: A survey
Rantakokko et al. Accurate and reliable soldier and first responder indoor positioning: Multisensor systems and cooperative localization
EP2845067B1 (en) Distributed positioning and collaborative behavior determination
Liu et al. Cooperative relative positioning of mobile users by fusing IMU inertial and UWB ranging information
US8739672B1 (en) Field of view system and method
US9746330B2 (en) System and method for localizing two or more moving nodes
US9080886B1 (en) System and method for urban mapping and positioning
AU2015356865B2 (en) Electronic device for the near locating of a terrestrial object, and method of locating such an object
KR101229958B1 (en) Pedestrian position acquisition system in gps shadow area and method thereof
US6388611B1 (en) Method and system for dynamic surveillance of a remote object using GPS
Faramondi et al. An enhanced indoor positioning system for first responders
JP6054535B2 (en) Pedestrian motion recognition based pedestrian position estimation apparatus and method
CA2697060A1 (en) Method and apparatus for sending data relating to a target to a mobile device
US11774547B2 (en) Self-positioning method, self-positioning system and tracking beacon unit
WO2012121819A1 (en) Inertial navigation system and initialization/correction method therefor
Fisher et al. Precision position, navigation, and timing without the global positioning system
Ali et al. Systematic review of dynamic multi-object identification and localization: Techniques and technologies
KR20190094684A (en) System for measuring position
Bahillo et al. Low-cost Bluetooth foot-mounted IMU for pedestrian tracking in industrial environments
Beauregard Infrastructureless pedestrian positioning
Azizi et al. 3D inertial algorithm of SLAM for using on UAV
Jao et al. UWB-Foot-SLAM: Bounding position error of foot-mounted pedestrian INS with simultaneously localized UWB beacons
Fischer et al. SLAM for pedestrians and ultrasonic landmarks in emergency response scenarios

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12796134

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12796134

Country of ref document: EP

Kind code of ref document: A1