WO2009137616A2 - Novel sensor apparatus - Google Patents

Novel sensor apparatus Download PDF

Info

Publication number
WO2009137616A2
WO2009137616A2 PCT/US2009/043033 US2009043033W WO2009137616A2 WO 2009137616 A2 WO2009137616 A2 WO 2009137616A2 US 2009043033 W US2009043033 W US 2009043033W WO 2009137616 A2 WO2009137616 A2 WO 2009137616A2
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
data
sensors
network
pixel
Prior art date
Application number
PCT/US2009/043033
Other languages
French (fr)
Other versions
WO2009137616A3 (en
Inventor
Andrew J. Griffis
Michael Powell
Original Assignee
Strongwatch Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Strongwatch Corporation filed Critical Strongwatch Corporation
Publication of WO2009137616A2 publication Critical patent/WO2009137616A2/en
Publication of WO2009137616A3 publication Critical patent/WO2009137616A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19663Surveillance related processing done local to the camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/46Indirect determination of position data
    • G01S2013/468Indirect determination of position data by Triangulation, i.e. two antennas or two sensors determine separately the bearing, direction or angle to a target, whereby with the knowledge of the baseline length, the position data of the target is determined
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Definitions

  • the invention addresses the lack of PIR (or equivalent non imaging detector) performance and capability for indoor and outdoor applications by a novel means of 1) using existing thermal infrared sensors (e.g., PIR detector), 2) extracting and processing signals from such sensors and 3) using such sensors in concert with movable apertures or multiple fixed apertures, 4) deploying such sensors with wireless intra-sensor communication, 5) combining such sensors with inertial sensors and 6) integrating such sensors with inertial and other moving elements into a single device (e.g., monolithic silicon, multi chip module, or microelectronic assembly).
  • a slightly different example is in the construction of a motor controller, wherein one always finds an encoder (or equivalent device) that will have a companion decoder inside the controller computer or on a nearby circuit If maximum performance is to be realized, one will also find a vector driven or state-driven algorithm in the computer that makes use of the decoded information to accommodate the peculiarities of the motor and/or gear cluster under control But one will not find an encoder that accommodates the system peculiarities directly, as there is a barrier between the practices of the control theorist, the computer scientist, and the optical engineer (in the case of an optical encoder) in regard to how one implements the blocks in the system block diagram that includes the encoder / decoder / controller functions.
  • the invention is a combination of discipline-crossing innovations that have bearing on the control of machines that move, with application to sensing and imaging for security and surveillance, for example.
  • the invention includes methods and means of calibrating the sensor network using a known source having an optional means (e.g., GPS, other radiolocation) of known position during calibration, such that the geometrical relations between sensors in the network can be determined unambiguously and used for detection and classification of objects in the field of view of the network.
  • a known source having an optional means (e.g., GPS, other radiolocation) of known position during calibration, such that the geometrical relations between sensors in the network can be determined unambiguously and used for detection and classification of objects in the field of view of the network.
  • Figure 6 shows an algorithm for managing the invention operation using sequential hypothesis testing in the context of a sensor network cueing a networked camera according to the present invention.
  • Figure 15 shows an architecture according to the present invention with the emphasis upon the gimbal for stabilization and control in a remote sensing application.
  • Figure 19 illustrates the notion of a state aware encoder, e.g., using an optical encoder approach, according to the present invention.
  • the power is provided by an external battery that is recharged by a solar panel and / or a small wind energy device mounted with or near the invention
  • the management of the power e g , the charging / discharging associated with the diurnal cycle and / or wind variations
  • the power consumption can be reduced, if need be, by relying on a network of low cost (and low power) cueing devices, so that full system operation is only engaged with appropriate cueing signals are present.
  • the digital measurement thus made can be interpreted by a computer 210 as to where in the field of view the object is (e.g., which beam or pixel), how bright the apparent object is as measured by the pixel and, for a scanned implementation such as is implied here, the encoder data 207 and inertial data 203 can be used to associate a pixel with a specific region in the field of view by calculating the angular position of moving components (encoder data) and deducing one's current position (inertial data).
  • encoder data 207 and inertial data 203 can be used to associate a pixel with a specific region in the field of view by calculating the angular position of moving components (encoder data) and deducing one's current position (inertial data).
  • the use of acoustic data 206 can complement the measurement and potentially provide additional cues for detection.
  • the probe While holding the initialization probe module ( Figure 14) the probe is used to send an initialization message 230 to the sensors in the network to enter into initialization ("Init") mode, the sensors having already been installed and the probe having been loaded with the device addresses of the network.
  • the probe module is then set to record its initial position 231, which may include the entry or capture of GPS data, after which time the probe continuously records 232 position states and sensor data events (as a function of time) from the network of sensors.
  • the user then moves / walks through the region 233 encompassed by the network of sensors and is detected / measured by the network, e.g., from X to Y to W to Z to Y to W to X to Z and then finish by walking around the perimeter.
  • the calibration probe includes an output for driving such a known blackbody source.
  • a robotic device with known blackbody probe attached could have the calibration probe placed on it and used to explore the sensor network without any unknown (human, in this case) objects.
  • the lower part of this figure suggests an extension to the multi pixel architecture wherein the calibration pixel is mounted on a rotating element 276.
  • the calibration pixel in this case has two states: blackbody 275 and mirror 274.
  • blackbody state i.e., when the blackbody side is pointed at the sensing pixel
  • mirror state the calibration pixel is used to radiate the sensing pixel so as to produce gain and offset calibration data.
  • mirror state mirror is pointed at the sensing pixel
  • the calibration pixel is used to reflect the sensing pixel radiation, thereby enabling a measurement of the sensing pixel self brightness, which is useful for its characterization generally and also for gaining additional accuracy in estimating the calibration offset measured with the blackbody state.
  • Figure 16 uses shared memory 303 between sensor data processing 302 and actuation control s 304 o as to enable both functions to have "awareness" of each other so as to reach a joint optimum operating point rather than an independent optimum operating point, the latter being determined by the root sum square (RSS) of error variances, the former being less than the RSS by at least the amount of correlation between the two functions as concerns errors (e g , departures from mathematically ideal operation) If joint prediction and correction is used, substantial improvements in joint performance can be attained without having changed any components - the improvement comes through measurement of root causes of system level behavior and compensation for those in real time through physics based modeling and computing
  • RSS root sum square
  • Figure 18 illustrates a simplified state aware encoder wherein a pattern is embedded or imprinted on a disc 340, an illuminator 341 (or equivalent excitation in the case of, e g , a magnetic disc) makes the pattern visible or otherwise available to a detector, a high speed detector 342 (e g, array of 1 or more elements) captures the pattern induced energy, formats it for decoding and then forwards the information to a decoder 343 that can derive the mechanical state of the disc in terms of position, velocity, etc , for the disc and can include system state information if the information was encoded during integration of the encoder into the system (or at time of manufacture, etc , — wherever in the process the information is available and ready for imprint / storing)
  • One way to achieve GPS or geo-location cueing is to have the mobility device communicate its GPS position to the invention, from which a bearing can be calculated (knowing both GPS locations, knowing altitudes from knowledge of local terrain in the field of view), a zoom established and the correct pan/tilt position commanded to direct the sensor 101 field of view to the person holding the mobility device
  • Applications for this are many, but include tracking (if repeated GPS updates are sent or if computer vision algorithms are used to detect and track the individual after an initial GPS cue or set of cues) an individual (e g , security guard) within sensor range of the invention so as to provide a recording of the actions taken by the person or to provide an extra observer for situational awareness
  • GPS in mobility devices can enable enhanced security in retail applications, such as for observing a customer walking across the parking lot from a building entrance to their vehicle or destination, having registered their mobility device with the retailer's installation of the invention, e g , via Bluetooth wireless 104, or WiFi wireless 103 commands, or via a wired connection to a wide area connection with access to the mobility device carrier network
  • This registration can be enabled in multiple ways, and the preferred embodiment is a downloadable plug in or application that is purchased by the owner of the device in concert with or through the retailer, such that the retailer and the carrier (e g , the phone company) both have revenue opportunities associated with the service, said service being integral to their secure billing and telephony systems
  • the physical packaging of 200 in Figure 8 having multiple sensors 208 and digitizers 209 can take on diverse forms, depending on requirements. However, for security / surveillance applications, the preferred embodiment involves separating the sensor 208 / digitizer 209 portion (or just the sensor in the case of a multiplexer) from the balance of the sensor module 200 and using a plurality of sensor 208 / digitizer 209 modules that connect mechanically and electrically to it. Since each sensor/digitizer constitutes a distinct field of view (e.g., 222 in Figure 9), a plurality of such fields of view (3 per sensor are shown in Figure 9) can be generated for each sensor module by stacking sensor/digitizer modules on top of the sensor module 100.
  • sensor/digitizer modules can be stacked and directed arbitrarily so as to produce many different distinct regions of detection uniquely associated with a given sensor/detector module.
  • a radial pattern (Figure 9 shows 3 sensor/digitizer pairs per sensor module, distributed radially in a 90 degree sector) is readily achieved with very few limitations on the number of sensors stacked.
  • the preferred embodiment of such a stackable sensor scenario e.g., a stack of cylinders containing either sensor modules or sensor/detector modules that mounts on a pole or tube of the same diameter so that the stack of modules can be placed at a useful height for sensing at significant distances, out of the reach of would be vandals.
  • said pole of the preferred embodiment would carry the solar or wind generation capability for a renewable energy based solution and would also provide a ready means of mounting to ground structures.
  • a related use of the invention envisioned does not require a laser but does rely on line of sight. If a person is equipped with head gear that contains inertial and compass sensors, and if such apparatus is placed so as to associate the line perpendicular to the wearer's face with the desired direction, then the inertial and compass data can be used with the perpendicular to construct a line of sight and direct smart cameras along that line of sight. Furthermore, if the line of sight information is relayed to a local or remote computer, digital elevation map or live mapped data can be directly used to project the line of sight to intersect the nearest object and this object can then be imaged and/or catalogued for archival or ongoing surveillance and study.

Abstract

An invention is disclosed that provides automated object detection, tracking and reporting having integral computer vision, motion control / stabilizing, communication, cueing, renewable energy, robust day/night security monitoring, measuring the location and producing images of objects passing through the secure area including three dimensional character of objects, calibrating the sensor network, and a means of realizing the invention in monolithic form. Finally an invention is disclosed that advances the state of the art in motion control and remote sensing through the introduction of joint sensor and motion optimal control, state based encoding / decoding, harness free operation of gimbals, gimbal design and fabrication that is tolerant of mass and inertial imbalance, and motor design that is PCB centric such that PCB processes can be used a complete single axis control solution in a pair of PCBs integrated around a shaft and coaxial mechanical advantage device.

Description

Title: Novel Sensor Apparatus
Inventor: Andrew J. Griffis, Michael Powell
Related Application/Claim of Priority
This application is related to and claims the priority of Provisional Application No. 61050850, filed 06-MAY-2008, and entitled Novel Actuation Apparatus and Methods;; No. 61051051, filed 07-MAY-2008, and entitled Network Security Sensor; No. 61051078, filed 07- MAY-2008, and entitled Integrated Surveillance System and which provisional applications are incorporated by reference herein.
Background
[0001] The present invention relates to the fields of imaging, sensor networks, and actuation and control for facilitating the same. The invention has particular utility for security and surveillance but is readily used for other forms that require the same function, including retail, elderly care, and many remote sensing applications. The relation of the invention is, in the following paragraphs, described in relation to imaging (e.g., cameras), sensor networks and actuation systems.
[0002] The use of imaging technology for surveillance and security has received a lot of attention in the past several years because of the growing desire for personal security and the demonstrable need to deal with petty theft and vandalism at an increased level. Having a camera installed to enable observation appeals to security-seeking customers because a camera seems to indicate that someone is watching out for thieves, vandals, etc. However, the presence of a camera typically only guarantees that it is possible to have someone watching, and that only if there is adequate light. In practice, if the lights are not on (night time) or there's not a security guard to carefully observe camera data and react promptly, the camera is only useful for recording what happens for later review (i.e., if a crime occurs within the field of view). If adequate funds and power are available, such as is the case for some businesses, one might purchase a smart video camera that can detect people or other objects and send an email or other message, provided the lights are on or some light is available However, without funds for a sophisticated camera and assurance that a good image can be obtained, the use of video cameras provides little actual security
[0003] The best means of realizing security is to provide a smart camera that doesn't require lighting, and ideally, allow for rapid (e g , high slew rate) precision tracking of objects in and out of the field of view of the camera, such as might be achieved with a gimbal mechanism, so that a single camera can provide wide area coverage without loss of resolution, and so that the influence of motion on the camera can be mitigated (e g , gyro stabilization), since many places that require a camera are not free of vibration or platform movement (both of which are problematic for many detection and tracking schemes) especially when viewing distant objects (e g , significant zoom) Finally, as it is often the case that installation is difficult or expensive, providing a means of simple power and communication for a security device is highly advantageous
[0004] Camera systems for achieving the security objectives cited above can be built by a well funded, skilled system integrator However, there is not a invention solution that integrates the necessary features into a whole and provides this integration at total cost of ownership that are within reach for an individual The present invention addresses this need by a novel integration, manufacture, and use of technologies for 1) day/night imaging without sources of illumination, 2) embedded imaging processing, 3) gyro stabilization with a one- or multi-axis gimbal, 4) network communication, and 5) renewable energy (e g „ wind /solar), 6) integral means of interacting with low cost cueing systems, 7) integral simple cueing to extend coverage without additional FPAs, 8) capability for using digitally stabilized visible light technology in concert with multi-stabilized IR technology
[0005] The invention addresses not only the imaging device, or camera, but also facilitates use of sensor networks as follows
[0006] The use of motion detection technology (e g , PIR or equivalent thermal infrared detectors) for security (e g , alarm panels or equivalent) has been widely adopted owing to the relative ease of installation (2-3 wires) and the corresponding low cost as compared to nearly any other solution for detecting the presence of warm bodies (e.g., human, other living warm body organism) in a room, hallway or other space. Also, since they are widely available and nearly ubiquitous, significant research has been invested in their use for indoor spaces so as to reduce false detections, since these progressively erode confidence in (and therefore utility of) alarm systems. However, even for the more benign indoor scenarios, false alarms do occur with non negligible frequency, and if / when outdoor scenarios are attempted, a very high false alarm rate can be expected due to the presence of animals and other animate and inanimate matter moving about. Thus, the use of PIR-like devices represents a compromise of sorts.
[0007] Furthermore, since outdoor use is virtually prohibited by the high false alarm rate, and since other technologies are not effective in the dark (when the criminal element prefers to work), the PIR-enabled alarm panel is really only useful for detecting a criminal event after the fact. There is no opportunity to suspect a pending event or attempt interdiction, thereby precluding the criminal activity. There are technologies that might deliver what is needed (e.g., thermal infrared imaging systems with embedded detection and communication) but not at a cost that is acceptable to many small business owners or homeowners, where the device count is highest in the aggregate (and where most of the crimes occur).
[0008] The invention addresses the lack of PIR (or equivalent non imaging detector) performance and capability for indoor and outdoor applications by a novel means of 1) using existing thermal infrared sensors (e.g., PIR detector), 2) extracting and processing signals from such sensors and 3) using such sensors in concert with movable apertures or multiple fixed apertures, 4) deploying such sensors with wireless intra-sensor communication, 5) combining such sensors with inertial sensors and 6) integrating such sensors with inertial and other moving elements into a single device (e.g., monolithic silicon, multi chip module, or microelectronic assembly).
[0009] In order to address the inevitable motion (e.g., vibrational) of the mount for a networked camera or sensor or the desired movement of a sensor so as to vary its field of view, the invention addresses actuation and control, with, for example, application to gimbals used for stabilizing imaging sensors as follows The invention has application beyond gimbals, extending to any use case that involves controlled motion (e g , motors, controllers, automation, transportation, etc )
[0010] Control of machines is fundamental to modern technology, of which machines that involve motion are one type The evolution of machines having motion control has proceeded from the diverse bases of motors, engines, mechanics, electronics and computation These bases pull from disciplines that focus on very different topics, including combustion, magnetics, strength of materials, dynamics, statics, charge transport, electromagnetics and computer science As a result, the control of machines has developed design traditions having barriers between disciplines, and these barriers limit the scaling of size, performance and cost
[0011] For example, one can readily find state of the art airborne surveillance systems that have imaging performance limits defined by the jitter of the stabilizing gimbal, owing to the fact that control of the gimbal is separated from the collection of imagery - there is a barrier between the gimbal control discipline and the image acquisition discipline A related example in gimbals used for imaging (or any other use of a gimbal, per typical usage) is that of the necessity of the harness for power and data The harness is always required (in some fashion - there are rotary couplings that can be used), and limits the range of motion or the design of the gimbal owing to the separation between the disciplines of electronics and mechanics
[0012] A slightly different example is in the construction of a motor controller, wherein one always finds an encoder (or equivalent device) that will have a companion decoder inside the controller computer or on a nearby circuit If maximum performance is to be realized, one will also find a vector driven or state-driven algorithm in the computer that makes use of the decoded information to accommodate the peculiarities of the motor and/or gear cluster under control But one will not find an encoder that accommodates the system peculiarities directly, as there is a barrier between the practices of the control theorist, the computer scientist, and the optical engineer (in the case of an optical encoder) in regard to how one implements the blocks in the system block diagram that includes the encoder / decoder / controller functions.
[0013] Finally, in the manufacture of motors, there will always be wire wound assemblies that provide magnetic fields for the required electromotive force. However, while this is often the best solution, there are times when the barrier between electronics and magnetic engineering traditions were not present so as to enable solutions based on modern electronics practices that have seen a disproportionate share of investment dollars in recent decades (especially as compared to motor manufacturing methods).
[0014] All of these examples of technical barriers are addressed by the invention in addition to other similar barrier resolving innovations. In short, the invention is a combination of discipline-crossing innovations that have bearing on the control of machines that move, with application to sensing and imaging for security and surveillance, for example.
Summary of the Invention
[0015] The present invention provides for the integration and use of technologies for: 1) day/night imaging without sources of illumination, 2) embedded imaging processing, 3) gyro stabilization with a multi axis gimbal, 4) network communication, and 5) renewable energy, 6) integral means of interacting with low cost cueing systems, 7) integral simple cueing to extend coverage without additional FPAs, 8) capability for using digitally stabilized visible light technology in concert with multi-stabilized IR technology, 9) automation of device / invention design and automation of system design based on the invention / invention, 10) distribution of computation across a network of fixed or mobile computing devices, 11) use of invention in social networks to enable community based security networks to form at small and large scale.
[0016] As it touches upon sensors and sensor networks, and in its simplest form, the invention combines digital and analog signals from a plurality of sensor modules (sensor network), each of which is comprised of a sensor, digitizer, processor and wireless communicator. The combination of signals from the sensor network is accomplished by a computer that is either distinct from the sensor network or is one of the sensor modules designated to be the master module (e.g., thereby forming a master slave network of sensor modules). The combination of signals from the distinct but overlapping fields of view of sensor modules is accomplished by algorithms that detect objects, classify them (e.g., human or non-human object), and localize them within the area being viewed by the sensors (e.g., through ray tracing).
[0017] By combining signals from sensors having distinct but overlapping fields of view, statistically independent observations of objects can be obtained and used to eliminate false detections. In the simplest case, the elimination of false detections is achieved by applying the logical AND operation to the detection results of sensors that could detect the same object. For example, if solar glare in an outdoor setting is illuminating one sensor so as to trigger a detection, it is highly improbable that another sensor would see the same glare in the same location, since the solar glare is a narrow field of view phenomena and very dependent on viewing angle. Thus, by logically combining the outputs of two sensor modules viewing a glare region from substantially different angles, the glare is not likely to trigger a detection in the sensor network.
[0018] Invention variations to the simplest form include: 1) sensor modules with a plurality of sensors having distinct fields of view, 2) sensor modules with a plurality of sensors each having substantially the same field of view but having masks (e.g., across or part of the aperture / lens) so that an image can be formed, 3) sensor modules with one or more sensors and a movable lens / aperture that realizes a time varying field of view (e.g., spot / scanner), 4) sensor modules with one or more sensors and a fixed lens / aperture and a means of moving the focal plane (e.g., external actuator) so as to produce a time varying field of view that can be used to image, 5) sensor modules with one or more sensors and a fixed lens / aperture, means of moving the focal plane, and also means of measuring the motion of the focal plane (e.g., by inertial sensors such as gyros and accelerometers), 6) instantiations of the foregoing that are realized using discrete off the shelf components, 7) instantiations of the foregoing that are realized in monolithic form (.e.g,, CMOS MEMS ASIC), sans the external actuator.
[0019] The invention includes methods and means of calibrating the sensor network using a known source having an optional means (e.g., GPS, other radiolocation) of known position during calibration, such that the geometrical relations between sensors in the network can be determined unambiguously and used for detection and classification of objects in the field of view of the network.
[0020] As it touches upon actuation and control, the invention comprises an ensemble of components for combined motion control and remote sensing. Figure 15 shows an example architecture of the invention components by function from the perspective of, or with emphasis upon, actuation and control. In this figure, the invention is an integration of components around the function of surveillance, wherein a sensor is used to gather surveillance data that is processed and simultaneously used to inform, via shared memory, the control of the mechanism that supports the sensor. Likewise, in a preferred embodiment, the figure shows a control processor that aims to close a control loop around the mechanism, while simultaneously using its commands and measurements to inform the sensor data processing. A state-ful (or state aware) position sensor is used to sense the mechanical and inertial position of system components (including the system as a whole) while the amplifiers are commanded by the controller to drive the motor.
[0021] Still further, in a preferred embodiment, a balance- and inertia- tolerant mechanism physically connects the control system to the sensor, this mechanism relying solely on mechanical advantage and computer feedback/control to achieve performance. The power and data connections are made throughout the system without harness / wiring, so that the motion of the sensor is unimpeded by harness mechanical constraints
[0022] Other features of the present invention will become further apparent from the following detailed description and the accompanying drawings.
Brief Description of the Drawings
[0023] Figure 1 is an architecture for a multi sensor e.g., multiple focal plane imaging device, module according to the present invention;
[0024] Figure 2 is an illustration of the integration of a movable (e.g., via gimbal) camera with a network of sensors (e.g., for cueing), according to the principals of the present invention;
[0025] Figure 3 shows an algorithm for detecting objects with a sensor according to the present invention.
[0026] Figure 4 shows an algorithm for tracking objects detected with a sensor according to the present invention. [0027] Figure 5 depicts an algorithm for producing super resolution sensor data according to the principals of the present invention.
[0028] Figure 6 shows an algorithm for managing the invention operation using sequential hypothesis testing in the context of a sensor network cueing a networked camera according to the present invention.
[0029] Figure 7 illustrates the use of feedback and feed-forward techniques between imaging processing and position control functions to maximize performance according to the present invention.
[0030] Figure 8 shows the architecture of a sensor module (e.g., for use in a network of sensors) according to the present invention, comprised of a sensor (e.g., PIR device, microbolometer, or other IR sensing device, including but not limited to infrared wavelengths), a digitizer, a microcontroller (or equivalent computing device), memory (volatile and non volatile), an aperture / lens, and a means of wireless communication (e.g., Zigbee, wireless USB, or other wireless network means). Interchangeable or optional features for the invention are shown in dashed lines.
[0031] Figure 9 depicts the invention when a plurality of sensor elements (e.g., A,B,C,D) with multiple fields of view (e.g., 1,2,3) are deployed in a space and used to detect objects (e.g., W, Z, Y, Z) according to the present invention. The plurality of fields of view for each sensor is shown along with the notional amplitude associated with each field of view (shown as 1 ,2,3) , akin to an antenna pattern that a radio frequency sensor might have.
[0032] Figure 10 shows the sequence of events for calibrating a sensor network using an initialization probe that contains a computer for reconciling the sensor states with respect to observed probe states according to the present invention..
[0033] Figure 11 shows a multi element pixel example for a compact (e.g., monolithic or multi chip module) version of the module according to the present invention, such that calibration is facilitated within the MCM or monolithic assembly according to the present invention. [0034] Figure 12 illustrates the flexibility of the system for connecting cameras and / or sensors so as to enable data sharing across a local or wide area network, according to the present invention.
[0035] Figure 13 depicts a means of including emissive and reflective elements near sensor (e.g., pixel) elements in a compact sensor according to the present invention.
[0036] Figure 14 shows the architecture of a sensor network initialization probe according to the present invention.
[0037] Figure 15 shows an architecture according to the present invention with the emphasis upon the gimbal for stabilization and control in a remote sensing application.
[0038] Figure 16 shows an example of sensor data processing wherein the shared memory between processing tasks allows sensor data to inform control data and vice versa so as to effect super resolution (e.g., interpolating between pixel elements or samples), according to the present invention.
[0039] Figure 17 shows the architecture of a harness free approach that uses transformers to couple power and data into a moving mechanism using circular symmetric coils, according to the present invention.
[0040] Figure 18 illustrates the use of mechanical advantage to render gimbal designs tolerant of mass and intertia imbalances, according to the present invention.
[0041] Figure 19 illustrates the notion of a state aware encoder, e.g., using an optical encoder approach, according to the present invention.
[0042] Figure 20 illustrates the integration of elements of the present invention with the motor to produce a means of actuation that is completely integrated on a per axis basis, with the controller, amplifiers, power / data, encoder / decoder and motor windings and mechanical advantage elements all integrated into a pair of disc shaped printed circuit boards (PCBs). Detailed Description
[0043] AN ILLUSTRATIVE EXAMPLE An example of the invention used in a realistic situation is helpful to understanding the relation of the drawings to the invention Suppose there is a need to monitor a remote equipment yard such as would be used by a building contractor or, for instance, a telecommunications company with a field service office and outdoor warehouse The equipment yard has a fence around its perimeter and a small building near the front gate that has power and a phone line Some equipment is stored inside another fenced perimeter within the fenced equipment yard, and this inner fence has steel fence posts approximately 8 foot high In order to reduce theft, the owner of the equipment yard wishes to monitor the perimeter for passersby and loiterers so that break-ins can be anticipated (so as to allow for interdiction), and should they occur unexpectedly, have a video record of the activities of the perpetrators In order to save electricity, there will be no yard lights other than what's on the corner of the small building In order to save cost on the installation, the monitoring equipment must be able to be installed on one of the inner fence posts without connecting power or data — just a mechanical clamp-style connection
[0044] The invention meets the requirements of the building owner as a result of the integration of many technologies into a single invention Figure 1 shows the architecture 100 of the invention in which at least one sensor 101 is connected to a computer / processor 102 that receives the sensor data and uses it to detect and track objects of interest (e g , a human or a truck) In the preferred embodiment, if only one sensor is used, this sensor is an infrared sensor, so that yard lights are not required in order to detect entry If more than one sensor is used, a color camera sensor can be used to provide color video for forensic support if adequate light is available for imaging But at least one sensor capable of day / night is preferred
[0045] Since the invention is mounted on a fence pole, when the wind blows there is motion in the sensor that must be compensated for in real time so that a false detection does not arise from the apparent sensor data changes that would occur as a result Thus, the invention connects the sensor processing 102 (Figure 1) to the control of the position of the sensor, the control being accomplished with one or more axes of motor control 109 informed by inertial data 106 and commanding motors/drivers that actuate or more mechanisms 112 under a closed loop based on position sensors 110 at the motor or the load, leading to a stabilized field of view for the sensor - the sensor data do not show any artifacts of motion By enabling sensor processing and control processing to share data through a common interface or shared resource 108 (e g , dual port memory) the invention can attain performance better than that possible if the system performance is constrained by independent error variances - the errors between the sensor data and the position data, for instance, are correlated and so there is an operating error based on joint estimation that is not possible with independent estimation An example of data flow for such joint estimation / control is shown in Figure 7
[0046] Continuing, algorithms for detection and tracking similar to those shown in Figure 3 and Figure 4 operate on the stabilized sensor data and monitor the field of view provided for the invention Since the field of view is often only a small part of 360 degrees, this will allow an intruder to go unobserved if the invention is not augmented with other similar invention For this reason, the invention has an integral wireless interface (e g , Zigbee) for communicating with other invention devices or with companion lower cost cueing devices that can detect events but may not be able to derive information for tracking and / or classification (e g , human vs non human)
[0047] Figure 2 shows how a network of cueing devices 123 can be used to detect the location (via tπangulation and coincidence detection) of an object, e g , a human 126, and then relay this to a smart camera 120 e g , a pan-tilt-zoom (PTZ) camera have a wireless interface 121 to the network of cueing devices also having a wireless interface 124, so that a better estimate of what the object is can be obtained and tracking can continue at higher resolution In this scenario (Figure 2) each cueing device 123 (three devices, S-A, S-B and S-B are shown) is comprised of one or more sensors having a reception pattern 122 (labeled 1, 2, 3 for each of S-A, S-B and S-C) that is distinct The smart camera 120 (PTZ) also has a reception pattern, or field of view, 125 that it redirects toward the object of interest 126 so that the sensor network and smart camera 120 work jointly to capture data of interest (e g , imagery) for the object of interest 126 Thus, using multiple inventions and/or an ensemble of cueing devices, a limited field of view invention can be used to survey an larger area, or a set of inventions can monitor a wide area and coordinate activities for detection and tracking objects of interest
[0048] Occasionally, an object of interest (e g , an intruder near the outer fence) will be observed The invention will have detected and tracked this object when it is in proximity to the observed perimeter, and thus a report can be generated describing the statistically significant variation from normal conditions (unattended equipment yard) Since there was no wiring during installation of the invention, it relies on an integral wireless connection to transmit the event data and supporting sensor data if required by the setup conditions The wireless data can be sent via low bandwidth (e g , Zigbee) or higher bandwidth (e g , WiMax, cellular telephony) means, these capabilities having been incorporated into the invention architecture
[0049] Finally, since the invention has no external wiring for this remote installation, the power is provided by an external battery that is recharged by a solar panel and / or a small wind energy device mounted with or near the invention The management of the power (e g , the charging / discharging associated with the diurnal cycle and / or wind variations) is accomplished within the architecture as shown in Figure 1 The power consumption can be reduced, if need be, by relying on a network of low cost (and low power) cueing devices, so that full system operation is only engaged with appropriate cueing signals are present This is suggestive of a sequential hypothesis testing approach, such as is illustrated in Figure 6
[0050] In this way, the invention can be used to provide intelligent monitoring of remote areas with minimal cost to operate or install the invention, and without significant invention cost, owing to the tight integration of functions within the invention
CAMERA ARCHITECTURE
[0051] Figure 1 shows the camera architecture 100 comprised of one or more sensor modules 101 in communication with a processor module 102 having a connection with a shared resource (e.g., dual port memory) or interface 108 that further connects with peripherals and / or a controller 109 for actuation of a mechanism 112 by means of driven motors 111 using feedback from position sensors 110. The peripherals are furthermore used to inform the processor(s) 102 of the camera 100 orientation via inertial sensor data 106, and can locate the camera 100 on the map using GPS data 105, in addition to facilitating communication with wired or wireless interfaces 104 to e.g., sensor networks, or wired / wireless communication 103 with a local or wide area network. The power is managed by a power controller 107 and battery module that enables direct connection to renewable energy e..g, solar, as need be.
[0052] Figure 2 shows an example of using a smart (e.g., PTZ or pan / tilt / zoom) camera 120 with fixed or variable field of view 125 and wireless connectivity 121 to the sensors 123 that also have wireless connectivity 124 and one or more sensors 122 that have distinct reception patterns (labeled 1,2,3). In the preferred embodiment, the sensors are low cost, low power devices readily powered using renewable sources, e.g., solar. The combined elements of Figure 2 operate by detecting objects of interest 126 (e.g., a human) that enter the monitored region, the detection occurring at one or more sensors (sensor 3 of S-A and sensor 1 of S-B both detect the human as shown); geometrical relations between the sensors are used to deduce a probable location for the detected object and the pan/tilt mechanism of the camera 120 is directed via wireless communication to place the camera field of view 125 directly on the object of interest 126. In this way a high value camera can be used to cover a broad area thereby reducing its effective cost per area monitored.
DETECTION OF OBJECTS USING SENSOR DATA
[0053] Figure 3 shows an example of a detection algorithm for detecting objects in sensor data. Digital sensor data 130 are acquired (e.g., a digital image) and sensor data (e.g., pixels) statistics are used to separate foreground data from background data 131 so that foreground blobs (e.g., clustered, co-located pixels) can be found 132 from which shape and emissivity (in the case of infrared data) or reflectivity (in the case of visible light data) information 133 can be gleaned. Having so detected an object, its detection data is then forwarded to a tracker. TRACKING OBJECTS THAT HAVE BEEN DETECTED
[0054] Figure 4 shows an algorithm for tracking the objects detected with a sensor, wherein object data 140 are provided by the object detector (e g , Figure 3) and compared with objects already being tracked 141 and then used to decide 142 whether to initiate a new track (this is a new object) or update an old track (this is a known object) Having decided upon the nature of the tracked object, the data vector(s) containing tracking data are then updated Having updated the data vectors, an updated localization 144 is performed so that spatial relationships between tracked objects can be analyzed and used to assess redundancy of multiply tracked objects and also to apply decision making rules regarding non allowed regions for tracked objects Having applied rules, reporting can ensue 145 so that the system operator (e g , security officer, landowner, etc ) can be made aware of tracked objects and events
GENERATING SUPER RESOLUTION DATA FROM LOWER RESOLUTION SENSOR DATA
[0055] Figure 5 shows a method for attaining super resolution sensor data through the control of the sensor field of view Both noise-like (but measured and observed directly or indirectly) and deterministic motions can be used, along with knowledge of the point spread function and / or field of view per pixel and per pixel ensemble Attaining super resolution simply means that more effective pixels in an image or similar sensor are obtained than are natively present in the sensor (e g , focal plane array) The approach is to move the sensor in the focal plane such that fractional pixel movement occurs so that pixel centers all occupy new parts of the imaged scene Thus the process begins with movement, or modulation of the position of the sensor field of view 150, whereupon the integration of sensor data values begins, noting the times of beginning and ending the integration 151 Through knowledge of the motion (e g , coordinates) in the time spanning the sensor pixel integration, the coordinates for the new (position modulated) sensor data can be computed 152 and used to map the data thus derived into a super coordinate system 153 and associated memory / storage 154 that is nominally some multiple of the number of pixels in the original sensor coordinate system For instance, if a 320x240 focal plane array were used, and the array was modulated in 1A pixel steps up/down/left/right, a 640x480 effective image might be obtained by virtue of having sampled the 320x240 array once extra per pixel in each dimension so that each dimension grows by a factor of two. The preferred embodiment uses the point spread function of the array to deconvolve each pixel response in its neighborhood prior to remapping so as to optimally remove correlated noise / data due to the finite spatial response of a pixel in the array.
USING CUEING DEVICES TO TRIGGER IMAGING DEVICES Figure 6 shows an example of a sequence for using a cueing device to trigger or direct an imaging device having the ability to alter its field of view (e.g., via pan / tilt / zoom means or equivalent / similar technique). This effectively makes use of a sensor with very few pixels to maximize the utilization of a sensor with very many pixels, such as would be the case for a PIR sensor used to trigger a thermal IR focal plane array. This also presents an opportunity to conserve power by only using a relatively high power device when lower power devices have been exhausted for information content - this is an example of the invention being used for successive hypothesis testing so as to conserve power, a valuable feature in the context of renewable energy. This approach is extensible to any wavelength, of course, and the invention anticipates the use across many wavelengths and using optical, microwave, acoustic and other means interchangeably to achieve performance. The sequence begins with the detection of an object with a cueing detector 160, after which time the more sophisticated imaging device is enabled 161 and cued to the location where the detection occurred and an image is capture 162. The image is then forwarded to a detection and tracking algorithm 163 from which objects can be detected using statistical means or morphological methods. Subsequently the detection and tracking rules can be applied 164 to determine the need to report / notify users about events. When rules have been applied and messages sent, the imaging devices can be idled while the low cost, simpler cueing devices return to their original state.
FEEDBACK AND CONTROL USING JOINT ERRORS [0057] Figure 7 shows an approach to sharing data between sensor data processing and sensor field of view control (motion control) such that jointly optimum performance can be obtained. Beyond accounting for correlation between noise sources, it is known that much of the error in motion control is due to very deterministic processes, which when quantified / modeled can be removed computationally and / or used to compensate within a control algorithm or to compensate in signal processing after data inventions are already obtained. The invention anticipates this class of processing for achieving high performance without high cost. The preferred embodiment is a sensor array comprised of one or more sensor elements (e.g , pixels) held within a stabilized fixture, such as would occur for a camera mounted to a two axis gimbal. For this embodiment, the process begins by setting sensor parameters 170 (e.g., beginning and ending integration times), capturing sensor data 171 and using sensor data (e.g., spatial correlation within and/or between samples) and control data (fed back) to estimate sensor errors 172 (e.g., pixels imaging a different part of a scene for which stationary data are desired); subsequently, knowing something about the current error state and the historical errors, parameters of the control system for the actuators, or gimbal mechanism, are set 173 so as to minimize the sensor data error and the control errors 175 based on the measured position 174 before and after commands are issued by the controller. Using this approach, the joint errors of sensor capture and gimbal position are minimized.
AUTOMATION OF DESIGN PROCESSES
[0058] Because of the scalable nature of the invention, the invention anticipates the use of physics based modeling of the invention and its components to allow a customer to pick / choose amongst components (e.g., sensor, motor, processor, material composition) or amongst features (e.g., field of view, range to target, type of target, use case for data, mounting, environment) and allow customization of invention design based on available modeled components and applications. This is readily accomplished via an internet or web-based dialogue so that online ordering of custom inventions can be coupled to the manufacturing floor where invention is built and tested prior to shipping to the customer. [0059] Furthermore, since geographic information systems are readily available across public networks, the invention anticipates the use of online GIS / mapping information to conduct system design of a complete surveillance system, including the elevation (e g , digital elevation maps) data needed to assess lines of sight and planning for the placement of ground sensors, mast mounted sensors, airborne sensors and waterborne sensors or simply to place sensors around a residence (e g , at corners of a house, alleyways, etc ) and identify the type of sensors and the fields of view and elevations required for each, along with the anticipated bandwidth and data invention flows for the overall design This online system design then is used to determine the requirements for individual invention designs (e g , whether single pixel sensors or gyro stabilized smart imaging sensors), these requirements are subsequently used with design automation to produce required inventions, price them for the customer and allow customers to design entire systems in a single user interface online, obtaining accurate pricing and delivery for the system
[0060] Thus, using for example, GoogleMaps and online purchasing systems, when coupled to the invention technologies for its automated design, e g , a server that accesses and executes on-demand Simulink models for actuation performance, Matlab models for sensor performance, C models for thermal IR detection and classification performance, Matlab codes for topology and geography and ray tracing, and using known / measured performance of manufactured invention, one can design a complete security and surveillance system for protecting a community, a military installation, an equipment yard, warehouse, industrial site, or residence This enables both individuals to access the invention, and also enables businesses to be established and built around the automation facilitated by the invention
DISTRIBUTED COMPUTING AND MOBILITY COMPUTING
[0061] The invention makes extensive use of embedded computing to reduce large volumes (e g , image) data to small packetized data suitable for shipping over a low or limited bandwidth network However, there is still significant information content in this highly reduced data, which will sometimes include image data as well As computing devices become more mobile, as power becomes more valuable, as consumers have more and more reason to do all their computing on a limited number of devices, and / or servers on networks are utilized more efficiently, it will be increasingly important to do post processing of the initial embedded computing data inventions on distributed networks and / or on personal mobility devices.
[0062] The invention thus anticipates the assignment of advanced analytical functions, e.g., tracking, to user personal devices, user home computers, or company network computers having excess capacity with value that can be extracted (or money saved, conversely). In this scenario, the invention provides software modules running its advanced code , e.g., tracking, to the user / customer for use on a personal computer, e.g., a PDA, so that cost savings and energy savings can be realized, thereby lowering the overall system cost for the customer. The network connection between surveillance sensors and processors provides the connection for data inventions to reach mobile or remote computing devices, and since both the edge of the network and the customer computing resource have local storage, even intermittent data transmission will still allow for good performance under most conditions.
[0063] Because the invention can thus leverage mobility devices, it is anticipated that the use of social networking to jointly protect assets (e.g., a family protecting a home, using the many mobile phones within the household) and to share information and data that are of interest to the community. Since mobility devices typically have GPS information for user locations, the social networking of security and surveillance devices thus enables communities to form around the invention for protection of homes, neighborhoods, campuses, towns, cities, counties, states and countries, these communities being demarcated by levels of encryption, access codes, or other well known means for doing this with digital / networked technologies.
[0064] Because of the combination of mobility devices with observation technologies that the invention provides, it is possible to protect a social network, or community, by virtue of combining sensed features (e.g., height, infrared or radio frequency emissivity, optical or RF reflectivity, voice signature), GPS location, digital ID (e.g., phone number in the simplest case) to effect an adequate biometric protection canopy for the community. When combined with known biometric (e.g., fingerprint) technologies for access control, the invention allows a community based security and surveillance solution to be delivered that scales from the residence to the region, include sovereign states.
AN ILLUSTRATIVE EXAMPLE: SENSOR NETWORKS.
With reference to the sensor described Figure 8 and the network of sensors in
Figure 2 an example of how the invention can be used will be elaborated. The sensor 208 of Figure 8 works by collecting light from an object in the field of view at the wavelength for which the single pixel sensor(s) has (have) sensitivity (e.g., 10 microns, for a thermal infrared device), the collection being accomplished by a lens or other aperture that provides focus and a restricted field of view. The collected light produces a measurable signal (voltage or current) than can be used to form a digital measurement 209 of the signal amplitude as a function of time. Using calibration data measured at the time of installation / initialization and stored in non volatile memory, the digital measurement thus made can be interpreted by a computer 210 as to where in the field of view the object is (e.g., which beam or pixel), how bright the apparent object is as measured by the pixel and, for a scanned implementation such as is implied here, the encoder data 207 and inertial data 203 can be used to associate a pixel with a specific region in the field of view by calculating the angular position of moving components (encoder data) and deducing one's current position (inertial data). The use of acoustic data 206 can complement the measurement and potentially provide additional cues for detection. Finally, the daylight sensor 204 can be helpful in accounting for the effects of the diurnal cycle (e.g., solar warming cycles, or solar glare events) in interpreting sensor module data. Taken as a whole, then, the observables the sensor module can provide include photonic / image-pixel data, acoustic data and day/night indicators, and these data (at least the photonic data) are evaluated with algorithms in the microcontroller and then conditionally made available to the sensor network after forming packets and time stamping
[0066] The data from one sensor module are made available to the network only when adequate signal is present to suggest the presence of a real object Thus, the fundamental role of a sensor module is as a cueing device In the preferred embodiment, a sensor module only operates in multi pixel mode (e g , scanned by mechanical or electronic means) if adequate signal is first detected in single pixel mode, if the sensor module is only used as a cueing device (thereby removing the need for any multi pixel function), then it can cue other imaging devices (e g , video camera) and / or other network nodes in the same fashion that it might cue itself and other network nodes if used purely as a cueing device This can produce significant power savings at the power supply 212 node and also acts to inherently reduce the false detection rate by virtue of the binary output of the first step in sensor module operation (cueing)
[0067] When data are made available to the network through a wireless interface 211, the preferred destination is a master module, which can be one of the sensor modules, suitably equipped with additional memory resources if needed It is anticipated that the optimum scenario, however, will be that for which there is a dedicated master module (again, could be a sensor module, or could be a remote server that is connected via LANAVAN to the sensor network) that has little or no detection responsibility, so that all of its processor power can be made available This is likely to reduce the minimum required processor load required, thereby keeping the module power and cost at a minimum.
[0068] The field of view for each single pixel sensor is determined by the lens / aperture
(not shown, but assumed and implied). Figure 9 illustrates a three fold field of view 222 (1,2,3) for each sensor module 220 (A,B,C,D), implying that either one pixel is being used with an internal scanning apparatus to form multiple pixels over time or there are three pixels separated in the focal plane so as to form three distinct, though slightly overlapping, fields of view (a polar-type plot of sensor response it assumed). For the sake of this example, either means (or a combination of the two) is adequate for realizing the invention, though in practice one might choose one over the other (e.g., if scene dynamics would not allow for the slower image formation speed of a scanned system). In the space being observed with the sensor network, there are four objects 221 (W,X,Y,Z) to which the sensors are able to respond by virtue of the objects' emissivity or reflectivity.
[0069] Continuing the example, and supposing that objects W, X, Y, Z are sufficiently bright to produce measurable signals on some or all of the sensor modules, an ensemble of responses for the sensor network would be obtained. Using the notation for detection inventions for which the first letter is the detector, the number that follows being the number for a particular part of the field of view (e.g., 1,2,3), the following response, characterized by strong or weak (based on which part of the field of view), would be expected for the ensemble:
Figure imgf000025_0001
[0070]
[0071] Clearly, in this example, each object is observed multiple times and from several different angles, so that in each case, object localization can be achieved by observing the pixel location, deducing the most probable location within a pixel (if required) and projecting rays from sensor to object for each detection. Using sensor data measured at the factory (e.g,. beam profile, field of view characteristics) and calibration data obtained at the time of installation, a robust triangulation for each object can be derived from the implied geometric relationships (intersections of rays or probable ray bundles). Furthermore, given the multiplicity of detections per object, the probability of false detection diminishes dramatically (by the 4th power in this case, being the invention of 4 individual probabilities).
[0072] If we extend the example to include additional layers of pixels (e.g., into the page or out of the page for Figure 9), or pixels extending vertically and horizontally from the sensor module, then images can be formed by the ensemble of detection events thus collected. The images would, for this example constitute a simplistic 3D image (front / back / sides) of the object being imaged and would thus afford a high probability of interpreting the data correctly, leading to an even lower probability of false detection
MODES OF ACQUIRING MULTIPLE FIELDS OF VIEW PER SENSOR
[0073] There are multiple means of acquiring the aforementioned multiple pixels per sensor module These means fall into four categories 1 ) mechanically moving an aperture about a single pixel so as to steer its field of view (or equivalently, moving a mirror through which an aperture is focused), 2) mechanically moving the focal plane containing one or more pixels so that the sensor element explores a range of focal plane locations (and thus, field locations), 3) using multiple sensor elements per module and using an aperture akin to a lenselet for each element (thus, direct imaging), and 4) using multiple sensor elements per module and using a masked aperture for each element, such as is used in techniques such as compressive imaging These are described briefly in the following paragraphs
[0074] Mechanically moving an aperture is roughly equivalent to simply pointing the sensor in a particular direction (as one would do with a handheld video camera) However, there are ways to achieve this result without physically moving the entire sensor module One approach envisioned here is to place a multi faceted lenslet device on the sensor element (such as are commonly used for PIR motion detectors) and then place a rotating disk with an aperture (a circular hole, in simplest form) in front the lenslet so that only one facet of the lenslet is illuminated at a time (transitions between could be handled in calibration or through the use of an encoder on the disk) Alternately, one might use counter rotating fresnel lenses or wedge optics to achieve beam steering In each case the choice is made through its impact on invention quality, but there are multiple means of achieving the desired results by proven methods for one skilled in the art. In addition to the encoder, this approach implies the use of actuation (e.g., a motor w/ amplifier).
[0075] Another means of achieving the multiple fields of view per pixel is by moving the pixel in the focal plane, thereby mimicking the effect of a focal plane array (e g., a CCD or CMOS image device) by allowing one pixel to be used to sample the focal plane sequentially. This also requires the use of actuation, though of smaller size than for the foregoing, as the moving mass is smaller (only the sensor element). Thus the invention envisions the use of piezoelectric devices to actuate the sensor element, in addition to magnetics incorporated directly onto the sensor element substrate such that emf can be used with the element to move it under control.
[0076] A third method for obtaining multiple fields of view per pixel is that of simply using an ensemble of individual pixels, each having a small lens attached to it or associated with it, and sampling the focal plane directly this way. In such a case, the locations of the pixels are chosen so as to place elements to optimally sample the focal plane. Using asymmetry in the pattern of elements is envisioned, as this will provide an installer the option of mixing orientations, having been instructed to by an algorithm running during installation that assesses the area coverage, and knowing the distribution (spatially and in angle) of the pixels in the sensor module, will have a recommended rotation as a function of sensor, so that the installer is thus aided in optimization that would be very difficult to assess without computer support provided by the network itself. This type of sensor module is most often preferred when moving parts are forbidden in a invention. However, if moving parts are allowed, this sensor can have significant advantages for (relatively) high resolution imaging using a network of low pixel count sensors.
[0077] The fourth method for obtaining multiple fields of view is a variation on the third method in terms of the sensor module topology - the lenslet is replaced per sensor element with a masked lenslet. The data are handled much differently, however, as the pixel data thus collected are interpreted as coefficients for basis functions used in reconstructing an image. The nature of the basis functions will have been loaded into nonvolatile memory, having been measured accurately at the factory (e.g., point source, distributed source measurements under calibrated conditions). Again, this method contains no moving parts and thus can be used where such is prohibited. However, the benefits of extending utility through using modest mechanical motions are relevant here as well.
SENSOR NETWORK CALIBRATION
[0078] Figure 10 shows a method for calibrating the sensor network envisioned by this invention. The initialization module shown in Figure 14 is assumed to be part of the calibration. The purpose of the calibration is to stimulate the sensor network as a network using known sources, such that its response function can be determined. Of particular interest is the relationship between the multiple fields of view of various sensor elements, so that coincidence relations between sensors can be mapped out. Such mapping leads directly to the ability to correctly associate multiple observations of objects in the network as observed by multiple sensors. [0079] An example of how calibration would be performed follows, in keeping with the flow suggested by Figure 10. The way this would work is as follows, with reference to Figure 9. While holding the initialization probe module (Figure 14) the probe is used to send an initialization message 230 to the sensors in the network to enter into initialization ("Init") mode, the sensors having already been installed and the probe having been loaded with the device addresses of the network. The probe module is then set to record its initial position 231, which may include the entry or capture of GPS data, after which time the probe continuously records 232 position states and sensor data events (as a function of time) from the network of sensors. The user then moves / walks through the region 233 encompassed by the network of sensors and is detected / measured by the network, e.g., from X to Y to W to Z to Y to W to X to Z and then finish by walking around the perimeter. Once the network as been explored adequately, the probe module data recording 234 is halted and the calibration of the network begins. The network calibration includes solving for the probe path 235 during calibration and solving for network sensor geometric and radiometric properties. Subsequent to the calibration solution, the parameters are stored in the probe 236 and then broadcast to the network for use by sensor modules.
[0080] It is envisioned that, from time to time, it may be helpful to use a known blackbody probe in the "walk through" mentioned above, so that a known amplitude and integrated brightness (in the solid angle sense) are available for absolute radiometric calibration and / or improved geometric calibration, or simply because the user is not a bright enough object for robust calibration. For this reason, the calibration probe includes an output for driving such a known blackbody source. [0081] Furthermore, it is envisioned that a robotic device with known blackbody probe attached could have the calibration probe placed on it and used to explore the sensor network without any unknown (human, in this case) objects. With appropriate software loaded into the calibration module and a wireless interface to the robotic device, this is readily extended to the case for which the robotic device can "explore" the space of the sensor network and can be guided by the sensor network and calibration module so as to fully characterize the space. This would enable a periodic self calibration to occur in situations where extended performance is needed, or the environment is often changing in significant ways.
INITIALIZATION (CALIBRATION) PROBE ARCHITECTURE
[0082] Figure 11 illustrates an architecture 280 for the probe used to initialize / calibrate the sensor network. The role of each block was suggested by the example cited above. The microcontroller 283 acquires data from local inertial sensors 286 for measuring probe position over time, a GPS device 287 is used (if available - it not strictly required) to place local position measurements into the context of a global coordinate system (e.g., WGS 84) and communication of results is accomplished via the wireless network interface / device 284. The power supply 288 and battery 289 are requirements for a handheld device.
[0083] The probe emitter 281 is the output from the initialization probe that is used to drive / control a known calibration source (e.g., blackbody). The nonvolatile storage 285 block is included so as to maintain a minimum of microprocessor cost, such cost being driven disproportionately by the size of nonvolatile storage. However, it is possible that a microcontroller becomes available that meets all the system requirements (including) cost and yet obviates the need for additional nonvolatile memory.
MONOLITHIC INSTANTIATIONS
[0084] The networked sensor can be constructed using readily available commercial components. However, for markets where high volume and low cost are important, the integration of the invention substantially onto monolithic silicon (e.g., CMOS MEMS or equivalent combination) is compelling. Figure 12 describes an example of a sensor module architecture 247 that can be implemented in monolithic silicon. At least one pixel 240 (e.g., a microbolometer or other sensing element) is connected to a digitizer 242 (e.g., comparator, analog to digital converter, or equivalent) via a multiplexer 241 (in the case that multiple pixels are present) that provides pixel data to a microprocessor 243 that can process the pixel data and assess whether signal or noise has been observed, and can process pixel data when signals are present so as to detect an object or simply measure the object apparent brightness.
[0085] If external actuation is used to acquire additional pixels in the field of view, the signal and logic blocks for producing actuation control 246 (e.g., amplifier signals, pulse width modulation, etc.) and for interpreting actuator feedback (e.g., quadrature decoders, Hall sensor decoder, etc.) are included along with an inertial measurement 244 (e.g., MEMS accelerometer and / or gyro) that enables the measurement of actual movement achieved through actuation. The feedback and control is facilitated by the microprocessor 243 and the parameters needed for addressing sensor module device peculiarities are measured at time of manufacturer and included in nonvolatile memory for use by control algorithms running in the microprocessor. [0086] The means of communication 245 is also readily incorporated onto the monolithic module using (now common) CMOS RF technologies, including the integration of RF subsystem technologies (e g , Zigbee or equivalent)
[0087] The RF network device can be used not only for communication, but also for tπangulation between a plurality of sensor modules so as to augment the calibration process and further facilitate the formation of a local coordinate system, such that auto calibration is possible based on algorithms that collect historical detection data and position data (via tπangulation) and adaptively solve for system parameters needed for localization Tπangulation can be achieved using RF signal strength in the preferred embodiment, where distance between points is estimated by the RF power measured mutually The addition of GPS data for any node in the network would thereby map local coordinates of all sensors in the network onto the global coordinate system (e g , WGS84 or equivalent), if GPS / GIS services are required for the network
MULTI PIXEL SENSOR ELEMENTS
[0088] For a monolithic or discrete implementation of the invention, the inclusion of self calibration tools within the invention is generally advantageous Most sensors, but especially radiometric ones such as are common for IR devices, experience drift in their operating state (gain, offset, linearity, hysteresis) over time and temperature Thus, the ability to inject a known signal into a sensor element is valuable for accounting for state changes and correcting the sensor data accordingly [0089] Figure 13 shows a multi-pixel architecture for a sensor element such that continuous and / or periodic calibration of the sensing pixel can be accomplished. The arrangement articulated in Figure 13 is particularly useful for IR devices with good sensitivity to blackbody radiators, but is conceptually consistent with any form of built in calibration. In the upper part of the figure there are two pixels shown: a sensing pixel 270 with pixel bias control 271 and a calibration pixel 272 with its own bias control 273. The bias control is integral to microbolometer devices and so it is convenient to discuss the multi pixel approach in the context of a microbolometer pixel element (e.g., thermal infrared microbolometer). In such a case, the bias control is required for making a measurement of the change in received energy of the microbolometer element by energizing the pixel (with a bias voltage) and observing changes in the pixel resistance between successive measurements, appreciable change being attributed to changes in the brightness of the field of view of the pixel. This bias control is thus useful for calibration as the applied bias raises the temperature of the pixel and thus its brightness. If the pixel is coated so as to approximate a blackbody 275 (perfect emitter / absorber), it can be a known source of radiation that, by virtue of its proximity, "injects" known signal into the sensing pixel. By varying the calibration pixel bias 274 over time and observing the incremental response of the sensing pixel, a calibration can be obtained indicative of the gain and offset response of the sensing pixel.
[0090] The lower part of this figure suggests an extension to the multi pixel architecture wherein the calibration pixel is mounted on a rotating element 276. The calibration pixel in this case has two states: blackbody 275 and mirror 274. In the blackbody state (i.e., when the blackbody side is pointed at the sensing pixel) the calibration pixel is used to radiate the sensing pixel so as to produce gain and offset calibration data. In the mirror state (mirror is pointed at the sensing pixel) the calibration pixel is used to reflect the sensing pixel radiation, thereby enabling a measurement of the sensing pixel self brightness, which is useful for its characterization generally and also for gaining additional accuracy in estimating the calibration offset measured with the blackbody state.
ADDITIONS FOR ACTUATION
[0091] In some settings the ideal actuator for producing the imaging capability is a so called "wobble plate" wherein the solid angle of the object region for a sensor is explored by inducing motion akin to precession about the normal to the focal plane. It is envisioned that this will be carried out by various means, including the use of magnets and electromagnets (i.e., motor elements) with the possibility of magnets included directly in the sensor module (even if monolithic). It is also envisioned that this will be carried by piezoelectric devices such as may be readily available for a MEMS implementation, but not exclusively so.
[0092] It is also envisioned that the implementation of the encoder in the sensor module may benefit from the use of capacitive and / or inductive sensing such as is common in consumer inventions (e.g., touch pads), so that an optical encoder or other electromechanical means is not required. This may be particularly valuable in a monolithic device, as CMOS processes permit the inclusion of capacitive structures very well. CONNECTIONS FOR THE INSTALLED BASE
[0093] Oftentimes the invention will be used where there's already been a security sensor (e.g., simple PIR motion sensor) in place and / or the invention is being used to augment an existing sensor or ensemble of sensors, such sensors already having been connected to an alarm panel or equivalent device. In such a case, it will be advantageous to provide a means of connecting the new sensor (the invention) to the existing ones by providing an interface between the invention and the alarm panel so that 1) the alarm panel is aware of the invention and 2) the invention is aware of the alarm panel and the devices already connected to it. The interface, having processing akin to what is available on the invention, can thereby use the existing sensors intelligently along with the invention so as to both enhance the capability of the existing sensors (by using the AND / OR logical combinations mentioned above) and also facilitate the access to all the connected sensors. This access provides a means of do-it-yourself security when an alarm panel is no longer being serviced under an alarm service provider contract and yet the panel and sensors are still operational. The invention anticipates this type of interface wherein the existing installation of an alarm panel and sensor ensemble is enhanced by the invention and / or simply made accessible by an access panel having a built in interface to the invention and, thus, networked devices outside the alarm panel.
AN ACTUATION EXAMPLE.
[0094] The application of interest for an example is that of stabilizing a sensor for long standoff remote sensing, with the following requirements. First, the sensor platform experiences three dimensional motion and thus must be stabilized in three axes using inertial sensors and system sensors for measuring position and compensating for motion Second, the sensor must be steerable across a wide angle so that a narrow field of view per pixel can be accommodated without loss of wide area viewing capability Third, owing to cost constraints and the need to attain a high degree of repeatability across multiple instantiations, a custom motor and amplifier / control design cannot be used - standard production motors and commercial amplifiers must be used Finally, it is desired that the stabilization be on the order of a milhradian or less over the span of several minutes under excitation of 3 degrees and 0-3 hertz, with microradian performance attained at
[0095] The invention approach to actuation uniquely solves the problem posed above
[0096] In Figure 15 the invention components by function combine around the function of surveillance or remote sensing The sensor 301 is used to gather surveillance data that is processed 302 and simultaneously used to inform, via shared memory 303, the control 304 of the mechanism 306 that supports the sensor Likewise, a motion control processor 304 closes a control loop around the mechanism 306 and a position sensor 307, while simultaneously using its commands and measurements to inform the sensor data processing
[0097] The position sensor includes the measurement of absolute or relative internal mechanism positions (e g , gear teeth) and may also include inertial measurements of the system as a whole or parts thereof The position sensor shares its state information with the control processor (dashed line) and, in the preferred embodiment, contains in its measurement method information about the system behavior from prior measurements (e g , calibration data stored in the position sensor mechanism)
[0098] Thus, a state based position sensor (Figure 18) is used to sense the mechanical and inertial position of system components (including the system as a whole) while the amplifiers are commanded by the controller to drive the motor
[0099] A balance- and inertia- tolerant mechanism (Figure 17) physically connects the control system to the sensor, this mechanism relying solely on mechanical advantage and computer feedback/control to achieve performance The power and data connections are made throughout the system without harness / wiring through the use of signal modulation and air gap transformer coupling across orthogonal or mechanically decoupled segments of the system (Figure 4), so that the motion of the sensor is unimpeded by harness mechanical constraints
[00100] For optimum scalability and manufacturability, the motor (windings, magnets), controller, (state based) encoder / decoder, amplifiers and power / data provisioning can be integrated into a single two disc per axis assembly, as shown in Figure 19 where all of the foregoing elements of the actuation electronics, magnetics and control have been integrated With local control per axis of motion, the only remaining items for control are those associated with inter axis coupling, overall platform motions, and control connections between sensor data and the actuation control subassembhes
COMBINED SENSOR AND ACTUATION CONTROL
[00101] Figure 16 uses shared memory 303 between sensor data processing 302 and actuation control s 304 o as to enable both functions to have "awareness" of each other so as to reach a joint optimum operating point rather than an independent optimum operating point, the latter being determined by the root sum square (RSS) of error variances, the former being less than the RSS by at least the amount of correlation between the two functions as concerns errors (e g , departures from mathematically ideal operation) If joint prediction and correction is used, substantial improvements in joint performance can be attained without having changed any components - the improvement comes through measurement of root causes of system level behavior and compensation for those in real time through physics based modeling and computing
[00102] For instance, in a typical remote sensing scenario, such as an airborne surveillance camera mounted on a gyro stabilized gimbal, the gyros (and accelerometers, often times) measure how the airplane moves and adjust the position of the gimbal so as to keep the sensor pointed in the same direction while the airplane moves The (stable imaging) performance of the system will be determined largely on what the integration time of the sensor is and what the jitter of the gimbal is - how stable it is under aircraft motion However, if the error vectors from the gimbal are available to the imaging electronics so that image registration (e g , where the sensor pixels map in physical coordinates) can be corrected in real time, the need for post processing of video and inertial data records to correct for those errors is alleviated Moreover, if for instance, the periodicity of the motion (almost always has periodic components) is modeled and used to predict impending motion (e g , using auto regression on recent and/or archival historical data), then not only can the gimbal reduce its inherent performance, but the camera timing can be adjusted so as to image (i e , integrate) during periods of minimum error / motion of the platform rather than at peak error / motion times This approach can be extended to include the state variables for motor amplification and encoder / decoder operation, but the principle advantages of joint optimum estimation can be seen in this example
STATE AWARE ENCODING
[00103] Figure 18 illustrates a simplified state aware encoder wherein a pattern is embedded or imprinted on a disc 340, an illuminator 341 (or equivalent excitation in the case of, e g , a magnetic disc) makes the pattern visible or otherwise available to a detector, a high speed detector 342 (e g, array of 1 or more elements) captures the pattern induced energy, formats it for decoding and then forwards the information to a decoder 343 that can derive the mechanical state of the disc in terms of position, velocity, etc , for the disc and can include system state information if the information was encoded during integration of the encoder into the system (or at time of manufacture, etc , — wherever in the process the information is available and ready for imprint / storing)
[00104] The geometry shown in the figure is typical of optical encoders However, existing designs to not have the capacity for programmabihty or state-wise encoding of encoder and system mechanical peculiarities (e g, nonlineaπties, gear / teeth non uniformities, velocity and acceleration dependent behaviors, etc ) Programmabihty can be accommodated in various ways, including the use of a magnetic technology with a read/write head such as is used in modern disk drives for mass storage (this read/write approach would displace the illuminator / detector in the figure) Alternately, a pattern such as is used in two dimensional bar codes could be imprinted in the annular region visible to the detector array and a focal plane array such as is used for optical computer / mouse inventions could be used to image the pattern so that the decoder could extract the state information from the pattern, leading to robust state decoding The state information for the encoder that reflects departures from ideal mathematical behavior (e g , concentricity of the shaft-disc interface, backlash non uniformities, or repeatable measurements useful for other system calculations that can be stored in the encoder) is stored in the encoded pattern on the disc after it is first measured The optimum scenario is one where the encoder is present in system with all system elements present, so that non ideal behavior across the system can be captured and encoded in the disc Encoding this way makes it possible to use a very modest controller in the control system, since many of the calculations for compensation have already been effectively pre computed and retrieved in real time from the disc The same result can be achieved by simply using a more powerful controller with more mass storage and high speed storage, but in cost sensitive applications that yet benefit from relatively high performance (such as the invention can offer) the additional cost is not allowable and would impede the use of the invention By encoding computationally complex results into a simple mechanical form, the cost of the measurement and processing is borne by a single manufacturing process (so it's cost per unit is divided by the number of units) rather than placing the burden on every invention that is manufactured
WIRE FREE HARNESS FOR DATA AND POWER
Figure 16 shows the major features of the wire free harness invention Notionally, the harness is removed from the system by using transformers between moving 322 and stationary 321 elements, with (ac) power coupled directly across the halves of a transformer winding pair, and data being modulated or coupled onto the transformer incrementally to the power, the data and power being in orthogonal frequency and/or code spaces The upper left part of the figure shows data and power entering the fixed mount part 320 of the two disc assembly, shown from a side view The adjacent (upper right) end view shows the transformer windings concentric 324 with respect to the shaft (one set of windings per disc) and the arrows indicate the coupling of power and data onto wiring or equivalent conductive material on or in the shaft, from which it can be coupled to subsequent stages by similar means.
[00106] The lower left portion of the figure shows schematically the two halves 321, 322 of the transformer in relation to the discs in the upper left of the figure. A core 324 is shown in the transformer as an optional element, to be determined by system constraints. The transformers will have an air gap by their nature (one side moving, one side not moving), but variations on the theme are anticipated, such as when the shaft can be augmented with a ferrous element (akin to a so called keeper) to help with transformer efficiency.
[00107] The lower right portion of the figure illustrates the extraction of power 328 from the shaft 323 and/or disc regions 324 at points where power or data are needed. The transformer is assumed to be carrying an ac power signal, so that the use of power, most often in dc form, will require the use of rectification of ac to dc. The data will be separated from the power through demodulation or simple filtering and signal recovery (e.g., clock and data recovery).
MECHANICAL ADVANTAGE AND BALANCE TOLERANT DESIGN
[00108] Figure 17 illustrates the use of mechanical advantage (assumed under computer control) to compensate for inertias and centers of gravity that are significantly displaced from the center of rotation. This is not practiced in industry today when gimbals are designed owing to the disconnection between disciplines of mechanics, control and computer science. However, using a physical model of the system and a corresponding accurate computational model, high performance gimbal operation can be attained without locating the inertias and centers of gravity at the center of rotation. The selection of the motor, the gear ratio, amplifier for driving the motor have significant impact, and these are factored in by the physical system model. The figure shows a load 332 with a center of gravity (CG) 333 significantly displaced from the center of rotation. The gear arrangement produces a 1 :N mechanical advantage 331 and the structure of the frame supports the testing. [00109] INTEGRATED HARNESS FREE PCB MOTOR AND CONTROLLER
[00110] Figure 19 illustrates the integration of all elements of a single axis motion control system around printed circuit board (PCB) technology and the invention elements mentioned in the foregoing. Multi-layer PCBs are used to instantiate motor windings 350 in terms of interconnected concentric PCB traces on rotor and stator, while permanent magnets are bonded to the rotor side to complete the two halves of the motor magnetic circuit. The harness free approach Figure 17 is integrated so as to make dc power available on both sides of the assembly, which further permits the co- location of the motor amplifiers, the encoder / decoder assembly (Figure 3) along with the microcontroller used to control the single axis solution. Inertial measurement can be included on the stator or rotor side of the assembly, depending on the requirements mandated in system design. Furthermore, using a coaxial (with respect to the shaft) mechanical advantage system, the design approach of Figure 5 is captured in this invention so as to enable a complete integration of computer control motion. The invention is extensible to multiple axes since the harness free approach is incorporated, so that a highly manufacturable (using PCB technology, including pick and place machinery and electronic test equipment) motor assembly becomes possible at low cost and with high performance.
[00111] USE OF GPS: MOBILITY DEVICES
[00112] Because the invention combines (Figure 1) knowledge of direction 110 and location 105, including the use of magnetometer data 106 (many inertial solutions now include the magnetometer, so it is assumed present in the preferred embodiment) with network access to wired and wireless local and wide area networks, interactions with mobility devices (e.g., portable digital assistant, mobile phone, notebook computer, etc.) are greatly facilitated. Further, since mobile devices often include embedded GPS reception and localization information, the mobile device can serve as a cueing device to direct the field of view of the sensor 101 to a specific GPS location. This can be achieved variously.
[00113] One way to achieve GPS or geo-location cueing, is to have the mobility device communicate its GPS position to the invention, from which a bearing can be calculated (knowing both GPS locations, knowing altitudes from knowledge of local terrain in the field of view), a zoom established and the correct pan/tilt position commanded to direct the sensor 101 field of view to the person holding the mobility device Applications for this are many, but include tracking (if repeated GPS updates are sent or if computer vision algorithms are used to detect and track the individual after an initial GPS cue or set of cues) an individual (e g , security guard) within sensor range of the invention so as to provide a recording of the actions taken by the person or to provide an extra observer for situational awareness
[00114] Additionally, the use of GPS in mobility devices can enable enhanced security in retail applications, such as for observing a customer walking across the parking lot from a building entrance to their vehicle or destination, having registered their mobility device with the retailer's installation of the invention, e g , via Bluetooth wireless 104, or WiFi wireless 103 commands, or via a wired connection to a wide area connection with access to the mobility device carrier network This registration can be enabled in multiple ways, and the preferred embodiment is a downloadable plug in or application that is purchased by the owner of the device in concert with or through the retailer, such that the retailer and the carrier (e g , the phone company) both have revenue opportunities associated with the service, said service being integral to their secure billing and telephony systems
[00115] The retail scenario just mentioned could proceed, for example, as follows 1) customer enters retailer site, 2) customer purchases goods, possibly using a phone or other electronic means including but not limited to credit cards, 3) customer indicates during point of sale event that he wishes to be monitored and presses that key on the user interface on his phone or at the retailer point of sale device, 4) or customer presses the "track me" button on the device application or plug in during their exit from the retail location, and then 5) the invention directs its field of view to the customer until customer is observed to leave the location via GPS or customer indicates manually there is no need to watch further, and then 6) the image sequence is provided to the customer or retailer or carrier or combinations thereof according to suitable commercial arrangements made for data sharing [00116] The same retail scenario can be used when the invention is cued through one of its network interfaces 103, 104 to track a person carrying stolen goods, such identity having been resolved manually (visual observation, and use of cameras to select) or automatically using RFK) or equivalent merchandise tracking, the merchandise tracking devices (which can include wireless devices such as used in Figure 8) being used to indicate the position of the person carrying the stolen goods. In the case that the merchandise tracking is accomplished using wireless devices (e.g., zigbee) equipped with motion sensing (e.g., accelerometer) devices, the localization of the carrier of the stolen goods can be determined using radio signal intensity, knowing that signal strength is proportional to distance in well established ways according to the physics of radio wave propagation (e.g., approximately as the square of distance in a non scattering scenario) and having a plurality of such devices in communication with the one being transported with the stolen goods, similar to arrangement shown in Figure 2 except that the person 126 also carries a wireless module.
[00117] Clearly, if an RFK) like device can be used to cue the system when said device is embedded in a product, then a product can be made that is intended to be carried out of a facility, such a product containing a location device, a "customer wand" that is carried to one's car and kept there until subsequent visits, or is simply left in one's purse / wallet or such so as to provide a security layer for use by customers at a retail site. There are many alternate uses for such a cueing scheme that rely on the same principles and capabilities of the invention, of which security is only one.
[00118] AIRBORNE APPLICATIONS
[00119] The invention (Figure 1) is readily used for security / surveillance / videography from an airborne (e.g., aerostat or aircraft) platform, though in such situations it is common to stabilize three or more axes of rotation or motion so as to compensate (e.g., gyro stabilize) the motion of bodies moving in air. The preferred embodiment is an aerostat, but any moving platform is applicable given the design of the invention for stabilizing a line of sight and in the context of networked communications.
[00120] The scenarios described earlier for cameras (100, 120) mounted on building or other traditional fixed structures apply readily to the situation where cameras are mounted on an airborne platform. The use of GPS and / or computer vision and sensor networks to track and image, for example, packages, vehicles, people based on their geographic location is envisioned for the invention. And as aerostats often provide a means of radio communication between distant ground locations, the use of the invention for wide area surveillance and coordinated observation of a multiplicity of objects, events and locales is intended.
[00121] Furthermore, having collected imagery and sensor data over time and space, the invention includes the data-mining of geo-located events to catalogue over time significant imaged or sensed activities by geographic location. For example, questions to be answered with this application of the invention include, but are not limited to: 1) how many times was a car parked in that space and which one is most often seen, 2) what packages were left in that location in the last 30 days, 3) who visited that part of the parking lot this month, 4) what types of vehicles cross this spot on my GoogleMap of the local roads, 5) which players were in the goalie net the most, 6) how has traffic varied at that intersection this month, etc.
[00122] The invention also includes the use of diverse cueing devices beyond the thermal point sensors illustrated in Figure 2 and elsewhere in this document, e.g., acoustic sensing, which has particular value form an airborne platform. The invention has been developed so as to enable cueing from one or more microphones or equivalent acoustic receiving elements including an ensemble of elements used to determine direction and location of sounds. It is known that the acoustic energy propagation above ground, such as would occur with an acoustic array mounted in an aerostat, is enhanced relative to nominal lateral ground level propagation. Thus, when significant events occur even at 100s of meters or multiple km of distance, such as a collision, explosion, gunshot or other acoustic signal of interest, an aerostat system is well positioned to detect and localize the event, deduce the bearing and direct the invention to that location in order to record video or other sensor data for viewing and/or analysis.
[00123] The invention also contemplates the use of high altitude balloons (e.g., weather balloons) that are in place to provide radio communication or limited imaging capability (owing to the payload constrains on such balloons). The high altitude balloons are capable of loitering for a day or more at altitudes in excess of 60000 feet, which provides a very wide area vantage point. The invention has a radio interface that allows direct connection to such a high altitude balloon or network of the same so as to 1) coordinate and collect and analyze reported events from a network of ground based sensors such as illustrated in Figure 2 and Figure 8 and/or an acoustic sensor on a low altitude (e.g., 500 feet) aerostat, 2) direct and/or coordinate the fields of view of one or more smart cameras (Figure 1) so as to optimize the use of discrete fields of view to address on-the-ground priorities indicated by sensor networks or command and control personnel having a need to meet specific mission requirements, e.g., watching the route of a presidential parade and acquiring pseudo random location data at predefined radii around intersections or key structures / buildings in the vicinity of the route, 3) provide wide area residential or commercial video surveillance based on ground sensor (e.g., motion sensor placed in proximity to a building entrance or storage lot gate) reporting over a wide area, 4) generally provide access to and control over a wide area surveillance endeavor. The foregoing are very powerful when a high altitude balloon is used, but even a low altitude (100s of feet above ground level) can achieve the same benefits, albeit at reduced scale in some cases.
[00124] The invention also anticipates the use of holosonic technology to produce acoustic information for both projection of sound and the reception of sound to a local area (e.g., a person's head). This is useful for communicating with intruders and authorized personnel alike and enables a security scenario to covertly use sound to support interdiction and resist intrusion.
[00125] The invention also anticipates the use of the invention for sporting events, e.g , events for which video imagery are desired for athletes and/or their friends in attendance. The invention, if cued with wireless devices in a playing field object (e.g., ball, glove, shoe, etc.) can be used to track specific individuals uniquely and cue to events according to pre established rules or pseudo random sequencing. Applications include, but are not limited to, automating the filming of a sporting event from an aerostat placed nearby, capturing team imagery by player for use in a video yearbook,
[00126] SENSOR ARRANGEMENT FOR WIRELESS NETWORKED SENSORS
[00127] The physical packaging of 200 in Figure 8 having multiple sensors 208 and digitizers 209 (if a multiplexer is not used) can take on diverse forms, depending on requirements. However, for security / surveillance applications, the preferred embodiment involves separating the sensor 208 / digitizer 209 portion (or just the sensor in the case of a multiplexer) from the balance of the sensor module 200 and using a plurality of sensor 208 / digitizer 209 modules that connect mechanically and electrically to it. Since each sensor/digitizer constitutes a distinct field of view (e.g., 222 in Figure 9), a plurality of such fields of view (3 per sensor are shown in Figure 9) can be generated for each sensor module by stacking sensor/digitizer modules on top of the sensor module 100. Furthermore, if the packaging is cylindrical and all modules have the same cylindrical shape and size, sensor/digitizer modules can be stacked and directed arbitrarily so as to produce many different distinct regions of detection uniquely associated with a given sensor/detector module. A radial pattern (Figure 9 shows 3 sensor/digitizer pairs per sensor module, distributed radially in a 90 degree sector) is readily achieved with very few limitations on the number of sensors stacked. The preferred embodiment of such a stackable sensor scenario, e.g., a stack of cylinders containing either sensor modules or sensor/detector modules that mounts on a pole or tube of the same diameter so that the stack of modules can be placed at a useful height for sensing at significant distances, out of the reach of would be vandals. Finally, said pole of the preferred embodiment would carry the solar or wind generation capability for a renewable energy based solution and would also provide a ready means of mounting to ground structures.
[00128] Furthermore, given a plurality of distinct sensor/digitizer modules, combinations of these can be arranged so as to gather the velocity (magnitude and phase) of objects in the area of regard. This enables prediction about path / trajectory to occur so as to optimize the cueing of smart cameras, or simply to be able to rely more fully on only the wireless sensor network for detection and tracking of events within a space or perimeter.
[00129] Finally, the use of a plurality of sensors enables the determination of wind / motion clutter by virtue of the coherence of detections across multiple sensors, and if sensor/digitizers are distributed radially, the assessment of wind direction may be estimated by the distribution of false detections around the perimeter, since sensors are not necessarily equally sensitive to scene changes in all directions.
[00130] LASERRANGE FINDING
[00131] The invention contemplates the use of laser range finders connected via wireless (e.g., BlueTooth) or wired means to a mobility device (e.g., phone) so as to point at distance objects, determine the range to the object, use the inertial sensor and magnetometer in the mobility device, combine it with GPS and thus calculate the location of the distance object pointed at. Having the location of the object thus calculated, the invention can cue a smart camera (Figure 1) to gather imagery of that object without having to view a monitor and slew at camera to that location manually. Such a pointing method has many other uses as well, including the cataloging of vehicles or objects in a closed space within range of a smart camera (Figure 1). Further, if the mobility device or the laser range finder has an integral camera, then the device can be used to catalog inventory and capture imagery of said inventory even while geolocating each inventoried object.
BODY WEAR
[00132] A related use of the invention envisioned does not require a laser but does rely on line of sight. If a person is equipped with head gear that contains inertial and compass sensors, and if such apparatus is placed so as to associate the line perpendicular to the wearer's face with the desired direction, then the inertial and compass data can be used with the perpendicular to construct a line of sight and direct smart cameras along that line of sight. Furthermore, if the line of sight information is relayed to a local or remote computer, digital elevation map or live mapped data can be directly used to project the line of sight to intersect the nearest object and this object can then be imaged and/or catalogued for archival or ongoing surveillance and study. In some security scenarios, security guards are equipped with vests that contain actuators that the wearer can feel - these actuators are distributed radially around the chest of the person and are associated with the corresponding direction, e.g., the actuator on the front of the vest correlates to "in front of you" and the one on the back of the vest correlates to "behind you". The invention contemplates the use of such vests or equivalent body gear having tactile or actuation devices to give the wearer feedback. The utility of such vests is in providing the wearer with feedback about threats or items of interest. For the invention, for example, if the vest is equipped with a wireless device compatible with that of Figure 1, observers watching the sensor 101 data, e.g., video, can spot threats and communicate those to the wearer without anyone else knowing that a communication has occurred. When used in conjunction with the mobility device and GPS data, a preferred embodiment is constructed wherein the wearer of the vest can enter a hostile area, signal a smart camera to watch him based on his GPS position or equivalent (e.g., RF triangulation based on RF energy) and track his movements, tell colleagues of his to watch the sensor data (video) and then allow colleagues to notify him of threats at his perimeter.

Claims

Claims
1. A System for automated object detection, tracking and reporting having integral computer vision, motion control, stabilization, communication, and cueing.
2. A means of using low cost sensors to detect and localize objects of interest in a region or along a perimeter of a region.
3. A means of constructing a multi axis gimbal without wired harnessing between moving and stationary elements.
PCT/US2009/043033 2008-05-06 2009-05-06 Novel sensor apparatus WO2009137616A2 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US5085008P 2008-05-06 2008-05-06
US61/050,850 2008-05-06
US5105108P 2008-05-07 2008-05-07
US5107808P 2008-05-07 2008-05-07
US61/051,078 2008-05-07
US61/051,051 2008-05-07

Publications (2)

Publication Number Publication Date
WO2009137616A2 true WO2009137616A2 (en) 2009-11-12
WO2009137616A3 WO2009137616A3 (en) 2009-12-30

Family

ID=41265386

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/043033 WO2009137616A2 (en) 2008-05-06 2009-05-06 Novel sensor apparatus

Country Status (1)

Country Link
WO (1) WO2009137616A2 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2510740A1 (en) * 2012-11-05 2014-10-21 Universidad De Salamanca System for identification and location in interior spaces (Machine-translation by Google Translate, not legally binding)
WO2016149157A1 (en) * 2015-03-13 2016-09-22 Aqueti Incorporated Multi-array camera imaging system and method therefor
CN106527465A (en) * 2016-12-09 2017-03-22 中国电子科技集团公司第三十八研究所 Multi-order redundant captive balloon attitude control system and cooperative control method thereof
EP3159709A1 (en) * 2015-10-21 2017-04-26 Everspring Industry Co., Ltd. Apparatus and method for detecting azimuthal angle of heat source
US20200129075A1 (en) * 2018-08-06 2020-04-30 Ohmk (Tianjin) Medical Technology Co., Ltd. Method and device for guiding and releasing energy based on three-dimensional skin temperature topographic map
CN111523459A (en) * 2020-04-22 2020-08-11 中科三清科技有限公司 Remote sensing image bare area identification method and device, electronic equipment and storage medium
WO2020200413A1 (en) * 2019-04-01 2020-10-08 Robert Bosch Gmbh Camera system and method for positioning an optical unit of the camera system
CN111860336A (en) * 2020-07-21 2020-10-30 西北工业大学 High-resolution remote sensing image inclined ship target detection method based on position sensing
CN113655508A (en) * 2021-08-10 2021-11-16 厦门市弘威崇安科技有限公司 Unattended sensor node auxiliary laying device and method
US11494830B1 (en) * 2014-12-23 2022-11-08 Amazon Technologies, Inc. Determining an item involved in an event at an event location

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030056638A1 (en) * 1999-05-28 2003-03-27 Non-Lethal Defense, Inc. Non-lethal personal defense device
US20050029458A1 (en) * 2003-08-04 2005-02-10 Z Jason Geng System and a method for a smart surveillance system
US20050073585A1 (en) * 2003-09-19 2005-04-07 Alphatech, Inc. Tracking systems and methods
US20050244033A1 (en) * 2004-04-30 2005-11-03 International Business Machines Corporation System and method for assuring high resolution imaging of distinctive characteristics of a moving object
US20060093190A1 (en) * 2004-09-17 2006-05-04 Proximex Corporation Adaptive multi-modal integrated biometric identification detection and surveillance systems
US20080054836A1 (en) * 2003-07-28 2008-03-06 Jim Rodnunsky System and method for facilitating fluid three-dimensional movement of an object via directional force

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030056638A1 (en) * 1999-05-28 2003-03-27 Non-Lethal Defense, Inc. Non-lethal personal defense device
US20080054836A1 (en) * 2003-07-28 2008-03-06 Jim Rodnunsky System and method for facilitating fluid three-dimensional movement of an object via directional force
US20050029458A1 (en) * 2003-08-04 2005-02-10 Z Jason Geng System and a method for a smart surveillance system
US20050073585A1 (en) * 2003-09-19 2005-04-07 Alphatech, Inc. Tracking systems and methods
US20050244033A1 (en) * 2004-04-30 2005-11-03 International Business Machines Corporation System and method for assuring high resolution imaging of distinctive characteristics of a moving object
US20060093190A1 (en) * 2004-09-17 2006-05-04 Proximex Corporation Adaptive multi-modal integrated biometric identification detection and surveillance systems

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2510740A1 (en) * 2012-11-05 2014-10-21 Universidad De Salamanca System for identification and location in interior spaces (Machine-translation by Google Translate, not legally binding)
US11494830B1 (en) * 2014-12-23 2022-11-08 Amazon Technologies, Inc. Determining an item involved in an event at an event location
WO2016149157A1 (en) * 2015-03-13 2016-09-22 Aqueti Incorporated Multi-array camera imaging system and method therefor
US10462343B2 (en) 2015-03-13 2019-10-29 Aqueti Incorporated Multi-array camera imaging system and method therefor
EP3159709A1 (en) * 2015-10-21 2017-04-26 Everspring Industry Co., Ltd. Apparatus and method for detecting azimuthal angle of heat source
CN106527465A (en) * 2016-12-09 2017-03-22 中国电子科技集团公司第三十八研究所 Multi-order redundant captive balloon attitude control system and cooperative control method thereof
US20200129075A1 (en) * 2018-08-06 2020-04-30 Ohmk (Tianjin) Medical Technology Co., Ltd. Method and device for guiding and releasing energy based on three-dimensional skin temperature topographic map
WO2020200413A1 (en) * 2019-04-01 2020-10-08 Robert Bosch Gmbh Camera system and method for positioning an optical unit of the camera system
US10979642B2 (en) 2019-04-01 2021-04-13 Robert Bosch Gmbh Camera system for positioning an optical unit of the camera system
CN111523459A (en) * 2020-04-22 2020-08-11 中科三清科技有限公司 Remote sensing image bare area identification method and device, electronic equipment and storage medium
CN111860336A (en) * 2020-07-21 2020-10-30 西北工业大学 High-resolution remote sensing image inclined ship target detection method based on position sensing
CN113655508A (en) * 2021-08-10 2021-11-16 厦门市弘威崇安科技有限公司 Unattended sensor node auxiliary laying device and method

Also Published As

Publication number Publication date
WO2009137616A3 (en) 2009-12-30

Similar Documents

Publication Publication Date Title
WO2009137616A2 (en) Novel sensor apparatus
US11733370B2 (en) Building radar-camera surveillance system
Erdem et al. Automated camera layout to satisfy task-specific and floor plan-specific coverage requirements
US11194938B2 (en) Methods and apparatus for persistent location based digital content
Alam et al. Device-free localization: A review of non-RF techniques for unobtrusive indoor positioning
CN105393079B (en) Depth transducer control based on context
US20150285896A1 (en) Real-Time Location System In Wireless Sensor Network
US20210304577A1 (en) Integrated Camera and Ultra-Wideband Location Devices and Related Systems
US11640486B2 (en) Architectural drawing based exchange of geospatial related digital content
EP3111246A1 (en) Real-time location system in wireless sensor network
US11893317B2 (en) Method and apparatus for associating digital content with wireless transmission nodes in a wireless communication area
CN103827634A (en) Logo detection for indoor positioning
WO2019217200A1 (en) Systems and methods for locating devices in venues
US20240153167A1 (en) Methods and apparatus for secure persistent location based digital content associated with a two-dimensional reference
JP2016085602A (en) Sensor information integrating method, and apparatus for implementing the same
TWM580186U (en) 360 degree surround orientation and position sensing object information acquisition system
US11436389B2 (en) Artificial intelligence based exchange of geospatial related digital content
US20210271786A1 (en) Method and apparatus for construction and operation of connected infrastructure
Neumann et al. A rotating platform for swift acquisition of dense 3D point clouds
US10674117B2 (en) Enhanced video system
WO2020186856A1 (en) Three-dimensional indoor navigation system and implementation method thereof
US20220164492A1 (en) Methods and apparatus for two dimensional location based digital content
JP2012027546A (en) Parking lot monitoring system
Rafiee et al. Improving indoor security surveillance by fusing data from BIM, UWB and video
Picus et al. Novel Smart Sensor Technology Platform for Border Crossing Surveillance within FOLDOUT

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09743602

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09743602

Country of ref document: EP

Kind code of ref document: A2