WO2012176101A2 - Method for robust and fast presence detection with a sensor - Google Patents
Method for robust and fast presence detection with a sensor Download PDFInfo
- Publication number
- WO2012176101A2 WO2012176101A2 PCT/IB2012/053024 IB2012053024W WO2012176101A2 WO 2012176101 A2 WO2012176101 A2 WO 2012176101A2 IB 2012053024 W IB2012053024 W IB 2012053024W WO 2012176101 A2 WO2012176101 A2 WO 2012176101A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- detection area
- sensor
- zone
- distance
- flight
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/04—Systems determining presence of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
Definitions
- the present invention is directed generally to sensor technology. More particularly, various inventive methods disclosed herein relate to presence detectors.
- Presence detectors may employ a variety of technologies. For example, pneumatic tubes or hoses may be placed across a roadway to detect the pressure of a vehicle as its tires roll over the tubes or hoses. Such detectors operate through physical contact with the object being detected. In another example, an optical light beam emitter and sensor system may detect the presence of an object when the object interrupts a projected light beam. In addition, in-ground inductance loops may detect a vehicle in close proximity by detecting a change in magnetic inductance. Other examples of presence detectors include video detectors and audio detectors.
- Time-of-flight presence detectors generally include one or more sensors, for example, ultrasound sensors.
- Time-of-flight presence sensors are used in various applications to detect the presence of objects within a specified field of detection.
- the ultrasound sensors do not require physical contact with the item being detected.
- ultrasound sensors can sense object regardless of the magnetic properties of the object.
- time-of-flight detectors can determine the distance from the detector to the object.
- Ultrasound sensors are typically oriented either horizontally or vertically.
- a horizontally oriented ultrasound sensor also called a front measuring sensor, may detect objects entering a detection area in front of the sensor.
- a top measurement sensor is an example of a vertically oriented sensor.
- a top measurement sensor may be mounted on the ceiling of a room, or to an overhead fixture, and detect objects entering a detection area below the sensor.
- An ultrasound sensor generally emits a burst or pulse of energy in the ultrasound frequency band, typically in the 20 KHz to 200 MHz range. When the pulse encounters a physical surface, a portion of the energy of the pulse is reflected back to the sensor. The reflection is also known as an echo.
- the sensor measures the time elapsed between the pulse transmission and the reception of the pulse reflection, called the time-of-flight measurement. The distance between the object from and sensor may be calculated based upon the measured time-of-flight.
- Ultrasound presence detectors may use a reference threshold corresponding to a maximum ultrasonic echo response time as the operable detection distance range. The presence of an object within the range is detected when a reflection is measured with an ultrasonic echo response time shorter than the reference threshold.
- a reference threshold corresponding to a maximum ultrasonic echo response time
- problems may occur when a fixed object is imported within the detection range, as it may be difficult to distinguish between detection of human beings and detection of fixed objects.
- a person carries a large object, such as a cardboard box
- the detection area such as a room
- the person places the box within the detection area and then departs
- the sensor will continue to sense the box and may erroneously continue to report the presence of a person in the room.
- such calibration is typically performed only during installation and customization.
- Prior presence detectors may not adequately define the boundaries of the detection area.
- a second problem scenario is distinguishing an object entering a detection area from an object merely passing close by the detection area but not actually entering the detection area.
- a simple binary detection system may interpret any detected motion as an object being present and will interpret a lack of motion as no object being present. Therefore, an object merely passing nearby the sensor area may falsely trigger the detector.
- Prior presence detectors may not adequately distinguish between large magnitude and small magnitude motions.
- a third scenario where prior presence detectors may be inadequate may occur when an animate object enters the sensor detection area but becomes temporarily dormant. For example, a person may enter a room monitored by a presence detector, sit down, and remain still for several minutes. This may cause the motion detector to switch to a no-object-detected state based upon lack of detected movement. For instance, a person quietly reading in a room with a prior presence detecting light switch may find himself in the dark after a period of time when the detector detects little or no motion. In particular, presence detectors that compare the magnitude of a detected motion to an average detected motion may fail to distinguish a dormant animate object from an inanimate object.
- the present disclosure is directed to inventive methods for quickly and accurately differentiating animate objects within a sensor detection area from inanimate objects that are moved into the sensor detection area.
- the methods distinguish objects passing nearby the detection area from objects entering or leaving the detection area.
- the methods further distinguish inanimate objects from dormant animate objects within the sensor detection area.
- the sensor may be a top measurement or front measurement ultrasound sensor configured to detect the presence of a person in a room.
- a method detects the presence of an object within a detection area of a sensor.
- the sensor is configured to transmit a signal and receive a signal reflection.
- the detection area includes a first zone and a second zone.
- the first zone includes an area within a first distance from the sensor, and the second zone includes an area beyond the first zone and within a second distance from the sensor, where the second distance is greater than the first distance.
- the method includes the steps of detecting an object with the sensor and determining if the object is in the detection area. If the object is within the detection area, a step includes characterizing the object as either an animate object or an inanimate object. If the object is within the detection area and is characterized as inanimate, a step includes declaring the object not present. If the object is within the detection area and is characterized as animate, a step includes declaring the object present.
- a step may include measuring a time-of- flight between the signal transmission and the signal reflection.
- the step of characterizing the object further includes detecting if the object is moving. If the object is moving, the step includes characterizing the object as animate, or if the object is not moving, measuring an inactivity time span, and if the inactivity time span exceeds an inactivity threshold, characterizing the object as inanimate.
- determining if the object is in the detection area further includes the step of calculating an object distance between the object and the sensor. If the object distance is less than the first distance, a step includes determining the object is detected in the first zone.
- a step includes determining the object is detected in the second zone.
- a step includes clearing an object leaving flag, and if the object is not detected in the first zone, determining if the object is leaving the detection area. If the object leaving flag is clear, determining if the object is leaving the detection area further includes the step of measuring an object level of movement. If the object level of movement is greater than a movement threshold, a step includes setting the object leaving flag. If the object is not active and not in the second zone, a step includes setting the object leaving flag.
- determining if the object is leaving the detection area further includes declaring the object not present.
- detecting whether the object is moving further includes the steps of calculating an average time-of-flight, and calculating a variance between the time-of flight and the average time of flight.
- a method for detecting objects within a detection area of a time-of-flight sensor includes the steps of monitoring time-of-flight sensor measurements, calculating an average time-of-flight, calculating a variance from the average time-of-flight, detecting an object moving within the detection area, and determining if the object has stopped moving while remaining within the detection area.
- the time-of-flight sensor includes an ultrasound sensor.
- a step includes determining if the object has left the detection area. If the object is moving within the detection area, a step includes indicating that the object is present. If the object has stopped moving while remaining within the detection area, a step includes indicating that no object is present. If the object has left the detection area, a step includes indicating that no object is present.
- the step determining if the object has stopped moving while remaining within the detection area further includes the step of determining if the variance in time-of- flight measurements has remained below a variance threshold for a predetermined time.
- a computer readable medium has stored thereon instructions that, when executed, direct a system comprising a processor and an ultrasound sensor to detect the presence of animate objects within a detection area.
- the detection area includes a first zone and a second zone, where the first zone includes an area within a first distance from the sensor and the second zone comprises an area beyond the first zone and within a second distance from the sensor.
- the instructions include the steps of transmitting a transmitted signal by the ultrasound sensor, receiving a reflected signal, wherein the reflected signal comprises a portion of the transmitted signal reflected back from an object, measuring a time-of-flight between the transmitted signal and the reflected signal, calculating an average time-of-flight, and calculating a variance between the time-of flight and the average time of flight.
- Additional steps in the instructions stored on the computer readable medium include determining if the object is in the detection area. If the object is within the detection area, a step includes characterizing the object as one of a group consisting of animate objects, and inanimate objects. If the object is within the detection area and is characterized as inanimate, a step includes declaring the object not present. If the object is within the detection area and is characterized as animate, a step includes declaring the object present.
- the term “spectrum” should be understood to refer to any one or more frequencies (or wavelengths) of radiation produced by one or more light sources. Accordingly, the term “spectrum” refers to frequencies (or wavelengths) not only in the visible range, but also frequencies (or wavelengths) in the infrared, ultraviolet, and other areas of the overall electromagnetic spectrum. Also, a given spectrum may have a relatively narrow bandwidth (e.g., a FWHM having essentially few frequency or wavelength components) or a relatively wide bandwidth (several frequency or wavelength components having various relative strengths). It should also be appreciated that a given spectrum may be the result of a mixing of two or more other spectra (e.g., mixing radiation respectively emitted from multiple light sources).
- the term "lighting fixture” is used herein to refer to an implementation or arrangement of one or more lighting units in a particular form factor, assembly, or package.
- the term “lighting unit” is used herein to refer to an apparatus including one or more light sources of same or different types.
- a given lighting unit may have any one of a variety of mounting arrangements for the light source(s), enclosure/housing arrangements and shapes, and/or electrical and mechanical connection configurations. Additionally, a given lighting unit optionally may be associated with (e.g., include, be coupled to and/or packaged together with) various other components (e.g., control circuitry) relating to the operation of the light source(s).
- LED-based lighting unit refers to a lighting unit that includes one or more LED- based light sources as discussed above, alone or in combination with other non LED-based light sources.
- a "multi-channel” lighting unit refers to an LED-based or non LED-based lighting unit that includes at least two light sources configured to respectively generate different spectrums of radiation, wherein each different source spectrum may be referred to as a "channel" of the multi-channel lighting unit.
- controller is used herein generally to describe various apparatus relating to the operation of one or more light sources.
- a controller can be implemented in numerous ways (e.g., such as with dedicated hardware) to perform various functions discussed herein.
- a "processor” is one example of a controller which employs one or more microprocessors that may be programmed using software (e.g., microcode) to perform various functions discussed herein.
- a controller may be implemented with or without employing a processor, and also may be implemented as a combination of dedicated hardware to perform some functions and a processor (e.g., one or more programmed microprocessors and associated circuitry) to perform other functions.
- Examples of controller components that may be employed in various embodiments of the present disclosure include, but are not limited to, conventional microprocessors, application specific integrated circuits (ASICs), and field-programmable gate arrays (FPGAs).
- a processor or controller may be associated with one or more storage media (generically referred to herein as "memory,” e.g., volatile and non-volatile computer memory such as RAM, PROM, EPROM, and EEPROM, floppy disks, compact disks, optical disks, magnetic tape, etc.).
- the storage media may be encoded with one or more programs that, when executed on one or more processors and/or controllers, perform at least some of the functions discussed herein.
- Various storage media may be fixed within a processor or controller or may be transportable, such that the one or more programs stored thereon can be loaded into a processor or controller so as to implement various aspects of the present invention discussed herein.
- program or “computer program” are used herein in a generic sense to refer to any type of computer code (e.g., software or microcode) that can be employed to program one or more processors or controllers.
- the term "addressable” is used herein to refer to a device (e.g., a light source in general, a lighting unit or fixture, a controller or processor associated with one or more light sources or lighting units, other non-lighting related devices, etc.) that is configured to receive information (e.g., data) intended for multiple devices, including itself, and to selectively respond to particular information intended for it.
- a device e.g., a light source in general, a lighting unit or fixture, a controller or processor associated with one or more light sources or lighting units, other non-lighting related devices, etc.
- information e.g., data
- the term “addressable” often is used in connection with a networked environment (or a "network,” discussed further below), in which multiple devices are coupled together via some communications medium or media.
- one or more devices coupled to a network may serve as a controller for one or more other devices coupled to the network (e.g., in a master/slave relationship).
- a networked environment may include one or more dedicated controllers that are configured to control one or more of the devices coupled to the network.
- multiple devices coupled to the network each may have access to data that is present on the communications medium or media; however, a given device may be "addressable" in that it is configured to selectively exchange data with (i.e., receive data from and/or transmit data to) the network, based, for example, on one or more particular identifiers (e.g., "addresses") assigned to it.
- network refers to any interconnection of two or more devices (including controllers or processors) that facilitates the transport of information (e.g. for device control, data storage, data exchange, etc.) between any two or more devices and/or among multiple devices coupled to the network.
- information e.g. for device control, data storage, data exchange, etc.
- networks suitable for interconnecting multiple devices may include any of a variety of network topologies and employ any of a variety of communication protocols.
- any one connection between two devices may represent a dedicated connection between the two systems, or alternatively a non-dedicated connection.
- a non-dedicated connection may carry information not necessarily intended for either of the two devices (e.g., an open network connection).
- various networks of devices as discussed herein may employ one or more wireless, wire/cable, and/or fiber optic links to facilitate information transport throughout the network.
- the term "animate object” as used herein refers to an object capable of controlled motion without the assistance of an external force.
- a person or an animal may be an animate object.
- the term "inanimate object” is an object that is not capable of movement without the assistance of an external force. Examples of an inanimate object may include a cardboard box or a chair.
- inanimate objects may be moved by animate objects.
- An animate object that is not moving is herein distinguished from an inanimate object by referring to the non-moving animate object as dormant.
- detection area refers to a space in the vicinity of a presence sensor wherein the presence sensor may sense the presence of an object.
- the detection area may be physically bounded, for example, by a floor or a wall, or the detection area may not be physically bounded, but instead defined as a range of distances from the presence sensor.
- the detection area may be bounded according to the maximum detection range limitation of the presence sensor, or may be an area defined within the maximum detection range of the presence sensor.
- Presence sensor refers to a device capable of sensing an object.
- presence sensors that may be employed in various implementations of the present disclosure include, but are not limited to light beam sensors, pressure sensors, sonic sensors, video sensors, motion sensors, and time-of-flight sensors.
- a presence sensor may provide Boolean results, for example, whether an object is sensed or not sensed, or may provide more detailed information, for example, the distance of the object from the presence sensor, or the amount of force exerted upon the sensor by the object.
- Presence detector refers to a device or system including one or more presence sensors, generally including a processor for manipulating data provided by the presence sensor.
- a presence detector may include logical circuitry for making a determination whether an object is present or whether an object is not present based upon the manipulated presence sensor data.
- flag refers to a means for maintaining a logical Boolean state.
- a flag may refer to a binary semaphore or Boolean variable.
- Boolean states include, but are not limited to, on/off, true/false, etc.
- set and “clear” in reference to a flag refer to changing the state of the flag. Therefore, setting a flag typically indicates changing the a state of a flag to "on,” or "true,” while clearing a flag typically indicates changing the state of the flag to "off,” or "false.”
- a flag may be used to determine a course of action in a logical flowchart, such as at a decision branch.
- persons having ordinary skill in the art will recognize additional mechanisms capable of serving as flags.
- FIG. 1A illustrates a first embodiment of a lighting fixture with a front detecting presence detector from a side view.
- FIG. IB is a schematic diagram of a lighting fixture and front sensor detection area from a top view.
- FIG. 2 illustrates a scenario where a presence detector may distinguish an animate object from an inanimate object.
- FIG. 3 is a first logical flowchart of an exemplary embodiment of a method for detecting the presence of an object with a sensor.
- FIG. 4 is a second logical flowchart of an exemplary embodiment of a method for detecting the presence of an object with a sensor.
- FIG. 5 is a schematic diagram of a computer system for detecting the presence of an object with a sensor. Detailed Description
- a lighting fixture 100 includes a presence detector configured to distinguish animate objects from inanimate objects.
- the lighting fixture 100 includes a base 110, a column 120, and an overhead support 130, where the overhead support includes a lighting unit 140.
- the lighting fixture includes four ultrasound presence sensors 150 located within the column 120. Each ultrasound presence sensor 150 is associated with a front detection area 160, where the ultrasound presence sensor 150 is capable of detecting an object within the detection area 160.
- the presence detector in the lighting fixture 100 may be configured to turn the lighting unit 140 on or off depending upon whether one or more ultrasound sensors 150 sense the presence of an animate object within the corresponding detection area 160.
- FIG. 1A depicts a presence sensor configured as a front sensor, there is no objection to embodiments including presence sensors in other orientations, for example, a top sensor.
- FIG. IB is a schematic diagram of the lighting fixture 100 from a top view, indicating the front detection area 160 projecting outward from the lighting fixture 100. While the detection area is depicted in FIG. IB as covering an area defined by an arc, there is no objection to a detection area having other shapes, for example, a circle or semicircle.
- a second threshold distance 195 bounds the outer edge of the detection area 160, and a first threshold distance 185 defines a boundary between a first zone 180 and a second zone 190 within the detection area 160.
- the ultrasound sensors 150 (FIG. 1A) may be able to sense objects beyond the detection area 160
- the presence detector may be configured to disregard objects beyond the detection area 160.
- the presence detector 250 includes a top sensing sensor, for example, an ultrasound sensor, positioned above a detection area, where the presence detector 250 is configured to detect objects above a reference threshold height 220.
- a person 240 enters the detection area carrying a box (not shown).
- frame C the person 240 passes directly beneath the presence detector 250, and places the box (not shown) on the ground beneath the presence detector 250.
- frame D the person 240 begins to depart the detection area, leaving the box 260 in the detection area.
- frame E the person 240 has departed the detection area, so that the presence detector 250 may detects the box 260 as the closest object to the presence detector 250.
- prior presence detectors may erroneously report the presence of an object or the lack of presence of an object within a detection area after an inanimate object has been introduced or removed from the detection area. Since the box 260 is an inanimate object, it is desirable for the presence detector 250 to distinguish between an inanimate object, such as the box 260, and an animate object, such as the person 240 in presence sensing applications. More generally, Applicants have recognized and appreciated that it would be beneficial for presence detectors to adapt to the introduction or removal of one or more inanimate objects within the detection area.
- Objects detected in the detection area may be active animate objects, dormant animate objects, and inanimate objects.
- An inanimate object being moved by an animate object is classified as an animate object, although it may later be re-classified as an inanimate object.
- the person 240 carrying the box 260 may be initially characterized as an animate object. After the person 240 places the box 260 within the detection area and departs, the presence detector 250 will detect no movement. It would be advantageous, therefore, to eventually change the status of the box as being an inanimate object, thereby indicating no object is present. Similarly, it would be advantageous to distinguish between an inanimate object and an animate object in a dormant or inactive state.
- FIG. 3 is a flowchart of a first embodiment for a method for distinguishing animate objects from inanimate objects with a presence detector.
- the method under the first embodiment may be executed, for example, by a computer or an embedded microprocessor system.
- any process descriptions or blocks in flowcharts should be understood as representing modules, segments, portions of code, or steps that include one or more instructions for implementing specific logical functions in the process, and alternative implementations are included within the scope of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
- the method under the first embodiment determines if an animate object is present within a detection area of a sensor. Inanimate objects within the detection area are distinguished from animate objects, so that the method does not indicate the presence of an inanimate object within the detection area. The method further distinguishes between an inanimate object within the detection area and an animate object in a dormant state. The method also characterizes an animated object as leaving the detection area or not leaving the detection area. Examples of an indication of the presence of an animate object may include, but are not limited to, switching the power to a power outlet, turning an indicator light on or off, or sending a message through a wired or wireless network.
- the method under the first embodiment begins at block 305.
- An ultrasound sensor is configured to transmit a signal into a detection area and receive a reflection of the signal from an object within the detection area.
- a number of ultrasound measurements are taken, as shown by block 310.
- An example of an ultrasound measurement is an ultrasound sensor transmitting an ultrasound pulse and receiving the reflection of the ultrasound pulse.
- the time- of-flight between the transmitted pulse and the reflection may be measured.
- the time-of-f light may be used to calculate the distance between the ultrasound sensor and the object reflecting the pulse.
- Statistics are calculated using the current and previous ultrasound measurements. Examples of such statistics include, but are not limited to, the mean, the median, the mode, and the variance (block 310).
- Reflections that are received by the ultrasound sensor after a threshold amount of time has elapsed after the ultrasound pulse has been transmitted may be ignored.
- This threshold time defines the outside distance boundary of the detection area.
- the detection area may further be divided into a first zone and a second zone, where the first zone includes an area within a first distance from the sensor, the second zone comprising an area beyond the first zone and within a second distance from the sensor, wherein the second distance is greater than the first distance.
- the second distance is generally the threshold distance 220 (FIG. 2).
- a determination is made whether the object is moving (block 330). This determination may be made, for example, by calculating the variance of the most recent time-of-flight value. It may be advantageous to use the variance to detect motion, as the variance is defined as the squared difference from the mean. Squaring the difference makes it possible to detect even relatively small movements. Large movements may be distinguished from small movements, for example, by setting a variance threshold level, above which movements are considered large
- an object is therefore both within the first zone and exhibiting significant movement, so the object is deemed not to be leaving the detection area (block 334), and furthermore considered to be present within the detection area (block 344).
- An example of how an object is deemed to be not leaving includes clearing a state variable, for instance, in a software state machine, where the setting the state variable indicates the object may be leaving the detection area, and clearing the state variable indicates the object may not be leaving the detection area.
- FIG. 4 a flowchart continues the method shown in FIG. 3, where block 350 is expanded for detailed description of the processing of distant objects.
- the blocks shown within block 350 are reached when an object is detected that is not in the first zone. If it has been previously determined that an object may be leaving the detection area (block 410), it is determined whether the object is inanimate or has disappeared (block 420). An object is deemed to have disappeared if it is not detected within the first zone or the second zone. As above, an object may be deemed inanimate if little or no variance in the time- of-flight measurement is detected over a time window. If the object is inanimate or has disappeared, the object is declared not present (block 460).
- the object is declared present (block 450).
- the level of motion detected in the object is examined, as shown by block 412. If the object is moving significantly, for example, above a leaving threshold, the object is armed for leaving (block 454) and is declared present (block 450). Arming an object for leaving may be done by, for example, setting a Boolean flag indicating that subsequent processing should consider the object as potentially leaving the detection area.
- the object is not in the first zone, is not armed for leaving, and is not moving significantly, then the presence of the object within the second zone is checked, as shown by block 414. For example, an object may be determined to not be within the second zone if the measured time-of-flight indicates the distance between the object and the sensor is beyond the distance defining the end boundary 195 (FIG. IB) of the second zone 190 (FIG. IB). If the object is not in the second zone, the object is armed for leaving (block 454) and declared present (block 450). If the object is in the second zone, a determination is made whether the object is inanimate (block 416).
- an object is considered inanimate if the time-of-flight variance remains below an animation threshold over a time window. If the object is inanimate, it is declared not present (block 460). Otherwise, if the object is not inanimate, for example, an animate object in a dormant state, the object is declared present (block 450).
- an animated object may be characterized as leaving the detection area by detecting significant movement of the animated object in a direction away from the sensor.
- the animated object may be characterized as leaving if the distance between the animated object and the sensor is increasing, and if the variance is above a leaving threshold.
- the animated object may be characterized as leaving when the variance is above a leaving threshold and the distance between the animated object and the sensor is above a distance threshold. Characterizing an animated object as leaving may be also accomplished in other ways, for example, by dynamically adjusting the size of the first zone and/or the second zone based upon detected movement of the animated object. Presence detecting system
- FIG . 5 is a schematic diagram illustrating an exemplary embodiment of a system for executing functionality of the present invention.
- the present system for executing the functionality described in detail above may be an embedded microprocessor system, an example of which is shown in the schematic diagram of FIG. 5.
- the exemplary system 500 contains a processor 502, a storage device 504, a memory 506 having software 508 stored therein that defines the abovementioned functionality, input and output (I/O) devices 510 (or peripherals), a sensor 514, and a local bus, or local interface 512 allowing for communication within the system 500.
- the local interface 512 can be, for example but not limited to, one or more buses or other wired or wireless connections, as is known in the art.
- the local interface 512 may have additional elements, which are omitted for simplicity, such as controllers, buffers (caches), drivers, repeaters, and receivers, to enable communications. Further, the local interface 512 may include address, control, and/or data connections to enable appropriate communications among the aforementioned components.
- the processor 502 is a hardware device for executing software, particularly that stored in the memory 506.
- the processor 502 can be any custom made or commercially available single core or multi-core processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the present system 500, a semiconductor based microprocessor (in the form of a microchip or chip set), a macroprocessor, or generally any device for executing software instructions.
- the memory 506 can include any one or combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, etc.)) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.). Moreover, the memory 506 may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that the memory 506 can have a distributed architecture, where various components are situated remotely from one another, but can be accessed by the processor 502. [0059] The software 508 defines functionality performed by the system 500, in accordance with the present invention.
- the software 508 in the memory 506 may include one or more separate programs, each of which contains an ordered listing of executable instructions for implementing logical functions of the system 500, as described below.
- the memory 506 may contain an operating system (O/S) 520.
- the operating system essentially controls the execution of programs within the system 500 and provides scheduling, input-output control, file and data management, memory management, and communication control and related services.
- the I/O devices 510 may include input devices, for example but not limited to, a keyboard, mouse, scanner, microphone, etc. Furthermore, the I/O devices 510 may also include output devices, for example but not limited to, a printer, display, etc. Finally, the I/O devices 510 may further include devices that communicate via both inputs and outputs, for instance but not limited to, a modulator/demodulator (modem; for accessing another device, system, or network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router, or other device.
- modem for accessing another device, system, or network
- RF radio frequency
- the sensor 514 may be, for example, an ultrasound presence sensor.
- the sensor 514 may be configured for one of several orientations, for example, a front sensor or a top sensor.
- the sensor 514 may convey sensing parameters, for example, time-of-flight data, to the processor 502 via the local interface 512.
- the sensor 514 may receive configuration information and commands from the processor 502.
- the processor 502 may send a command to the sensor 514 to collect a single set of measurements, or may send
- the processor 502 is configured to execute the software 508 stored within the memory 506, to communicate data to and from the memory 506, and to generally control operations of the system 500 pursuant to the software 508, as explained above. It should be noted that in other embodiments, one or more of the elements in the exemplary embodiment may not be present. [0063] While several inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein.
- a reference to "A and/or B", when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
- the phrase "at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
- This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase "at least one" refers, whether related or unrelated to those elements specifically identified.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Geophysics And Detection Of Objects (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
- Length Measuring Devices Characterised By Use Of Acoustic Means (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280030882.5A CN103635828B (en) | 2011-06-21 | 2012-06-15 | Robust and the method that quickly there is detection is carried out with sensor |
EP12741382.1A EP2724178A2 (en) | 2011-06-21 | 2012-06-15 | Method for robust and fast presence detection with a sensor |
US14/125,121 US20140247695A1 (en) | 2011-06-15 | 2012-06-15 | Method for robust and fast presence detection with a sensor |
JP2014516467A JP2014526034A (en) | 2011-06-21 | 2012-06-15 | Robust and fast presence detection method using sensors |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161499414P | 2011-06-21 | 2011-06-21 | |
US61/499,414 | 2011-06-21 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2012176101A2 true WO2012176101A2 (en) | 2012-12-27 |
WO2012176101A3 WO2012176101A3 (en) | 2013-03-07 |
Family
ID=46601859
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2012/053024 WO2012176101A2 (en) | 2011-06-15 | 2012-06-15 | Method for robust and fast presence detection with a sensor |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP2724178A2 (en) |
JP (1) | JP2014526034A (en) |
CN (1) | CN103635828B (en) |
WO (1) | WO2012176101A2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2653357C2 (en) * | 2013-05-03 | 2018-05-08 | Филипс Лайтинг Холдинг Б.В. | Mitigating disturbance in sensing |
US10430528B2 (en) | 2015-05-12 | 2019-10-01 | Signify Holding B.V. | Method and system for managing space configurations |
CN113009463A (en) * | 2021-01-29 | 2021-06-22 | 杭州涂鸦信息技术有限公司 | Human body detection method and device |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106199738A (en) * | 2015-02-15 | 2016-12-07 | 鲍星合 | There is detection device and method in object |
CN104977585B (en) * | 2015-06-11 | 2017-07-28 | 中国科学院声学研究所 | A kind of motion sonar target detection method of robust |
JP6281846B2 (en) * | 2015-07-17 | 2018-02-21 | 学校法人千葉工業大学 | Information processing device |
JP6556573B2 (en) * | 2015-09-15 | 2019-08-07 | 株式会社デンソーアイティーラボラトリ | Intrusion detection device, intrusion detection system, intrusion detection method, and intrusion detection program |
US10242268B2 (en) * | 2017-02-03 | 2019-03-26 | Raytheon Company | Pixel-based event detection for tracking, hostile fire indication, glint suppression, and other applications |
CN207571884U (en) * | 2017-07-06 | 2018-07-03 | 杭州盛棠信息科技有限公司 | Road occupying/parking behavior detection device and system |
JP6348653B2 (en) * | 2017-11-29 | 2018-06-27 | 学校法人千葉工業大学 | Information processing device |
CN117917586A (en) * | 2022-10-21 | 2024-04-23 | 法雷奥汽车内部控制(深圳)有限公司 | In-cabin detection method, in-cabin detection device, computer program product, and motor vehicle |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3986182A (en) * | 1974-03-27 | 1976-10-12 | Sontrix, Inc. | Multi-zone intrusion detection system |
US3997866A (en) * | 1975-03-31 | 1976-12-14 | Automation Industries, Inc. | Acoustic bus passenger counter |
JP2575867B2 (en) * | 1989-03-09 | 1997-01-29 | オプテックス株式会社 | Automatic door start switch device |
US5043705A (en) * | 1989-11-13 | 1991-08-27 | Elkana Rooz | Method and system for detecting a motionless body in a pool |
JPH09156438A (en) * | 1995-12-07 | 1997-06-17 | Kansei Corp | Occupant detecting device |
AUPO073796A0 (en) * | 1996-06-27 | 1996-07-25 | Duskedge Pty Ltd | A collision avoidance system |
JP3196669B2 (en) * | 1996-11-25 | 2001-08-06 | 松下電工株式会社 | Combined human body sensing device |
JP3417274B2 (en) * | 1997-10-27 | 2003-06-16 | 松下電工株式会社 | Human body detection device |
US20050146429A1 (en) * | 2003-12-31 | 2005-07-07 | Spoltore Michael T. | Building occupant location and fire detection system |
US8077034B2 (en) * | 2006-09-28 | 2011-12-13 | Bea Sa | Sensor for presence detection |
CN101324666A (en) * | 2007-06-16 | 2008-12-17 | 电子科技大学 | Method for detecting concealed target life trace and concealed target detection device |
TW201015099A (en) * | 2008-09-10 | 2010-04-16 | Koninkl Philips Electronics Nv | System, device and method for emergency presence detection |
-
2012
- 2012-06-15 EP EP12741382.1A patent/EP2724178A2/en not_active Withdrawn
- 2012-06-15 CN CN201280030882.5A patent/CN103635828B/en not_active Expired - Fee Related
- 2012-06-15 WO PCT/IB2012/053024 patent/WO2012176101A2/en active Application Filing
- 2012-06-15 JP JP2014516467A patent/JP2014526034A/en active Pending
Non-Patent Citations (1)
Title |
---|
None |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2653357C2 (en) * | 2013-05-03 | 2018-05-08 | Филипс Лайтинг Холдинг Б.В. | Mitigating disturbance in sensing |
US10430528B2 (en) | 2015-05-12 | 2019-10-01 | Signify Holding B.V. | Method and system for managing space configurations |
CN113009463A (en) * | 2021-01-29 | 2021-06-22 | 杭州涂鸦信息技术有限公司 | Human body detection method and device |
CN113009463B (en) * | 2021-01-29 | 2023-04-18 | 杭州涂鸦信息技术有限公司 | Human body detection method and device |
Also Published As
Publication number | Publication date |
---|---|
JP2014526034A (en) | 2014-10-02 |
EP2724178A2 (en) | 2014-04-30 |
CN103635828B (en) | 2016-10-26 |
CN103635828A (en) | 2014-03-12 |
WO2012176101A3 (en) | 2013-03-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140247695A1 (en) | Method for robust and fast presence detection with a sensor | |
EP2724178A2 (en) | Method for robust and fast presence detection with a sensor | |
US9162620B2 (en) | Method and apparatus of determining position of obstacle, and parking assist method and system | |
US10234548B2 (en) | Ultrasonic detection device to determine interference source by an additional reception mode | |
WO2012174068A1 (en) | Background object sensor | |
JP2010071881A (en) | Obstacle detection system | |
KR20140012303A (en) | Device for detection of vehicle proximity obstacle and methed thereof | |
CN113253287B (en) | Object movement detection device and method, and non-transitory computer readable storage medium | |
CN109747639A (en) | Vehicle and its control method | |
KR102061514B1 (en) | Apparatus and method for detecting objects | |
JP2005091026A (en) | Two-frequency doppler range finder and detection system equipped with the same finder | |
CN117310670B (en) | Measuring method and device based on ultrasonic radar, vehicle-mounted terminal and storage medium | |
KR101509945B1 (en) | Object detection method of vehicle, and method for controlling parking assist system using the same | |
JP2010158917A (en) | Obstacle detection system and vehicle device | |
KR102263722B1 (en) | Nosie detecting device of ultrasonic sensor for vehicle and noise detecting method thereof | |
JP6143879B2 (en) | Sensor device for computer system, computer system having sensor device, and method for operating sensor device | |
EP4109123A1 (en) | System and method for facilitating localizing an external object | |
CN113552575A (en) | Parking obstacle detection method and device | |
US20070274157A1 (en) | Method For Detecting an Obstacle In the Detection Area of a Detection Device | |
NO346569B1 (en) | Proximity detection for computers or screens | |
Fernandes et al. | Wi-Fi intruder detection | |
KR20200068820A (en) | People counter for improving accuracy | |
KR20150058894A (en) | Obstacle detection device with function of noise detection and method thereof | |
JP7180368B2 (en) | Object detection device and program | |
JP2011257814A (en) | Vehicle sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12741382 Country of ref document: EP Kind code of ref document: A2 |
|
REEP | Request for entry into the european phase |
Ref document number: 2012741382 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012741382 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2014516467 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14125121 Country of ref document: US |