CN112785759A - System and method for passenger flow statistics - Google Patents

System and method for passenger flow statistics Download PDF

Info

Publication number
CN112785759A
CN112785759A CN202110085015.2A CN202110085015A CN112785759A CN 112785759 A CN112785759 A CN 112785759A CN 202110085015 A CN202110085015 A CN 202110085015A CN 112785759 A CN112785759 A CN 112785759A
Authority
CN
China
Prior art keywords
sensor
passenger flow
perception
flow statistics
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110085015.2A
Other languages
Chinese (zh)
Other versions
CN112785759B (en
Inventor
续立军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202110085015.2A priority Critical patent/CN112785759B/en
Publication of CN112785759A publication Critical patent/CN112785759A/en
Application granted granted Critical
Publication of CN112785759B publication Critical patent/CN112785759B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Geophysics And Detection Of Objects (AREA)

Abstract

The system and method for passenger flow statistics provided by the present specification can monitor multiple target areas within a tested area through multiple sensing systems within the tested area. Each perception system comprises at least one perception sensor and at least one processor, each perception sensor monitors whether a target object exists in a corresponding monitoring range, the processor performs living body detection on data of each sensor to determine whether the target object in the monitoring range of each perception sensor is a living body, and the living body is counted into passenger flow only when the target object is the living body. And the control terminal receives the living body detection result of each sensing system, determines the number of living body users of each target area, and accordingly determines the total number of users in the measured area.

Description

System and method for passenger flow statistics
Technical Field
The specification relates to the technical field of internet, in particular to a system and a method for passenger flow statistics.
Background
With the rapid development of the internet and information technology, the passenger flow statistics is widely applied to various scenes such as catering industry, intelligent retail, security monitoring, shopping malls, hotels, roads, scenic spots, vehicles and the like, and can assist merchants to know the conditions in the shops more accurately and carry out corresponding operation and marketing decisions through the passenger flow statistics. Particularly for the catering industry, the passenger flow statistics can help managers to sense the number of customers having meals in a store in real time, and is of great significance in predicting future dining flow, improving business operation strategies and business benefits and improving customer experience.
At present, the passenger flow statistical scheme commonly used mainly adopts the technical scheme that a distance meter or visual monitoring is arranged at an entrance and an exit and image recognition is combined, the data of the entrance is monitored through an infrared sensor or a visual sensor and the like to monitor the flow of people entering a store, the data output by the sensor is processed through a processor, the current number of people in the passenger flow is obtained through the data processing output by the sensor, and the current number of people in the passenger flow is uploaded to a cloud server. If the total number of people in the store needs to be calculated, the sensor data in a period of time needs to be accumulated and calculated. However, over time, cumulative calculations may result in accumulation of errors, resulting in inaccurate passenger flow statistics. In addition, the method of combining visual monitoring and image recognition needs to arrange a plurality of cameras, so that the hardware arrangement cost is high, the calculation is complex, the power consumption is high, the installation is inconvenient, and the maintenance is difficult.
Therefore, it is desirable to provide a system and method for low cost, high accuracy distributed passenger flow statistics.
Disclosure of Invention
The present specification provides a low-cost, high-accuracy system and method for distributed passenger flow statistics.
In a first aspect, the present specification provides a system for passenger flow statistics, including a plurality of sensing systems and a control terminal, where the plurality of sensing systems are distributed in a plurality of target areas within a detected area, each sensing system in the plurality of sensing systems includes at least one sensing sensor and at least one processor, each sensing sensor in the at least one sensing sensor monitors whether a target object exists in a corresponding monitoring range during operation, and generates sensor data, where the target object includes a living user and a non-living user; the at least one processor receives the sensor data sent by each perception sensor, performs living body detection on the sensor data based on a preset identification model, and generates an identification result to determine whether the target object is the living body user; the control terminal is in communication connection with each sensing system when working, obtains the identification result of each sensing sensor in each sensing system, and determines the total number of users in the detected area based on the identification result.
In some embodiments, the plurality of target regions at least partially cover the area under test.
In some embodiments, the detected area is a dining area of a dining room, and the monitoring range corresponding to each perception sensor comprises dining chairs.
In some embodiments, the identification model is obtained by training based on sample data corresponding to the living body user and sample data corresponding to the non-living body user, and the sample data includes a mapping relationship between signal frequency and signal amplitude in the sensor data.
In some embodiments, the performing the in-vivo detection on the sensor data based on a preset identification model includes: determining a mapping relation between the signal frequency and the signal amplitude of the sensor data in a preset time window; and performing living body detection on the mapping relation based on the identification model, and determining whether a perception sensor corresponding to the sensor data perceives the living body user.
In some embodiments, the preset time window comprises a preset length of time prior to the current time.
In some embodiments, there is at most one of the living users within the monitoring range of each of the perception sensors.
In some embodiments, the determining the total number of users in the area under test comprises: determining a number of living users in each of the plurality of target regions based on the recognition result of the each perception sensor; and determining the total number of users based on the number of live users in each of the target regions.
In some embodiments, each of the sensing systems further comprises a wireless communication module operable to establish the communication connection with the control terminal, the communication connection comprising a wireless communication connection.
In some embodiments, each of the sensing systems further comprises a power module operatively electrically connected to the at least one sensing sensor and the at least one processor.
In some embodiments, said obtaining said identification of said each sensing sensor in said each sensing system comprises: and acquiring the identification result of each perception sensor based on a preset time period.
In some embodiments, the at least one perception sensor comprises at least one of at least one radar sensor, at least one infrared sensor, at least one pressure sensor, at least one temperature sensor, at least one vibration sensor, and at least one electric field sensor.
In a second aspect, the present specification provides a method for passenger flow statistics, which is applied to the system for passenger flow statistics described in the first aspect of the present specification, and includes the following steps executed by the at least one processor: obtaining the sensor data for each of the perception sensors in the corresponding perception system; performing in-vivo detection on the sensor data of each perception sensor based on the recognition model, and generating the recognition result of each perception sensor; and sending the identification result of each perception sensor to the control terminal.
In some embodiments, the plurality of target regions at least partially cover the area under test.
In some embodiments, the detected area is a dining area of a dining room, and the monitoring range corresponding to each perception sensor comprises dining chairs.
In some embodiments, the identification model is obtained by training based on sample data corresponding to the living body user and sample data corresponding to the non-living body user, and the sample data includes a mapping relationship between signal frequency and signal amplitude in the sensor data.
In some embodiments, the in vivo detection of the sensor data of the each perception sensor based on the recognition model comprises: determining a mapping relation between the signal frequency and the signal amplitude of the sensor data in a preset time window; and performing living body detection on the mapping relation based on the identification model, and determining whether a perception sensor corresponding to the sensor data perceives the living body user.
In some embodiments, the preset time window comprises a preset length of time prior to the current time.
In some embodiments, there is at most one of the living users within the monitoring range of each of the perception sensors.
In some embodiments, said sending said identification result of each said perception sensor to said control terminal comprises: and sending the identification result of each perception sensor to the control terminal based on a preset time period.
In a third aspect, the present specification provides a method for passenger flow statistics, which is applied to the passenger flow statistics system described in the first aspect of the present specification, and includes that the control terminal executes: acquiring the identification result of each perception sensor in each perception system; and determining the total number of users in the tested area based on the identification result.
In some embodiments, the determining the total number of users in the area under test comprises: determining a number of living users in each of the plurality of target regions based on the recognition result of the each perception sensor; and determining the total number of users based on the number of live users in each of the target regions.
In some embodiments, said obtaining said identification of said each sensing sensor in said each sensing system comprises: and acquiring the identification result of each perception sensor based on a preset time period.
According to the technical scheme, the passenger flow statistics system and the passenger flow statistics method can monitor a plurality of target areas in the detected area through a plurality of sensing systems in the detected area. Each perception system monitors whether a target object exists in a monitoring range corresponding to each perception sensor through at least one perception sensor, sensor data are generated and sent to the processor, the processor performs living body detection on the data of each sensor according to the sensor data to determine whether the target object in the monitoring range of each perception sensor is a living body, and the living body is counted into passenger flow only when the target object is the living body. And the control terminal determines the number of living users in each target area according to the living body detection result of the processor in each sensing system, so that the total number of users in the detected area is determined. The system and the method can obtain the absolute value of the passenger flow in the detected area by monitoring the passenger flow in a plurality of target areas in the detected area at the same time, but not obtain the relative value of the passenger flow, and obtain the total passenger flow by accumulative calculation, thereby avoiding error accumulation and improving the accuracy of passenger flow statistics. Meanwhile, the system and the method can avoid the non-human body from being included in the passenger flow volume by carrying out the living body detection on the user, so that the comprehensiveness and the accuracy of the passenger flow statistics are improved.
Additional features of the system and method of passenger flow statistics provided herein will be set forth in part in the description which follows. The following numerical and exemplary descriptions will be readily apparent to those of ordinary skill in the art in view of the description. The inventive aspects of the system and method of passenger flow statistics provided herein can be fully explained by the practice or use of the methods, apparatus and combinations described in the detailed examples below.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram illustrating an application scenario of a system for passenger flow statistics provided in accordance with an embodiment of the present specification;
FIG. 2 illustrates an apparatus diagram of a perception system provided in accordance with an embodiment of the present description;
FIG. 3 illustrates a schematic diagram of an installation location of a sensor provided in accordance with an embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating a sensor data frequency distribution of a living user provided in accordance with an embodiment of the present description;
fig. 5 shows a schematic diagram of a device for controlling a terminal according to an embodiment of the present description;
FIG. 6 illustrates a flow chart of a method of providing passenger flow statistics in accordance with embodiments of the present description;
FIG. 7 illustrates a schematic diagram of an effective signal and an interfering signal provided in accordance with an embodiment of the present description; and
FIG. 8 illustrates a flow chart of another method of providing passenger flow statistics in accordance with embodiments of the present description.
Detailed Description
The following description is presented to enable any person skilled in the art to make and use the present description, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present description. Thus, the present description is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. For example, as used herein, the singular forms "a", "an" and "the" may include the plural forms as well, unless the context clearly indicates otherwise. The terms "comprises," "comprising," "includes," and/or "including," when used in this specification, are intended to specify the presence of stated integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
These and other features of the present specification, as well as the operation and function of the elements of the structure related thereto, and the combination of parts and economies of manufacture, may be particularly improved upon in view of the following description. Reference is made to the accompanying drawings, all of which form a part of this specification. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the specification. It should also be understood that the drawings are not drawn to scale.
The flow diagrams used in this specification illustrate the operation of system implementations according to some embodiments of the specification. It should be clearly understood that the operations of the flow diagrams may be performed out of order. Rather, the operations may be performed in reverse order or simultaneously. In addition, one or more other operations may be added to the flowchart. One or more operations may be removed from the flowchart.
The present specification provides a system and a method for passenger flow statistics, which can monitor passenger flow in a detected area through a distributed sensing system. Specifically, the system and the method for passenger flow statistics provided by the present specification may arrange the sensing system at a plurality of positions in the detected area to monitor the passenger flow in the area where the sensing system is located, thereby obtaining the absolute value of the passenger flow in the detected area and avoiding the accumulated error. In addition, the system and the method for passenger flow statistics can perform living body detection on each user, and only the user detected as a living body can be counted into the passenger flow volume, so that the comprehensiveness and the accuracy of the passenger flow statistics are improved. The scheme solves the problem of laying and maintaining cost of arranging a plurality of cameras in a shop scene, and has the characteristics of simple implementation and easy popularization. The scheme can also provide a low-cost and reproducible solution for the in-store passenger flow for the merchant, and can help the merchant to rapidly realize the digitization of the off-line scene.
Fig. 1 is a schematic diagram illustrating an application scenario of a system 001 for passenger flow statistics provided according to an embodiment of the present specification. The system 001 for passenger flow statistics (hereinafter, referred to as the system 001) can be used for passenger flow statistics of the people flow in the area 003 to be tested. The area to be detected 003 can be any spatial area, such as a supermarket, a mall, a restaurant, and the like, a supermarket living goods area, a cosmetics area, a clothing area, and the like, a mall children area, a shoes area, a decoration area, and the like, a restaurant dining area, a checkout area, a waiting area, and the like. For convenience of illustration, in the following description, we will describe the tested area 003 as a restaurant. The system 001 may include a plurality of perception systems 200 and a control terminal 300.
The plurality of sensing systems 200 are respectively distributed at a plurality of different positions in the measured area 003, and for convenience of description, the position where each sensing system 200 is distributed in the measured area 003 is referred to as a target position. Each perception system 200 may monitor the stream of people within a preset range of the target location. I.e., each perception system 200 may monitor the number of users within the target area 202 around the corresponding target location. The plurality of perception systems 200 correspond to a plurality of target areas 202. When a target object is present in the target area 202 of the perception system 200, the presence of said target object causes a change in the output signal of the perception system 200. It should be noted that the target object may be a living user, such as a human body, or a non-living user, such as a bag, clothes, or other sundries placed on a seat in a restaurant, or the like. The variation of the output signal when a living user is sensed by the sensing system 200 is different from the variation of the output signal when a non-living user is sensed. When the target object is a living user, that is, when the target object is a human body, the output signal of the sensing system 200 may be changed continuously along with the movement of the human body due to the body movement (limb movement, heartbeat) of the human body, the temperature change caused by the occurrence of the human body, and the like. When the target object is a non-living user, the change condition of the output signal caused by the non-living user is single. The perception system 200 may have stored therein, and may execute or be used to execute, data or instructions for performing the methods of passenger flow statistics described herein. Specifically, each sensing system 200 may generate sensing data in real time according to a change of the output signal in the corresponding target region 202, perform living body detection on each sensing data in the plurality of sensing data based on a preset identification model for human body feature identification, and generate an identification result to determine whether the target object is a living body, thereby performing passenger flow statistics.
The target area 202 may be a monitoring range of the sensing system 200, or may be an effective working area of the sensing system 200, i.e. a sensing range of the sensing system 200. The target area 202 may be preset or modified. For example, we can modify the range of the target region 202 by adjusting the sensitivity or detection range of the sensor. The range of the target area 202 is adjusted, for example, by adjusting the transmission distance of a radar sensor or an infrared sensor. Each perception system 200 corresponds to a target area 202. The plurality of perception systems 200 correspond to a plurality of target areas 202.
As shown in fig. 1, a plurality of target regions 202 may at least partially cover the area under test 003. The plurality of target areas 202 may cover a part of the measured area 003, for example, when the measured area 003 is a space area in the whole restaurant, the plurality of target areas 202 may cover a dining area in the measured area 003, that is, the plurality of target areas 202 partially cover the measured area 003. The plurality of target areas 202 may also cover the entire area of the tested area 003, for example, the plurality of target areas 202 may cover the entire area within the tested area 003, including a dining area, a checkout area, a waiting area, and the like. By adjusting the distribution position of the sensing system 200, the monitoring range (target area 202) of the sensing system 200 and the distribution density of the sensing system 200, the coverage of the target areas 202 of the sensing systems 200 can be changed, so that the target areas 202 of the sensing systems 200 can cover any area, and the use requirement can be met. Meanwhile, the area which does not need to carry out passenger flow statistics can be avoided, so that the cost is reduced, the passenger flow statistics precision is improved, and wrong data are prevented from being included in statistical data.
Adjacent target areas 202 in the plurality of target areas 202 may or may not partially overlap. When the adjacent target areas 202 are partially overlapped, the area of the overlapped part does not exceed the preset threshold value, so that the same user is prevented from being repeatedly identified and counted, and the statistical result is inaccurate. When the adjacent target regions 202 do not overlap, the gap area between the adjacent target regions 202 should not exceed the preset threshold, so as to avoid missing statistics and causing inaccurate statistical results.
The control terminal 300 may store data or instructions for performing the method P200 of passenger flow statistics described herein and may execute or be used to execute the data and/or instructions. The method P200 for passenger flow statistics will be described in detail later in the description. The control terminal 300 is operatively connected to the plurality of sensing systems 200 to obtain the identification result of each sensing system 200 of the plurality of sensing systems 200, and to calculate the number of living users in each target area 202, i.e. the number of passenger flows in each target area 202, based on the identification result, thereby determining the passenger flow volume in the area 003 to be tested, i.e. the total number of users. The communication connection refers to any form of connection capable of receiving information directly or indirectly. In some embodiments, the control terminal 300 may communicate data with each other through wireless communication connections with a plurality of perception systems 200; in some embodiments, the control terminal 300 may also communicate data with each other via a direct wire connection with multiple sensing systems 200; in some embodiments, the control terminal 300 may also be directly connected to other circuits via wires to establish indirect connections with multiple sensing systems 200, thereby enabling data transfer to each other. For convenience of description, the present specification will describe the control terminal 300 and the plurality of perception systems 200 as an example of transferring the recognition result through a wireless communication connection. The wireless communication connection is established between the control terminal 300 and the plurality of sensing systems 200, the installation is simple and convenient, the adaptability is strong, and the difficulty caused by the wiring of the wired communication connection can be avoided.
The control terminal 300 may include a hardware device having a data information processing function and necessary programs required to drive the hardware device to operate. Of course, the control terminal 300 may be only a hardware device having a data processing capability, or only a program running in a hardware device. In some embodiments, the control terminal 300 may include a mobile device, a tablet computer, a laptop computer, an in-built device of a motor vehicle, or the like, or any combination thereof. In some embodiments, the mobile device may include a smart home device, a smart mobile device, or the like, or any combination thereof. In some embodiments, the smart home device may include a smart television, a desktop computer, or the like, or any combination thereof. In some embodiments, the smart mobile device may include a smart phone, personal digital assistant, gaming device, navigation device, etc., or any combination thereof in some embodiments, the built-in device in the automobile may include an on-board computer, an on-board television, etc. In some embodiments, the control terminal 300 may be a device having a positioning technology for locating the position of the control terminal 300.
As shown in fig. 1, the control terminal 300 may include a local device 301. In some embodiments, the control terminal 300 may also include a cloud device 302. The wireless communication connection between the control terminal 300 and the plurality of perception systems 200 is performed through the local device 301. The local device 301 and the cloud device 302 may exchange information or data via a network. For example, the cloud device 302 may obtain the identification result from the local device 301 through a network. In some embodiments, the network may be any type of wired or wireless network, as well as combinations thereof. For example, the network may include a cable network, a wireline network, a fiber optic network, a telecommunications network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, or the like. In some embodiments, the network may include one or more network access points.
The data processing algorithm for the control terminal 300 to perform the passenger flow statistics according to the identification result may be performed in the local device 301 or the cloud device 302. The cloud device 302 may have a greater, faster computing power than the local device 301. The data processing algorithm for passenger flow statistics may be performed in the local device 301 when the calculation amount of the data processing algorithm for passenger flow statistics is small. When the calculation amount of the data processing algorithm of the passenger flow statistics is large, the data processing algorithm of the passenger flow statistics may be performed in the cloud device 302, and the local device 301 may send the identification result to the cloud device 302 for calculation. The local device 301 may not perform data operation, and the local device 301 may send all the recognition results obtained from the multiple sensing systems 200 to the cloud device 302 for performing passenger flow statistics, so as to reduce the cost of the local device 301.
As shown in fig. 1, the system 001 described in this specification performs real-time monitoring on a target object in a plurality of target areas 202 by arranging a plurality of sensing systems 200 in a detected area 003, respectively, obtains monitored sensing data, and performs living body detection on the sensing data through the preset identification model to determine whether the target object is a living body user; the control terminal 300 acquires the identification results of the plurality of sensing systems 200 through wireless communication connection with the plurality of sensing systems 200, and determines the absolute value of the passenger flow quantity in each target area 202, so that the absolute value of the passenger flow in the detected area 003 is acquired, the accumulated error and the false recognition of the non-living body are avoided, and the accuracy of passenger flow statistics is improved.
Fig. 2 shows a schematic device diagram of a perception system 200 provided according to an embodiment of the present description. As shown in fig. 2, the sensing system 200 may include at least one sensing sensor 220. In some embodiments, the perception system 200 may further include at least one processor 280. In some embodiments, the perception system 200 may further include a wireless communication module 240. In some embodiments, the perception system 200 may further include a power module 260.
Each of the at least one perception sensor 220 may monitor whether a target object exists within a corresponding monitoring range during operation, and receive signals within the monitoring range to generate sensor data. The sensor data may be a time-varying relationship of the signal. The signal includes an amplitude of the signal and a frequency distribution of the signal. The sensor data may be used to perform a biopsy. Each perception system 200 may include one or more perception sensors 220. Each perception sensor 220 is used to monitor a target object within its corresponding monitoring range. Each of the perception sensors 220 may generate one sensor data. At least one perception sensor 220 corresponds to the at least one sensor data one to one. The at least one perception sensor 220 may generate at least one sensor data. The combination of the monitoring ranges of at least one of the perception sensors 220 in each perception system 200 constitutes the target area 202 to which the perception system 200 corresponds. In particular, the at least one perception sensor 220 may monitor changes in the target object within the corresponding target region 202 and generate corresponding sensor data based on the changes in the target object.
In order to facilitate the passenger flow statistics, at most one living user is located within the monitoring range of each of the at least one sensing sensor 220. Fig. 3 is a schematic diagram illustrating an installation position of a sensing sensor 220 provided according to an embodiment of the present disclosure. The sensing sensor 220 may be installed at any position within the measured area 003. Taking the measured area 003 as the dining area of the restaurant as an example, the monitoring range of the sensing sensor 220 may be the dining chair 006 of the dining area. The perception sensor 220 can be used to monitor whether a target object is currently on the dining chair. The sensor 220 may be installed at the dining table 005 or the dining chair 006 in the dining area. As shown in fig. 3, the sensor 220 can be installed under the dining table 005 and aligned with the dining chair 006 to sense whether a living user is dining on the dining chair 006. The dining chair 006 may also be installed on the back of the dining chair 006 facing the dining table 005 or under the dining chair 006 to sense whether a living user is dining at the current location. When a living user is on the dining chair 006, the physical movement (limb movement, heartbeat) of the human body and the like cause the output signal of the dining chair 006 to change, so as to generate sensor data, and the control terminal 300 can receive the sensor data and perform living detection based on the preset identification model to identify the living user.
Of course, the sensor 220 can be installed at other locations, for example, the sensor 220 can be installed near the dining user, such as on the ceiling or above the dining chair 006, and sense whether a living user is dining on the dining chair 006, and so on.
For example, when the area 003 to be detected is a dining room, each sensing system 200 may be installed under a dining table 005, and each sensing system 200 may include a plurality of sensing sensors 220, each sensing sensor 220 facing a dining chair 006 around the current dining table 005, or directly installed on the dining chair 006 to monitor whether the corresponding dining chair 006 contains the living user. When the target object appears in the monitoring range of the perception sensor 220, the signal received by the perception sensor 220 will change. The change law of the signal caused by the living body user and the non-living body user is also different. The control terminal may identify whether the target object on the current dining chair 006 is a living user or a non-living user according to a change rule of the signal in the sensor data.
The system 001 may include a plurality of sensing systems 200, each sensing system 200 corresponding to a set of sensor data, each set of sensor data may include the at least one sensor data.
When the number of users in the target area 202 changes, a series of data changes in the target area 202 may be caused, such as gravity changes, distance changes, vibration changes, even temperature changes and dielectric constant changes in the surrounding environment, and so on. The at least one sensing sensor 220 may be any form of sensor that can sense a change in environmental data. For example, the at least one perception sensor 220 may be at least one of at least one radar sensor, at least one infrared sensor, at least one pressure sensor, at least one temperature sensor, at least one vibration sensor, and at least one electric field sensor.
The radar sensor can emit electromagnetic wave signals outwards and receive electromagnetic wave signals reflected back by other objects. When the sensing sensor 220 is a radar sensor, the radar sensor may transmit an electromagnetic wave signal in a predetermined direction and receive a reflected electromagnetic wave signal reflected by an object. When a target object appears in the preset direction of the radar sensor, the reflected electromagnetic wave signal received by the radar sensor in the preset direction changes. The control terminal 300 may determine whether a target object exists in the preset direction according to the reflected electromagnetic wave signal, and determine whether the target object is a living user according to a variation rule of the reflected electromagnetic wave signal. The preset direction may be one direction or a plurality of different directions. The radar sensor data may be one radar sensor that can emit electromagnetic wave signals in multiple directions, or multiple radar sensors that can emit electromagnetic wave signals in a single direction.
When the target object is a living body user (human body) and a non-living body user, the reflected electromagnetic wave signals received by the radar sensor are also different. When the target object is a non-living user, the non-living user cannot move by itself, so that the reflected electromagnetic wave signal received by the radar sensor changes smoothly. When the target object is a living user, the change of the reflected electromagnetic wave signal is caused by the limb movement or heartbeat of the human body, so that the change of the reflected electromagnetic wave signal received by the radar sensor is complex.
Taking a radar sensor as an example, fig. 4 shows a schematic diagram of a sensor data frequency distribution of a living user provided according to an embodiment of the present specification. As shown in fig. 4, the horizontal axis represents the frequency f, and the vertical axis represents the output signal amplitude a of the sensor 220. Generally, when a human body waits in a restaurant or has a meal, due to limb movement, a signal with lower frequency but higher amplitude is generated; while due to the respiration and heartbeat of the human body a signal with a relatively high frequency but small amplitude is generated. The distribution of the limb movement signal 1, the respiration signal 2 and the heartbeat signal 3 on the frequency spectrum is shown in fig. 4, wherein the frequency of respiration is about 0.13-0.4 Hz, the frequency of heartbeat is about 0.8-3.3 Hz, and the limb movement frequency is less than 0.1 Hz. But because the breathing and heartbeat signals are very weak, the signals are easily submerged in the signals of limb movement due to the continuous movement of the human body. Meanwhile, the human body is probably in a static state for a long time in the dining process, so that the motion signals of the limbs cannot be detected. Therefore, the above three data need to be combined for judgment, and whether the current radar sensor detects a human body or not is judged.
Therefore, the sensing system 200 can determine whether the target object is a human body according to the mapping relationship between the frequency of the signal in the sensor data and the amplitude of the signal, so as to prevent the target object that is not a living body from being included in the passenger flow volume, and improve the accuracy of the passenger flow statistics.
The infrared sensor can emit infrared signals outwards and receive the infrared signals reflected back by other objects. When the sensing sensor 220 is an infrared sensor, the infrared sensor may emit an infrared signal in a predetermined direction and receive a reflected infrared signal reflected by an object. When a target object appears in the preset direction of the infrared sensor, the reflected infrared signal received by the infrared sensor in the preset direction changes. The control terminal 300 may determine whether a target object exists in the preset direction according to the reflected infrared signal, and determine whether the target object is a living user according to a change rule of the infrared signal. The preset direction may be one direction or a plurality of different directions. The infrared sensor data may be one infrared sensor that can emit infrared signals in multiple directions, or multiple infrared sensors that can emit infrared signals in a single direction. The infrared sensor may be an infrared pyroelectric sensor or an infrared thermopile sensor, and the description is not limited herein.
The at least one pressure sensor may measure the data of the pressure variations experienced. When the sensor 220 is a pressure sensor, the pressure sensor can be installed on the dining chair 006 to measure the pressure change experienced by the dining chair 006. When a target object is present within the monitoring range of the sensor 220, the pressure data measured by the pressure sensor may also change. The control terminal 300 may determine whether the target object exists in the monitoring range according to the change of the pressure data, and determine whether the target object is a living user according to a change rule of the pressure signal.
The temperature sensor may measure temperature change data within the monitoring range. When the sensing sensor 220 is a temperature sensor, the temperature data measured by the temperature sensor may also change when the target object appears in the monitoring range. The control terminal 300 may determine whether the target object exists in the monitoring range according to the change of the temperature data, and determine whether the target object is a living user according to a change rule of the temperature signal.
The vibration sensor may measure vibration variation data within the monitoring range. When the sensing sensor 220 is a vibration sensor, when the target object appears in the monitoring range, the vibration data measured by the vibration sensor may also change. The control terminal 300 may determine whether the target object exists in the monitoring range according to the change of the vibration data, and determine whether the target object is a living user according to a change rule of the vibration signal.
The electric field sensor may measure voltage change data within the monitoring range. When the sensing sensor 220 is an electric field sensor, when the target object appears in the monitoring range, the dielectric constant in the monitoring range may also change, and the voltage data measured by the electric field sensor may also change. The control terminal 300 may determine whether the target object exists in the monitoring range according to the change of the voltage data, and determine whether the target object is a living user according to a change rule of the voltage signal.
The sensing sensor 220 may be any one of the above sensors, or a combination of the above sensors, or any other sensor capable of sensing human body information, such as a distance sensor, an ultrasonic sensor, a sound sensor, a light sensor, and the like. The perception sensor 220 may generate sensor data based on a change in the number of users within the monitoring range. The plurality of perception data includes at least the plurality of sets of sensor data. The perception sensor 220 may generate the sensor data in real time.
The at least one processor 280 may store data or instructions for performing the method for passenger flow statistics P100 described herein, and may execute or be used to execute the data and/or instructions. The method P100 for passenger flow statistics will be described in detail later in the description. The at least one processor 280 may be communicatively coupled to each of the at least one sensing sensors 220 and receive the sensor data transmitted by each sensing sensor 220. As previously described, the perception sensor 220 may monitor whether a target object exists within a corresponding monitoring range and generate the sensor data. The at least one processor 280 may acquire and store sensor data of each of the sensing sensors 220 from the at least one sensing sensor 220, perform living body detection on the sensor data based on a preset recognition model, and generate a recognition result to determine whether the target object is the living body user.
The identification model is used for performing in vivo detection on the sensor data. As described above, when the sensing sensor 220 senses that the target object exists in the monitoring range, the output signal of the sensor data changes. Specifically, a change rule of the sensor data when the target object is a living body user is different from a change rule when the target object is a non-living body user. Taking the sensing sensor 220 as a radar sensor, the mapping relationship between the signal frequency and the signal amplitude in the sensor data is different.
The identification model is obtained by training based on sample data corresponding to the living body user and sample data corresponding to the non-living body user. The sample data may include a mapping of signal frequencies and signal amplitudes in the sensor data. Specifically, the developer may install the sensing sensor 220 in different types of scenes (e.g., a plurality of different restaurants) and perform sample data acquisition, where the sample data includes sensor data corresponding to a plurality of marked live users and sensor data corresponding to non-live users; acquiring a mapping relation between signal frequency and signal amplitude in each sensor data, and taking the mapping relation as training data to train a classification neural network model; after the training is completed, the neural network classification model is deployed in the processor 280 to perform real-time prediction classification, generate the identification result, and output whether a human body exists in the monitoring range of the current perception sensor 280.
The at least one processor 280 may also transmit the recognition result to the control terminal 300 based on a preset time period. In particular, processor 280 is an embedded low power processor. The wireless communication module 240 and the at least one processor 280 within the sensing system 200 are normally in a standby mode. The control terminal 300 wakes up the sensing system 200 at intervals of a preset time, that is, the control terminal 300 may wake up the sensing system 200 based on a preset time period; after the sensing system 200 receives the wake-up signal sent by the control terminal 300, the processor 280 in the sensing system 200 may upload the identification result to the control terminal 300, and re-enter the sleep or standby state to wait for the next wake-up. Therefore, the low power consumption processor 280 can save power consumption of the sensing system 200 and prolong the standby time.
The wake-up signal may be a signal transmitted by the control terminal 300 to the sensing system 200. The preset time can be set or changed. Specifically, the control terminal 300 may obtain the preset time through machine learning according to the historical data of the passenger flow volume in the detected area 003. The preset times for different time periods may be different, for example, the preset time may be shorter during the peak dining period, for example, the preset time may be 10s, 1min, 10min, etc. during the noon or evening; for example, the preset time may be longer during a meal peak, such as in the morning or at night, the preset time may be 1h, 2h, and so on.
When the sensing system 200 is in a standby state, the processor 280 may store the sensor data and the recognition result; when the sensing system 200 is awakened, the processor 280 may delete the transmitted identification result and the corresponding sensor data after transmitting the stored identification result to the control terminal 300.
In some embodiments, the perception system 200 may further include at least one storage medium (not shown in FIG. 2). The storage medium may include a data storage device that may be used to store the recognition model and at least one set of instructions. The instructions are computer program code that may include programs, routines, objects, components, data structures, procedures, modules, etc. that perform the method P100 of passenger flow statistics provided herein.
The wireless communication module 240 may be electrically connected with the at least one processor 280. When the sensing system 200 works, the wireless communication connection can be established with the control terminal 300 through the wireless communication module 240, so that wiring is avoided, labor is saved while cost is reduced, and appearance attractiveness is improved. The wireless communication may be WiFi communication, bluetooth communication, NFC communication, optical communication, and the like. The sensing system 200 can send the identification result corresponding to the sensing system 200 to the control terminal 300 through the wireless communication module 240.
As shown in fig. 2, the power module 260 may be electrically connected to the at least one sensing sensor 220 and the at least one sensor 280 to provide power to the entire sensing system 200. The power module 260 may be battery powered, wire powered, or self powered, and is not limited herein. The power supply of the battery and the self power supply can realize the installation of equipment without wiring, the cost is saved, meanwhile, the attractiveness is improved, the use is more convenient and faster, the popularization is easier, and the deployment and use conditions of the system 001 are reduced.
Fig. 5 shows a schematic diagram of an apparatus for controlling the terminal 300. The control terminal 300 may execute the method P200 of passenger flow statistics described in this specification. The method P200 of passenger flow statistics is described elsewhere in this specification. The device diagram shown in fig. 5 may be used for the local device 301 and may also be used for the cloud device 302.
As shown in fig. 5, the control terminal 300 may include at least one storage medium 330 and at least one processor 320. In some embodiments, the control terminal 300 may also include a communication port 350 and an internal communication bus 310.
Internal communication bus 310 may connect various system components including storage medium 330, processor 320, and communication port 350.
The communication port 350 is used for controlling data communication between the terminal 300 and the outside, for example, the communication port 350 may be used for controlling data communication between the terminal 300 and a plurality of sensing systems 200. The communication port 350 may also be used for data communication between the local device 301 and the cloud device 302. The communication port 350 may be a wired communication port or a wireless communication port. In this specification, the communication port 350 is described as an example of a wireless communication port. The control terminal 300 receives the recognition results of the plurality of sensing systems 200 through the wireless communication connection.
Storage media 330 may include data storage devices. The data storage device may be a non-transitory storage medium or a transitory storage medium. For example, the data storage devices may include one or more of a magnetic disk 332, a read-only storage medium (ROM)334, or a random access storage medium (RAM) 336. The storage medium 330 further comprises at least one set of instructions stored in the data storage device. The instructions are computer program code that may include programs, routines, objects, components, data structures, procedures, modules, and the like that perform the methods of passenger flow statistics provided herein.
The at least one processor 320 may be communicatively coupled to at least one storage medium 330 and a communication port 350 via an internal communication bus 310. The at least one processor 320 is configured to execute the at least one instruction set. When the system 001 is operating, the at least one processor 320 reads the at least one instruction set and obtains the recognition results of the plurality of perception systems 200 through the communication port 350 according to the instruction of the at least one instruction set, and performs the method of passenger flow statistics provided herein. The processor 320 may perform all the steps involved in the method of passenger flow statistics. Processor 320 may be in the form of one or more processors, and in some embodiments, processor 320 may include one or more hardware processors, such as microcontrollers, microprocessors, Reduced Instruction Set Computers (RISC), Application Specific Integrated Circuits (ASICs), application specific instruction set processors (ASIPs), Central Processing Units (CPUs), Graphics Processing Units (GPUs), Physical Processing Units (PPUs), microcontroller units, Digital Signal Processors (DSPs), Field Programmable Gate Arrays (FPGAs), Advanced RISC Machines (ARM), Programmable Logic Devices (PLDs), any circuit or processor capable of executing one or more functions, or the like, or any combination thereof. For illustrative purposes only, only one processor 320 is depicted in the control terminal 300 in this description. It should be noted, however, that the control terminal 300 may also include multiple processors, and thus, the operations and/or method steps disclosed in this specification may be performed by one processor as described in this specification, or may be performed by a combination of multiple processors. For example, if the processor 320 of the control terminal 300 performs steps a and B in this specification, it should be understood that steps a and B may also be performed by two different processors 320 in combination or separately (e.g., a first processor performs step a, a second processor performs step B, or both a first and second processor perform steps a and B together).
It should be noted that when the local device 301 does not have the operation capability, the local device 301 may not include the processor 320.
Fig. 6 shows a flow chart of a method P100 for providing passenger flow statistics according to an embodiment of the present description. The method P100 for passenger flow statistics is applied to the system 001 for passenger flow statistics. The system 001 for passenger flow statistics may perform the method P100 for passenger flow statistics provided herein. In particular, the at least one processor 280 may read a set of instructions stored in its local storage medium and then execute the method for passenger flow statistics P100 provided herein, as specified by the set of instructions. The method P100 may include performing, by at least one processor 280:
s120: the processor 280 acquires sensor data for each of the at least one perception sensor 220 in the corresponding perception system 200.
As previously described, each of the perception sensors 220 may generate sensor data in real time, which the processor 280 may acquire in real time.
S140: the processor 280 performs a living body detection on the sensor data of each sensing sensor 220 based on the recognition model and generates the recognition result of each sensing sensor 220.
Specifically, step S140 may be: determining a mapping relationship between signal frequency and signal amplitude of the sensor data within a preset time window based on the sensor data of each perception sensor 220; and performing living body detection on the mapping relation based on the identification model, and determining whether the living body user is sensed in the monitoring range of the sensing sensor 220 corresponding to each sensor data.
The preset time window includes a preset duration before the current time. Specifically, when performing living body detection on the sensor data, the processor 280 may intercept the sensor data within a time window (i.e., the preset time window) from the current time to the past time, input the sensor data within the time window into the identification model, and output the identification result by the identification model, where the identification result is whether a living body user exists in the detection range of the current sensing sensor 220.
Due to unavoidable existence of various forms of interference in the environment, the output signal of the at least one sensing sensor 220 contains a false interference signal. Fig. 7 illustrates a schematic diagram of an effective signal and an interference signal provided according to an embodiment of the present description. As shown in fig. 7, the horizontal axis represents time t, and the vertical axis represents the output signal amplitude a of the sensor 220. The effective signal 5, in which the motion of the user is sensed by the sensing sensor 220, is overlapped with the interference signal 6. To ensure the accuracy of the passenger flow statistics, the processor 280 may filter the sensor data. Specifically, before the performing the living body detection on the sensor data, the step S140 may further include: the processor 280 filters the sensor data of each of the sensing sensors 220 to remove the interference signal 6; and performing the living body detection based on the filtered sensor data and the preset identification model.
S160: the processor 280 transmits the recognition result of each of the sensing sensors 220 to the control terminal 300.
As previously described, the recognition result may be stored in the processor 280. The control terminal 300 may obtain the recognition result from the processor 280 at intervals of a preset time. Specifically, step S160 may include: the processor 280 receives a wake-up signal sent by the control terminal 300 to the processor 280 based on a preset time period; the processor 280 sends the identification result to the control terminal 300 after receiving the wake-up signal.
Fig. 8 shows a flow chart of a method P200 for passenger flow statistics according to an embodiment of the present description. The method P200 of passenger flow statistics is applied to the control terminal 300. Specifically, the processor 320 in the control terminal 300 can read the instruction set stored in its local storage medium and then execute the method P200 of passenger flow statistics provided in the present specification according to the specification of the instruction set. The method P200 may comprise performing, by the control terminal 300:
s220: the control terminal 300 acquires the recognition result of each of the perception sensors 220 in each of the perception systems 200.
As previously described, a plurality of sensing systems 220 may be included in the measured area 003. The control terminal 300 may be in communication with each sensing system 200, and may acquire an identification result corresponding to each sensing sensor 220 of the at least one sensing sensor 220 in each sensing system 200. As previously described, the recognition result may be stored in the processor 280. The control terminal 300 may obtain the recognition result from the processor 280 at intervals of a preset time. Specifically, step S220 may include: the control terminal 300 acquires the identification result corresponding to each of the sensing sensors 220 based on a preset time period. That is, the control terminal 300 may send the wake-up signal to each sensing system 200 at intervals of the preset time; after receiving the wake-up signal, the sensing system 200 sends the identification result to the control terminal 300.
S240: the control terminal 300 determines the total number of users in the measured area based on the recognition result.
Specifically, step S240 may include:
s242: the control terminal 300 determines the number of living users in each of the plurality of target areas 202 based on the recognition result of each of the plurality of sensing sensors 220.
As described above, there can be only one living user at most within the detection range of each of the perception sensors 220. The recognition result of each sensing sensor 220 includes whether there is a living user in the detection range corresponding to the current sensing sensor 220. When the recognition result of the sensing sensor 220 indicates that there is a living user in the monitoring range of the sensing sensor 220, it indicates that there is a living user in the detection range of the sensing sensor 220. The control terminal 300 may determine the number of living users in each target region 202 according to the recognition result of each sensing sensor 220 in each sensing system 200.
S244: the control terminal 300 determines the total number of users based on the number of live users in each of the target areas 202.
Specifically, the control terminal 300 may determine the total number of users by superimposing the number of live users in each of the plurality of target regions 202. In some embodiments, the control terminal 300 may further adjust the number of users after the overlapping, that is, adjust the number by using an adjustment coefficient, so as to eliminate an error caused by the overlapping or the spacing of the target area 202. The adjustment coefficient may be obtained by machine learning.
In summary, the methods P100, P200 and system 001 for passenger flow statistics provided herein can monitor a plurality of target areas 202 in the tested area 002 through a plurality of sensing systems 200 in the tested area 003. Each perception system 200 monitors passenger flow in the corresponding target area 202 through at least one perception sensor 220, performs living body detection on data of each sensor through a preset identification model, and generates an identification result to determine whether a living body user exists in a detection range corresponding to each perception sensor 220. The control terminal 300 calculates the passenger flow volume in the target area according to the recognition result. The system 001, the method P100 and the method P200 can obtain the absolute value of the passenger flow in the detected area 003 by monitoring the passenger flow in the target areas 202 in the detected area 003 at the same time, but not measure the relative value of the passenger flow, and obtain the total passenger flow by accumulative calculation, thereby avoiding error accumulation and improving the accuracy of the passenger flow statistics. Meanwhile, the system 001, the method P100 and the method P200 may perform living body detection on the sensor data of each sensing sensor 220 to determine that the target object detected by the sensing sensor 220 is a living body user, so as to prevent a non-living body user from being counted in the passenger flow volume and improve the accuracy of passenger flow statistics.
Another aspect of the present disclosure provides a non-transitory storage medium storing at least one set of executable instructions for passenger flow statistics, which when executed by a processor, direct the processor 280 to perform the steps of the method P100 for passenger flow statistics described herein. In some possible implementations, various aspects of the description may also be implemented in the form of a program product including program code. The program code is adapted to cause the processor 280 to perform the steps of passenger flow statistics as described herein, when said program product is run on the perception system 200. A program product for implementing the above method may employ a portable compact disc read only memory (CD-ROM) including program code and may be run on the perception system 200. However, the program product of the present specification is not so limited, and in this specification, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system (e.g., the processor 280). The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. The computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. Program code for carrying out operations for this specification may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the perception system 200, partly on the perception system 200, as a stand-alone software package, partly on the perception system 200, partly on a remote computing device, or entirely on the remote computing device.
Another aspect of the present disclosure provides a non-transitory storage medium storing at least one set of executable instructions for passenger flow statistics, which when executed by a processor, direct the processor 320 to perform the steps of the method P200 for passenger flow statistics described herein. In some possible implementations, various aspects of the description may also be implemented in the form of a program product including program code. The program code is adapted to cause the processor 320 to perform the steps of passenger flow statistics described in this specification when said program product is run on the control terminal 300. A program product for implementing the above-described method may employ a portable compact disc read only memory (CD-ROM) including program code and may be run on the control terminal 300. However, the program product of the present specification is not so limited, and in this specification, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system (e.g., the processor 320). The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. The computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. Program code for carrying out operations for this specification may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the control terminal 300, partly on the control terminal 300, as a stand-alone software package, partly on the control terminal 300, partly on a remote computing device, or entirely on the remote computing device.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
In conclusion, upon reading the present detailed disclosure, those skilled in the art will appreciate that the foregoing detailed disclosure can be presented by way of example only, and not limitation. Those skilled in the art will appreciate that the present specification contemplates various reasonable variations, enhancements and modifications to the embodiments, even though not explicitly described herein. Such alterations, improvements, and modifications are intended to be suggested by this specification, and are within the spirit and scope of the exemplary embodiments of this specification.
Furthermore, certain terminology has been used in this specification to describe embodiments of the specification. For example, "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined as suitable in one or more embodiments of the specification.
It should be appreciated that in the foregoing description of embodiments of the specification, various features are grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the specification, for the purpose of aiding in the understanding of one feature. This is not to be taken as an admission that any of the features are required in combination, and it is fully possible for one skilled in the art to extract some of the features as separate embodiments when reading this specification. That is, embodiments in this specification may also be understood as an integration of a plurality of sub-embodiments. And each sub-embodiment described herein is equally applicable to less than all features of a single foregoing disclosed embodiment.
Each patent, patent application, publication of a patent application, and other material, such as articles, books, descriptions, publications, documents, articles, and the like, cited herein is hereby incorporated by reference. All matters hithertofore set forth herein except as related to any prosecution history, may be inconsistent or conflicting with this document or any prosecution history which may have a limiting effect on the broadest scope of the claims. Now or later associated with this document. For example, if there is any inconsistency or conflict in the description, definition, and/or use of terms associated with any of the included materials with respect to the terms, descriptions, definitions, and/or uses associated with this document, the terms in this document are used.
Finally, it should be understood that the embodiments of the application disclosed herein are illustrative of the principles of the embodiments of the present specification. Other modified embodiments are also within the scope of this description. Accordingly, the disclosed embodiments are to be considered in all respects as illustrative and not restrictive. Those skilled in the art may implement the applications in this specification in alternative configurations according to the embodiments in this specification. Therefore, the embodiments of the present description are not limited to the embodiments described precisely in the application.

Claims (23)

1. A system of passenger flow statistics, comprising:
a plurality of sensing systems distributed over a plurality of target areas within a region under test, each sensing system of the plurality of sensing systems comprising:
the system comprises at least one perception sensor, at least one sensor and a control unit, wherein each perception sensor in the at least one perception sensor monitors whether a target object exists in a corresponding monitoring range during operation and generates sensor data, and the target object comprises a living body user and a non-living body user; and
the at least one processor receives the sensor data sent by each perception sensor, performs living body detection on the sensor data based on a preset identification model, and generates an identification result to determine whether the target object is the living body user; and
and the control terminal is in communication connection with each sensing system during working, acquires the identification result of each sensing sensor in each sensing system, and determines the total number of users in the detected area based on the identification result.
2. The system of passenger flow statistics of claim 1, wherein the plurality of target areas at least partially cover the area under test.
3. The system of passenger flow statistics of claim 1, wherein the area under test is a dining area of a restaurant, and the monitoring range for each perception sensor includes dining chairs.
4. The system of passenger flow statistics of claim 1, wherein the recognition model is trained based on sample data corresponding to the living user and sample data corresponding to the non-living user, the sample data comprising a mapping of signal frequencies and signal amplitudes in the sensor data.
5. The system of passenger flow statistics of claim 4, wherein the in-vivo detection of the sensor data based on a preset recognition model comprises:
determining a mapping relation between the signal frequency and the signal amplitude of the sensor data in a preset time window; and
and performing living body detection on the mapping relation based on the identification model, and determining whether a perception sensor corresponding to the sensor data perceives the living body user.
6. The system of passenger flow statistics of claim 5, wherein the preset time window comprises a preset duration before the current time.
7. The system of passenger flow statistics of claim 4, wherein a maximum of one of the living users is within a monitoring range of each of the perception sensors.
8. The system of passenger flow statistics of claim 1, wherein the determining a total number of users within the area under test comprises:
determining a number of living users in each of the plurality of target regions based on the recognition result of the each perception sensor; and
determining the total number of users based on the number of live users in each of the target regions.
9. The system of passenger flow statistics of claim 1, wherein each perception system further comprises:
and the wireless communication module is used for establishing the communication connection with the control terminal during working, and the communication connection comprises wireless communication connection.
10. The system of passenger flow statistics of claim 1, wherein each perception system further comprises:
and the power supply module is electrically connected with the at least one perception sensor and the at least one processor during operation.
11. The system of passenger flow statistics of claim 1, wherein said obtaining said recognition result of said each perception sensor in said each perception system comprises:
and acquiring the identification result of each perception sensor based on a preset time period.
12. The system of passenger flow statistics of claim 1, wherein the at least one perception sensor comprises at least one of at least one radar sensor, at least one infrared sensor, at least one pressure sensor, at least one temperature sensor, at least one vibration sensor, and at least one electric field sensor.
13. A method of passenger flow statistics, applied to the system of passenger flow statistics recited in claim 1, comprising executing, by the at least one processor:
obtaining the sensor data for each of the perception sensors in the corresponding perception system;
performing in-vivo detection on the sensor data of each perception sensor based on the recognition model, and generating the recognition result of each perception sensor; and
and sending the identification result of each perception sensor to the control terminal.
14. The method of passenger flow statistics of claim 13, wherein the plurality of target areas at least partially cover the area under test.
15. The method of passenger flow statistics of claim 13, wherein the area under test is a dining area of a restaurant, and the monitoring range for each perception sensor includes dining chairs.
16. The method of passenger flow statistics according to claim 13, wherein the recognition model is obtained by training based on sample data corresponding to the living user and sample data corresponding to the non-living user, the sample data including a mapping relationship between signal frequency and signal amplitude in the sensor data.
17. The method of passenger flow statistics of claim 16, wherein the in-vivo detection of the sensor data of the each perception sensor based on the recognition model comprises:
determining a mapping relation between the signal frequency and the signal amplitude of the sensor data in a preset time window; and
and performing living body detection on the mapping relation based on the identification model, and determining whether a perception sensor corresponding to the sensor data perceives the living body user.
18. The method of passenger flow statistics of claim 17, wherein the preset time window comprises a preset duration before the current time.
19. The method of passenger flow statistics of claim 16, wherein there is at most one of the living users within the monitoring range of each of the perception sensors.
20. The method of passenger flow statistics according to claim 13, wherein said sending said recognition result of each said perception sensor to said control terminal comprises:
and sending the identification result of each perception sensor to the control terminal based on a preset time period.
21. A passenger flow statistics method applied to the passenger flow statistics system of claim 1, comprising the following steps executed by the control terminal:
acquiring the identification result of each perception sensor in each perception system; and
and determining the total number of users in the tested area based on the identification result.
22. The method of passenger flow statistics of claim 21, wherein the determining a total number of users within the area under test comprises:
determining a number of living users in each of the plurality of target regions based on the recognition result of the each perception sensor; and
determining the total number of users based on the number of live users in each of the target regions.
23. The method of passenger flow statistics of claim 21, wherein said obtaining said recognition result for said each perception sensor in said each perception system comprises:
and acquiring the identification result of each perception sensor based on a preset time period.
CN202110085015.2A 2021-01-22 2021-01-22 System and method for passenger flow statistics Active CN112785759B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110085015.2A CN112785759B (en) 2021-01-22 2021-01-22 System and method for passenger flow statistics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110085015.2A CN112785759B (en) 2021-01-22 2021-01-22 System and method for passenger flow statistics

Publications (2)

Publication Number Publication Date
CN112785759A true CN112785759A (en) 2021-05-11
CN112785759B CN112785759B (en) 2023-05-23

Family

ID=75758434

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110085015.2A Active CN112785759B (en) 2021-01-22 2021-01-22 System and method for passenger flow statistics

Country Status (1)

Country Link
CN (1) CN112785759B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114783088A (en) * 2022-04-20 2022-07-22 杭州天迈网络有限公司 Global travel industry data monitoring method

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6345839B1 (en) * 1997-01-13 2002-02-12 Furukawa Electronics Co., Ltd. Seat fitted with seating sensor, seating detector and air bag device
US20050008197A1 (en) * 2002-04-12 2005-01-13 Stmicroelectronics Ltd. Biometric sensor apparatus and methods
US20100106450A1 (en) * 2008-10-27 2010-04-29 Electronics And Telecommunications Research Institute Method and apparatus for sensing meal activity using pressure sensor
CN101706976A (en) * 2009-08-26 2010-05-12 深圳市飞瑞斯科技有限公司 Anti-trailing system and device based on number of video viewers
CN102054304A (en) * 2010-12-10 2011-05-11 深圳职业技术学院 Passenger flow trends acquiring device and system based on multisensor
US20130015946A1 (en) * 2011-07-12 2013-01-17 Microsoft Corporation Using facial data for device authentication or subject identification
CN104063933A (en) * 2014-06-30 2014-09-24 浙江大学 Canteen people counting system based on pressure induction floor tiles
CN105243418A (en) * 2015-11-18 2016-01-13 桂林电子科技大学 Pedestrian flow counting method and device based on infrared temperature measurement sensor
CN106805941A (en) * 2015-12-02 2017-06-09 许亚夫 A kind of continuous wave bioradar sign detection means
CN107348749A (en) * 2017-08-25 2017-11-17 深圳三七九美发生活有限公司 A kind of detection method of time counting cushion and its chair and shops's customer's stay time
CN107872776A (en) * 2017-12-04 2018-04-03 泰康保险集团股份有限公司 For the method, apparatus of Indoor Video, electronic equipment and storage medium
CN108876504A (en) * 2017-09-12 2018-11-23 北京旷视科技有限公司 A kind of unmanned selling system and its control method
CN109044298A (en) * 2018-09-12 2018-12-21 金陵科技学院 It is a kind of can long-range monitoring human vital sign unmanned plane device
US20190128552A1 (en) * 2017-11-01 2019-05-02 Carrier Corporation Biosome counting and device controlling for a predetermined space region
CN110012495A (en) * 2018-11-09 2019-07-12 阿里巴巴集团控股有限公司 A kind of demographic method, device and computer equipment
CN110045370A (en) * 2019-05-10 2019-07-23 成都宋元科技有限公司 Human perception method and its system based on millimetre-wave radar
CN110992678A (en) * 2019-12-23 2020-04-10 宁波市数极信息技术有限公司 Bus passenger flow statistical method based on big data face recognition
CN111553753A (en) * 2020-07-10 2020-08-18 支付宝(杭州)信息技术有限公司 Passenger flow statistical method and device and electronic equipment
CN111985297A (en) * 2020-06-16 2020-11-24 深圳数联天下智能科技有限公司 Human body existence detection method and device, storage medium and computer equipment

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6345839B1 (en) * 1997-01-13 2002-02-12 Furukawa Electronics Co., Ltd. Seat fitted with seating sensor, seating detector and air bag device
US20050008197A1 (en) * 2002-04-12 2005-01-13 Stmicroelectronics Ltd. Biometric sensor apparatus and methods
US20100106450A1 (en) * 2008-10-27 2010-04-29 Electronics And Telecommunications Research Institute Method and apparatus for sensing meal activity using pressure sensor
CN101706976A (en) * 2009-08-26 2010-05-12 深圳市飞瑞斯科技有限公司 Anti-trailing system and device based on number of video viewers
CN102054304A (en) * 2010-12-10 2011-05-11 深圳职业技术学院 Passenger flow trends acquiring device and system based on multisensor
US20130015946A1 (en) * 2011-07-12 2013-01-17 Microsoft Corporation Using facial data for device authentication or subject identification
CN104063933A (en) * 2014-06-30 2014-09-24 浙江大学 Canteen people counting system based on pressure induction floor tiles
CN105243418A (en) * 2015-11-18 2016-01-13 桂林电子科技大学 Pedestrian flow counting method and device based on infrared temperature measurement sensor
CN106805941A (en) * 2015-12-02 2017-06-09 许亚夫 A kind of continuous wave bioradar sign detection means
CN107348749A (en) * 2017-08-25 2017-11-17 深圳三七九美发生活有限公司 A kind of detection method of time counting cushion and its chair and shops's customer's stay time
CN108876504A (en) * 2017-09-12 2018-11-23 北京旷视科技有限公司 A kind of unmanned selling system and its control method
US20190128552A1 (en) * 2017-11-01 2019-05-02 Carrier Corporation Biosome counting and device controlling for a predetermined space region
CN107872776A (en) * 2017-12-04 2018-04-03 泰康保险集团股份有限公司 For the method, apparatus of Indoor Video, electronic equipment and storage medium
CN109044298A (en) * 2018-09-12 2018-12-21 金陵科技学院 It is a kind of can long-range monitoring human vital sign unmanned plane device
CN110012495A (en) * 2018-11-09 2019-07-12 阿里巴巴集团控股有限公司 A kind of demographic method, device and computer equipment
CN110045370A (en) * 2019-05-10 2019-07-23 成都宋元科技有限公司 Human perception method and its system based on millimetre-wave radar
CN110992678A (en) * 2019-12-23 2020-04-10 宁波市数极信息技术有限公司 Bus passenger flow statistical method based on big data face recognition
CN111985297A (en) * 2020-06-16 2020-11-24 深圳数联天下智能科技有限公司 Human body existence detection method and device, storage medium and computer equipment
CN111553753A (en) * 2020-07-10 2020-08-18 支付宝(杭州)信息技术有限公司 Passenger flow statistical method and device and electronic equipment
CN112232882A (en) * 2020-07-10 2021-01-15 支付宝(杭州)信息技术有限公司 Passenger flow statistical method and device and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114783088A (en) * 2022-04-20 2022-07-22 杭州天迈网络有限公司 Global travel industry data monitoring method

Also Published As

Publication number Publication date
CN112785759B (en) 2023-05-23

Similar Documents

Publication Publication Date Title
JP6962413B2 (en) Guidance system, guidance method and program
US11531082B2 (en) Device location network
CN103109244A (en) Method and apparatus for object tracking and recognition
US20210373919A1 (en) Dynamic user interface
US10943153B2 (en) Ultrasound analytics for actionable information
US11676360B2 (en) Assisted creation of video rules via scene analysis
Komai et al. Elderly person monitoring in day care center using Bluetooth Low Energy
US10205891B2 (en) Method and system for detecting occupancy in a space
US20220101709A1 (en) INDOOR OCCUPANCY ESTIMATION, TRAJECTORY TRACKING and EVENT MONITORING AND TRACKING SYSTEM
US11016129B1 (en) Voltage event tracking and classification
CN112785759B (en) System and method for passenger flow statistics
US20220319172A1 (en) Retroactive event detection in video surveillance
US20240005648A1 (en) Selective knowledge distillation
US20230111865A1 (en) Spatial motion attention for intelligent video analytics
US20180365741A1 (en) Method and apparatus for collecting voc
Feagin Jr et al. A Review of Existing Test Methods for Occupancy Sensors
CN112508618A (en) System and method for passenger flow statistics
US20220101066A1 (en) Reducing false detections for night vision cameras
CN109328308A (en) Position measurement system, mobile communication terminal and program
US20230360430A1 (en) Face detector using positional prior filtering
US20220230410A1 (en) Object localization in video
US20230351634A1 (en) Loop closure using multi-modal sensor data
US20240046485A1 (en) Real-motion prediction
CN115151939A (en) Information processing system and information processing method
KR20150071475A (en) Apparatus and method of moving direction decision using intergrated sensors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant