Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments, and all other embodiments obtained by those skilled in the art without making any inventive effort based on the embodiments of the present invention are within the scope of protection of the present invention.
As shown in fig. 1, an embodiment of the present invention provides a method for determining a probability that a pedestrian enters or leaves at a pixel point, where the method may include the following steps:
step 101: several frames of the video stream are acquired.
The video stream may be captured by an RGB camera, an RGBD camera, or the like. In a practical application scenario, different frames may be taken from the video stream according to a preset period, for example, one frame is taken every 20 ms.
Step 102: characteristic information of pedestrians in a plurality of frames is determined.
In the embodiment of the invention, pedestrians in each frame are identified through a target detection method in deep learning, and the pixel positions of the pedestrians are determined. Among them, the deep learning method includes, but is not limited to, the fast R-CNN algorithm, the YOLO algorithm (You Only Look Once, only at a glance), etc. In the target detection process, the pedestrian can be subjected to multi-point positioning, and the pixel position of the pedestrian is determined according to the multi-point position. The pixel location may be characterized by pixel coordinates and other identifications of the pixels. In addition, the number of pixels included in each frame is the same.
As shown in fig. 2, the frame includes 9 pixels, and the pedestrian is located at the 5 th pixel.
And determining the appearance characteristics of pedestrians in each frame according to the images of the pedestrians in each frame. Appearance features may include facial features location, contours, etc.
Step 103: and determining the pedestrian walking route and the initial probability of the pedestrian entering or leaving at the pixel point according to the characteristic information.
The method comprises two cases, namely, pedestrian entering and corresponding pedestrian entering probability diagrams; and secondly, the pedestrians leave, and the corresponding pedestrians leave probability map. The pixel points where pedestrians enter or leave correspond to entrances and exits in practical application scenes, such as elevator entrances, market entrances and the like.
The characteristic information includes: appearance characteristics and pixel location. Based on this, step 103 specifically includes:
a1: pedestrians in a plurality of frames are distinguished according to the appearance characteristics and the pixel positions.
In an actual application scene, pedestrians in a plurality of frames can be distinguished only according to appearance characteristics.
In step 102, although pedestrians in each frame can be detected by the target detection method, it cannot be determined whether pedestrians in different frames are the same person.
A1 specifically comprises:
a11: and matching the appearance characteristics and the pixel positions of the first pedestrian in the target frame with the appearance characteristics and the pixel positions of all pedestrians in other frames respectively, and determining the similarity of the first pedestrian and all pedestrians in other frames.
And when the other frames have multiple frames, matching the target frames with the other frames one by one. That is, the target frame is T1, and the other frames include T2, T3, and T4, and each pedestrian in T1 is compared with each pedestrian in T2, then each pedestrian in T3, and finally each pedestrian in T4. Only matching of two frames will be described as an example.
For example, pedestrians a, b are present in the target frame, and pedestrians e, f, g are present in the other frames. In distinguishing pedestrians in a target frame from other frames, the appearance characteristics and the pixel positions of a are matched with the appearance characteristics and the pixel positions of e, f and g respectively, and then the appearance characteristics and the pixel positions of b are matched with the appearance characteristics and the pixel positions of e, f and g respectively. For example, the appearance feature and the pixel position of a are matched with the appearance feature and the pixel position of e, and the similarity of the appearance feature and the pixel position is determined. In an actual application scene, feature vectors can be generated according to the appearance features and the pixel positions, so that the similarity of the feature vectors corresponding to two pedestrians is determined.
A12: when a second pedestrian exists in each pedestrian in other frames, determining that the second pedestrian and the first pedestrian are the same pedestrian; the similarity between the second pedestrian and the first pedestrian is larger than a preset similarity threshold.
A2: and fitting to obtain a pedestrian walking route according to the pixel positions of each pedestrian in each frame.
A2 specifically comprises:
a21: and counting the number of pedestrians appearing at each pixel point according to the pixel positions of each pedestrian in each frame.
A22: and determining the walking direction of the pedestrians appearing at each pixel point according to the pixel positions of each pedestrian in each frame.
The walking direction may be different in different application scenarios, e.g. in one scenario the walking direction comprises four directions and in another scenario the walking direction comprises eight directions.
As shown in fig. 3, the walking direction of the pedestrian includes eight possibilities based on the arrangement of the pixels.
A23: and fitting according to the number and the walking direction of pedestrians appearing at each pixel point to obtain a pedestrian walking route.
A23 specifically includes:
a231: and determining the number ratio of pedestrians in each walking direction of the target pixel point according to the number of pedestrians and the walking directions of the target pixel point.
For example, 10 pedestrians appear at the pixel point M, wherein 5 pedestrians walk eastward, and the number of pedestrians walking eastward is 1/2.
A232: and determining the fitting direction of the target pixel point according to the pedestrian quantity ratio of each walking direction of the target pixel point.
The walking direction with the highest number of pedestrians as the fitting direction can be used.
A233: fitting is carried out according to the fitting direction of each pixel point, and a pedestrian walking route is obtained.
In an actual application scene, in order to obtain a plurality of pedestrian walking routes, fitting can be performed according to the duty ratio level of the number of pedestrians to obtain different pedestrian walking routes.
In this case, the duty ratio level of each walking direction of the current pixel point is determined according to the number of pedestrians, and fitting is performed according to the walking direction of each pixel point with the same duty ratio level, so as to obtain the walking route of the pedestrian corresponding to different duty ratio levels.
A3: and determining the initial probability of the pedestrian entering or leaving at each pixel point according to the pixel position of each pedestrian in each frame.
A3 specifically comprises:
a31: and determining the pixel points of each pedestrian entering or leaving according to the pixel positions of each pedestrian in each frame.
A31 specifically comprises:
a311: when a third pedestrian appears in the first frame and no third row of people appears in the first number of frames after the first frame, determining that the pixel point corresponding to the pixel position of the third pedestrian in the first frame is the pixel point from which the third pedestrian leaves.
A312: and when a third pedestrian appears in the second frame and no third row of people appears in a second number of frames before the second frame, determining a pixel point corresponding to the pixel position of the third pedestrian in the second frame as the pixel point where the third pedestrian enters.
The first number and the second number are related to the actual application scene, and may be one or a plurality of.
For example, when the first number and the second number are both 1, the first frame and the second frame are two consecutive frames, referring to fig. 2, the pedestrian W appears at the first pixel point in the first frame, and the pedestrian W leaves at the first pixel point if the pedestrian W does not appear in the second frame.
A32: and determining the number of pedestrians entering or leaving at each pixel point according to the pixel points where each pedestrian enters or leaves.
Referring to fig. 2, the number of pedestrians entering at the pixel points 1 to 9 is 0, 1, 0, 3, 2, 6, 3, 9, 2, respectively.
A33: and determining the initial probability of the pedestrian entering or leaving at each pixel point according to the number of the pedestrians entering or leaving at each pixel point.
A33 specifically includes:
a331: and determining the adjustment coefficients corresponding to the number of pedestrians entering or leaving at each pixel point according to the corresponding relation between the preset number interval and the adjustment coefficients.
The adjustment coefficient corresponding to the number interval [0-5] is 0.8, and the adjustment coefficient corresponding to the number interval [6, 10] is 1.1.
The weight of the pixel points with more pedestrians entering or leaving can be increased by adjusting the coefficients, and the influence of shielding is reduced.
A332: and determining the total number of the pedestrians entering or leaving according to the number of the pedestrians entering or leaving at each pixel point and the corresponding adjustment coefficients thereof.
Total number of pedestrians entering= (1+3+2+3+2) ×0.8+ (6+9) ×1.1=25.3.
A333: and determining the initial probability of the entering or leaving of the pedestrians at each pixel point according to the number of the entering or leaving pedestrians at each pixel point and the total number of the entering or leaving pedestrians.
Probability of pedestrian entry at pixel 4 = 3/25.3.
Step 104: and correcting the initial probability of the pedestrian entering or leaving at the pixel point according to the pedestrian walking route to obtain the final probability of the pedestrian entering or leaving at the pixel point.
Because the camera is fixed in the in-process of shooting video stream, when pedestrian's quantity is great, probably there is the mutual condition of sheltering from of pedestrian, causes the statistics result to have the mistake. Along the above example, the pedestrian W may not leave at the 1 st pixel, but is blocked by the pedestrian Q appearing in other frames, and since the pedestrian W is blocked, only the pedestrian Q is recognized in the target detection process, so that the subsequent statistical result is unpaired. For example, the probability that a pedestrian leaves at the 1 st probability point is error-prone.
In view of this, the present application corrects the initial probability of a pedestrian entering or exiting at a pixel point by a pedestrian walking route.
Step 104 specifically includes:
b1: and determining the pixel points to be corrected according to the walking route of the pedestrian.
Taking a pedestrian walking route as an example, determining a correction area according to the end point of the pedestrian walking route, wherein the pixel points in the correction area are the pixel points to be corrected. The correction area may be constituted by a circle passing through the end point of the pedestrian travel route, or by other geometric shapes passing through the end point of the pedestrian travel route, without limitation.
In an actual application scene, the pixel to be corrected may also be determined in other manners, for example, the pixel to be corrected is a pixel located on a pedestrian walking path and having a pedestrian entering or exiting.
B2: and correcting the initial probability of the pedestrian entering or leaving at the pixel point to be corrected according to the preset correction amplitude to obtain the final probability of the pedestrian entering or leaving at the pixel point.
Different pedestrian walking routes may correspond to different correction amplitudes.
The final probability that the pedestrian enters or leaves the pixel to be corrected is larger than the initial probability that the pedestrian enters or leaves the pixel to be corrected through the correction amplitude.
Referring to the explanation in a233, when there are a plurality of pedestrian walking routes, the correction amplitude is set corresponding to the duty ratio level (or the number of pedestrians is duty ratio), and the correction amplitude is different for the pixels to be corrected corresponding to the different duty ratio levels (or the number of pedestrians is duty ratio). For example, a larger pedestrian number ratio corresponds to a correction amplitude that is larger than a smaller pedestrian number ratio.
The method corrects the initial probability of the pedestrian entering or leaving at the pixel point by utilizing the pedestrian walking route, and can reduce the influence caused by pixel point error statistics caused by shielding and crossing of the pedestrian.
As shown in fig. 4, an embodiment of the present invention provides a pedestrian number counting method, including:
step 401: several frames of the video stream are acquired.
Step 402: characteristic information of pedestrians in a plurality of frames is determined.
Step 403: and determining the pedestrian walking route and the initial probability of the pedestrian entering or leaving at the pixel point according to the characteristic information.
Step 404: and correcting the initial probability of the pedestrian entering or leaving at the pixel point according to the pedestrian walking route to obtain the final probability of the pedestrian entering or leaving at the pixel point.
The contents of steps 401 to 404 have been described in the foregoing embodiments, and are not described herein.
Step 405: and counting the number of pedestrians entering or leaving in a monitoring area preset in a future time period according to the final probability of the pedestrians entering or leaving in the pixel points.
It should be noted that the future time period is relative to the time corresponding to the acquired frames, for example, the time period corresponding to the acquired frames is 8 to 9 points yesterday, and the future time period may be 8 to 9 points today.
Step 405 specifically includes:
c1: when the current pedestrian enters or leaves at the current pixel point in the future time period, determining the final probability of the pedestrian entering or leaving at the current pixel point according to the final probability of the pedestrian entering or leaving at the pixel point.
The method for detecting whether the current pedestrian enters or leaves at the current pixel point has been described in the foregoing embodiment, please refer to a31.
C2: when the final probability of entering or leaving the pedestrian at the current pixel point and the random number generated by the preset random function meet the preset increasing condition, adding 1 to the number of the entering or leaving pedestrians in the monitoring area.
For example, the final probability of the pedestrian entering or leaving at the current pixel point is 0.8, the random number generated by the random function is 0.7, and if the addition condition is that the final probability of the pedestrian entering or leaving at the current pixel point is larger than the random number generated by the random function, the number of the pedestrians entering or leaving in the monitoring area is increased by 1.
Of course, the final probability of the pedestrian entering or leaving at the current pixel point can also be directly used as the pedestrian number, i.e. the pedestrian number entering or leaving in the monitoring area is added with the final probability of the pedestrian entering or leaving at the current pixel point.
Or judging whether the final probability of entering or leaving the pedestrian at the current pixel point is larger than a preset probability threshold value, and adding 1 to the number of the entering or leaving pedestrians in the monitoring area when the final probability of entering or leaving the pedestrian at the current pixel point is larger than the probability threshold value.
The method corrects the initial probability of the pedestrian entering or leaving at the pixel point by utilizing the pedestrian walking route, and can reduce the influence caused by pixel point error statistics caused by shielding and crossing of the pedestrian. The pedestrian quantity is counted according to the final probability of the pedestrian entering or leaving at the pixel point obtained through correction, so that the reliability of a counting result can be improved, and the labor cost can be reduced.
The embodiment of the invention provides a pedestrian number counting method, which comprises the following steps of:
s1: several frames of the video stream are acquired.
S2: determining characteristic information of pedestrians in a plurality of frames, wherein the characteristic information comprises: appearance characteristics and pixel location.
S3: and matching the appearance characteristics and the pixel positions of the first pedestrian in the target frame with the appearance characteristics and the pixel positions of all pedestrians in other frames respectively, and determining the similarity of the first pedestrian and all pedestrians in other frames.
S4: when a second pedestrian exists in each pedestrian in other frames, determining that the second pedestrian and the first pedestrian are the same pedestrian; the similarity between the second pedestrian and the first pedestrian is larger than a preset similarity threshold.
S5: and counting the number of pedestrians appearing at each pixel point according to the pixel positions of each pedestrian in each frame.
S6: and determining the walking direction of the pedestrians appearing at each pixel point according to the pixel positions of each pedestrian in each frame.
S7: and determining the number ratio of pedestrians in each walking direction of the target pixel point according to the number of pedestrians and the walking directions of the target pixel point.
S8: and determining the fitting direction of the target pixel point according to the pedestrian quantity ratio of each walking direction of the target pixel point.
S9: fitting is carried out according to the fitting direction of each pixel point, and a pedestrian walking route is obtained.
S10: when a third pedestrian appears in the first frame and no third row of people appears in the first number of frames after the first frame, determining that the pixel point corresponding to the pixel position of the third pedestrian in the first frame is the pixel point from which the third pedestrian leaves.
S11: and when a third pedestrian appears in the second frame and no third row of people appears in a second number of frames before the second frame, determining a pixel point corresponding to the pixel position of the third pedestrian in the second frame as the pixel point where the third pedestrian enters.
S12: and determining the number of pedestrians entering or leaving at each pixel point according to the pixel points where each pedestrian enters or leaves.
S13: and determining the adjustment coefficients corresponding to the number of pedestrians entering or leaving at each pixel point according to the corresponding relation between the preset number interval and the adjustment coefficients.
S14: and determining the total number of the pedestrians entering or leaving according to the number of the pedestrians entering or leaving at each pixel point and the corresponding adjustment coefficients thereof.
S15: and determining the initial probability of the entering or leaving of the pedestrians at each pixel point according to the number of the entering or leaving pedestrians at each pixel point and the total number of the entering or leaving pedestrians.
S16: and determining the pixel points to be corrected according to the walking route of the pedestrian.
S17: and correcting the initial probability of the pedestrian entering or leaving at the pixel point to be corrected according to the preset correction amplitude to obtain the final probability of the pedestrian entering or leaving at the pixel point.
S18: when the current pedestrian enters or leaves at the current pixel point in the future time period, determining the final probability of the pedestrian entering or leaving at the current pixel point according to the final probability of the pedestrian entering or leaving at the pixel point.
S19: when the final probability of entering or leaving the pedestrian at the current pixel point and the random number generated by the preset random function meet the preset increasing condition, adding 1 to the number of the entering or leaving pedestrians in the monitoring area.
As shown in fig. 5, an embodiment of the present invention provides an apparatus for determining a probability of a pedestrian entering or exiting at a pixel point, including:
an acquisition unit 501, configured to acquire a number of frames of a video stream;
a determining unit 502 for determining characteristic information of pedestrians in a plurality of frames;
a pedestrian tracking unit 503, configured to determine an initial probability of a pedestrian walking route and a pedestrian entering or exiting at a pixel point according to the feature information;
and the correcting unit 504 is configured to correct the initial probability of the pedestrian entering or exiting at the pixel point according to the pedestrian walking route, so as to obtain the final probability of the pedestrian entering or exiting at the pixel point.
In one embodiment of the invention, the characteristic information includes: appearance features and pixel locations;
a pedestrian tracking unit 503 for distinguishing pedestrians in a plurality of frames according to the appearance characteristics and the pixel positions; fitting to obtain a pedestrian walking route according to the pixel positions of each pedestrian in each frame; and determining the initial probability of the pedestrian entering or leaving at each pixel point according to the pixel position of each pedestrian in each frame.
In one embodiment of the present invention, the pedestrian tracking unit 503 is configured to match the appearance feature and the pixel position of the first pedestrian in the target frame with the appearance feature and the pixel position of each pedestrian in the other frames, and determine the similarity between the first pedestrian and each pedestrian in the other frames; when a second pedestrian exists in each pedestrian in other frames, determining that the second pedestrian and the first pedestrian are the same pedestrian; the similarity between the second pedestrian and the first pedestrian is larger than a preset similarity threshold.
In one embodiment of the present invention, the pedestrian tracking unit 503 is configured to count the number of pedestrians occurring at each pixel point according to the pixel position of each pedestrian in each frame; determining the walking direction of pedestrians appearing at each pixel point according to the pixel position of each pedestrian in each frame; and fitting according to the number and the walking direction of pedestrians appearing at each pixel point to obtain a pedestrian walking route.
In one embodiment of the present invention, the pedestrian tracking unit 503 is configured to determine the number of pedestrians in each traveling direction of the target pixel according to the number of pedestrians and traveling directions of the pedestrians occurring in the target pixel; determining the fitting direction of the target pixel point according to the pedestrian quantity ratio of each walking direction of the target pixel point; fitting is carried out according to the fitting direction of each pixel point, and a pedestrian walking route is obtained.
In one embodiment of the present invention, the pedestrian tracking unit 503 is configured to determine, according to the pixel positions of the pedestrians in each frame, the pixel points where the pedestrians enter or leave; according to the pixel points where each pedestrian enters or leaves, determining the number of pedestrians entering or leaving at each pixel point; and determining the initial probability of the pedestrian entering or leaving at each pixel point according to the number of the pedestrians entering or leaving at each pixel point.
In one embodiment of the present invention, the pedestrian tracking unit 503 is configured to determine that a pixel point corresponding to a pixel position of a third pedestrian in the first frame is a pixel point from which the third pedestrian leaves when the third pedestrian appears in the first frame and no third row of people appears in the first number of frames after the first frame; and when a third pedestrian appears in the second frame and no third row of people appears in a second number of frames before the second frame, determining a pixel point corresponding to the pixel position of the third pedestrian in the second frame as the pixel point where the third pedestrian enters.
In one embodiment of the present invention, the pedestrian tracking unit 503 is configured to determine, according to a correspondence between a preset number interval and adjustment coefficients, adjustment coefficients corresponding to the number of pedestrians entering or leaving at each pixel point; determining the total number of pedestrians entering or leaving according to the number of pedestrians entering or leaving at each pixel point and the corresponding adjustment coefficients thereof; and determining the initial probability of the entering or leaving of the pedestrians at each pixel point according to the number of the entering or leaving pedestrians at each pixel point and the total number of the entering or leaving pedestrians.
In one embodiment of the present invention, the correction unit 504 is configured to determine a pixel to be corrected according to a walking path of a pedestrian; and correcting the initial probability of the pedestrian entering or leaving at the pixel point to be corrected according to the preset correction amplitude to obtain the final probability of the pedestrian entering or leaving at the pixel point.
As shown in fig. 6, an embodiment of the present invention provides a pedestrian number counting apparatus, including:
an acquisition unit 601, configured to acquire a plurality of frames of a video stream;
a determining unit 602, configured to determine characteristic information of pedestrians in a plurality of frames;
the pedestrian tracking unit 603 is configured to determine, according to the feature information, an initial probability that a pedestrian walks along a route and the pedestrian enters or leaves at a pixel point;
The correcting unit 604 is configured to correct an initial probability of the pedestrian entering or exiting at the pixel according to the pedestrian walking route, so as to obtain a final probability of the pedestrian entering or exiting at the pixel;
the statistics unit 605 is configured to count the number of pedestrians entering or leaving in a monitoring area preset in a future time period according to the final probability of the pedestrians entering or leaving in the pixel point.
In one embodiment of the present invention, the statistics unit 605 is configured to determine, when it is detected that a current pedestrian enters or leaves at a current pixel point in a future period of time, a final probability of the pedestrian entering or leaving at the current pixel point according to a final probability of the pedestrian entering or leaving at the pixel point; when the final probability of entering or leaving the pedestrian at the current pixel point and the random number generated by the preset random function meet the preset increasing condition, adding 1 to the number of the entering or leaving pedestrians in the monitoring area.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present application.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.