CN107610151B - Pedestrian trajectory processing method/system, computer-readable storage medium and device - Google Patents

Pedestrian trajectory processing method/system, computer-readable storage medium and device Download PDF

Info

Publication number
CN107610151B
CN107610151B CN201710734605.7A CN201710734605A CN107610151B CN 107610151 B CN107610151 B CN 107610151B CN 201710734605 A CN201710734605 A CN 201710734605A CN 107610151 B CN107610151 B CN 107610151B
Authority
CN
China
Prior art keywords
pedestrian
line
block
detection frame
trajectory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710734605.7A
Other languages
Chinese (zh)
Other versions
CN107610151A (en
Inventor
姚磊
王作辉
袁德胜
游浩泉
陈子健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Winner Technology Co ltd
Original Assignee
Winner Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Winner Technology Co ltd filed Critical Winner Technology Co ltd
Priority to CN201710734605.7A priority Critical patent/CN107610151B/en
Publication of CN107610151A publication Critical patent/CN107610151A/en
Application granted granted Critical
Publication of CN107610151B publication Critical patent/CN107610151B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a pedestrian trajectory line processing method/system, a computer-readable storage medium and equipment, wherein the pedestrian trajectory line processing method comprises the following steps: acquiring image data to be processed; detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered trajectory lines by using pedestrian trajectory information contained in the pedestrian detection frame; blocking the pedestrian detection frame to generate tracking trajectory lines, the scattered trajectory lines being connected by the tracking trajectory lines to form initial pedestrian trajectory lines; and refining the initial pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed. The invention selects to process the pedestrian movement track line in one time period at one time, can effectively improve the stability of the pedestrian movement track line, keeps high tracking precision for scenes in which a large number of pedestrians simultaneously appear, and obtains the pedestrian movement track line with high quality.

Description

Pedestrian trajectory processing method/system, computer-readable storage medium and device
Technical Field
The invention belongs to the technical field of image processing, relates to a processing method and a processing system, and particularly relates to a pedestrian trajectory processing method/system, a computer-readable storage medium and equipment.
Background
In the O2O era, business passenger flow analysis was increasingly valued by more enterprises. Shopping malls, brand chains, supermarkets, etc. as retail establishments in direct contact with customers can provide a large amount of high-value business data.
The customer's behavioral trajectories provide factual basis for business data analysis and mining, such as what movement trajectories the customer has in the shopping mall, which shopping areas the customer is interested in, whether the promotional program increases customer traffic, and the like. Thorough understanding of customer behavior is a key to improving enterprise competitiveness and enhancing profitability.
The passenger flow counting system is used as a data providing end for commercial passenger flow analysis, and one important investigation index is the accuracy of the passenger flow data. Due to the fact that actual application scenes are complex and changeable, the pedestrian movement trajectory line cannot be stabilized and accurate in the prior art, and the tracking accuracy of the passenger flow counting system is not high.
Therefore, the present invention provides a method/system for processing a pedestrian trajectory, a computer-readable storage medium and a device thereof, so as to solve the defects that in the prior art, under the condition of complicated and variable practical application scenarios, a pedestrian movement trajectory cannot be stably and accurately obtained, and the tracking accuracy of a passenger flow counting system is not high, and the like, and thus the method/system and the device thereof become technical problems to be urgently solved by those skilled in the art.
Disclosure of Invention
In view of the above drawbacks of the prior art, an object of the present invention is to provide a method/system, a computer-readable storage medium, and a device for processing a pedestrian trajectory, which are used to solve the problems in the prior art that a pedestrian movement trajectory cannot be stably and accurately determined and the tracking accuracy of a passenger flow counting system is not high in the case that an actual application scenario is complicated and changeable.
To achieve the above and other related objects, an aspect of the present invention provides a pedestrian trajectory line processing method, including: acquiring image data to be processed; detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered trajectory lines by using pedestrian trajectory information contained in the pedestrian detection frame; blocking the pedestrian detection frame to generate tracking trajectory lines, the scattered trajectory lines being connected by the tracking trajectory lines to form initial pedestrian trajectory lines;
and refining the initial pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed.
In an embodiment of the present invention, in the process of acquiring the image data to be processed, the pedestrian trajectory processing method includes:
acquiring image data from an image acquisition device within a preset acquisition time; and calculating a foreground image of the image data to acquire image data to be processed comprising the image data and the foreground image.
In an embodiment of the present invention, the step of detecting the image data to be processed and extracting the pedestrian detection frame from the image data to be processed includes: pedestrian detection based on the features of the directional gradient histograms is carried out on all image data in the image data to be processed so as to detect a pedestrian detection frame of the image data; and screening the pedestrian detection frames by using the foreground proportion of the foreground image so as to extract the pedestrian detection frames with the foreground proportion larger than a preset foreground proportion threshold value.
In an embodiment of the present invention, the step of performing pedestrian detection based on histogram of oriented gradient features on all image data in the image data to be processed includes: carrying out color space normalization processing on the image data to form preprocessed image data; the color space normalization processing comprises image graying and gamma correction of the image data; acquiring the gradient and the gradient direction of the preprocessed image data; dividing the preprocessed image data into a plurality of image units, and counting a gradient histogram of each image unit; forming a plurality of image units into image blocks, and connecting the feature vectors of all the image units in each image block in series to obtain the directional gradient histogram feature of the image block; combining all the directional gradient histogram features in the image data to form a feature vector of the image data for representing the image data; distinguishing a feature vector of the image data by using a support vector machine classifier to detect a pedestrian detection frame of the image data; the pedestrian detection frame comprises the size and the confidence of the currently detected pedestrian detection frame and/or the current detection moment in the preset acquisition time.
In an embodiment of the present invention, the step of generating the scattered trajectory line by using the pedestrian trajectory information included in the pedestrian detection frame includes: connecting the pedestrian detection frames which accord with the connection rule together by using the pedestrian detection frames to form scattered track lines; the scattered trajectory line contains pedestrian trajectory line information in a preset acquisition time, and the scattered trajectory line comprises the following steps: the new attribute message comprises the position coordinates of the ith scattered trajectory line at the current detection time, the confidence coefficient of the ith scattered trajectory line and the size of the pedestrian detection frame currently detected; wherein the connection rule includes: ensuring that the time difference between the pedestrian detection frame at the current moment and the pedestrian detection frame at the previous moment is within three frames; the change of the front and back movement directions of the pedestrian detection frame does not exceed a preset angle; the confidence of the pedestrian detection frame at each moment is not lower than the confidence threshold.
In an embodiment of the present invention, the step of tracking the pedestrian detection frame by the block to generate the tracking trajectory line includes:
performing image segmentation on the pedestrian detection frame at the current moment according to the head, the body, the left arm, the right arm and the legs to form a head-shoulder segment, a body segment, a left arm segment, a right arm segment and a leg segment; wherein, the pedestrian detection frame at the current moment has the tail end of a scattered track line; the head and shoulder sub-blocks adopt a first tracking mode, and the body sub-blocks, the left arm sub-blocks, the right arm sub-blocks and the leg sub-blocks adopt a second tracking mode to track respectively so as to acquire the positions of the update sub-blocks of the head and shoulder sub-blocks, the body sub-blocks, the left arm sub-blocks, the right arm sub-blocks and the leg sub-blocks in a pedestrian detection frame at the next moment; calculating the center of the pedestrian detection frame at the next moment according to the positions of the update blocks of the head-shoulder block, the body block, the left arm block, the right arm block and the leg block in the pedestrian detection frame at the next moment and the relative displacement of the centers of the head-shoulder block, the body block, the left arm block, the right arm block and the leg block and the pedestrian detection frame at the current moment; calculating the offset of each updating block from the center of the pedestrian detection frame at the next moment; if the offsets of the head-shoulder block, the body part block, the left arm block, the right arm block, the leg block and the center of the pedestrian detection frame at the current moment respectively satisfy the predetermined offset determination condition with the offsets of each update block and the center of the pedestrian detection frame at the next moment, correcting each update block to be the initial position of the next block tracking: circularly executing the steps, connecting the updated pedestrian detection frames after the updating of the pedestrian detection frame is finished to form a tracking track line, searching the head end of another zero track line matched with the tail end of the tracking track line in the updated pedestrian detection frame, if the head end of another zero track line is matched with the tail end of the tracking track line, connecting the tracking track line with the other zero track line, and continuing the block tracking at the tail end of the other zero track line; and if not, continuously updating the pedestrian detection frame.
In an embodiment of the present invention, a calculation formula for calculating the center of the pedestrian detection frame at the next time is as follows: c ═ Σ (y)i+di)×wi(ii) a Wherein C represents the center of the pedestrian detection frame at the next moment; i represents the serial numbers of the head and shoulder block, the body block, the left arm block, the right arm block and the leg block; y isiThe positions of the update blocks of the head and shoulder blocks, the body blocks, the left arm blocks, the right arm blocks and the leg blocks in the pedestrian detection frame at the next moment are represented; diRepresenting the relative displacement of the head and shoulder sub-block, the body sub-block, the left arm sub-block, the right arm sub-block, the leg sub-block and the center of the pedestrian detection frame at the current moment; w is aiAnd representing the corresponding weights of the head and shoulder block, the body block, the left arm block, the right arm block and the leg block.
In an embodiment of the present invention, the predetermined offset determination condition is:
Figure BDA0001387836260000031
wherein z isiAn offset of an update block representing a head-shoulder block, a body block, a left arm block, a right arm block, a leg block from the center of the pedestrian detection frame at the next time, diAnd a relative displacement between the center of the pedestrian detection frame at the current time and the center of the head-shoulder segment, the body segment, the left-arm segment, the right-arm segment, and the leg segment is represented.
In an embodiment of the invention, the pedestrian trajectory processing method further includes marking a state of the scattered trajectory as a completed state when all the scattered trajectory lines are connected pairwise by the generated tracking trajectory lines or are connected with the tracking trajectory lines only, and otherwise, marking the state as an uncompleted state.
In an embodiment of the present invention, the step of refining the pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed includes: smoothing the initial pedestrian trajectory line; cutting off the initial pedestrian trajectory line after the smoothing treatment; and according to a preset reconnection judgment condition, reconnecting the cut initial pedestrian trajectory line to form a real pedestrian trajectory line.
In an embodiment of the invention, the step of smoothing the initial pedestrian trajectory line includes: and calculating a spline curve by taking the coordinate points on the initial pedestrian trajectory line as control points, and enabling the spline curve to replace the initial pedestrian trajectory line.
In an embodiment of the invention, the step of cutting the smoothed initial pedestrian trajectory line includes: setting a sliding window, and judging whether the length of the spline curve is greater than the length of a preset window; if not, not cutting off the spline curve; if yes, executing the next step; intercepting a part of spline curve with the length of a preset window, acquiring the average motion direction of the front half part of spline curve and the average motion direction of the rear half part of spline curve, and setting the point of the half part of spline curve as a cutting point to form a cutting line if the included angle of the average motion directions of the front half part of spline curve and the rear half part of spline curve is larger than a preset cutting included angle; and sliding the sliding window backwards for a preset window length, and returning to judge whether the length of the spline curve after the sliding of the preset window length is greater than the preset window length.
In an embodiment of the present invention, the step of reconnecting the cut initial pedestrian trajectory line according to a predetermined reconnection determination condition to form a real pedestrian trajectory line includes: setting the cutting line to be in an initial state;
searching two cutting lines in an initial state, judging whether the cutting lines in the initial state meet the preset reconnection judgment condition, if so, connecting the cutting lines in pairs to form a real pedestrian trajectory line; if not, continuing searching;
in an embodiment of the present invention, the predetermined reconnection determination condition includes: time determination conditions: the tail end time of the cutting line searched first is prior to the head end time of the cutting line searched later; judging conditions of the included angle of the moving direction: the included angle between the motion direction of the cutting line searched first and the motion direction of the cutting line searched later is smaller than the preset motion direction; if the time judgment condition and the motion direction included angle judgment condition are met, judging the cut-off lines searched later as candidate connecting lines; distance determination conditions: and searching the cutting line of which the distance between the tail end of the cutting line searched first and the head end of the candidate connecting line is smaller than a preset distance threshold value in the candidate connecting line.
In another aspect, the present invention provides a pedestrian trajectory line processing system, including: the acquisition module is used for acquiring image data to be processed; the detection module is used for detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered track lines by using pedestrian track information contained in the pedestrian detection frame; a track line initial forming module, which is used for tracking the pedestrian detection frame in blocks to generate tracking track lines, and connecting the scattered track lines through the tracking track lines to form initial pedestrian track lines; and the refining module is used for performing refining processing on the initial pedestrian trajectory line so as to obtain a real pedestrian trajectory line in the image data to be processed.
Yet another aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the pedestrian trajectory line processing method.
A final aspect of the invention provides an apparatus comprising: a processor and a memory; the memory is used for storing a computer program, and the processor is used for executing the computer program stored by the memory so as to cause the equipment to execute the pedestrian trajectory line processing method
As described above, the pedestrian trajectory line processing method/system, the computer-readable storage medium, and the device of the present invention have the following advantageous effects:
the pedestrian trajectory processing method/system, the computer-readable storage medium and the equipment select to process the pedestrian movement trajectory in a time period at one time, can effectively improve the stability of the pedestrian movement trajectory, keep high tracking precision for scenes in which a large number of pedestrians simultaneously appear, and obtain the high-quality pedestrian movement trajectory.
Drawings
Fig. 1A is a flow chart illustrating a pedestrian trajectory processing method according to an embodiment of the invention.
Fig. 1B is a schematic flow chart of S12 in the pedestrian trajectory processing method according to the present invention.
Fig. 1C is a schematic flow chart of S13 in the pedestrian trajectory processing method according to the present invention.
Fig. 1D is a schematic flow chart of S14 in the pedestrian trajectory processing method according to the present invention.
Fig. 2 is a block diagram of the pedestrian detection frame according to the present invention.
Figure 3 shows a schematic connection of the neutral trace of the present invention.
FIG. 4 is a schematic diagram of the initial pedestrian trajectory without refinement according to the present invention.
Fig. 5 is a schematic diagram of the trace line cutting of the present invention.
FIG. 6 is a schematic diagram of a refined pedestrian trajectory according to the present invention.
FIG. 7 is a schematic diagram of a pedestrian trajectory processing system according to an embodiment of the present invention.
Description of the element reference numerals
7 pedestrian trajectory processing system
71 acquisition module
72 detection module
73 track line initial forming module
74 refinement module
S11-S14
S121 to S123
S131 to S136 steps
S141 to S143 steps
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
Example one
The embodiment provides a pedestrian trajectory processing method, which comprises the following steps:
acquiring image data to be processed;
detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered trajectory lines by using pedestrian trajectory information contained in the pedestrian detection frame;
blocking the pedestrian detection frame to generate tracking trajectory lines, the scattered trajectory lines being connected by the tracking trajectory lines to form initial pedestrian trajectory lines;
and refining the initial pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed.
The pedestrian trajectory line processing method provided by the present embodiment will be described in detail below with reference to the drawings. Referring to fig. 1A, a flow chart of a pedestrian trajectory line processing method in an embodiment is shown. As shown in fig. 1A, the pedestrian trajectory line processing method includes the following steps:
and S11, acquiring the image data to be processed. In the present embodiment, image data to be processed after decoding from a video stream acquired by an image data acquisition apparatus (e.g., a video camera).
Specifically, this step includes acquiring image data I from the image acquisition device for a predetermined acquisition time TTAnd calculating the image data ITIs FG of the foreground imageTTo obtain image data to be processed including the image data and the foreground image D ═ { I ═ IT,FGT}. In the present embodiment, the image data I is calculated using a single Gaussian modelTIs FG of the foreground imageT. The single-Gaussian background modeling is a commonly used background modeling algorithm, namely, a corresponding single-Gaussian distribution model is established independently for the color distribution of each pixel point in an image.
S12, detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered track lines by using pedestrian track information contained in the pedestrian detection frame.
Please refer to fig. 1B, which shows a schematic flow chart of S12. As shown in fig. 1B, the S12 includes the following steps:
and S121, performing pedestrian detection based on histogram of oriented gradient features (HOG features) on all image data in the image data to be processed to detect a pedestrian detection frame of the image data.
Specifically, the S121 includes the following:
firstly, carrying out color space normalization processing on the image data to form preprocessed image data; the color space normalization process includes image graying and gamma correction of the image data.
In the present embodiment, image graying refers to converting RGB components into a grayscale image for a color image, and the conversion formula is:
gray ═ 0.3R + 0.59G + 0.11B formula (1)
Gamma correction is a process of increasing or decreasing the brightness of the entire image by Gamma correction when the illuminance of the image is not uniform. In practice, the Gamma normalization, square root, logarithmic, can be performed in two different ways. For example, the square root approach, the formula is as follows (where γ ═ 0.5):
Y(x,y)=I(x,y)γformula (2)
Second, the gradient of the preprocessed image data and its gradient direction are obtained. In this embodiment, the calculation is performed in the horizontal and vertical directions, respectively, and the gradient operator is: horizontal direction: [ -101](ii) a Vertical direction: [ -101]T
Gx(x, y) ═ I (x +1, y) -I (x-1, y) formula (3)
Gy(x, y) ═ I (x, y +1) -I (x, y-1) formula (4)
Figure BDA0001387836260000071
Figure BDA0001387836260000072
Thirdly, dividing the preprocessed image data into a plurality of image units (ceil), and counting a gradient histogram of each image unit.
Fourthly, a plurality of image units are combined into image blocks (blocks), and the feature vectors of all the image units in each image block are connected in series to obtain the directional gradient histogram feature (HOG feature) of the image block.
And fifthly, combining all the directional gradient histogram features in the image data to form a feature vector of the image data, wherein the feature vector is used for representing the image data.
Sixthly, a feature vector of the image data is discriminated by using a Support Vector Machine (SVM) classifier to detect a pedestrian detection frame P of the image datai(ii) a Pedestrian detection frame
Figure BDA0001387836260000073
Containing pedestrian trajectory information, i.e. the currently detected position, the confidence level and/or the currently detected instant T within a predetermined acquisition time T, wherein,
Figure BDA0001387836260000074
the pedestrian detection frame size indicating the current detection time t,
Figure BDA0001387836260000075
and the confidence level of the pedestrian detection frame at the current detection time t is shown.
S122, utilizing the foreground image FGTThe pedestrian detection frames are screened to extract pedestrian detection frames with foreground proportion larger than a preset foreground proportion threshold.
And S123, connecting the pedestrian detection frames which accord with the connection rule together by using the pedestrian track information contained in the pedestrian detection frames to form scattered track lines. The scattered trajectory line contains pedestrian trajectory line information in a preset acquisition time, and the scattered trajectory line comprises the following steps: k scattered trajectory lines, attribute information of each scattered trajectory line at the current detection time, the new attribute message including the position coordinates of the ith scattered trajectory line at the current detection time and the confidence coefficient of the ith scattered trajectory line
Figure BDA0001387836260000081
And the size of the pedestrian detection frame currently detected
Figure BDA0001387836260000082
Wherein the connection rule includes:
ensuring that the time difference between the pedestrian detection frame at the current moment and the pedestrian detection frame at the previous moment is within three frames;
the change of the front and back movement direction of the pedestrian detection frame does not exceed a predetermined angle (in the present embodiment, the predetermined angle is 60 degrees);
the confidence of the pedestrian detection frame at each time is not lower than the confidence threshold (in this embodiment, the confidence threshold is 0.9). The connection rule in particular records each zeroLocation of head and tail of stray trajectory
Figure BDA0001387836260000083
Figure BDA0001387836260000084
Head coordinates for the ith scattered trajectory line,
Figure BDA0001387836260000085
is the end coordinate of the ith scattered trajectory line.
S13, blocking and tracking the pedestrian detection frame to generate tracking track lines, and connecting the scattered track lines through the tracking track lines to form initial pedestrian track lines. Please refer to fig. 1C, which shows a schematic flow chart of S13. As shown in fig. 1C, the S13 specifically includes the following steps:
and S131, performing image segmentation on the pedestrian detection frame at the current moment according to the head, the body, the left arm, the right arm and the legs to form a head-shoulder segment, a left arm segment, a right arm segment, a body segment and a leg segment. Wherein, the pedestrian detection frame at the current moment has the tail end of a scattered track line. Please refer to fig. 2, which is a block diagram of the pedestrian detection frame. As shown in fig. 2, the pedestrian detection frame R is divided into a head-shoulder section R1, a left-arm section R2, a right-arm section R3, a body-part section R4, a leg section R5, and a center C of the pedestrian detection frame.
S132, the head-shoulder block R1 adopts a first tracking manner, and the left-arm block R2, the right-arm block R3, the body-part block R4, and the leg block R5 respectively track in a second tracking manner to obtain the position Y ═ Y { Y ═ of the updated blocks of the head-shoulder block, the body-part block, the left-arm block, the right-arm block, and the leg block in the pedestrian detection frame at the next timeiI denotes the numbers of the head-shoulder block R1, left arm block R2, right arm block R3, body part block R4, and leg block R5. In this embodiment, the first tracking mode uses a KCF tracking algorithm, and the second tracking mode uses a particle filter tracking algorithm. In the present embodiment, accuracy and calculation rate are usedAnd (4) compromise selection, wherein the KCF tracking algorithm is superior to the particle filter tracking algorithm in tracking accuracy, and as the apparent information of the head-shoulder blocks is rich and is not easy to shield, the KCF tracking algorithm is adopted to track the head-shoulder blocks. In the present embodiment, the relative displacement d of the head-shoulder block R1, the left-arm block R2, the right-arm block R3, the body-part block R4, and the leg block R5 to the pedestrian detection frame Ri
S133, the position Y of the pedestrian detection frame at the next time point is { Y ═ Y ] according to the updated blocks of the head-shoulder block, the body block, the left-arm block, the right-arm block, and the leg blockiAnd relative displacement d between the head and shoulder block, the body block, the left arm block, the right arm block, the leg block and the center of the pedestrian detection frame at the current momentiAnd calculating the center of the pedestrian detection frame at the next moment. In this embodiment, the calculation formula for calculating the center of the pedestrian detection frame at the next time is:
C=∑(yi+di)×wiformula (7)
Wherein C represents the center of the pedestrian detection frame at the next moment; i represents the serial numbers of the head and shoulder block, the body block, the left arm block, the right arm block and the leg block; y isiThe positions of the update blocks of the head and shoulder blocks, the body blocks, the left arm blocks, the right arm blocks and the leg blocks in the pedestrian detection frame at the next moment are represented; diRepresenting the relative displacement of the head and shoulder sub-block, the body sub-block, the left arm sub-block, the right arm sub-block, the leg sub-block and the center of the pedestrian detection frame at the current moment; w is aiAnd representing the corresponding weights of the head and shoulder block, the body block, the left arm block, the right arm block and the leg block. The weight of the head-shoulder block is twice the weight of the other blocks, and ∑ wi=1。
S134, calculating the offset of each updating block from the center of the pedestrian detection frame at the next moment. As shown in FIG. 2, z1To z5An offset from the center of the pedestrian detection frame at the next time is indicated by an update block representing a head-shoulder block, a body block, a left-arm block, a right-arm block, and a leg block.
And S135, if the offsets of the head and shoulder blocks, the body part blocks, the left arm blocks, the right arm blocks and the leg blocks with the center of the pedestrian detection frame at the current moment and the offsets of each updating block with the center of the pedestrian detection frame at the next moment respectively meet the preset offset judgment condition, correcting each updating block to be the initial position of the next block tracking. In this embodiment, the predetermined offset determination condition is:
Figure BDA0001387836260000091
wherein z isiAn offset of an update block representing a head-shoulder block, a body block, a left arm block, a right arm block, a leg block from the center of the pedestrian detection frame at the next time, diAnd a relative displacement between the center of the pedestrian detection frame at the current time and the center of the head-shoulder segment, the body segment, the left-arm segment, the right-arm segment, and the leg segment is represented.
S136, circularly executing the steps, connecting the updated pedestrian detection frames after the updating of the pedestrian detection frame is finished, forming a tracking track line, searching the head end of another zero track line matched with the tail end of the tracking track line in the updated pedestrian detection frame, if the head end of another zero track line is matched with the tail end of the tracking track line, connecting the tracking track line with the other zero track line, and continuing block tracking at the tail end of the other zero track line; and if not, continuously updating the pedestrian detection frame. In the process of connecting scattered track lines, marking the states of the scattered track lines as finished states when all the scattered track lines are connected pairwise through the generated tracking track lines or are only connected with the tracking track lines, and finishing the connection stage of the track lines when all the scattered track lines are marked as finished states to form initial pedestrian track lines; otherwise, it is marked as incomplete. Please refer to fig. 3, which shows a schematic connection diagram of the zero line trace. As shown in fig. 3, h1 and h2 represent the corresponding pedestrians in the pedestrian detection frames f1 and f 2. The solid lines S1, S2, S3 represent the zero line trace generated in S12, and the dashed lines j1, j2, j3 represent the tracking trace generated in S13. Where the head of the dashed line is the end of the dashed line, e.g., q12 and q31 correspond to the head and end of trace line j1, dashed line j1 will achieve s1 and s3, and then end q32 of solid line s3 will begin to track trace line j3 again. If the scattered trace lines are not connected, the connection process is ended. For example, trace lines j2, j3, ends e2, and e1 are tracked.
FIG. 4 is a schematic diagram of an initial pedestrian trajectory without refinement. As shown in fig. 4, S1 and S2 are initial pedestrian trajectory lines that require refinement processing.
And S14, refining the initial pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed. Referring to fig. 1D, which is a schematic flow chart of S14, as shown in fig. 1D, the S14 specifically includes the following steps:
and S141, smoothing the initial pedestrian trajectory line.
Specifically, a B-spline curve is calculated using a coordinate point on the initial pedestrian trajectory line as a control point, and the spline curve is made to replace the initial pedestrian trajectory line. If the number of control points is m, the order n-ceil (m 0.8) is set for obtaining the low-order spline curve. And replacing the initial pedestrian trajectory line by a spline curve to achieve the purpose of smoothing the initial pedestrian trajectory line.
And S142, cutting the initial pedestrian trajectory line after the smoothing treatment.
Specifically, the step of cutting the smoothed initial pedestrian trajectory line includes:
setting a sliding window, and judging whether the length of the spline curve is greater than the preset window length w; if not, not cutting off the spline curve; if yes, executing the next step;
a part of the spline curve with a predetermined window length W is cut, the average moving direction of the first half part of the spline curve (the front W/2, namely, W1) and the average moving direction of the second half part of the spline curve (the rear W/2, namely, W2) are obtained, and the included angle between the average moving directions of the two parts is larger than a predetermined cut-off included angle (in the present embodiment, the predetermined cut-off included angle is 60 degrees), and the point where the second half part of the spline curve is located (the point W0) is set as a cut-off point, so that a cut-off line is formed. As shown in fig. 5 with the trace cut. In the present embodiment, the average moving direction is obtained by setting two consecutive points p1(x1, y1), p2(x2, y2), the moving direction is represented by a vector V, the vectors V12 of the points p1 and p2(x 2-x1, y2-y1), and the average moving direction is the average value of the vectors.
And sliding the sliding window backwards by a preset window length w, and returning to judge whether the length of the spline curve after the sliding of the preset window length is greater than the preset window length.
In the present embodiment, if one of the initial pedestrian trajectory lines is cut, the cut line has an initial state and a retained state.
And S143, according to preset reconnection judgment conditions, reconnecting the cut initial pedestrian trajectory line to form a real pedestrian trajectory line. In reconnection, reconnection judgment is performed on all the cut trajectory lines, and the cut lines meeting the preset reconnection judgment condition are connected pairwise. This step completes the optimization processing of the staggered pedestrian trajectory line, reducing the probability of the generation of the wrong trajectory line. Meanwhile, the integrity of the pedestrian motion track is ensured by effectively connecting the unconnected track lines.
Specifically, the cutting line is set to an initial state (whether the contents of the initial state can be explained or not); in this embodiment, the initial state is referred to as an unconnected state.
Searching two cutting lines S1 and S2 in an initial state, judging whether the cutting lines in the initial state meet the preset reconnection judgment condition, if so, connecting the cutting lines in pairs to form a real pedestrian trajectory line; if not, continuing searching.
The preset reconnection determination condition includes:
time determination conditions: the tail end time of the cutting line searched first is prior to the head end time of the cutting line searched later;
judging conditions of the included angle of the moving direction: the movement direction of the cutting line searched first and the movement direction of the cutting line searched later are smaller than a preset movement direction included angle th _ theta;
if the time judgment condition and the motion direction included angle judgment condition are met, judging the cut-off lines searched later as candidate connecting lines;
distance determination conditions: and searching for the cutting-off line of which the distance between the tail end of the cutting-off line searched first and the head end of the candidate connecting line is smaller than a preset distance threshold th _ dist in the candidate connecting lines.
Specifically, the process of determining whether the cutting line in the initial state satisfies the preset reconnection determination condition is:
1) searching a cutting line in an initial state S1, if the cutting line exists, executing 2), and if the cutting line does not exist, executing 5), namely, matching every two cutting lines, and meeting the condition that the end time of one cutting line is prior to the head end time of the other cutting line, wherein the distance between the head ends of the two cutting lines is smaller than a preset distance threshold (namely, the minimum distance threshold th _ dist). In this embodiment, all the cutting lines are traversed, and if the state of the cutting line is the initial state, it indicates that one cutting line is successfully found S1.
2) Searching another cutting line S2 in an initial state, wherein the tail end time of the cutting line S1 searched first is prior to the head end time of the cutting line S2 searched later; if yes, entering the next step 3), and if not, executing the step 4).
3) And calculating the running directions of the tail end of the cutting line S1 and the head end of the cutting line S2, judging whether the tail end and the head end of the cutting line S1 meet the motion direction included angle judgment condition, if so, adding the cutting line S2 into a candidate cutting line of the cutting line S1, recording the distance d between the tail end of the S1 and the head end of the S2, and returning to the step 2).
4) If the cut line S1 has a candidate cut line S2, searching for a cut line having a distance d between the tail end of S1 and the head end of S2 as a minimum distance threshold th _ dist, if the candidate cut line S2 exists, connecting the two cut lines to obtain a new trajectory line S1 ', setting the state of the new trajectory line S1' as an initial state, connecting the two cut lines to form a real pedestrian trajectory line, as shown in fig. 6, and performing refinement processing on the pedestrian trajectory lines S1 'and S2'; if not, the candidate cutting line S2 is set to the hold state, and the procedure returns to step 1).
The present embodiment also provides a computer-readable storage medium on which a computer program is stored, which when executed by a processor, implements the above-described pedestrian trajectory line processing method. Those of ordinary skill in the art will understand that: all or part of the steps for implementing the above method embodiments may be performed by hardware associated with a computer program. The aforementioned computer program may be stored in a computer readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
The pedestrian trajectory processing method and the computer-readable storage medium in the embodiment select to process the pedestrian motion trajectory in a time period at one time, so that the stability of the pedestrian motion trajectory can be effectively improved, the high tracking accuracy of scenes where a large number of pedestrians simultaneously appear is maintained, and the high-quality pedestrian motion trajectory is obtained.
Example two
The present embodiment provides a pedestrian trajectory line processing system, including:
the acquisition module is used for acquiring image data to be processed;
the detection module is used for detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered track lines by using pedestrian track information contained in the pedestrian detection frame;
a track line initial forming module, which is used for tracking the pedestrian detection frame in blocks to generate tracking track lines, and connecting the scattered track lines through the tracking track lines to form initial pedestrian track lines;
and the refining module is used for performing refining processing on the initial pedestrian trajectory line so as to obtain a real pedestrian trajectory line in the image data to be processed.
The pedestrian trajectory line processing system provided by the present embodiment will be described in detail below with reference to the drawings. It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the x module may be a processing element that is set up separately, or may be implemented by being integrated in a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and the function of the x module may be called and executed by a processing element of the apparatus. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, when one of the above modules is implemented in the form of a processing element scheduler code, the processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Referring to fig. 7, a schematic structural diagram of a pedestrian trajectory processing system in one embodiment is shown. As shown in fig. 7, the pedestrian trajectory line processing system 7 includes: an acquisition module 71, a detection module 72, a trajectory line initial forming module 73 and a refinement module 74.
The acquisition module 71 acquires image data to be processed. In the present embodiment, image data to be processed after decoding from a video stream acquired by an image data acquisition apparatus (e.g., a video camera).
Specifically, the acquiring module 71 acquires the image data I from the image acquisition device within a predetermined acquisition time TTAnd calculating the image data ITIs FG of the foreground imageTTo obtain image data to be processed including the image data and the foreground image D ═ { I ═ IT,FGT}。
The detection module 72 coupled to the acquisition module 71 detects the image data to be processed, extracts a pedestrian detection frame from the image data to be processed, and generates a scattered trajectory line by using pedestrian trajectory information included in the pedestrian detection frame.
Specifically, the detection module 72 is configured to perform pedestrian detection based on histogram of oriented gradient feature (HOG feature) on all image data in the image data to be processed to detect a pedestrian detection frame of the image data; using the foreground image FGTThe pedestrian detection frames are screened to extract the pedestrian detection frames with the foreground proportion larger than a preset foreground proportion threshold value; and generating scattered trajectory lines by using the pedestrian trajectory information contained in the pedestrian detection frame. Connecting the pedestrian detection frames which accord with the connection rule together by using the pedestrian detection frames to form scattered track lines; the scattered trajectory line contains pedestrian trajectory line information in a preset acquisition time, and the scattered trajectory line comprises the following steps: k scattered trajectory lines, attribute information of each scattered trajectory line at the current detection time, the new attribute message including the position coordinates of the ith scattered trajectory line at the current detection time and the confidence coefficient of the ith scattered trajectory line
Figure BDA0001387836260000131
And the size of the pedestrian detection frame currently detected
Figure BDA0001387836260000132
Wherein the connection rule includes:
ensuring that the time difference between the pedestrian detection frame at the current moment and the pedestrian detection frame at the previous moment is within three frames;
the change of the front and back movement direction of the pedestrian detection frame does not exceed a predetermined angle (in the present embodiment, the predetermined angle is 60 degrees);
the confidence of the pedestrian detection frame at each time is not lower than the confidence threshold (in this embodiment, the confidence threshold is 0.9). The connection rule records, in particular, the position of the head and tail of each scattered trajectory
Figure BDA0001387836260000133
Figure BDA0001387836260000134
Head coordinates for the ith scattered trajectory line,
Figure BDA0001387836260000135
is the end coordinate of the ith scattered trajectory line.
An initial trace line forming module 73 coupled to the acquisition module 71 and the detection module 72 is configured to block trace the pedestrian detection frame to generate trace lines, and to connect the scattered trace lines through the trace lines to form an initial pedestrian trace line.
Specifically, the trajectory line initial forming module 73 is configured to perform image segmentation on the pedestrian detection frame at the current time according to the head, the body, the left arm, the right arm, and the leg, and form a head-shoulder segment, a left-arm segment, a right-arm segment, a body segment, and a leg segment. Wherein, the pedestrian detection frame at the current moment has the tail end of a scattered track line. The head-shoulder block R1 adopts a first tracking manner, and the left-arm block R2, the right-arm block R3, the body-part block R4, and the leg block R5 respectively track in a second tracking manner to obtain the position Y ═ Y { Y ═ of the updated blocks of the head-shoulder block, the body-part block, the left-arm block, the right-arm block, and the leg block in the pedestrian detection frame at the next timeiI represents the numbers of the head-shoulder segment, the left-arm segment, the right-arm segment, the body segment, and the leg segment. In this embodiment, the first tracking mode uses a KCF tracking algorithm, and the second tracking mode uses a particle filter tracking algorithm. In this implementationIn the example, compromise selection of accuracy and calculation rate is adopted, the KCF tracking algorithm is superior to the particle filter tracking algorithm in tracking precision, and as apparent information of the head-shoulder blocks is rich and is not easy to shield, the head-shoulder blocks are tracked by adopting the KCF tracking algorithm. In the present embodiment, the relative displacement d of the head-shoulder block, the left-arm block, the right-arm block, the body block, and the leg block in the pedestrian detection framei. The position Y ═ Y in the pedestrian detection frame at the next time according to the update blocks of the head-shoulder block, the body block, the left-arm block, the right-arm block, and the leg blockiAnd relative displacement d between the head and shoulder block, the body block, the left arm block, the right arm block, the leg block and the center of the pedestrian detection frame at the current momentiAnd calculating the center of the pedestrian detection frame at the next moment. And calculating the offset of each updating block from the center of the pedestrian detection frame at the next moment. And if the offsets of the head-shoulder block, the body part block, the left arm block, the right arm block and the leg block from the center of the pedestrian detection frame at the current moment and the offsets of each updating block from the center of the pedestrian detection frame at the next moment respectively meet the preset offset judgment condition, correcting each updating block to be the initial position of the next block tracking. Circularly executing the above functions, connecting the updated pedestrian detection frames after finishing updating the pedestrian detection frame to form a tracking track line, searching the head end of another zero track line matched with the tail end of the tracking track line in the updated pedestrian detection frame, if the head end of another zero track line is matched with the tail end of the tracking track line, connecting the tracking track line with the other zero track line, and continuing block tracking at the tail end of the other zero track line; and if not, continuously updating the pedestrian detection frame. In the process of connecting scattered track lines, marking the states of the scattered track lines as finished states when all the scattered track lines are connected pairwise through the generated tracking track lines or are only connected with the tracking track lines, and finishing the connection stage of the track lines when all the scattered track lines are marked as finished states to form initial pedestrian track lines; otherwise, it is marked as incomplete.
A refinement module 74 coupled to the trajectory line initial forming module 73 is configured to refine the initial pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed.
The refinement module 74 is used to smooth the initial pedestrian trajectory. Specifically, a B-spline curve is calculated using a coordinate point on the initial pedestrian trajectory line as a control point, and the spline curve is made to replace the initial pedestrian trajectory line. If the number of control points is m, the order n-ceil (m 0.8) is set for obtaining the low-order spline curve. And replacing the initial pedestrian trajectory line by a spline curve to achieve the purpose of smoothing the initial pedestrian trajectory line.
The refinement module 74 is configured to cut off the smoothed initial pedestrian trajectory. Specifically, the refining module 74 sets a sliding window, and determines whether the length of the spline curve is greater than a predetermined window length w; if not, not cutting off the spline curve; if yes, cutting a part of the spline curve with the length of a preset window length W, obtaining the average motion direction of the front half part of the spline curve (front W/2, namely W1) and the average motion direction of the back half part of the spline curve (back W/2, namely W2), and setting the point (W0 point) of the half part of the spline curve as a cutting point to form a cutting line, wherein the included angle between the average motion directions of the front half part of the spline curve and the back half part of the spline curve is larger than a preset cutting included angle (in the embodiment, the preset cutting included angle is 60 degrees); and sliding the sliding window backwards by a preset window length w, and returning to judge whether the length of the spline curve after the sliding of the preset window length is greater than the preset window length. In the present embodiment, if one of the initial pedestrian trajectory lines is cut, the cut line has an initial state and a retained state.
The refining module 74 is further configured to reconnect the cut-off initial pedestrian trajectory line according to a preset reconnection determination condition to form a real pedestrian trajectory line. In reconnection, reconnection judgment is performed on all the cut trajectory lines, and the cut lines meeting the preset reconnection judgment condition are connected pairwise. This step completes the optimization processing of the staggered pedestrian trajectory line, reducing the probability of the generation of the wrong trajectory line. Meanwhile, the integrity of the pedestrian motion track is ensured by effectively connecting the unconnected track lines.
Specifically, the two cutting lines S1 and S2 of the refining module 74 in the initial state are searched, whether the cutting lines in the initial state meet the preset reconnection determination condition is judged, if yes, the cutting lines are connected in pairs to form a real pedestrian trajectory line; if not, continuing searching.
The preset reconnection determination condition includes:
time determination conditions: the tail end time of the cutting line searched first is prior to the head end time of the cutting line searched later;
judging conditions of the included angle of the moving direction: the movement direction of the cutting line searched first and the movement direction of the cutting line searched later are smaller than a preset movement direction included angle th _ theta;
if the time judgment condition and the motion direction included angle judgment condition are met, judging the cut-off lines searched later as candidate connecting lines;
distance determination conditions: and searching for the cutting-off line of which the distance between the tail end of the cutting-off line searched first and the head end of the candidate connecting line is smaller than a preset distance threshold th _ dist in the candidate connecting lines.
EXAMPLE III
The present embodiment provides an apparatus, comprising: a processor, a memory, a transceiver, a communication interface, and a system bus; the memory is used for storing computer programs and the communication interface is used for communicating with other devices, and the processor and the transceiver are used for operating the computer programs to enable the devices to execute steps of the method such as the ascending human trajectory processing method.
The above-mentioned system bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The system bus may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus. The communication interface is used for realizing communication between the database access device and other equipment (such as a client, a read-write library and a read-only library). The memory may include a Random Access Memory (RAM), and may further include a non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor may be a general-purpose processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the integrated circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components.
In summary, the pedestrian trajectory processing method/system, the computer-readable storage medium and the device of the present invention select to process the pedestrian motion trajectory within a time period at a time, which can effectively improve the stability of the pedestrian motion trajectory, maintain high tracking accuracy for scenes where a large number of pedestrians are present at the same time, and obtain a high-quality pedestrian motion trajectory. Therefore, the invention effectively overcomes various defects in the prior art and has high industrial utilization value.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (13)

1. A pedestrian trajectory line processing method, comprising:
acquiring image data to be processed;
detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered trajectory lines by using pedestrian trajectory information contained in the pedestrian detection frame;
blocking the pedestrian detection frame to generate tracking trajectory lines, the scattered trajectory lines being connected by the tracking trajectory lines to form initial pedestrian trajectory lines;
the refining processing is carried out on the initial pedestrian trajectory line to obtain a real pedestrian trajectory line in the image data to be processed, and the method comprises the following steps:
smoothing the initial pedestrian trajectory line; wherein smoothing the initial pedestrian trajectory line comprises: calculating a spline curve by taking the coordinate point on the initial pedestrian trajectory line as a control point, and enabling the spline curve to replace the initial pedestrian trajectory line;
cutting off the initial pedestrian trajectory line after the smoothing treatment; the step of cutting the smoothed initial pedestrian trajectory line includes: setting a sliding window, and judging whether the length of the spline curve is greater than the length of a preset window; if not, not cutting off the spline curve; if yes, executing the next step; intercepting a part of spline curve with the length of a preset window, acquiring the average motion direction of the front half part of spline curve and the average motion direction of the rear half part of spline curve, and setting the point of the half part of spline curve as a cutting point to form a cutting line if the included angle of the average motion directions of the front half part of spline curve and the rear half part of spline curve is larger than a preset cutting included angle; sliding the sliding window backward for a preset window length, and returning to judge whether the length of the spline curve after sliding the preset window length is larger than the preset window length; according to preset reconnection judgment conditions, reconnecting the initial pedestrian trajectory line after being cut off to form a real pedestrian trajectory line; wherein the preset reconnection determination condition includes: time determination conditions: the tail end time of the cutting line searched first is prior to the head end time of the cutting line searched later; judging conditions of the included angle of the moving direction: the included angle between the motion direction of the cutting line searched first and the motion direction of the cutting line searched later is smaller than the preset motion direction; if the time judgment condition and the motion direction included angle judgment condition are met, judging the cut-off lines searched later as candidate connecting lines; distance determination conditions: and searching the cutting line of which the distance between the tail end of the cutting line searched first and the head end of the candidate connecting line is smaller than a preset distance threshold value in the candidate connecting line.
2. The pedestrian trajectory line processing method according to claim 1, wherein in acquiring the image data to be processed, the pedestrian trajectory line processing method includes:
acquiring image data from an image acquisition device within a preset acquisition time;
and calculating a foreground image of the image data to acquire image data to be processed comprising the image data and the foreground image.
3. The pedestrian trajectory line processing method according to claim 2, wherein the step of detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed includes:
pedestrian detection based on the features of the directional gradient histograms is carried out on all image data in the image data to be processed so as to detect a pedestrian detection frame of the image data;
and screening the pedestrian detection frames by using the foreground proportion of the foreground image so as to extract the pedestrian detection frames with the foreground proportion larger than a preset foreground proportion threshold value.
4. The pedestrian trajectory line processing method according to claim 3, wherein the step of performing pedestrian detection based on histogram of oriented gradient features on all image data in the image data to be processed comprises:
carrying out color space normalization processing on the image data to form preprocessed image data; the color space normalization processing comprises image graying and gamma correction of the image data;
acquiring the gradient and the gradient direction of the preprocessed image data;
dividing the preprocessed image data into a plurality of image units, and counting a gradient histogram of each image unit;
forming a plurality of image units into image blocks, and connecting the feature vectors of all the image units in each image block in series to obtain the directional gradient histogram feature of the image block;
combining all the directional gradient histogram features in the image data to form a feature vector of the image data for representing the image data;
distinguishing the feature vector of the image data by using a support vector machine classifier so as to detect a pedestrian detection frame of the image data; the pedestrian detection frame comprises the size and the confidence of the currently detected pedestrian detection frame and/or the current detection moment in the preset acquisition time.
5. The pedestrian trajectory line processing method according to claim 3, wherein the step of generating the scattered trajectory line using the pedestrian trajectory information included in the pedestrian detection frame includes:
connecting the pedestrian detection frames which accord with the connection rule together by using the pedestrian detection frames to form scattered track lines; the scattered trajectory line contains pedestrian trajectory line information in a preset acquisition time, and the scattered trajectory line comprises the following steps: the system comprises k scattered trajectory lines, attribute information of each scattered trajectory line at the current detection time, wherein the attribute information comprises position coordinates of the ith scattered trajectory line at the current detection time, confidence of the ith scattered trajectory line and the size of a pedestrian detection frame detected currently;
wherein the connection rule includes:
ensuring that the time difference between the pedestrian detection frame at the current moment and the pedestrian detection frame at the previous moment is within three frames;
the change of the front and back movement directions of the pedestrian detection frame does not exceed a preset angle;
the confidence of the pedestrian detection frame at each moment is not lower than the confidence threshold.
6. The pedestrian trajectory line processing method according to claim 3, wherein the step of blocking the tracking of the pedestrian detection frame to generate a tracking trajectory line includes:
performing image segmentation on the pedestrian detection frame at the current moment according to the head, the body, the left arm, the right arm and the legs to form a head-shoulder segment, a body segment, a left arm segment, a right arm segment and a leg segment; wherein, the pedestrian detection frame at the current moment has the tail end of a scattered track line;
the head and shoulder sub-blocks adopt a first tracking mode, and the body sub-blocks, the left arm sub-blocks, the right arm sub-blocks and the leg sub-blocks adopt a second tracking mode to track respectively so as to acquire the positions of the update sub-blocks of the head and shoulder sub-blocks, the body sub-blocks, the left arm sub-blocks, the right arm sub-blocks and the leg sub-blocks in a pedestrian detection frame at the next moment;
calculating the center of the pedestrian detection frame at the next moment according to the positions of the update blocks of the head-shoulder block, the body block, the left arm block, the right arm block and the leg block in the pedestrian detection frame at the next moment and the relative displacement of the centers of the head-shoulder block, the body block, the left arm block, the right arm block and the leg block and the pedestrian detection frame at the current moment;
calculating the offset of each updating block from the center of the pedestrian detection frame at the next moment;
if the offsets of the head-shoulder block, the body part block, the left arm block, the right arm block, the leg block and the center of the pedestrian detection frame at the current moment respectively satisfy the preset offset judgment condition with the offsets of each updating block and the center of the pedestrian detection frame at the next moment, correcting each updating block to be the initial position of the next block tracking;
circularly executing the steps, connecting the updated pedestrian detection frames after the updating of the pedestrian detection frame is finished to form a tracking track line, searching the head end of another zero track line matched with the tail end of the tracking track line in the updated pedestrian detection frame, if the head end of another zero track line is matched with the tail end of the tracking track line, connecting the tracking track line with the other zero track line, and continuing the block tracking at the tail end of the other zero track line; and if not, continuously updating the pedestrian detection frame.
7. The pedestrian trajectory line processing method according to claim 6, wherein a calculation formula for calculating the center of the pedestrian detection frame at the next time is:
C=∑(yi+di)×wi
wherein C represents the center of the pedestrian detection frame at the next moment; i represents the serial numbers of the head and shoulder block, the body block, the left arm block, the right arm block and the leg block; y isiThe positions of the update blocks of the head and shoulder blocks, the body blocks, the left arm blocks, the right arm blocks and the leg blocks in the pedestrian detection frame at the next moment are represented; diRepresenting the relative displacement of the head and shoulder sub-block, the body sub-block, the left arm sub-block, the right arm sub-block, the leg sub-block and the center of the pedestrian detection frame at the current moment; w is aiAnd representing the corresponding weights of the head and shoulder block, the body block, the left arm block, the right arm block and the leg block.
8. The pedestrian trajectory line processing method according to claim 6, wherein the predetermined deviation determination condition is:
Figure FDA0003039941870000041
wherein z isiAn offset of an update block representing a head-shoulder block, a body block, a left arm block, a right arm block, a leg block from the center of the pedestrian detection frame at the next time, diAnd a relative displacement between the center of the pedestrian detection frame at the current time and the center of the head-shoulder segment, the body segment, the left-arm segment, the right-arm segment, and the leg segment is represented.
9. The pedestrian trajectory processing method according to claim 6, further comprising labeling a state of the scattered trajectory line as a completed state when all the scattered trajectory lines are connected two by the generated tracking trajectory line or are connected with the tracking trajectory line only completely, and labeling as an uncompleted state otherwise.
10. The pedestrian trajectory processing method according to claim 1, wherein the step of reconnecting the initial pedestrian trajectory line after the cutting to form a real pedestrian trajectory line according to a preset reconnection determination condition includes:
setting the cutting line to be in an initial state;
searching two cutting lines in an initial state, judging whether the cutting lines in the initial state meet the preset reconnection judgment condition, if so, connecting the cutting lines in pairs to form a real pedestrian trajectory line; if not, continuing searching.
11. A pedestrian trajectory processing system, comprising:
the acquisition module is used for acquiring image data to be processed;
the detection module is used for detecting the image data to be processed, extracting a pedestrian detection frame from the image data to be processed, and generating scattered track lines by using pedestrian track information contained in the pedestrian detection frame;
a track line initial forming module, which is used for tracking the pedestrian detection frame in blocks to generate tracking track lines, and connecting the scattered track lines through the tracking track lines to form initial pedestrian track lines;
the refining module is used for carrying out refining processing on the initial pedestrian trajectory line so as to obtain a real pedestrian trajectory line in the image data to be processed; the refinement module comprises smoothing the initial pedestrian trajectory; cutting off the initial pedestrian trajectory line after the smoothing treatment; the refining module is used for calculating a spline curve by taking a coordinate point on the initial pedestrian trajectory line as a control point, and enabling the spline curve to replace the initial pedestrian trajectory line so as to carry out smoothing treatment on the initial pedestrian trajectory line;
the refinement module is used for judging whether the length of the spline curve is greater than the length of a preset window by setting a sliding window; if not, not cutting off the spline curve; if yes, executing the next step; intercepting a part of spline curve with the length of a preset window, acquiring the average motion direction of the front half part of spline curve and the average motion direction of the rear half part of spline curve, and setting the point of the half part of spline curve as a cutting point to form a cutting line if the included angle of the average motion directions of the front half part of spline curve and the rear half part of spline curve is larger than a preset cutting included angle; sliding the sliding window backward for a preset window length, and returning to judge whether the length of the spline curve after sliding the preset window length is larger than the preset window length; according to preset reconnection judgment conditions, reconnecting the initial pedestrian trajectory line after being cut off to form a real pedestrian trajectory line; wherein the preset reconnection determination condition includes: time determination conditions: the tail end time of the cutting line searched first is prior to the head end time of the cutting line searched later; judging conditions of the included angle of the moving direction: the included angle between the motion direction of the cutting line searched first and the motion direction of the cutting line searched later is smaller than the preset motion direction; if the time judgment condition and the motion direction included angle judgment condition are met, judging the cut-off lines searched later as candidate connecting lines; distance determination conditions: and searching the cutting line of which the distance between the tail end of the cutting line searched first and the head end of the candidate connecting line is less than a preset distance threshold value in the candidate connecting line to cut the initial pedestrian trajectory line after the smoothing treatment.
12. A computer-readable storage medium on which a computer program is stored, the program, when being executed by a processor, implementing the pedestrian trajectory processing method according to any one of claims 1 to 10.
13. An apparatus, comprising: a processor and a memory;
the memory for storing a computer program, the processor for executing the computer program stored by the memory to cause the apparatus to perform the pedestrian trajectory processing method according to any one of claims 1 to 10.
CN201710734605.7A 2017-08-24 2017-08-24 Pedestrian trajectory processing method/system, computer-readable storage medium and device Active CN107610151B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710734605.7A CN107610151B (en) 2017-08-24 2017-08-24 Pedestrian trajectory processing method/system, computer-readable storage medium and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710734605.7A CN107610151B (en) 2017-08-24 2017-08-24 Pedestrian trajectory processing method/system, computer-readable storage medium and device

Publications (2)

Publication Number Publication Date
CN107610151A CN107610151A (en) 2018-01-19
CN107610151B true CN107610151B (en) 2021-07-13

Family

ID=61065802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710734605.7A Active CN107610151B (en) 2017-08-24 2017-08-24 Pedestrian trajectory processing method/system, computer-readable storage medium and device

Country Status (1)

Country Link
CN (1) CN107610151B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108648213A (en) * 2018-03-16 2018-10-12 西安电子科技大学 A kind of implementation method of KCF track algorithms on TMS320C6657
CN108509896B (en) 2018-03-28 2020-10-13 腾讯科技(深圳)有限公司 Trajectory tracking method and device and storage medium
CN109117882B (en) * 2018-08-10 2022-06-03 北京旷视科技有限公司 Method, device and system for acquiring user track and storage medium
CN110781806A (en) * 2019-10-23 2020-02-11 浙江工业大学 Pedestrian detection tracking method based on YOLO
CN111553291B (en) * 2020-04-30 2023-10-17 北京爱笔科技有限公司 Pedestrian track generation method, device, equipment and computer storage medium
CN111914699B (en) * 2020-07-20 2023-08-08 同济大学 Pedestrian positioning and track acquisition method based on video stream of camera
CN112037267B (en) * 2020-11-06 2021-02-02 广州市玄武无线科技股份有限公司 Method for generating panoramic graph of commodity placement position based on video target tracking
JP2023066240A (en) * 2021-10-28 2023-05-15 株式会社日立ハイテク Walking mode visualizing method, program and device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106682573A (en) * 2016-11-15 2017-05-17 中山大学 Pedestrian tracking method of single camera

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4208898B2 (en) * 2006-06-09 2009-01-14 株式会社ソニー・コンピュータエンタテインメント Object tracking device and object tracking method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106682573A (en) * 2016-11-15 2017-05-17 中山大学 Pedestrian tracking method of single camera

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于视频的行人检测与跟踪算法研究;罗招材;《万方数据硕士学位论文》;20161103;第1-66页 *
时空上下文抗遮挡视觉跟踪;刘万军 等;《中国图象图形学报》;20160831;第21卷(第8期);第1057-1067页 *

Also Published As

Publication number Publication date
CN107610151A (en) 2018-01-19

Similar Documents

Publication Publication Date Title
CN107610151B (en) Pedestrian trajectory processing method/system, computer-readable storage medium and device
Hannuna et al. DS-KCF: a real-time tracker for RGB-D data
US20200356818A1 (en) Logo detection
CN107103326B (en) Collaborative significance detection method based on super-pixel clustering
JP6188400B2 (en) Image processing apparatus, program, and image processing method
CN109446889B (en) Object tracking method and device based on twin matching network
CN109934065B (en) Method and device for gesture recognition
CN110197149B (en) Ear key point detection method and device, storage medium and electronic equipment
Seo et al. Effective and efficient human action recognition using dynamic frame skipping and trajectory rejection
US20170323149A1 (en) Rotation invariant object detection
CN110544268B (en) Multi-target tracking method based on structured light and SiamMask network
Li et al. Robust vehicle detection in high-resolution aerial images with imbalanced data
Yang et al. Binary descriptor based nonparametric background modeling for foreground extraction by using detection theory
CN111915657A (en) Point cloud registration method and device, electronic equipment and storage medium
EP4209959A1 (en) Target identification method and apparatus, and electronic device
CN106033613B (en) Method for tracking target and device
Liao et al. Multi-scale saliency features fusion model for person re-identification
Huan et al. Human action recognition based on HOIRM feature fusion and AP clustering BOW
CN106991684B (en) Foreground extracting method and device
JP2019220174A (en) Image processing using artificial neural network
CN116071569A (en) Image selection method, computer equipment and storage device
Shyam et al. Retaining image feature matching performance under low light conditions
CN112214639B (en) Video screening method, video screening device and terminal equipment
JP2016081472A (en) Image processing device, and image processing method and program
TW202303451A (en) Nail recognation methods, apparatuses, devices and storage media

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 201505 Room 216, 333 Tingfeng Highway, Tinglin Town, Jinshan District, Shanghai

Applicant after: Huina Technology Co., Ltd.

Address before: 201505 Room 216, 333 Tingfeng Highway, Tinglin Town, Jinshan District, Shanghai

Applicant before: SHANGHAI WINNER INFORMATION TECHNOLOGY CO., INC.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder

Address after: 201203 No. 6, Lane 55, Chuanhe Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee after: Winner Technology Co.,Ltd.

Address before: 201505 Room 216, 333 Tingfeng Highway, Tinglin Town, Jinshan District, Shanghai

Patentee before: Winner Technology Co.,Ltd.

CP02 Change in the address of a patent holder