CN114387311B - LKJ file and locomotive video automatic time setting method, device and computer equipment - Google Patents

LKJ file and locomotive video automatic time setting method, device and computer equipment Download PDF

Info

Publication number
CN114387311B
CN114387311B CN202111570909.7A CN202111570909A CN114387311B CN 114387311 B CN114387311 B CN 114387311B CN 202111570909 A CN202111570909 A CN 202111570909A CN 114387311 B CN114387311 B CN 114387311B
Authority
CN
China
Prior art keywords
image frame
locomotive
intensity
motion
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111570909.7A
Other languages
Chinese (zh)
Other versions
CN114387311A (en
Inventor
李俊成
张晋楷
闫帅
闫龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guoneng Xinshuo Railway Co ltd
Original Assignee
Guoneng Xinshuo Railway Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guoneng Xinshuo Railway Co ltd filed Critical Guoneng Xinshuo Railway Co ltd
Priority to CN202111570909.7A priority Critical patent/CN114387311B/en
Publication of CN114387311A publication Critical patent/CN114387311A/en
Application granted granted Critical
Publication of CN114387311B publication Critical patent/CN114387311B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to an LKJ file and locomotive video automatic time setting method, device and computer equipment. The method comprises the following steps: acquiring locomotive video, and calculating optical flow intensity of adjacent image frames in the locomotive video at a plurality of same pixel positions; for each adjacent image frame, determining the motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames; classifying each motion intensity, and determining a separation score between two adjacent motion intensities according to the type of each motion intensity and a separation algorithm; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the beginning time and the image frame at the ending time of the locomotive video; determining an optimal separation according to the separation score, and determining a target image frame according to the optimal separation; and according to the target image frame, the locomotive video is paired with the LKJ file. By adopting the method, the locomotive video and the LKJ file can be automatically timed.

Description

LKJ file and locomotive video automatic time setting method, device and computer equipment
Technical Field
The application relates to the technical field of image analysis, in particular to an LKJ file and locomotive video automatic time setting method, device and computer equipment.
Background
In the running process of the locomotive, the locomotive video monitoring host computer records information such as running road conditions of the locomotive, behavior actions of crews and the like in a video mode. Meanwhile, LKJ (train operation monitoring recording device) of a train can record the operation condition of the train and form the recorded data into LKJ file. In the process of analyzing locomotive video files of a locomotive video monitoring host, time synchronization of locomotive video and time synchronization of LKJ files is required, so that the relationship between LKJ recorded data and locomotive video can be accurately judged. However, in actual situations, the locomotive video monitoring host machine often cannot acquire the time of the LKJ file due to communication failure, equipment failure and the like, so that the time of locomotive video is completely unsynchronized with the time of LKJ.
In the existing method, the image OCR technology is utilized to extract the text information on the locomotive video, the text information generally comprises locomotive speed, kilometer post, train number and the like, and then the time of the locomotive video and the time of the LKJ file are automatically synchronized according to the kilometer post, train number and the kilometer post and train number in the LKJ file.
However, the inventor researches and discovers that in the existing method, the time of the locomotive video and the time of the LKJ file cannot be automatically synchronized under the condition that the locomotive video has no text information.
Disclosure of Invention
Based on the foregoing, it is necessary to provide a method, a device and a computer device for automatically timing an LKJ file with a locomotive video.
In a first aspect, the application provides an LKJ file and locomotive video automatic time synchronization method. The method comprises the following steps:
acquiring locomotive video, and calculating optical flow intensity of adjacent image frames in the locomotive video at a plurality of same pixel positions; the locomotive video comprises a target image frame, wherein the target image frame is an image frame of the locomotive at the starting moment;
for each adjacent image frame, determining the motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames;
Classifying each motion intensity, and determining a separation score between two adjacent motion intensities according to the type of each motion intensity and a separation algorithm; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the beginning time and the image frame at the ending time of the locomotive video;
determining an optimal separation according to the separation score, and determining a target image frame according to the optimal separation;
and according to the target image frame, the locomotive video is paired with the LKJ file.
In one embodiment, the step of determining the motion intensity between adjacent image frames according to the plurality of optical flow intensities corresponding to the adjacent image frames includes:
Comparing the plurality of optical flow intensities with a threshold value, respectively;
According to the comparison result, the number of pixel points with the optical flow intensity exceeding the threshold value is counted, and the number is taken as the motion intensity between the adjacent image frames.
In one embodiment, the step of classifying each intensity of motion comprises:
taking the logarithm of each motion intensity, and clustering a plurality of the motion intensities after taking the logarithm;
and classifying the motion intensities according to the clustering result.
In one embodiment, prior to the step of logarithmically taking each motion intensity, the method comprises:
For each exercise intensity, if the exercise intensity is 0, the exercise intensity is updated to 1.
In one embodiment, the step of calculating optical flow intensities of adjacent image frames in locomotive video at a plurality of identical pixel locations comprises:
removing noise of each image frame in locomotive video to obtain a denoising video;
For each adjacent image frame in the denoising video, calculating the optical flow intensity of the adjacent image frame in the denoising video at a plurality of identical pixel positions.
In one embodiment, the step of synchronizing the locomotive video with the LKJ file according to the target image frame includes:
acquiring an LKJ file, and analyzing the LKJ file;
Extracting a target time period according to the analysis result; the target time period comprises a starting time;
and aligning the moment corresponding to the target image frame with the starting moment in the target time period so as to time the locomotive video and the LKJ file.
In a second aspect, the application also provides an LKJ file and locomotive video automatic time setting device. The device comprises:
the optical flow calculation module is used for acquiring locomotive video and calculating optical flow intensity of adjacent image frames in the locomotive video at a plurality of same pixel point positions; the locomotive video comprises a target image frame, wherein the target image frame is an image frame of the locomotive at the starting moment;
The intensity determining module is used for determining the motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames for each adjacent image frame;
The score determining module is used for classifying each exercise intensity and determining the separation score between two adjacent exercise intensities according to the type of each exercise intensity and a separation algorithm; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the beginning time and the image frame at the ending time of the locomotive video;
the target frame determining module is used for determining optimal separation according to the separation score and determining target image frames according to the optimal separation;
and the time setting module is used for setting time of the locomotive video and the LKJ file according to the target image frame.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory in which a computer program is stored and a processor which, when executing the computer program, carries out the steps of the method described above.
In a fourth aspect, the present application also provides a computer-readable storage medium. A computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method described above.
In a fifth aspect, the present application also provides a computer program product. Computer program product comprising a computer program which, when executed by a processor, implements the steps of the method described above.
The LKJ file and locomotive video automatic time setting method, device and computer equipment are used for acquiring locomotive video, wherein the locomotive video comprises target image frames, and the target image frames are image frames of the locomotive at the starting moment; the optical flow intensity of adjacent image frames in the locomotive video at a plurality of same pixel point positions can be calculated; determining the motion intensity between the adjacent image frames according to the plurality of optical flow intensities corresponding to each adjacent image frame; classifying each exercise intensity, and determining a separation score between two adjacent exercise intensities according to the type of each exercise intensity and a separation algorithm; the adjacent two motion intensities are two motion intensities corresponding to each image frame except the image frame at the beginning time and the image frame at the ending time of the locomotive video; an optimal separation may be determined based on the separation score and the target image frame based on the optimal separation; and according to the target image frame, the locomotive video is paired with the LKJ file. Therefore, the motion intensity between each image frame can be determined according to the optical flow intensity, and then the optimal separation is determined according to the motion intensity, so that the target image frame at the starting time of the locomotive is obtained, the starting time of the locomotive in the locomotive video can be obtained, and the locomotive video and the LKJ file can be automatically timed according to the starting time.
Drawings
FIG. 1 is a flow chart of a method for automatically synchronizing LKJ files with locomotive video in one embodiment;
FIG. 2 is a flowchart illustrating a step of determining motion intensity between adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames in one embodiment;
FIG. 3 is a flow chart illustrating the steps for classifying each intensity of motion in one embodiment;
FIG. 4 is a flowchart illustrating steps for synchronizing locomotive video with LKJ files according to a target image frame in one embodiment;
FIG. 5 is a flow chart of an exemplary method for automatically synchronizing LKJ files with locomotive video according to one embodiment;
FIG. 6 is a block diagram of an LKJ file and locomotive video auto-time synchronization device in one embodiment;
Fig. 7 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
In the aspect of current locomotive video application, a locomotive video device records a driving process scene of a crew member, and after the crew member is out of duty, a professional analysis staff analyzes the video to confirm whether the crew member has illegal behaviors in the driving process, such as mobile phone playing, dozing, hand-to-eye watching without being required, and the like. However, the analysis process must be performed synchronously based on the data of the LKJ file to obtain a correct analysis result. The current process flow is as follows:
1) Locomotive is put into a warehouse, a crewmember dumps LKJ files and a manager dumps locomotive video files to a ground server respectively;
2) The video analysis personnel call locomotive video and LKJ files which are recorded on the behavior of the crewmember, and carry out driving analysis on the crewmember, and the analysis result is obtained by manually analyzing the mode of synchronously playing the LKJ files and the locomotive video files on a computer or comparing and analyzing the time of finding some key points from the locomotive video and then removing the LKJ files to obtain the occurrence key points.
From the above flow, locomotive video analysis requires locomotive video time and LKJ file time synchronization, so that the relationship between the data and video of the LKJ file can be accurately judged. However, in actual situations, the vehicle-mounted video monitoring host computer often cannot acquire the LKJ file time due to communication faults, equipment faults and the like, so that the video time and the LKJ time are completely unsynchronized. I.e. the video monitoring host does not use the clock of LKJ. When the defect causes video analysis, the two can be synchronized according to locomotive road condition videos and LKJ files through the experience of an analyst.
In the current method for synchronously analyzing the LKJ file and the locomotive video file, the image OCR technology is utilized to extract the text information on the video, the text information of the video generally comprises locomotive speed, kilometer post, train number and the like, and then the video and the LKJ file are automatically aligned according to the kilometer post and the train number which are synchronous with the kilometer post and the train number in the LKJ file. However, according to the method, for the condition that the communication between the video monitoring host and the LKJ equipment is abnormal and the video is not overlapped with the subtitle, the character cannot be grasped through the OCR technology, so that automatic time synchronization cannot be realized. The actual situation is that most of the dyssynchrony is caused by abnormal communication between the video monitoring host and the LKJ device. The technique does not solve the problem of the LKJ file and video not being synchronized due in part to this.
In order to solve the problems, the application provides an LKJ file and locomotive video automatic time-setting method, an LKJ file and locomotive video automatic time-setting device and computer equipment. According to the application, the locomotive video of the running road condition of the locomotive can be automatically timed with the LKJ file, so that all locomotive videos in the video monitoring host can be automatically timed with the LKJ file.
In one embodiment, as shown in fig. 1, an automatic time synchronization method for LKJ files and locomotive videos is provided, and this embodiment is applied to a terminal for illustration by using the method, it is understood that the method may also be applied to a server, and may also be applied to a system including the terminal and the server, and implemented through interaction between the terminal and the server. In this embodiment, the method includes the steps of:
Step S102, acquiring locomotive video and calculating optical flow intensity of adjacent image frames in the locomotive video at a plurality of same pixel positions.
The locomotive video comprises target image frames, wherein the target image frames are image frames of the locomotive at the starting moment.
The locomotive video can be one section of video of road condition video obtained by video recording of the running road condition of the locomotive, and the content of the locomotive video comprises the process of starting the locomotive. And extracting an image frame sequence in the locomotive video to obtain continuous adjacent image frames. The optical flow intensity is the movement amount of the same pixel point in the video image from the previous frame image to the next frame image, and is generally calculated by an optical flow algorithm, wherein the optical flow algorithm comprises an HS optical flow method, a Lucas-Kanada algorithm and a PYRAMIDAL LK algorithm; it will be appreciated that in locomotive video, optical flow is created in the video image as the foreground and background move relative to each other.
Specifically, a locomotive video is acquired, and optical flow intensities of a plurality of identical pixel points in adjacent image frames of the locomotive video are calculated through an optical flow algorithm.
In a specific embodiment, when the speed of the locomotive changes, the object photographed by the road condition camera of the locomotive video device will generate relative displacement, so that the optical flow of the image pixel point of the locomotive video can be calculated. However, because the speed change of the locomotive is relatively large, a large error is caused by utilizing the Lucas-Kanade algorithm, and in order to reduce the error, the optical flow strength of a plurality of identical pixel points in adjacent image frames can be calculated by utilizing an optical flow algorithm (namely PYRAMIDAL LK algorithm) based on pyramid layering and proposed by Jean-Yves Bouguset
Step S104, for each adjacent image frame, determining the motion intensity between the adjacent image frames according to the optical flow intensities corresponding to the adjacent image frames.
Wherein the intensity of motion will vary between each adjacent image frame. For example, when the locomotive is moving at a high speed, the intensity of motion between adjacent image frames is high; when only constructors walk in the locomotive video and the locomotive is in a stop state, the motion intensity between adjacent image frames is low (comprising the motion intensity of 0).
Specifically, for each adjacent image frame, having a plurality of optical flow intensities calculated from a plurality of identical pixel points, the motion intensity between each adjacent image frame may be determined from the plurality of optical flow intensities for that adjacent image frame.
Step S106, classifying each exercise intensity, and determining a separation score between two adjacent exercise intensities according to the type of each exercise intensity and a separation algorithm.
The adjacent two motion intensities are two motion intensities corresponding to each image frame except the image frame at the beginning time and the image frame at the ending time of the locomotive video; it will be appreciated that each image frame of the locomotive video corresponds to two motion intensities, two adjacent, except for the image frame at the start and the image frame at the end of the locomotive video. The separation score between two adjacent exercise intensities is a score of the separation between two adjacent exercise intensities; the separation between the two motion intensities is the image frame of the locomotive video corresponding to the two motion intensities.
Specifically, the classification can be performed according to the similarity of the motion intensities; as an example, a high degree of similarity may be similar in magnitude of the intensity of the motion. And calculating the separation scores of the two adjacent exercise intensities according to the type of each exercise intensity and a separation algorithm for each exercise intensity after classification.
In a specific embodiment, the type of motion intensity includes high intensity and low intensity. Determining the type of each exercise intensity as high intensity or low intensity, setting the exercise intensity of the type as high intensity as 1, setting the exercise intensity of the type as low intensity as 0, and calculating a separation score between two adjacent exercise intensities according to the following expression (1):
S=A+B-C-D (1)
Wherein S is a separation score; a is the number of motion intensities of low intensity types prior to the separation, i.e., the number of 0 prior to the separation; b is the number of motion intensities of the type high intensity after the separation, i.e. 1 after the separation; c is the number of motion intensities of the type high intensity before the separation, i.e. 1 before the separation; d is the number of motion intensities of low intensity type after the separation, i.e. 0 after the separation.
Step S108, determining optimal separation according to the separation score, and determining a target image frame according to the optimal separation;
specifically, according to each calculated separation score, determining an optimal separation; and determining the target image frame based on the optimal separation.
In a specific embodiment, the segment corresponding to the segment score with the largest value is determined as the optimal segment, and the image frame of the locomotive video corresponding to the optimal segment is determined as the target image frame.
Step S110, according to the target image frame, the locomotive video and the LKJ file are paired.
The target image frame is an image frame of the locomotive at the starting moment, the LKJ file comprises a time axis, and the time on the time axis comprises the starting moment.
Specifically, a time axis of the LKJ file may be extracted, and the starting time of the locomotive on the time axis in the LKJ file may be aligned with the target image frame to obtain a synchronized locomotive video and LKJ file.
Further, the locomotive video is one section of the road condition video obtained by recording the running road condition of the locomotive; the complete road condition video and the LKJ file can be synchronized according to the synchronized locomotive video and the LKJ file;
further, the behavior video and the road condition video obtained by recording the behavior actions of the crews are connected to the same locomotive video monitoring host, and the behavior video and the LKJ file can be synchronized according to the synchronized road condition video and the synchronized LKJ file; so that it is possible to analyze whether the crewmember violates or not based on the data in the LKJ file.
In the embodiment, a locomotive video is obtained, wherein the locomotive video comprises a target image frame, and the target image frame is an image frame of the locomotive at a starting moment; the optical flow intensity of adjacent image frames in the locomotive video at a plurality of same pixel point positions can be calculated; determining the motion intensity between the adjacent image frames according to the plurality of optical flow intensities corresponding to each adjacent image frame; classifying each exercise intensity, and determining a separation score between two adjacent exercise intensities according to the type of each exercise intensity and a separation algorithm; the adjacent two motion intensities are two motion intensities corresponding to each image frame except the image frame at the beginning time and the image frame at the ending time of the locomotive video; an optimal separation may be determined based on the separation score and the target image frame based on the optimal separation; and according to the target image frame, the locomotive video is paired with the LKJ file. Therefore, the motion intensity between each image frame can be determined according to the optical flow intensity, and then the optimal separation is determined according to the motion intensity, so that the target image frame at the starting time of the locomotive is obtained, the starting time of the locomotive in the locomotive video can be obtained, and the locomotive video and the LKJ file can be automatically timed according to the starting time.
In one embodiment, as shown in fig. 2, the step of determining the motion intensity between adjacent image frames according to the plurality of optical flow intensities corresponding to the adjacent image frames includes:
step S202, comparing the plurality of optical flow intensities with threshold values respectively;
Step S204, according to the comparison result, counting the number of pixel points with the optical flow intensity exceeding a threshold value, and taking the number as the motion intensity between the adjacent image frames.
The threshold value can be set according to the optical flow intensity of the image frame of the video when the actual locomotive runs; in a stopped state of the locomotive, a moving object may exist in the locomotive video, the image frames at the moment of the object moving, and a plurality of optical flow intensities between adjacent image frames may exceed a threshold value; an optical flow intensity exceeding a threshold is an optical flow intensity greater than the threshold, and an optical flow intensity less than or equal to the threshold is an optical flow intensity not exceeding the threshold.
Specifically, each adjacent image frame includes a plurality of optical flow intensities, and the optical flow intensities of each image frame are compared with a preset threshold value respectively to obtain a comparison result of each optical flow intensity and the threshold value. According to the comparison result of each optical flow intensity and the threshold value in each adjacent image frame, counting the number of pixel points with the optical flow intensity larger than the threshold value, and taking the number as the motion intensity of the adjacent image frame. It is understood that the intensity of motion between adjacent image frames may be 0.
In this embodiment, the number of pixels whose optical flow intensity exceeds the threshold is used as the motion intensity between the adjacent image frames, so that the motion intensity between the adjacent image frames can be reflected more intuitively. Meanwhile, the motion intensity is reflected more accurately, so that the target image frame can be determined more accurately, and the time setting accuracy is improved.
In one embodiment, as shown in fig. 3, the step of classifying each motion intensity includes:
step S302, taking the logarithm of each motion intensity, and clustering a plurality of motion intensities after taking the logarithm;
Step S304, classifying the motion intensities according to the clustering result.
The clustering is an algorithm for processing data, and the clustering algorithm comprises a K-Means clustering algorithm, a mean shift clustering algorithm and a K-means++ clustering algorithm.
Specifically, taking the logarithm of the motion intensity between each adjacent image frame, and clustering a plurality of motion intensities after taking the logarithm to obtain a clustering result. Based on the result of the clustering, each motion intensity may be classified, and the type of each motion intensity may be further determined.
In a specific embodiment, the motion intensities are counted in pairs, the K-Means clustering algorithm is used for clustering the counted motion intensities, and the motion intensities are classified into high-intensity and low-intensity according to the clustering result.
In this embodiment, the logarithm is taken for each motion intensity, so that the clustering can be performed more accurately, and the clustering can be performed more simply and in a boundary manner, so that the accuracy of locating the target image frame can be improved, and the accuracy of time synchronization can be further improved.
In one embodiment, prior to the step of logarithming each motion intensity, the method comprises: for each exercise intensity, if the exercise intensity is 0, the exercise intensity is updated to 1.
Specifically, the motion intensity between adjacent image frames may be 0, and in the case where the motion intensity is 0, logarithm taking cannot be performed, and thus an error may be generated when logarithm taking, causing deviation of the result of classification after clustering.
If the motion intensity is 0, the motion intensity is updated to be 1, and the logarithm of the motion intensity updated to be 1 is taken, so that the error of classification after clustering can be reduced.
In one embodiment, the step of calculating optical flow intensities of adjacent image frames in locomotive video at a plurality of identical pixel locations comprises:
removing noise of each image frame in locomotive video to obtain a denoising video;
For each adjacent image frame in the denoising video, calculating the optical flow intensity of the adjacent image frame in the denoising video at a plurality of identical pixel positions.
Specifically, each image frame of the locomotive video may have noise, and each image frame of the locomotive video is removed, so that the locomotive video with noise removed can be obtained. And calculating the optical flow intensity of a plurality of identical pixel points between each adjacent image frame in the locomotive video after noise removal.
In one embodiment, as shown in FIG. 4, the step of synchronizing locomotive video with the LKJ file according to the target image frame includes:
step S402, an LKJ file is obtained, and the LKJ file is analyzed;
step S404, extracting a target time period according to the analysis result; the target time period comprises a starting time;
In step S406, the time corresponding to the target image frame is aligned with the start time in the target time period, so as to pair the locomotive video with the LKJ file.
The LKJ file is formed by data recorded by LKJ of the same train number and the same running time as the locomotive video. The target time period comprises the starting time of the locomotive in the locomotive video, wherein the starting time of the locomotive is the time corresponding to the target image frame; if there are a plurality of starting moments of the locomotive, the target time period comprises the moment corresponding to the target image frame obtained in the locomotive video.
Specifically, LKJ files with the same train number and the same running time as the locomotive video are obtained, and the LKJ files are analyzed. And extracting a target time period including the moment corresponding to the target image frame according to the result of analyzing the LKJ file. The time corresponding to the target image frame is aligned with the start time in the target time period, so that the locomotive video and the LKJ file can be paired.
In a specific embodiment, as shown in fig. 5, the method for automatically synchronizing LKJ file with locomotive video includes the following steps:
Step S501, acquiring locomotive video, removing noise of each image frame in the locomotive video, and obtaining denoising video; the locomotive video comprises a target image frame, wherein the target image frame is an image frame of the locomotive at the starting moment;
step S502, for each adjacent image frame in the denoising video, calculating the optical flow intensity of the adjacent image frame in the denoising video at a plurality of identical pixel point positions by utilizing PYRAMIDAL LK algorithm;
step S503, comparing the plurality of optical flow intensities with a threshold value for each adjacent image frame; according to the comparison result, counting the number of pixel points with the optical flow intensity exceeding a threshold value, and taking the number as the motion intensity between adjacent image frames;
step S504, for each exercise intensity, if the exercise intensity is 0, updating the exercise intensity to 1;
Step S505, taking the logarithm of each motion intensity, and clustering a plurality of motion intensities after taking the logarithm; classifying the exercise intensity according to the clustering result, and classifying the frontal exercise intensity into high intensity and low intensity;
Step S506, determining the type of each exercise intensity as high intensity or low intensity, setting the exercise intensity of the type as high intensity as 1, setting the exercise intensity of the type as low intensity as 0, and calculating the separation score between two adjacent exercise intensities according to the expression (1); the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the beginning time and the image frame at the ending time of the locomotive video;
Step S507, determining the segmentation corresponding to the separation score with the largest value as the optimal separation, and determining the image frame of the locomotive video corresponding to the optimal separation as the target image frame;
step S508, an LKJ file is obtained, and the LKJ file is analyzed; extracting a target time period according to the analysis result; the target time period comprises a starting time;
step S509 aligns the time corresponding to the target image frame with the start time in the target time period, so as to pair the locomotive video with the LKJ file.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides an LKJ file and locomotive video automatic time-setting device for realizing the above-mentioned LKJ file and locomotive video automatic time-setting method. The implementation scheme of the device for solving the problem is similar to that described in the above method, so the specific limitation in the embodiments of the device for automatically timing one or more LKJ files and locomotive video provided below can be referred to the above limitation on the method for automatically timing the LKJ files and locomotive video, and will not be repeated here.
In one embodiment, as shown in fig. 6, there is provided an LKJ file and locomotive video automatic time setting device, including: an optical flow calculation module 610, an intensity determination module 620, a score determination module 630, a target frame determination module 640, and a time synchronization module 650, wherein:
the optical flow calculation module 610 is configured to obtain a locomotive video, and calculate optical flow intensities of adjacent image frames in the locomotive video at a plurality of identical pixel positions; the locomotive video comprises a target image frame, wherein the target image frame is an image frame of the locomotive at the starting moment;
The intensity determining module 620 is configured to determine, for each of the adjacent image frames, a motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames;
The score determining module 630 is configured to classify each exercise intensity, and determine a separation score between two adjacent exercise intensities according to the type of each exercise intensity and a separation algorithm; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the beginning time and the image frame at the ending time of the locomotive video;
The target frame determining module 640 is configured to determine an optimal separation according to the separation score, and determine a target image frame according to the optimal separation;
The time synchronization module 650 is configured to time the locomotive video with the LKJ file according to the target image frame.
In one embodiment, the intensity determination module 620 includes a threshold comparison unit and a statistics unit.
The threshold comparison unit is used for comparing the plurality of optical flow intensities with a threshold respectively;
The statistics unit is used for counting the number of pixel points with the optical flow intensity exceeding a threshold value according to the comparison result, and taking the number as the motion intensity between adjacent image frames.
In one embodiment, the score determination module 630 includes a logarithmic unit, a clustering unit, and a classification unit;
The logarithmic unit is used for taking the logarithm of each motion intensity;
The clustering unit is used for clustering the plurality of the motion intensities after taking the logarithm;
the classifying unit is used for classifying each motion intensity according to the clustering result.
In one embodiment, optical flow calculation module 610 includes a denoising unit and an optical flow unit;
the denoising unit is used for removing noise of each image frame in the locomotive video to obtain a denoising video;
The optical flow unit is used for calculating the optical flow intensity of each adjacent image frame in the denoising video at a plurality of identical pixel point positions.
In one embodiment, the time synchronization module 650 includes a file acquisition unit, an parsing unit, a time extraction unit, and a synchronization unit;
the file acquisition unit is used for acquiring an LKJ file;
the analysis unit is used for analyzing the LKJ file;
The time extraction unit is used for extracting a target time period according to the analysis result; the target time period comprises a starting time;
The synchronization unit is used for aligning the moment corresponding to the target image frame with the starting moment in the target time period so as to time the locomotive video and the LKJ file.
All or part of each module in the LKJ file and locomotive video automatic time setting device can be realized by software, hardware and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 7. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer equipment is used for storing locomotive video, LKJ files and other data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program when executed by the processor implements a method for automatically synchronizing LKJ files with locomotive video.
It will be appreciated by those skilled in the art that the structure shown in FIG. 7 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In an embodiment, there is also provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magneto-resistive random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (PHASE CHANGE Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.

Claims (10)

1. An automatic time alignment method for LKJ files and locomotive videos, which is characterized by comprising the following steps:
acquiring locomotive video, and calculating optical flow intensity of adjacent image frames in the locomotive video at a plurality of same pixel point positions; the locomotive video comprises a target image frame, wherein the target image frame is an image frame of the locomotive at the starting moment;
For each of the adjacent image frames, determining a motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames;
Classifying each motion intensity, and determining a separation score between two adjacent motion intensities according to the type of each motion intensity and a separation algorithm; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the beginning time and the image frame at the ending time of the locomotive video;
Determining an optimal separation according to the separation score, and determining the target image frame according to the optimal separation;
And according to the target image frame, the locomotive video and the LKJ file are paired.
2. The method of claim 1, wherein determining the intensity of motion between the adjacent image frames from a plurality of the optical flow intensities corresponding to the adjacent image frames comprises:
comparing a plurality of said optical flow intensities with a threshold value, respectively;
And according to the comparison result, counting the number of pixel points with the optical flow intensity exceeding a threshold value, and taking the number as the motion intensity between the adjacent image frames.
3. The method of claim 1, wherein the step of classifying each intensity of motion comprises:
taking the logarithm of each motion intensity, and clustering a plurality of motion intensities after taking the logarithm;
and classifying each motion intensity according to the clustering result.
4. A method according to claim 3, comprising, prior to the step of logarithmically taking each of the motion intensities:
for each motion intensity, if the motion intensity is 0, the motion intensity is updated to 1.
5. The method of claim 1, wherein the step of calculating optical flow intensities for adjacent image frames in the locomotive video at a plurality of identical pixel locations comprises:
Removing noise of each image frame in the locomotive video to obtain a denoising video;
For each adjacent image frame in the denoising video, calculating optical flow intensity of the adjacent image frame in the denoising video at a plurality of identical pixel point positions.
6. The method of claim 1, wherein the step of synchronizing the locomotive video with the LKJ file based on the target image frame comprises:
Acquiring the LKJ file, and analyzing the LKJ file;
extracting a target time period according to the analysis result; the target time period comprises the starting time;
and aligning the moment corresponding to the target image frame with the starting moment in the target time period so as to time the locomotive video and the LKJ file.
7. An apparatus for automatically synchronizing LKJ files with locomotive video, said apparatus comprising:
The optical flow calculation module is used for acquiring locomotive video and calculating optical flow intensity of adjacent image frames in the locomotive video at a plurality of same pixel point positions; the locomotive video comprises a target image frame, wherein the target image frame is an image frame of the locomotive at the starting moment;
an intensity determining module, configured to determine, for each of the adjacent image frames, a motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames;
The score determining module is used for classifying each motion intensity and determining a separation score between two adjacent motion intensities according to the type of each motion intensity and a separation algorithm; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the beginning time and the image frame at the ending time of the locomotive video;
A target frame determining module for determining an optimal separation according to the separation score and determining the target image frame according to the optimal separation;
and the time setting module is used for setting time of the locomotive video and the LKJ file according to the target image frame.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 6 when the computer program is executed.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
CN202111570909.7A 2021-12-21 2021-12-21 LKJ file and locomotive video automatic time setting method, device and computer equipment Active CN114387311B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111570909.7A CN114387311B (en) 2021-12-21 2021-12-21 LKJ file and locomotive video automatic time setting method, device and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111570909.7A CN114387311B (en) 2021-12-21 2021-12-21 LKJ file and locomotive video automatic time setting method, device and computer equipment

Publications (2)

Publication Number Publication Date
CN114387311A CN114387311A (en) 2022-04-22
CN114387311B true CN114387311B (en) 2024-06-25

Family

ID=81196961

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111570909.7A Active CN114387311B (en) 2021-12-21 2021-12-21 LKJ file and locomotive video automatic time setting method, device and computer equipment

Country Status (1)

Country Link
CN (1) CN114387311B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105323420A (en) * 2014-07-29 2016-02-10 腾讯科技(深圳)有限公司 Video image processing method and apparatus

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110019060B (en) * 2017-12-27 2021-04-13 郑州畅想高科股份有限公司 Method and device for automatically synchronizing locomotive video file and operation record file
CN111340101B (en) * 2020-02-24 2023-06-30 广州虎牙科技有限公司 Stability evaluation method, apparatus, electronic device, and computer-readable storage medium
CN113763988B (en) * 2020-06-01 2024-05-28 中车株洲电力机车研究所有限公司 Time synchronization method and system for locomotive cab monitoring information and LKJ monitoring information
CN113450579B (en) * 2021-08-30 2021-12-14 腾讯科技(深圳)有限公司 Method, device, equipment and medium for acquiring speed information

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105323420A (en) * 2014-07-29 2016-02-10 腾讯科技(深圳)有限公司 Video image processing method and apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
铁路机车安全视频报警记录***;陆海空;朱蒙;;中国科技信息;20100915(第18期);全文 *

Also Published As

Publication number Publication date
CN114387311A (en) 2022-04-22

Similar Documents

Publication Publication Date Title
US10803357B2 (en) Computer-readable recording medium, training method, and object detection device
CN108229456B (en) Target tracking method and device, electronic equipment and computer storage medium
US9418297B2 (en) Detecting video copies
EP2891990B1 (en) Method and device for monitoring video digest
Fuhl et al. Automatic Generation of Saliency-based Areas of Interest for the Visualization and Analysis of Eye-tracking Data.
US11527000B2 (en) System and method for re-identifying target object based on location information of CCTV and movement information of object
US11068713B1 (en) Video-based intelligent road traffic universal analysis
US10271018B2 (en) Method of detecting critical objects from CCTV video using metadata filtering
CN110430339B (en) Digital video intra-frame tampering detection method and system
CN110418129B (en) Digital video interframe tampering detection method and system
KR101821989B1 (en) Method of providing detection of moving objects in the CCTV video data by reconstructive video processing
CN110348392B (en) Vehicle matching method and device
CN111832492A (en) Method and device for distinguishing static traffic abnormality, computer equipment and storage medium
CN102301697A (en) Video identifier creation device
CN110969645A (en) Unsupervised abnormal track detection method and unsupervised abnormal track detection device for crowded scenes
CN114387311B (en) LKJ file and locomotive video automatic time setting method, device and computer equipment
Jin et al. Object-based video forgery detection via dual-stream networks
US20220375202A1 (en) Hierarchical sampling for object identification
CN112581489A (en) Video compression method, device and storage medium
CN111553408B (en) Automatic test method for video recognition software
CN111062294B (en) Passenger flow queuing time detection method, device and system
CN111597979B (en) Target object clustering method and device
CN112381058A (en) Black and white list control method and device based on pedestrian re-identification
CN112380970A (en) Video target detection method based on local area search
CN117411987B (en) Drop-out time detection method, drop-out time detection equipment and storage medium for monitoring video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant