CN111127507A - Method and system for determining throwing object - Google Patents

Method and system for determining throwing object Download PDF

Info

Publication number
CN111127507A
CN111127507A CN201911313526.4A CN201911313526A CN111127507A CN 111127507 A CN111127507 A CN 111127507A CN 201911313526 A CN201911313526 A CN 201911313526A CN 111127507 A CN111127507 A CN 111127507A
Authority
CN
China
Prior art keywords
projectile
image
determination
background image
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911313526.4A
Other languages
Chinese (zh)
Inventor
王博文
谭志国
石永禄
毛河
陈志超
周彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Topplusvision Science & Technology Co ltd
Original Assignee
Chengdu Topplusvision Science & Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Topplusvision Science & Technology Co ltd filed Critical Chengdu Topplusvision Science & Technology Co ltd
Priority to CN201911313526.4A priority Critical patent/CN111127507A/en
Publication of CN111127507A publication Critical patent/CN111127507A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application discloses a method and a system for determining a projectile. The throw determination method includes: acquiring at least one monitoring area image; determining a current background image based on the at least one monitored area image; determining a change region in the current background image based on a historical background image; detecting a projectile in the area of change. By the aid of the method for determining the sprinkled objects, the sprinkled objects appearing on the road can be found and reported in time, potential safety hazards of the road are reduced, detection efficiency of the sprinkled objects is improved, and accuracy of detection results is improved.

Description

Method and system for determining throwing object
Technical Field
The present application relates to the field of image processing, and in particular, to a method and a system for determining a projectile.
Background
With the continuous development of social economy and public transportation roads, the traffic flow is rapidly increased, however, the traffic safety accidents caused by the rapid increase are also frequently happened. Among them, the accidental sprinklers on roads bring potential huge potential safety hazards to the running vehicles, may cause traffic accidents and even cause more serious consequences, and become the traffic safety problem which is urgently needed to be solved. Therefore, it is necessary to provide a method for determining a projectile, which can find and report the projectile on a road in time, and reduce the potential safety hazard of the road.
Disclosure of Invention
One aspect of the present application provides a projectile determination method. The throw determination method includes: acquiring at least one monitoring area image; determining a current background image based on at least one monitoring area image; determining a change region in the current background image based on a historical background image; detecting a projectile in the area of change.
In some embodiments, the current background image is an image obtained by removing a moving object from the monitored region image, and the historical background image is an image obtained by removing a moving object from the historical monitored region image.
In some embodiments, the changed region includes a region of the current background image that is different from the historical background image.
In some embodiments, said detecting a projectile in said varying area further comprises: obtaining a set of detected projectiles, the set of detected projectiles including one or more detected projectiles and associated information thereof, the associated information of the detected projectiles at least reflecting the position of the detected projectiles; comparing the area of change with information relating to one or more detected projectiles; in response to the comparison being a match, determining that the area of change contains a detected projectile; in response to the comparison result being a mismatch, detecting a projectile in the change region using a projectile determination model, the projectile determination model being a machine learning model, and taking the detected projectile as a new projectile; updating the set of detected projectiles based on the new projectile.
In some embodiments, the projectile determination model is obtained by: acquiring a sample image, the sample image including at least one projectile; labeling the projectile in the sample image; and taking the sample image as input data, taking the marked throwing object as output data or training a machine learning model by reference to a standard to obtain a trained throwing object determining model.
In some embodiments, the projectile in the sample image comprises at least one of: vehicles, pedestrians, roads, lane lines, guard rails/isolation strips, plants, cone buckets, road billboards, dirt/soil slopes, spills/drops.
In some embodiments, the projectile determination methods provided herein further comprise: outputting a detection result, wherein the detection result comprises one or more of the following information: whether there is a projectile, the type of projectile, the location of the projectile, the number of projectiles, and the picture of the projectile.
In some embodiments, the output mode of the detection result comprises one or more of the following combinations: highlighting the projectile, prompting the projectile with a prompt identifier, text prompting the projectile, or sound prompting the projectile.
In some embodiments, the method of projectile determination provided herein further comprises updating the historical background image based on the current background image.
Another aspect of the application provides a projectile determination system comprising an acquisition module, a background extraction module, a change region determination module, and a projectile detection module; the acquisition module is used for acquiring at least one monitoring area image; the background extraction module is used for determining a current background image based on at least one monitoring area image; the change region determination module is used for determining a change region in the current background image based on a historical background image; the projectile detection module is for detecting a projectile in the varying area.
In some embodiments, the projectile determination system provided herein further comprises an update module for updating the set of detected projectiles based on the new projectile. In some embodiments, the update module is further configured to update the historical background image based on the current background image.
In some embodiments, the projectile determination system provided herein further comprises an output module for outputting the detection results in a manner including one or a combination of: highlighting the projectile, prompting the projectile with a prompt identifier, text prompting the projectile, or sound prompting the projectile.
Another aspect of the application provides a projectile determination device, the device comprising a processor and a memory; the memory is configured to store computer instructions that, when executed by the processor, cause the apparatus to implement a projectile determination method as previously described.
Another aspect of the application provides a computer-readable storage medium storing computer instructions, at least a portion of which, when executed by at least one processor, implement a method of projectile determination as previously described.
Drawings
The present application will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic view of an application scenario of a projectile determination system according to some embodiments of the present application;
FIG. 2 is a block diagram of a projectile determination system according to some embodiments of the present application;
FIG. 3 is an exemplary flow chart of a method of projectile determination shown in accordance with some embodiments of the present application;
FIG. 4 is an exemplary flow chart of a method of projectile determination shown according to some embodiments of the present application:
FIG. 5 is an exemplary flow chart of a method of varying area projectile detection according to some embodiments of the present application;
FIG. 6 is an exemplary flow chart of a projectile determination method according to some embodiments of the present application;
FIG. 7 is an exemplary flow chart of a method of acquiring a projectile determination model according to some embodiments of the present application.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
FIG. 1 is a schematic diagram of an application scenario of a projectile determination system according to some embodiments of the present application. The projectile determination system 100 can determine projectiles in the road monitoring area image, so that monitoring management personnel can clear the road projectiles in time, and traffic accidents are reduced.
The projectile determination system 100 may be an on-line monitoring platform for roadway management. For example, the spill determination system 100 may be an intelligent video surveillance platform for safety precaution. In some embodiments, the spill determination system 100 may be applied to public areas with complex people flow crowded scenes, such as airports, stadiums, waiting halls, and exhibitions. In some embodiments, the projectile determination system 100 may also be applied to projectile detection in streets, highways, etc. roads. The projectile determination system 100 may include a server 110, a network 120, a user terminal 130, a storage device 140, and an image acquisition device 150. The server 110 may include a processing device 112.
In some embodiments, the server 110 may be used to process information and/or data related to determining a projectile. The server 110 may be a stand-alone server or a group of servers. The set of servers can be centralized or distributed (e.g., server 110 can be a distributed system). The server 110 may be regional or remote in some embodiments. For example, server 110 may access information and/or data stored in user terminal 130, storage device 140, through network 120. In some embodiments, server 110 may be directly connected to user terminal 130, storage device 140, image capture device 150 to access information and/or data stored therein. In some embodiments, the server 110 may execute on a cloud platform. For example, the cloud platform may include one or any combination of a private cloud, a public cloud, a hybrid cloud, a community cloud, a decentralized cloud, an internal cloud, and the like.
In some embodiments, the server 110 may include a processing device 112. The processing device 112 may process data and/or information related to determining a projectile (e.g., a request to detect a projectile in a roadway) to perform one or more of the functions described herein. For example, the processing device 112 may receive the images captured by the image capturing device 150, process the images to determine whether the projectile and information related to the projectile are contained therein, and output the presence of the projectile in the monitored area to the user terminal 130. In some embodiments, the processing device 112 may include one or more sub-processing devices (e.g., a single core processing device or a multi-core processing device). By way of example only, the processing device 112 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an Application Specific Instruction Processor (ASIP), a Graphics Processor (GPU), a Physical Processor (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a programmable logic circuit (PLD), a controller, a micro-controller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like, or any combination thereof.
Network 120 may facilitate the exchange of data and/or information. In some embodiments, one or more components in the projectile determination system 100 (e.g., the server 110, the user terminal 130, the storage device 140, the image acquisition device 150) may send data and/or information to other components in the projectile determination system 100 via the network 120. In some embodiments, network 120 may be any type of wired or wireless network. For example, network 120 may include a cable network, a wired network, a fiber optic network, a telecommunications network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, the like, or any combination thereof. In some embodiments, network 120 may include one or more network access points. For example, the network 120 may include wired or wireless network access points, such as base stations and/or Internet switching points 120-1, 120-2, …, through which one or more components of the spill determination system 100 may connect to the network 120 to exchange data and/or information.
In some embodiments, the user may obtain the projectile information for the monitored area through the user terminal 130. In some embodiments, the user terminal 130 may include one or any combination of a mobile device 130-1, a tablet computer 130-2, a laptop computer 130-3, an in-vehicle device 130-4, and the like. In some embodiments, the mobile device 130-1 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, and the like, or any combination thereof. In some embodiments, the smart furniture device may include a smart lighting device, a control device for a smart appliance, a smart monitoring device, a smart television, a smart camera, an intercom, or the like, or any combination thereof. In some embodiments, the wearable device may include a smart bracelet, smart footwear, smart glasses, smart helmet, smart watch, smart clothing, smart backpack, smart accessory, or the like, or any combination thereof. In some embodiments, the smart mobile device may comprise a smart phone, a Personal Digital Assistant (PDA), a gaming device, a navigation device, a POS device, and the like, or any combination thereof. In some embodiments, the metaverse device and/or the augmented reality device may include metaverse helmets, metaverse glasses, metaverse eyewear, augmented reality helmets, augmented reality glasses, augmented reality eyewear, and the like, or any combination thereof.
Storage device 140 may store data and/or instructions. In some embodiments, storage device 140 may store the profile retrieved from user terminal 130. In some embodiments, storage device 140 may store information and/or instructions for execution or use by server 110 to perform the example methods described herein. In some embodiments, storage device 140 may include mass storage, removable storage, volatile read-and-write memory (e.g., random access memory, RAM), read-only memory (ROM), the like, or any combination thereof. In some embodiments, the storage device 140 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a decentralized cloud, an internal cloud, and the like, or any combination thereof.
In some embodiments, a storage device 140 may be connected to the network 120 to communicate with one or more components of the projectile determination system 100 (e.g., the server 110, the user terminal 130, the image capture device 150, etc.). One or more components of the projectile determination system 100 may access data or instructions stored in the storage device 140 via the network 120. In some embodiments, the storage device 140 may be directly connected to or in communication with one or more components (e.g., the server 110, the user terminal 130, the image capture device 150) in the projectile determination system 100. In some embodiments, the storage device 140 may be part of the server 110.
The image capturing device 150 may capture an image of the monitored area. In some embodiments, image capture device 150 may send the captured monitored area image data to storage device 140 for storage. In some embodiments, image capture device 150 may send captured image data to server 110. In some embodiments, image capture device 150 may include any combination of one or more of a spherical camera, a dome camera, a surveillance camera, a smart camera, a pinhole camera, and the like. In some embodiments, image capture device 150 may also include any combination of one or more of a tachograph, smart glasses, smart helmet, cell phone, tablet, and the like. In some embodiments, image capture device 150 may include any combination of one or more of a digital camera, a single lens reflex camera, a micro single camera, and the like. In some embodiments, the image capture device may comprise any apparatus with a camera. In some embodiments, the camera may comprise any device having image capture capabilities.
FIG. 2 is a block diagram of a projectile determination system according to some embodiments of the present application. As shown in fig. 2, the projectile determination system 200 may include an acquisition module 210, a background extraction module 220, a change region determination module 230, a projectile detection module 240, an output module 250, an update module 260, and a training module 270.
The acquisition module 210 may be used to acquire images of a monitored area.
In some embodiments, the acquisition module 210 may acquire images of the monitored area acquired by the image acquisition device 150. In some embodiments, the images acquired by image acquisition device 150 may include static images and/or dynamic images. In some embodiments, the images captured by image capture device 150 may include multiple frames of video images within the monitored area. In some embodiments, the acquisition mode of image acquisition device 150 may include any combination of one or more of timing acquisition, real-time acquisition, panoramic acquisition, sliced acquisition, and the like.
The background extraction module 220 may be used to determine a background image.
In some embodiments, the background extraction module 220 may determine the current background image of the monitored area image using a foreground detection algorithm (e.g., frame differencing). In some embodiments, the background extraction module 220 may determine the current background image of the monitored area image using background modeling (e.g., gaussian mixture modeling). In some embodiments, the background extraction module 220 may determine the current background image using foreground detection and background modeling. In some embodiments, the background extraction module 220 may obtain the background image of the monitored area based on a plurality (e.g., 500, 1000, etc.) of video images of the monitored area.
The changed region determination module 230 may be used to determine a changed region in the current background image.
In some embodiments, the change region determination module 230 may determine the change region in the current background image using a contrast detection method. For example, the changed region determining module 230 may determine the changed region in the current background image by comparing the current background image with the inconsistent portion in the history background image. In some embodiments, the contrast detection method may include one or any combination of color contrast, shape contrast, and the like. In some embodiments, the contrast detection method may further include one or any combination of histogram methods, image template matching, PSNR peak signal-to-noise ratio, ssim (structural similarity) structural similarity, perceptual hashing algorithms, and the like. In some embodiments, the change region determining module 230 may determine the change region in the current background image in other feasible ways, which is not limited by the present disclosure.
The projectile detection module 240 may be used to detect projectiles in varying areas.
In some embodiments, the projectile detection module 240 may detect projectiles in the areas of change using contrast detection. For example, the projectile detection module 240 may determine a projectile in the varying region by comparing the varying region to the detected projectile. In some embodiments, the projectile detection module 240 may detect projectiles in the varying regions using the projectile determination model.
The output module 250 may be used to output the detection result.
In some embodiments, the output module 250 may output the detection of the projectile to the user terminal 130. In some embodiments, the output module 250 may output the detection of the spill by a combination of one or more of highlighting, identifying a prompt with a prompt, text prompting, audible prompting, and the like.
The update module 260 may be used to update the set of detected projectiles as well as the historical background image.
In some embodiments, the update module 260 may update the detected set of sprinkles by a combination of one or more of adding, deleting, merging, replacing, and the like. In some embodiments, the update module 260 may update the historical background image based on the current background image. For example, the updating module 260 may update the current background image to the new historical background image by one or more of adding, deleting, merging, replacing, and the like.
The training module 270 may be used to train the acquisition of the projectile determination model.
In some embodiments, the training module 270 may be used to acquire sample images. In some embodiments, the sample image may include one or any combination of a vehicle, a pedestrian, a road, a lane line, a guard rail/median, a plant, a cone, a road sign, a soil/soil slope, a spill/throw, and the like. In some embodiments, the training module 270 may be used to label a projectile in the sample image. In some embodiments, the training module 270 may train a machine learning model using the sample image as input data and the annotated projectile as output data or with reference to a standard, resulting in a trained projectile determination model.
In some embodiments, the projectile determination system 200 may also include other execution modules. For example, the projectile determination system 200 may further include one or any combination of an image pre-processing module, an image post-processing module, a projectile ensemble determination module, and the like.
It should be understood that the system and its modules shown in FIG. 2 may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules of the present application may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the projectile determination system 200 and its modules is merely for convenience of description and is not intended to limit the present application to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. For example, in some embodiments, the acquisition module 210, the background extraction module 220, the change region determination module 230, the projectile detection module 240, the output module 250, the update module 260, and the training module 270 disclosed in fig. 2 may be different modules in a system, or may be a module that implements the functions of two or more of the above modules. The modules of the projectile determination system 200 may share a memory module, or each module may have a separate memory module. Such variations are within the scope of the present application.
Fig. 3 is an exemplary flow chart of a method of projectile determination according to some embodiments of the present application.
Step 310, acquiring a monitoring area image. In some embodiments, step 310 may be implemented by acquisition module 210.
In some embodiments, the image of the monitored area may be used to reflect a scene change at a location over a period of time. In some embodiments, the monitored area image may include both static objects and dynamic objects in the scene. In some embodiments, a static object may be an object that remains in a fixed position for a long period of time. For example, the static object may include one or any combination of trees, buildings, street lights, traffic signs, guardrails, display boards, parked vehicles, and the like. In some embodiments, a dynamic object may be an object that changes position over a time frequency. For example, one or any combination of a moving vehicle, a walking person, a moving animal, an object (e.g., fallen leaves, empty carton) driven by wind, and the like.
In some embodiments, the images of the monitored area can be continuously captured and acquired by a fixed image acquisition device preset in the road. In some embodiments, the fixed image acquisition device may include one or any combination of a road condition monitoring device, a security monitoring device, an intersection violation monitoring device, a speed measurement monitoring device, a traffic flow monitoring device, and the like. In some embodiments, the monitoring/surveillance device may be any device with camera functionality. In some embodiments, the monitored area images may be acquired by continuous shooting with a moving image capture device. In some embodiments, the mobile image capture device may include one or any combination of a vehicle event recorder, a camera, a cell phone, a tablet, a computer, smart glasses, and the like.
In some embodiments, the acquired images of the monitored area may be panoramic images acquired by the fixed/mobile image acquisition device within the field of view of the shot. The shooting field of view is the maximum shooting angle and/or the maximum shooting distance that the fixed/moving image acquisition device can reach. In some embodiments, the acquired monitoring area image may be a road image (e.g., 500 frames, 800 frames, 1000 frames, etc.) comprising a plurality of video frames. In some embodiments, the obtaining module 210 may obtain the video frames in the recorded video at regular time intervals to obtain one or more (e.g., 15, 20, 30, etc.) images of the monitoring area. For example, the time interval may be 0.5 seconds, 1 second, 5 seconds, 10 seconds, 20 seconds, 30 seconds, 40 seconds, or the like. In some embodiments, the time interval may be set based on the processing power of the system. In some embodiments, the time interval may be determined based on road traffic flow. For example, if the traffic flow is large, the time interval value is small.
At step 320, a current background image is determined based on the monitored area image. In some embodiments, step 320 may be implemented by the context extraction module 220.
The current background image is an image obtained by removing a moving object from the image of the monitored area. The moving object may be an object whose position changes continuously in a plurality of images of the monitoring area. For example, one or any combination of a moving vehicle, a walking person, a moving animal (e.g., running, flying, walking, etc.), a wind-propelled object (e.g., fallen leaves, empty cartons), and the like. In some embodiments, the current background image may be determined using a frame difference method. For example, the background extraction module 220 may determine a moving object in the monitored region image as a foreground image by using a frame difference method, and then separate a foreground portion from the monitored region image, and use the remaining image as a current background image. In some embodiments, the current background image may be determined using a mixture of gaussian background model algorithms. In some embodiments, the background extraction module 220 may determine the current background image using other possible ways (e.g., VIBE algorithm).
In step 330, a change area of the current background image is determined based on the historical background image. In some embodiments, step 330 may be implemented by the change region determination module 230.
In some embodiments, the historical background image may be an image of the monitored area obtained after removal of a moving object, such as a current background image obtained during historical projectile detection. In some embodiments, the historical background image may be a monitored area image acquired in the absence of moving objects and projectiles. For example, when road spray detection is first performed using the spray determination system 200. In some embodiments, the changed region may be used to reflect a region where the current background image differs from the historical background image. For example, the changed region may be used to reflect a region of the current background image that contains more and/or less objects than the historical background image contains objects. In some embodiments, the change region determination module 230 may determine the change region of the current background image by comparing the historical background image and the current background image. In some embodiments, the contrast method may include one or any combination of color contrast, shape contrast, and the like. In some embodiments, the difference between the current background image and the historical background image may be calculated according to the pixel points, so as to obtain a change area different from the historical background image in the current background image. In some embodiments, the system may extract the changed region in the current background image to obtain a changed region image. For example, comparing the current background image with the historical background image, finding that there is an unknown object in the current background image at the intersection position, but there is no unknown object in the historical background image, determining the intersection position as the change area of the current background image, and segmenting the change area to generate the change area image of the intersection position.
In step 340, a projectile in the area of change is detected. In some embodiments, step 340 may be implemented by the projectile detection module 240.
In some embodiments, the object may include one or any combination of window garbage, floating objects, vehicle parts, truck spills, various animal carcasses, and the like in roads, and the object falling in the roads may bring about a great safety hazard to the vehicle. In some embodiments, the vehicle window domestic waste can comprise one or any combination of plastic empty bottles, melon and fruit peels, food packages and other vehicle window falling objects. In some embodiments, the floating objects may include one or more of objects floating from elsewhere (e.g., plastic bags, empty cartons, etc.), objects that should not be apparent on the roadway (e.g., plants falling on the roadway due to storm, thunderstorm, construction problems, etc., billboards, etc.), and the like. In some embodiments, the vehicle component may include various vehicle components such as tires, bolts, vehicle covers, and the like. Vehicle components may be scattered in the road due to various factors such as burst tires, loose components, strong wind, etc. In some embodiments, the truck spill may include various goods spilled while the truck is in transit, such as fruit, vegetables, coal, and the like. In some embodiments, the projectile detection module 240 may detect projectiles in the varying regions using the projectile determination model. The projectile determination model may include a machine learning model. In some embodiments, the machine learning model may include a deep learning model. For example, the deep learning model may include, but is not limited to, one or any combination of Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), RCNNs (regions with CNNs), Fast-RCNNs, BP Neural Networks, K-nearest neighbor algorithms (KNNs), Support Vector Machines (SVMs), and the like.
In some embodiments, the projectile determination system 200 may determine whether the projectile is contained in the variation region based on the output of the projectile determination model. For example, the projectile determination system 200 may set a threshold value, and when the probability that the variation area contains the projectile in the output result is greater than a preset threshold value, the variation area is considered to contain the projectile. The certain threshold may be dynamically configured in a program (for example, the threshold is directly set and stored in the front end of the system), or may be manually set. In some embodiments, the projectile determination model may determine two or more projectiles in the area of change. In this case, the projectile determination model can not only detect the presence of a projectile in the varying area, but also determine the type of various projectiles and the position of the projectile in the varying area. For more details on the projectile determination model, reference may be made to fig. 7 and its associated description.
In some embodiments, the projectile detection module 240 may detect a projectile in the varying region based on the set of detected projectiles. For more details on detecting a projectile in a variation region based on a set of detected projectiles, reference may be made to fig. 5 and its associated description, which are not repeated herein.
In some embodiments, step 350 of outputting the detection result may be further included. In some embodiments, step 350 may be implemented by output module 250.
In some embodiments, the output module 250 may output a detection result of the projectile when the projectile is present in the varying area. In some embodiments, the output detection result may include one or any combination of information about the presence or absence of a projectile, the type of projectile, the location of the projectile, the number of projectiles, a picture of the projectile, and the like. In some embodiments, the output of the detection result may include one or any combination of highlighting the projectile, identifying the alert projectile with an alert, text alerting the projectile, sound alerting the projectile, and the like. For example, highlighting the projectile may include magnifying the display, circling the display, or the like. In some embodiments, the output module 250 may also automatically generate images of the change before and after the appearance of the projectile for visually displaying the information of the projectile.
In some embodiments, the projectile determination system 200 (or the projectile determination system 100) may report the output detection results to a backend server, and the backend server administrator may further process the detection results according to the projectile. For example, service personnel or vehicles may be dispatched to the respective locations to clear the detected road spill. In some embodiments, the projectile detection results may be reported by setting a specific area control projectile determination system 200 (or projectile determination system 100). For example, the projectile determination system 200 (or the projectile determination system 100) may filter the projectile detection results and report to a background server when the projectile is within the particular area.
It should be noted that the above description related to the flow 300 is only for illustration and explanation, and does not limit the applicable scope of the present application. Various modifications and changes to flow 300 will be apparent to those skilled in the art in light of this disclosure. However, such modifications and variations are intended to be within the scope of the present application. For example, in step 350, the detection result may be output to a mobile terminal of the administrator (e.g., a laptop, a mobile phone, a tablet computer, etc.), and an alarm may be issued to the administrator. For another example, in step 320, the projectile determination system 200 may obtain the current background image through multiple iterative calculations based on the monitored area image and the historical background image.
Fig. 4 is an exemplary flow chart of a method of projectile determination according to some embodiments of the present application. In some embodiments, the projectile determination method 400 may be performed by the projectile determination system 200.
In contrast to the projectile determination method 300, where the steps (e.g., steps 410, 420, 430, 440) of the projectile determination method 400 for determining a projectile are all the same as the corresponding steps (e.g., steps 310, 320, 330, 340) in fig. 3, after detecting a projectile (i.e., after step 440), the projectile determination method 400 updates the set of detected projectiles and the historical background image based on the detected projectile and the current background image, respectively. As shown in fig. 4, after step 440 (detecting a projectile in the area of change), the projectile determination system 200 may perform step 450: the projectile queue is updated based on the detected new projectiles, and the historical background image is updated based on the current background image. In the projectile determination method 400, the projectile determination system 200 may update the set of detected projectiles based on the new projectiles detected and update the historical background image based on the current background image, to a degree that may improve the accuracy of projectile detection.
In step 450, when the projectile determination model is used to determine that the change area image contains a projectile, the projectile determination system 200 (e.g., the update module 260) may update the set of detected projectiles with the detected new projectile. For example, a new projectile may be added to the detected projectile collection. In some embodiments, the updated set of projectiles may be used for the detection of the next projectile. In some embodiments, the detected set of sprinkles can be updated periodically (e.g., deleted, replaced, added, etc.). For example, a newly detected projectile may be added. For another example, the detected set of projectiles may be cleaned at regular intervals (e.g., 6 hours, a day, a week, or 10 days) to remove projectiles that have been cleaned from the monitored area. The throw set may be updated on-line in real time, or off-line, periodically or aperiodically.
In step 450, the projectile determination system 200 (e.g., the update module 260) may update the historical background image with the current background image. In some embodiments, the updating manner of the background image may include one or any combination of merging, deleting, replacing, adding, and the like. For example, the update module 260 may treat the current background image as a new historical background image. In some embodiments, the updated historical background image may be used to determine the area of change in the current background image at the next detection of a spill. In some embodiments, updating the historical background image with the current background image may eliminate interference caused by changes in road conditions and/or other changes, improving accuracy and real-time performance of the projectile detection. In some embodiments, the historical background image is updated by using the current background image, so that the identified projectile can be judged as a background part under the condition of no change, thereby avoiding repeated identification of the same projectile, simplifying the projectile detection process and improving the identification efficiency.
For details of the steps in fig. 4 corresponding to fig. 3, reference may be made to fig. 3 and the related description thereof, which are not repeated herein.
FIG. 5 is an exemplary flow chart of a method of varying area projectile detection according to some embodiments of the present application.
Step 510, a set of detected projectiles is acquired. In some embodiments, step 510 may be implemented by the projectile detection module 240.
In some embodiments, the set of detected projectiles may include one or more detected projectiles and their associated information. In some embodiments, the detected projectile may include a manually detected projectile and/or a machine (e.g., projectile determination system 100, 200) detected projectile. In some embodiments, the information relating to the detected projectile may include one or any combination of the location of the detected projectile, the type of projectile, and the like. In some embodiments, the location of the detected projectile may include the location of the detected projectile in the historical background image. In some embodiments, the location of the detected projectile may include a location in the current background image mapped according to the location of the detected projectile in the historical background image. In some embodiments, the location of the detected projectile may include a location that maps to a location in real space based on the location of the detected projectile in the historical background image.
In some embodiments, the projectile detection module 240 may retrieve the set of detected projectiles from the storage device 140. In some embodiments, the projectile detection module 240 may retrieve the set of detected projectiles from the source database. In some embodiments, the projectile detection module 240 may obtain a set of detected projectiles via the network 120.
Step 520, compare if the area of change matches the detected projectile. In some embodiments, step 520 may be implemented by the projectile detection module 240.
In some embodiments, the projectile detection module 240 may compare the change area to the location of the detected projectile to determine whether the change area matches the detected projectile. For example, if the area of the change region in the current background image is highly coincident with the area of the detected projectile in the historical background image, the change region is considered to contain the detected projectile. In some embodiments, when the areas of the two co-located regions reach a set ratio, the regions of the two are considered to be highly overlapped. The set ratio may be a value between 60% and 100%, such as 70%, 80%, etc. The ratio may be the ratio of the area of the two co-located regions to the area of the varying region. For example, the variation area and the position of the projectile may be respectively described by a rectangular frame on one image (such as the current background image and the historical background image), the rectangular frame is composed of a vertex on the left side and three elements of width and height, the matching of the variation area and the detected position of the projectile is actually the matching of two rectangular frame areas, and whether the two areas are overlapped can be determined by calculating the ratio of the area size of the overlapped area of the two rectangular frames to the area sum of the two rectangular frames and comparing the ratio with a set threshold ratio (such as 80%).
In some embodiments, the region of variation being highly coincident with a corresponding region of the detected projectile (e.g., the region in the historical background image) may include both regions being completely coincident and/or being mathematically coincident. The complete overlap of the regions means that the shapes, sizes, contents, and the like of the regions of the two are completely identical. For example, a suspected projectile in the area of change is completely coincident with a detected projectile. The suspected projectile refers to a projectile that may be present in the area of change. The region mathematical coincidence refers to the coincidence of the shape, size and/or area of the two regions. For example, a new projectile is in the changed region, which corresponds to a detected projectile in the historical background image that has disappeared (e.g., blown away by wind, cleared by staff, etc.). In this case, the projectile detection module 240 may further detect the projectile in the changed area, for example, using the projectile determination model to detect the projectile in the changed area.
In some embodiments, the position of the detected projectile in the historical background image may be mapped to the current background region based on its position information, and the positions of the two may be matched. In some embodiments, the image of the monitored area is acquired by a fixed image acquisition device, and the position information of the detected projectile in the current background image may be considered to be the same as that in the historical background image. In some embodiments, the detected projectile may also be mapped into the actual monitored area according to the position information of the detected projectile in the historical background image, and the position information of the change area in the current background image may be mapped into the actual monitored area, and then the positions of the two in the actual monitored area may be compared. In some embodiments, the projectile detection module 240 may determine whether the change region matches a detected projectile by comparing the change region to projectiles in the set of detected projectiles one by one.
If there is a match, then the area of change is determined to contain the detected projectile, step 530. In some embodiments, step 530 may be implemented by the projectile detection module 240.
In some embodiments, when the changed region matches one or more of the set of detected projectiles, the projectile detection module 240 may determine that the detected projectiles are contained in the changed region of the current background image. In some embodiments, when a detected projectile is contained in the area of change, the projectile determination system 200 will proceed directly to the next detection. In some embodiments, the projectile detection module 240 may further detect the projectile in the varying region when the detected projectile is included in the varying region (e.g., detect the projectile in the varying region using the projectile determination model). For example, the area of variation is mathematically coincident with the area of the projectile. In some embodiments, when a detected projectile is included in the region of variation, indicating that the projectile has been detected prior to the detection, the detection result for the detection of the projectile will not be output. For example, when the change area completely coincides with the area where the projectile is located, it indicates that the suspected projectile in the change area of the current background image is consistent with the detected projectile in the area in the historical background image, that is, the suspected projectile has been detected before the detection, and at this time, in order to avoid repeated reporting or output, the detection result of the projectile at this time is not output.
If not, the projectile determination model is used to detect the projectile in the area of variation, step 540. In some embodiments, step 540 may be implemented by the projectile detection module 240.
In some embodiments, the mismatch of the area of change with the detected projectile may include one or any combination of disappearance of the detected projectile, displacement of the detected projectile (e.g., from an original position by wind), appearance of a new suspected projectile, and the like. In some embodiments, the projectile detection module 240 may detect a projectile in the varying region using the projectile determination model when the varying region does not match any of the set of detected projectiles. In some embodiments, the projectile determination model may include a machine learning model. In some embodiments, the projectile determination model may include a deep learning model. For example, the deep learning model may include one or any combination of CNN, RNN, RCNN, Fast-RCNN, BP neural network, KNN, SVM, and the like. For more details on the projectile determination model, reference may be made to fig. 7 and its associated description.
In some embodiments, the projectile in the changed region detected by the projectile determination model is a new projectile different from any detected projectile. In some embodiments, a new projectile subsequently detected using the projectile determination model may be output as a detection result. In some embodiments, the projectile determination system 200 may report the output detection results to a backend server, and the backend server administrator may further process the detection results according to the projectile.
The set of projectiles is updated based on the new projectile detected, step 550. In some embodiments, step 550 may be implemented by update module 260.
In some embodiments, the update module 260 may add the detected new projectile to the set of detected projectiles. In some embodiments, the update module 260 may update the set of detected projectiles based on the detected time of the new projectile. For example, the update module 260 may add the currently detected projectile in front of the set of detected projectiles. In some embodiments, the update module 260 may update the set of detected projectiles based on the type of new projectile. For example, the update module 260 may add the currently detected new sprinkles to the corresponding type of the set of detected sprinkles.
It should be noted that the above description related to the flow 500 is only for illustration and explanation, and does not limit the applicable scope of the present application. Various modifications and changes to flow 500 may occur to those skilled in the art upon review of the present application. However, such modifications and variations are intended to be within the scope of the present application.
In some embodiments, the process 300, the process 400, and the process 500 may be an integrated projectile determination process that may be integrated into a projectile determination system (e.g., the projectile determination system 100) to enable determination of a projectile.
Specifically, as shown in FIG. 6, at step 610, a monitored area image is acquired, corresponding to step 310 in flow 300 or step 410 in flow 400. At step 620, a current background image is determined based on the monitored area image, corresponding to step 320 of flow 300 or step 420 of flow 400. For more details, see the description related to flow 300 or flow 400.
In step 630, it is determined whether there is a difference between the current background image and the historical background image. In some embodiments, it may be determined whether the current background image is different from the historical background image by comparing the current background image with the historical background image. In some embodiments, the contrast method may include one or any combination of color contrast, shape contrast, and the like. In response to the current background image not differing from the historical background image, indicating that there is no projectile in the current monitored area (e.g., the previous projectile has been cleared, the road has not left a projectile, etc.), then step 610 is re-executed, starting a new round of projectile detection.
In step 640, in response to the difference between the current background image and the historical background image, a changed area in the current background image is determined. In some embodiments, the changed region of the current background image may be determined by the changed region determination module 230 based on the historical background image. For more details about the determination of the change area, reference may be made to step 330 or step 430 and the related description thereof, which are not described herein again.
Step 650, determine if the variation area matches the detected projectile. In some embodiments, it may be determined by the projectile detection module 240 whether the change region matches the detected projectile based on the set of detected projectiles. In response to the changed region matching a detected projectile, indicating that the projectile has been detected before the detection, step 610 is performed to begin a new round of projectile detection. Step 650 corresponds to step 520 in flow 500, and for further details, reference may be made to step 520.
In response to the changed region not matching the detected projectile, the projectile in the changed region is detected, step 660. In some embodiments, the projectile in the varying region may be detected by the projectile detection module 240 using the projectile determination model. For more details of step 660, see step 540 and its associated description.
And step 670, judging whether the change area of the current background image is a real projectile or not based on the detection result of the step 660. In some embodiments, newly added foliage, billboards, pedestrians, vehicles, road barriers, changes in road lighting, etc. may be identified as changed regions of the current background image as being different from the historical background image. By detecting the change area with the projectile determination model, it can be determined whether the change area is a real projectile. If the object in the area of change is not a true projectile, no output is needed and a new round of projectile detection is initiated at step 610.
Step 680, in response to the change area being a real projectile, outputting a detection result; step 685, updating a projectile collection based on the detected projectiles; step 690, the historical background image is updated based on the current background image. For more details about step 680, step 685 and step 690, reference may be made elsewhere in this specification (e.g., step 350, step 450, step 550 and their related descriptions), and further description is omitted here.
FIG. 7 is an exemplary flow chart of a method of acquiring a projectile determination model according to some embodiments of the present application. In particular, the relevant steps of fig. 7 may be implemented by the training module 270.
Step 710, a sample image is acquired. In this step, the projectile determination system 200 (e.g., the training module 270) may acquire sample images for model training.
In some embodiments, the sample image may be any image that reflects the conditions in the road. In some embodiments, the sample image may include, but is not limited to, one or any combination of a vehicle image, a pedestrian image, a vehicle road (without vehicle lane lines) image, a vehicle lane line image, a guard rail/median image, a vegetation image, a cone bucket image, a road billboard image, a dirt or soil slope image, a window throw/car spill image, and the like.
In some embodiments, the sample image may be obtained from a roadway monitoring device. For example, the road monitoring device may include monitoring devices for road monitoring, public security monitoring, crossing violation monitoring, speed measurement monitoring, traffic flow monitoring, and the like. In some embodiments, the sample image may be obtained from other devices that may collect road conditions. Such as a tachograph, camera, cell phone, tablet, etc. In some embodiments, different projectiles are included in different sample images. In some embodiments, the same sample image may contain more than two different projectiles. In some embodiments, the sample image may be obtained from a historical detection record of the road spill. In some embodiments, the sample images may be obtained from various open source databases for road monitoring. In some embodiments, the sample image may also be obtained from images from various sources disclosed in the network. In some alternative embodiments, the sample image may also be obtained by other means, which are not limited in this application.
And 720, marking the throwing objects in the sample image to generate a training set. The training process can be executed at a client, can also be executed at the cloud, can also be executed in other training servers, can be on-line real-time training, and can be off-line training. In some embodiments, the sample images may be manually labeled, and a training server or the projectile determination system 200 may be used to obtain the sample images after the manual labeling and generate a training set. In some embodiments, the annotation of the sample image may be implemented by a computer program. In some embodiments, the annotation of the sample image may be implemented by other feasible manners (e.g., sample annotation software), and the application is not limited herein.
In some embodiments, a user and/or computer program may annotate a sample image according to content contained in the sample image. In some embodiments, the computer program may label the sample image according to rules set by the user. In some embodiments, the user may set different sample labeling rules according to the specific condition of the road corresponding to the monitoring device. For example, if the monitoring device inevitably has plants, billboards and the like corresponding to the road, the user can set the targets of the plants, the billboards and the like as the types which do not belong to the sprinkled objects when setting the sample marking rules; on the contrary, if the monitoring device does not have any plants, billboards and the like in the corresponding road section, the user can set the plants, billboards and the like to be of the type of the sprinkled objects (for example, the plants and the billboards fall down on the road due to storm, thunderstorm, construction problems and the like). In some embodiments, a user and/or computer program may mark the projectile in the sample image. For example, the training server may simply circle the projectile in the sample image within the sample image. As another example, the training server may simultaneously note the type of projectile. In some embodiments, a user and/or computer program may label the sample image according to the location of the projectile in the sample image. The manner in which the user and/or computer program annotates the sample image can include, but is not limited to, any combination of one or more of the foregoing manners.
In some embodiments, the projectile determination system 200 may randomly divide the sample image after user and/or computer program annotation into a training set and a test set in a certain proportion. In some embodiments, the division ratio may be 80% of the training set, 20% of the test set, or any other ratio. The training set may be used to train a projectile determination model; the test set may be used to test a projectile determination model obtained from training.
And step 730, inputting the training set images into a machine learning model for training to obtain a projectile determination model. In this step, the projectile determination system 200 (e.g., training module 270) may acquire a training-generated projectile determination model.
In some embodiments, the machine learning model may include a deep learning model. For example, the deep learning model may include one or any combination of CNN, RNN, RCNN, Fast-RCNN, BP neural network, KNN, SVM, and the like. In some embodiments, the machine learning model may include, but is not limited to, one or any combination of a supervised learning model for classification, a supervised learning model for regression, and the like. For example, linear classifiers (e.g., LR), Naive Bayes (NB), Decision Trees (DT), ensemble models (RF/GDBT, etc.), and the like.
In some embodiments, the training set generated in step 720 is used as the training input of the model, and the labeled result is used as the reference standard to train the model. In some embodiments, the output of the trained projectile determination model may be a binary result of the presence or absence of a projectile in the image, or a probability value of the inclusion of a projectile in the image, or a type to which a projectile included in the image belongs and a probability value of the type, or the output of the trained projectile determination model may be an image of the location of the projectile marked. The output form of the trained projectile determination model is determined by the algorithm of the model itself and the training sample, and the application is not limited in any way.
It should be noted that the above description related to the flow 700 is only for illustration and explanation, and does not limit the applicable scope of the present application. Various modifications and changes to flow 700 may occur to those skilled in the art upon review of the present application. However, such modifications and variations are intended to be within the scope of the present application. For example, the projectile determination system 200 may separate the labeled sample images into a training set, a validation set, and a test set, and the projectile determination system 200 may use the validation set to validate the model after initial training of the projectile determination model is completed.
The beneficial effects that may be brought by the embodiments of the present application include, but are not limited to: (1) the objects thrown on the road can be found and reported in time, so that the potential safety hazard of the road can be reduced; (2) the projectile detection efficiency can be improved by using the projectile determination model to detect the projectile; (3) the combination of the detected sprinklers and the historical background image are continuously updated, and the identification efficiency and accuracy of the sprinklers can be improved. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be considered merely illustrative and not restrictive of the broad application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereon. Accordingly, various aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visualbasic, Fortran2003, Perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing processing device or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
The entire contents of each patent, patent application publication, and other material cited in this application, such as articles, books, specifications, publications, documents, and the like, are hereby incorporated by reference into this application. Except where the application is filed in a manner inconsistent or contrary to the present disclosure, and except where the claim is filed in its broadest scope (whether present or later appended to the application) as well. It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the statements and/or uses of the present application in the material attached to this application.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present application can be viewed as being consistent with the teachings of the present application. Accordingly, the embodiments of the present application are not limited to only those embodiments explicitly described and depicted herein.

Claims (26)

1. A method of projectile determination, the method comprising:
acquiring at least one monitoring area image;
determining a current background image based on the at least one monitored area image;
determining a change region in the current background image based on a historical background image;
detecting a projectile in the area of change.
2. The projectile determination method as defined in claim 1 wherein said current background image is an image with a moving object removed from said monitored area image and said historical background image is an image with a moving object removed from a historical monitored area image.
3. The projectile determination method of claim 1 wherein said varying areas comprise areas of said current background image that are different from said historical background image.
4. The projectile determination method of claim 1 wherein said detecting a projectile in said varying area further comprises:
obtaining a set of detected projectiles, the set of detected projectiles including one or more detected projectiles and associated information thereof, the associated information of the detected projectiles at least reflecting the position of the detected projectiles;
comparing the area of change with information relating to one or more detected projectiles;
in response to the comparison being a match, determining that the area of change contains the detected projectile.
5. The projectile determination method of claim 4 wherein said detecting a projectile in said varying area further comprises:
in response to the comparison result being a mismatch, detecting a projectile in the change region using a projectile determination model, and treating the detected projectile as a new projectile; the projectile determination model is a machine learning model.
6. The projectile determination method of claim 5 wherein said method further comprises:
updating the set of detected projectiles based on the new projectile.
7. The projectile determination method of claim 1 wherein said detecting a projectile in said varying area further comprises:
detecting a projectile in the varying area using a projectile determination model; the projectile determination model is a machine learning model.
8. The projectile determination method according to any one of claims 5 to 7, wherein said projectile determination model is obtained by:
acquiring a sample image, the sample image including at least one projectile;
labeling the projectile in the sample image;
and taking the sample image as input data, taking the marked throwing object as output data or training a machine learning model by reference to a standard to obtain a trained throwing object determining model.
9. The projectile determination method of claim 8 wherein the projectile in the sample image comprises at least one of: vehicles, pedestrians, roads, lane lines, guard rails/isolation strips, plants, cone buckets, road billboards, dirt/soil slopes, spills/drops.
10. The projectile determination method of claim 1 wherein said method further comprises:
outputting a detection result, wherein the detection result comprises one or more of the following information: whether there is a projectile, the type of projectile, the location of the projectile, the number of projectiles, and the picture of the projectile.
11. The projectile determination method of claim 10 wherein said detection results are output in a manner including one or a combination of: highlighting the projectile, prompting the projectile with a prompt identifier, text prompting the projectile, or sound prompting the projectile.
12. The projectile determination method of claim 1 wherein said method further comprises:
updating the historical background image based on the current background image.
13. A system for determining a projectile is characterized by comprising an acquisition module, a background extraction module, a change region determination module and a projectile detection module;
the acquisition module is used for acquiring at least one monitoring area image;
the background extraction module is used for determining a current background image based on the at least one monitoring area image;
the change region determination module is used for determining a change region in the current background image based on a historical background image;
the projectile detection module is for detecting a projectile in the varying area.
14. The projectile determination system of claim 13 wherein the current background image is an image of the monitored area after removal of the moving object from the image and the historical background image is an image of the historical monitored area after removal of the moving object from the image.
15. The projectile determination system of claim 13 wherein said varying areas include areas of said current background image that are different from said historical background image.
16. The projectile determination system of claim 13 wherein said projectile detection module is further adapted to:
obtaining a set of detected projectiles, the set of detected projectiles including one or more detected projectiles and associated information thereof, the associated information of the detected projectiles at least reflecting the position of the detected projectiles;
comparing the area of change with information relating to one or more detected projectiles;
in response to the comparison being a match, determining that the area of change contains the detected projectile.
17. The projectile determination system of claim 16 wherein said projectile detection module is further adapted to:
in response to the comparison result being a mismatch, detecting a projectile in the change region using a projectile determination model, and treating the detected projectile as a new projectile; the projectile determination model is a machine learning model.
18. The projectile determination system of claim 17 further comprising an update module for:
updating the set of detected projectiles based on the new projectile.
19. The projectile determination system of claim 13 wherein said projectile detection module is further adapted to:
detecting a projectile in the varying area using a projectile determination model; the projectile determination model is a machine learning model.
20. The projectile determination system as recited in any one of claims 17-19, further comprising a training module for:
acquiring a sample image, the sample image including at least one projectile;
labeling the projectile in the sample image;
and taking the sample image as input data, taking the marked throwing object as output data or training a machine learning model by reference to a standard to obtain a trained throwing object determining model.
21. The projectile determination system of claim 20 wherein the projectile in the sample image includes at least one of: vehicles, pedestrians, roads, lane lines, guard rails/isolation strips, plants, cone buckets, road billboards, dirt/soil slopes, spills/drops.
22. The projectile determination system of claim 13 further comprising an output module for:
outputting a detection result, wherein the detection result comprises one or more of the following information: whether there is a projectile, the type of projectile, the location of the projectile, the number of projectiles, and the picture of the projectile.
23. The projectile determination system of claim 22 wherein said detection results are output in a manner including one or a combination of: highlighting the projectile, prompting the projectile with a prompt identifier, text prompting the projectile, or sound prompting the projectile.
24. The projectile determination system of claim 13 further comprising an update module; the update module is configured to update the historical background image based on the current background image.
25. A projectile determination device, the device comprising a processor and a memory; the memory for storing computer instructions which, when executed by the processor, cause the apparatus to implement a method of projectile determination as claimed in any one of claims 1 to 12.
26. A computer-readable storage medium storing computer instructions, at least a portion of which, when executed by at least one processor, implement the method of claims 1-12.
CN201911313526.4A 2019-12-18 2019-12-18 Method and system for determining throwing object Pending CN111127507A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911313526.4A CN111127507A (en) 2019-12-18 2019-12-18 Method and system for determining throwing object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911313526.4A CN111127507A (en) 2019-12-18 2019-12-18 Method and system for determining throwing object

Publications (1)

Publication Number Publication Date
CN111127507A true CN111127507A (en) 2020-05-08

Family

ID=70498375

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911313526.4A Pending CN111127507A (en) 2019-12-18 2019-12-18 Method and system for determining throwing object

Country Status (1)

Country Link
CN (1) CN111127507A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111968078A (en) * 2020-07-28 2020-11-20 北京恒通智控机器人科技有限公司 Appearance detection method, device, equipment and storage medium for power transformation equipment
CN112822407A (en) * 2021-01-28 2021-05-18 华设设计集团股份有限公司 Video detection system and method for road throwing event
CN113743151A (en) * 2020-05-27 2021-12-03 顺丰科技有限公司 Method and device for detecting road surface sprinkled object and storage medium
CN113808409A (en) * 2020-06-17 2021-12-17 华为技术有限公司 Road safety monitoring method, system and computer equipment
CN114512006A (en) * 2022-04-18 2022-05-17 深圳市城市交通规划设计研究中心股份有限公司 Road surface sprinkle early warning method and device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063614A (en) * 2010-12-28 2011-05-18 天津市亚安科技电子有限公司 Method and device for detecting lost articles in security monitoring
US20120045090A1 (en) * 2010-08-17 2012-02-23 International Business Machines Corporation Multi-mode video event indexing
CN102411703A (en) * 2010-09-21 2012-04-11 索尼公司 Device and method for detecting specific object in image sequence as well as video camera equipment
CN104392630A (en) * 2014-11-26 2015-03-04 天津艾思科尔科技有限公司 Throw-out intelligent detection device and method
CN106485697A (en) * 2016-09-22 2017-03-08 成都通甲优博科技有限责任公司 A kind of roadbed subsidence based on binocular vision and foreign matter detecting method
CN107527009A (en) * 2017-07-11 2017-12-29 浙江汉凡软件科技有限公司 A kind of remnant object detection method based on YOLO target detections
CN110232359A (en) * 2019-06-17 2019-09-13 ***通信集团江苏有限公司 It is detained object detecting method, device, equipment and computer storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120045090A1 (en) * 2010-08-17 2012-02-23 International Business Machines Corporation Multi-mode video event indexing
CN102411703A (en) * 2010-09-21 2012-04-11 索尼公司 Device and method for detecting specific object in image sequence as well as video camera equipment
US20120093362A1 (en) * 2010-09-21 2012-04-19 Sony Corporation Device and method for detecting specific object in sequence of images and video camera device
CN102063614A (en) * 2010-12-28 2011-05-18 天津市亚安科技电子有限公司 Method and device for detecting lost articles in security monitoring
CN104392630A (en) * 2014-11-26 2015-03-04 天津艾思科尔科技有限公司 Throw-out intelligent detection device and method
CN106485697A (en) * 2016-09-22 2017-03-08 成都通甲优博科技有限责任公司 A kind of roadbed subsidence based on binocular vision and foreign matter detecting method
CN107527009A (en) * 2017-07-11 2017-12-29 浙江汉凡软件科技有限公司 A kind of remnant object detection method based on YOLO target detections
CN110232359A (en) * 2019-06-17 2019-09-13 ***通信集团江苏有限公司 It is detained object detecting method, device, equipment and computer storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
孙丽婷;宋焕生;关琦;闻江;: "基于稳定状态差异的停车及抛落物检测算法" *
汪贵平;马力旺;郭璐;王会峰;张?|;: "高速公路抛洒物事件图像检测算法" *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113743151A (en) * 2020-05-27 2021-12-03 顺丰科技有限公司 Method and device for detecting road surface sprinkled object and storage medium
CN113808409A (en) * 2020-06-17 2021-12-17 华为技术有限公司 Road safety monitoring method, system and computer equipment
CN111968078A (en) * 2020-07-28 2020-11-20 北京恒通智控机器人科技有限公司 Appearance detection method, device, equipment and storage medium for power transformation equipment
CN112822407A (en) * 2021-01-28 2021-05-18 华设设计集团股份有限公司 Video detection system and method for road throwing event
CN112822407B (en) * 2021-01-28 2022-07-26 华设设计集团股份有限公司 Video detection system and method for road throwing event
CN114512006A (en) * 2022-04-18 2022-05-17 深圳市城市交通规划设计研究中心股份有限公司 Road surface sprinkle early warning method and device and storage medium

Similar Documents

Publication Publication Date Title
CN111127507A (en) Method and system for determining throwing object
US8855361B2 (en) Scene activity analysis using statistical and semantic features learnt from object trajectory data
US9365217B2 (en) Mobile pothole detection system and method
CN101587622B (en) Forest rocket detecting and identifying method and apparatus based on video image intelligent analysis
CN109255288A (en) A kind of road surface breakage detection method, device and terminal device
CN105448103B (en) Vehicle fake-license detection method and system
Morishita et al. SakuraSensor: Quasi-realtime cherry-lined roads detection through participatory video sensing by cars
US20180060986A1 (en) Information processing device, road structure management system, and road structure management method
US20150178572A1 (en) Road surface condition classification method and system
CN109377694B (en) Monitoring method and system for community vehicles
CN111753612B (en) Method and device for detecting casting object and storage medium
CN104200466A (en) Early warning method and camera
CN112447041A (en) Method and device for identifying operation behavior of vehicle and computing equipment
CN113505638B (en) Method and device for monitoring traffic flow and computer readable storage medium
CN109948455A (en) One kind leaving object detecting method and device
CN114782897A (en) Dangerous behavior detection method and system based on machine vision and deep learning
CN112084892B (en) Road abnormal event detection management device and method thereof
WO2021076573A1 (en) Systems and methods for assessing infrastructure
Soilán et al. Automatic road sign inventory using mobile mapping systems
CN113869275A (en) Vehicle object detection system that throws based on remove edge calculation
CN107221175A (en) A kind of pedestrian is intended to detection method and system
CN112597926A (en) Method, device and storage medium for identifying airplane target based on FOD image
CN110782653A (en) Road information acquisition method and system
CN114926791A (en) Method and device for detecting abnormal lane change of vehicles at intersection, storage medium and electronic equipment
CN113674314A (en) Method and device for detecting throwing event, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200508

WD01 Invention patent application deemed withdrawn after publication