CN113949830A - Image processing method - Google Patents

Image processing method Download PDF

Info

Publication number
CN113949830A
CN113949830A CN202111162368.4A CN202111162368A CN113949830A CN 113949830 A CN113949830 A CN 113949830A CN 202111162368 A CN202111162368 A CN 202111162368A CN 113949830 A CN113949830 A CN 113949830A
Authority
CN
China
Prior art keywords
image
ith frame
information
determining
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111162368.4A
Other languages
Chinese (zh)
Other versions
CN113949830B (en
Inventor
郭义明
吴应龙
郁启华
占磊
胡江海
黄光球
唐鑫鑫
邵书成
彭冬
李朝锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi Guoneng Energy Development Co ltd
Guoneng Zhishen Control Technology Co ltd
State Energy Group Guangxi Electric Power Co ltd
Original Assignee
Guangxi Guoneng Energy Development Co ltd
Guoneng Zhishen Control Technology Co ltd
State Energy Group Guangxi Electric Power Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangxi Guoneng Energy Development Co ltd, Guoneng Zhishen Control Technology Co ltd, State Energy Group Guangxi Electric Power Co ltd filed Critical Guangxi Guoneng Energy Development Co ltd
Priority to CN202111162368.4A priority Critical patent/CN113949830B/en
Publication of CN113949830A publication Critical patent/CN113949830A/en
Application granted granted Critical
Publication of CN113949830B publication Critical patent/CN113949830B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/68Noise processing, e.g. detecting, correcting, reducing or removing noise applied to defects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application discloses a method for processing image information. The method comprises the following steps: acquiring the gray value theta of the ith frame imageiWherein i is a positive integer; determining gray scale change information between the image information of the ith frame and the image before the ith frame; and if the gray scale change information does not meet the preset gray scale change condition, determining to execute target detection operation on the ith frame of image, wherein the gray scale change condition is determined according to the light irradiation condition corresponding to the acquisition time of the ith frame of image.

Description

Image processing method
Technical Field
The present disclosure relates to the field of information processing, and more particularly, to an image processing method.
Background
There are many high voltage devices in a substation, which may be equipped with many cameras for safety reasons. These cameras constantly monitor the image screen of the designated area. Because the area of some power stations is large, the monitoring is numerous, the monitoring is often called back to the beginning to check after an accident occurs, and the accident discovery has hysteresis. Some cameras have an abnormality detection function, and these cameras capture an abnormal picture according to the change of light based on the acquired image, and the accuracy is poor.
Disclosure of Invention
In order to solve any one of the above technical problems, an embodiment of the present application provides an image processing method.
In order to achieve the object of the embodiment of the present application, an embodiment of the present application provides a method for processing image information, including:
acquiring the gray value theta of the ith frame imageiWherein i is a positive integer;
determining gray scale change information between the image information of the ith frame and the image before the ith frame;
and if the gray scale change information does not meet the preset gray scale change condition, determining to execute target detection operation on the ith frame of image, wherein the gray scale change condition is determined according to the light irradiation condition corresponding to the acquisition time of the ith frame of image.
A storage medium having a computer program stored therein, wherein the computer program is arranged to perform the method as described above when executed.
An electronic device comprising a memory having a computer program stored therein and a processor arranged to execute the computer program to perform the method as described above.
One of the above technical solutions has the following advantages or beneficial effects:
by obtaining the gray value theta of the ith frame imageiAnd determining gray scale change information between the ith frame of image information and an image before the ith frame, and if the gray scale change information does not meet a preset gray scale change condition, determining to execute target detection operation on the ith frame of image, so as to reduce the occurrence of false detection caused by light interference.
Additional features and advantages of the embodiments of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the embodiments of the application. The objectives and other advantages of the embodiments of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings are included to provide a further understanding of the embodiments of the present application and are incorporated in and constitute a part of this specification, illustrate embodiments of the present application and together with the examples of the embodiments of the present application do not constitute a limitation of the embodiments of the present application.
Fig. 1 is a flowchart of an image processing method provided in an embodiment of the present application;
fig. 2 is another flowchart of an image processing method according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for determining whether there is a change in an image according to the method shown in FIG. 2;
fig. 4 is a flowchart of a target tracking method in the method shown in fig. 2.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application more apparent, the embodiments of the present application will be described in detail below with reference to the accompanying drawings. It should be noted that, in the embodiments of the present application, features in the embodiments and the examples may be arbitrarily combined with each other without conflict.
Fig. 1 is a flowchart of a method for processing image information according to an embodiment of the present application. As shown in fig. 1, the method shown in fig. 1 includes:
step 101, obtaining a gray value theta of an ith frame imagei
Wherein i is a positive integer;
in an exemplary embodiment, gray value distribution information in a picture of a current frame is acquired, and pixels of a gray image of the current frame can be traversed; accumulating and summing sum is carried out on the gray value of each pixel point; calculating the total number n of pixels in the image; and acquiring the average gray value sum/n of the image.
102, determining gray scale change information between the image information of the ith frame and an image before the ith frame;
the gray-scale change information may be determined by comparing with gray-scale values of images of one or more frames prior to the ith frame, and determining the gray-scale change information.
In one exemplary embodiment, the image before the ith frame is an image of N consecutive frames before the ith frame, and N is a positive integer. Since the N consecutive frames are the image frames closest to the ith frame and are most similar to the gray value of the ith frame, the interference of external light can be effectively eliminated by comparing the N consecutive frames with the gray value of the image of the N consecutive frames, and accurate gray change information can be obtained.
In an exemplary embodiment, the image before the ith frame is an image of N discontinuous frames before the ith frame, where the images of the N discontinuous frames are all within a preset acquisition period. For example, in the ith acquisition period, the image frames acquired in the mth acquisition period may be selected as reference objects to represent the external light information during the period.
103, if the gray scale change information does not meet a preset gray scale change condition, determining to execute target detection operation on the ith frame of image, wherein the gray scale change condition is determined according to a light irradiation condition corresponding to the acquisition time of the ith frame of image;
because the set positions of each image acquisition device are different, the light irradiation conditions are different, and the light irradiation conditions corresponding to different acquisition times can be determined according to the acquired image information, so that the corresponding gray value change conditions are determined, and the set light irradiation conditions are more in line with the environment set by the image acquisition devices.
For example, the gray values of the images acquired by the image acquisition device in the acquisition time of the ith frame over a period of time (e.g., one week or one month) may be acquired, the maximum value and the minimum value of the gray values therein may be determined, and the gray change condition may be determined based on the maximum value and the minimum value.
For example, the gradation change condition is a threshold value, which may be determined based on a difference between the maximum value and the minimum value.
The gray scale change condition is determined according to a gray scale change value affected by external light. If the gray scale change information does not meet the gray scale change condition, the interference of people or objects on the image acquisition operation is indicated, and a target detection function needs to be executed; otherwise, it represents the gray scale change caused by normal light change only, and the target detection function is not required to be executed.
Due to the fact that the gray scale change condition is set, target detection operation cannot be triggered due to gray scale change caused by light change, and therefore false detection caused by light interference is effectively reduced.
The method provided by the embodiment of the application obtains the gray value theta of the ith frame imageiAnd determining gray scale change information between the ith frame of image information and an image before the ith frame, and if the gray scale change information does not meet a preset gray scale change condition, determining to execute target detection operation on the ith frame of image, so as to reduce the occurrence of false detection caused by light interference.
The method provided by the embodiments of the present application is explained as follows:
in an exemplary embodiment, the determining gray-scale change information between the image information of the ith frame and the image before the ith frame includes:
acquiring gray values of images of at least two frames before the ith frame;
determining an average value V of the gray values of the images of the at least two framesi
Calculating the gray value thetaiAnd the average value ViAnd obtaining the gray scale change information according to the difference value between the two.
The gray value of the image of each frame before the ith frame can be realized by the implementation manner in the step 101, and the average value is obtained by averaging the gray values of at least two frames, so that the realization is simple and convenient.
In one exemplary embodiment, when the at least two frames are N consecutive frames before the ith frame, the average value V is obtained by calculating an expression as followsiThe method comprises the following steps:
Vi=βVi-1+(1-β)θi
wherein beta is more than 0 and less than 1, Vi-1Which represents the average of the gray values corresponding to the images of N consecutive frames preceding the i-1 th frame.
Wherein, β represents weight, and the size can be set according to actual needs.
By adopting the method, the gray value information of N continuous frames before the ith frame can be more accurately represented, and the accuracy of subsequent judgment is improved.
In the process of implementing the invention, some cameras in the prior art are found to have a face detection function, but the detection of the small animals and the people with the back shadow and the side body is limited, and the cameras are either fixed and do not need to be adjusted manually. In view of the above findings, the method provided by the embodiment of the application can effectively shield the abnormal capturing interference of the light change of the lens, and can automatically track and adjust the angle of the camera with a detected target position.
In an exemplary embodiment, after determining whether to perform the object detection operation on the ith frame image according to the variation information, the method further includes:
determining the position information of the target in the ith frame image;
and adjusting the image acquisition angle according to the position information.
And adjusting the acquisition angle of the image acquisition equipment based on the position information of the target in the ith frame of image, thereby realizing the purpose of automatic tracking.
In an exemplary embodiment, the adjusting the acquisition angle of the image according to the position information includes:
judging whether the position information meets the boundary condition of the ith frame of image;
if the position information meets the boundary condition of the ith frame of image, determining a target boundary corresponding to the position information;
and adjusting the image acquisition angle according to the target boundary.
Wherein the boundary condition may be represented by coordinate information of the image. If the position information is in the coordinate range corresponding to the boundary condition, the boundary condition of the ith frame image is shown, which indicates that the target may be far away from the acquisition range of the image acquisition device with the increase of time, so that the direction in which the target needs to be far away, namely the target boundary, needs to be determined, and the acquisition angle is adjusted according to the target boundary, so that the target can be ensured to be in the acquisition range.
In an exemplary embodiment, the determining the target boundary corresponding to the position information includes:
if the position information of at least two targets meets the boundary condition of the ith frame of image, the boundary information corresponding to each target;
and determining the boundary of the targets according to the number of the targets on the same boundary.
If a plurality of targets are detected to be positioned at the boundary of the image acquisition range, determining the direction of each target away, namely determining the boundary of each target; the boundary with the largest number of objects can be selected as the object boundary according to the number of objects on each boundary.
In an exemplary embodiment, the importance order of the targets may be determined, and the boundary corresponding to the target with the highest importance may be used as the target boundary. The importance sequence may be requested to be set externally, or determined according to a locally preset rule. The rule is set as a target type, and the sequence of importance from high to low is 'driving equipment (such as unmanned aerial vehicles, automobiles and the like), people and animals'.
In an exemplary embodiment, the determining the position information of the target in the ith frame image comprises:
determining size information of the position information in an ith frame image;
and carrying out target detection on the acquired new image according to the size information.
The size of the target can be determined by determining the size information corresponding to the position information, and the target is identified according to the size of the target, so that the efficiency of follow-up target tracking can be improved, and the workload of image identification is reduced.
The method provided by the embodiment of the application is explained by using an application example as follows:
the application example of the application example judges whether the gray values of the current frame and the previous 10 frames are within a fixed threshold value by using statistical distribution, and replaces the abnormal detection judgment of a light ray method. False alarm caused by light can be better filtered; the target is tracked by using a method of matching the image after target detection with the target of the frame image, and the pressure of the camera target detection can be reduced by using the dynamic comparison tracking of the coordinate position after the target is detected; the camera can be dynamically and adaptively adjusted in the tracking process from target detection, and once the target tracking is finished, the camera is initialized to the originally set position.
Fig. 2 is another flowchart of an image processing method according to an embodiment of the present application. As shown in fig. 2, the method includes:
and S1, acquiring the terminal image. And acquiring the video stream from the terminal camera.
And S2, whether the image has change or not. The difference of each frame of image is analyzed when capturing the video stream graph of S1. If the difference exceeds a given threshold, the current frame image output is intercepted, otherwise, the next frame is continuously waited.
And S3, detecting the target. Models using deep learning methods, such as the YOLO series and the SSD series, are recommended according to the target design detection model that needs to be detected actually.
And S4, judging whether the target exists. And judging the detected result in the S3, if the target is not detected, returning to continuously wait for the camera to respond, if the target is detected, performing S5 and further judging.
And S5, target tracking. And detecting the target and tracking and recording the target.
Fig. 3 is a flowchart of a method for determining whether there is a change in an image in the method shown in fig. 2. As shown in fig. 3, the method includes:
s21, the current frame picture gray value. Traversing pixels of a frame gray image to be processed; accumulating and summing sum is carried out on the gray value of each pixel point; calculating the total number n of pixels in the image; and calculating the average gray value sum/n of the image.
And S22, comparing the current gray value distribution with the gray distribution of the previous 10 frames.
Comparing the value of S21 with the image of the previous 10 frames, and calculating a moving average value V using the average gray values of the previous 10 framestThen the moving average V is subtracted from the current previous gray valuetObtaining variation information in which exponentially weighted moving averages are performed in order of timeAll, the calculation is as follows:
Vt=βVt-1+(1-β)θt
where t is 1,2,3,4,5,6,7,8,9,10, β is a weight generally taken to be 0.9, θ istIs the true value at time t. .
S23, it is determined whether the change information is greater than the threshold value.
If not, the current gray value is saved, and the first gray value is deleted.
If yes, go to S24.
S24, the current frame picture is output.
Fig. 4 is a flowchart of a target tracking method in the method shown in fig. 2. As shown in fig. 4, the method includes:
s51, cropping the current frame image. After the object is detected, the current frame image coordinates (x) are clipped according to the detected object coordinatesn1,yn1,xn2,yn2)。
And S52, judging whether the boundary exists. Using coordinates (x)n1,yn1,xn2,yn2) And comparing the range coordinates (0,0, w, h) of the image, if the range coordinates are close to the boundary, returning to be yes, and if the range coordinates are not close to the boundary, returning to be no. And returns the coordinates of the target. If multiple targets are out of bounds in different directions, the coordinates of the direction in which the targets are most shifted can be returned according to the weight of the targets (which target is given priority in advance).
And S53, adjusting the lens. And linking the mobile interface of the terminal camera according to the return result of the S52, and adjusting the corresponding angle so that the target is kept in the lens.
And S54, outputting the result. And returning the detection result of S51.
S55, a current frame image is acquired. The return movement detection image of S4 is acquired.
And S56, storing the detection result. The detection result of S51 is saved. And tracking of the target is performed in conjunction with the current frame image acquired at S55.
And S57, sliding comparison detection. The entire sliding window cropping is performed on the image of S55 using the result of S56. Each crop is sampled at a distance of 10 pixels at a time on the S55 image, depending on the size of the S56 image. And matching the content similarity of the sampled images. The matching method is various, and a neural network or a gray scale mean value matching method can be used. The Hausdorff distance matching employed as embodied herein.
S58, whether there is a detection target. If the target is not detected after the detection result is returned according to the S57, the S54 is executed. If so, the coordinates of the upper left corner and the lower right corner of the acquired object are calculated and returned to S51 for clipping.
The method provided by the application example can solve the problem of misinformation of light interference received by video monitoring in the conventional power station, and meanwhile, the method combines deep learning and an artificial intelligence algorithm to detect the video monitoring target. After a target is detected by using a camera, frame image tracking of the target appearing behind smooth detection is carried out, and the calling times of a detection model are reduced; according to the angle of the camera adjusted by moving and linking the position of the target in the lens, manual adjustment of the tracked target can be reduced.
An embodiment of the present application provides a storage medium, in which a computer program is stored, wherein the computer program is configured to perform the method described in any one of the above when the computer program runs.
An embodiment of the application provides an electronic device, comprising a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the method described in any one of the above.
It will be understood by those of ordinary skill in the art that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the components may be implemented as software executed by a processor, such as a digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.

Claims (10)

1. A method of processing image information, comprising:
acquiring the gray value theta of the ith frame imageiWherein i is a positive integer;
determining gray scale change information between the image information of the ith frame and the image before the ith frame;
and if the gray scale change information does not meet the preset gray scale change condition, determining to execute target detection operation on the ith frame of image, wherein the gray scale change condition is determined according to the light irradiation condition corresponding to the acquisition time of the ith frame of image.
2. The method of claim 1, wherein the image before the ith frame is an image of N consecutive frames before the ith frame, wherein N is a positive integer.
3. The method according to claim 1 or 2, wherein the determining gray scale change information between the image information of the ith frame and the image before the ith frame comprises:
acquiring gray values of images of at least two frames before the ith frame;
determining an average value V of the gray values of the images of the at least two framesi
Calculating the gray value thetaiAnd the average value ViAnd obtaining the gray scale change information according to the difference value between the two.
4. The method according to claim 3, wherein when the at least two frames are N consecutive frames before the i-th frame, the average value V is obtained by the following calculation expressioniThe method comprises the following steps:
Vi=βVi-1+(1-β)θi
wherein beta is more than 0 and less than 1, Vi-1Which represents the average of the gray values corresponding to the images of N consecutive frames preceding the i-1 th frame.
5. The method according to claim 1, wherein after determining whether to perform the object detection operation on the ith frame image according to the change information, the method further comprises:
determining the position information of the target in the ith frame image;
and adjusting the image acquisition angle according to the position information.
6. The method of claim 5, wherein the adjusting the image acquisition angle according to the position information comprises:
judging whether the position information meets the boundary condition of the ith frame of image;
if the position information meets the boundary condition of the ith frame of image, determining a target boundary corresponding to the position information;
and adjusting the image acquisition angle according to the target boundary.
7. The method of claim 6, wherein the determining the boundary of the object corresponding to the position information comprises:
if the position information of at least two targets meets the boundary condition of the ith frame of image, the boundary information corresponding to each target;
and determining the boundary of the targets according to the number of the targets on the same boundary.
8. The method of claim 6, wherein the determining the position information of the target in the ith frame of image comprises:
determining size information of the position information in an ith frame image;
and carrying out target detection on the acquired new image according to the size information.
9. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 8 when executed.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 8.
CN202111162368.4A 2021-09-30 2021-09-30 Image processing method Active CN113949830B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111162368.4A CN113949830B (en) 2021-09-30 2021-09-30 Image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111162368.4A CN113949830B (en) 2021-09-30 2021-09-30 Image processing method

Publications (2)

Publication Number Publication Date
CN113949830A true CN113949830A (en) 2022-01-18
CN113949830B CN113949830B (en) 2023-11-24

Family

ID=79329656

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111162368.4A Active CN113949830B (en) 2021-09-30 2021-09-30 Image processing method

Country Status (1)

Country Link
CN (1) CN113949830B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090324102A1 (en) * 2008-06-27 2009-12-31 Shintaro Okada Image processing apparatus and method and program
KR20120032178A (en) * 2010-09-28 2012-04-05 엘지디스플레이 주식회사 Light emitting display device and method for driving the same
CN102779272A (en) * 2012-06-29 2012-11-14 惠州市德赛西威汽车电子有限公司 Switching method for vehicle detection modes
CN108933897A (en) * 2018-07-27 2018-12-04 南昌黑鲨科技有限公司 Method for testing motion and device based on image sequence
CN109409238A (en) * 2018-09-28 2019-03-01 深圳市中电数通智慧安全科技股份有限公司 A kind of obstacle detection method, device and terminal device
CN109660736A (en) * 2017-10-10 2019-04-19 凌云光技术集团有限责任公司 Method for correcting flat field and device, image authentication method and device
CN110149486A (en) * 2019-05-17 2019-08-20 凌云光技术集团有限责任公司 A kind of automatic testing method, bearing calibration and the system of newly-increased abnormal point
US20190355104A1 (en) * 2016-09-29 2019-11-21 Huawei Technologies Co., Ltd. Image Correction Method and Apparatus
CN111223129A (en) * 2020-01-10 2020-06-02 深圳中兴网信科技有限公司 Detection method, detection device, monitoring equipment and computer readable storage medium
CN111405218A (en) * 2020-03-26 2020-07-10 深圳市微测检测有限公司 Touch screen time delay detection method, system, device, equipment and storage medium
CN111724430A (en) * 2019-03-22 2020-09-29 株式会社理光 Image processing method and device and computer readable storage medium
CN111866383A (en) * 2020-07-13 2020-10-30 Oppo广东移动通信有限公司 Image processing method, terminal and storage medium
CN112132858A (en) * 2019-06-25 2020-12-25 杭州海康微影传感科技有限公司 Tracking method of video tracking equipment and video tracking equipment
US20210029272A1 (en) * 2019-07-25 2021-01-28 Shenzhen Skyworth-Rgb Electronic Co., Ltd. Detection method for static image of a video and terminal, and computer-readable storage medium
US20210089820A1 (en) * 2019-09-19 2021-03-25 Konica Minolta, Inc. Image processing apparatus and storage medium

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090324102A1 (en) * 2008-06-27 2009-12-31 Shintaro Okada Image processing apparatus and method and program
KR20120032178A (en) * 2010-09-28 2012-04-05 엘지디스플레이 주식회사 Light emitting display device and method for driving the same
CN102779272A (en) * 2012-06-29 2012-11-14 惠州市德赛西威汽车电子有限公司 Switching method for vehicle detection modes
US20190355104A1 (en) * 2016-09-29 2019-11-21 Huawei Technologies Co., Ltd. Image Correction Method and Apparatus
CN109660736A (en) * 2017-10-10 2019-04-19 凌云光技术集团有限责任公司 Method for correcting flat field and device, image authentication method and device
CN108933897A (en) * 2018-07-27 2018-12-04 南昌黑鲨科技有限公司 Method for testing motion and device based on image sequence
CN109409238A (en) * 2018-09-28 2019-03-01 深圳市中电数通智慧安全科技股份有限公司 A kind of obstacle detection method, device and terminal device
CN111724430A (en) * 2019-03-22 2020-09-29 株式会社理光 Image processing method and device and computer readable storage medium
CN110149486A (en) * 2019-05-17 2019-08-20 凌云光技术集团有限责任公司 A kind of automatic testing method, bearing calibration and the system of newly-increased abnormal point
CN112132858A (en) * 2019-06-25 2020-12-25 杭州海康微影传感科技有限公司 Tracking method of video tracking equipment and video tracking equipment
US20210029272A1 (en) * 2019-07-25 2021-01-28 Shenzhen Skyworth-Rgb Electronic Co., Ltd. Detection method for static image of a video and terminal, and computer-readable storage medium
US20210089820A1 (en) * 2019-09-19 2021-03-25 Konica Minolta, Inc. Image processing apparatus and storage medium
CN111223129A (en) * 2020-01-10 2020-06-02 深圳中兴网信科技有限公司 Detection method, detection device, monitoring equipment and computer readable storage medium
CN111405218A (en) * 2020-03-26 2020-07-10 深圳市微测检测有限公司 Touch screen time delay detection method, system, device, equipment and storage medium
CN111866383A (en) * 2020-07-13 2020-10-30 Oppo广东移动通信有限公司 Image processing method, terminal and storage medium

Also Published As

Publication number Publication date
CN113949830B (en) 2023-11-24

Similar Documents

Publication Publication Date Title
US11102417B2 (en) Target object capturing method and device, and video monitoring device
CN107886048B (en) Target tracking method and system, storage medium and electronic terminal
US9767570B2 (en) Systems and methods for computer vision background estimation using foreground-aware statistical models
Sen-Ching et al. Robust techniques for background subtraction in urban traffic video
KR100792283B1 (en) Device and method for auto tracking moving object
US7982774B2 (en) Image processing apparatus and image processing method
JP7272024B2 (en) Object tracking device, monitoring system and object tracking method
CN106327488B (en) Self-adaptive foreground detection method and detection device thereof
CN111539265B (en) Method for detecting abnormal behavior in elevator car
US20070058837A1 (en) Video motion detection using block processing
US20140029855A1 (en) Image processing apparatus, image processing method, and program
CN111462155B (en) Motion detection method, device, computer equipment and storage medium
CN110555377B (en) Pedestrian detection and tracking method based on fish eye camera overlooking shooting
CN112561946B (en) Dynamic target detection method
JPWO2017047494A1 (en) Image processing device
US20180047271A1 (en) Fire detection method, fire detection apparatus and electronic equipment
US20150117761A1 (en) Image processing method and image processing apparatus using the same
CN111242023A (en) Statistical method and statistical device suitable for complex light passenger flow
CN115953719A (en) Multi-target recognition computer image processing system
US20240048672A1 (en) Adjustment of shutter value of surveillance camera via ai-based object recognition
US20210089809A1 (en) Deposit detection device and deposit detection method
US11373277B2 (en) Motion detection method and image processing device for motion detection
CN113553992A (en) Escalator-oriented complex scene target tracking method and system
KR102244380B1 (en) Method for object recognition using queue-based model selection and optical flow in autonomous driving environment, recording medium and device for performing the method
CN111199177A (en) Automobile rearview pedestrian detection alarm method based on fisheye image correction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant