CN110392303B - Method, device and equipment for generating heat map video and storage medium - Google Patents

Method, device and equipment for generating heat map video and storage medium Download PDF

Info

Publication number
CN110392303B
CN110392303B CN201910521249.XA CN201910521249A CN110392303B CN 110392303 B CN110392303 B CN 110392303B CN 201910521249 A CN201910521249 A CN 201910521249A CN 110392303 B CN110392303 B CN 110392303B
Authority
CN
China
Prior art keywords
video
time period
heat
event
video event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910521249.XA
Other languages
Chinese (zh)
Other versions
CN110392303A (en
Inventor
章伦啟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN201910521249.XA priority Critical patent/CN110392303B/en
Publication of CN110392303A publication Critical patent/CN110392303A/en
Application granted granted Critical
Publication of CN110392303B publication Critical patent/CN110392303B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/74Browsing; Visualisation therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to a method, a device, equipment and a storage medium for generating a heat map video. The method comprises the following steps: the method comprises the steps of obtaining video pictures shot by a camera, detecting preset video events in the video pictures, generating heat data of the video events in each shooting time period, coding the heat data of the video events in each shooting time period and the video pictures in the corresponding time period, and generating heat map videos of the video events. By adopting the method, the statistics and dynamic display of the intelligent event activity in the video shot by the camera can be realized.

Description

Method, device and equipment for generating heat map video and storage medium
Technical Field
The present application relates to the field of video processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for generating a heat map video.
Background
With the development of video monitoring technology, intelligent video monitoring equipment is applied to occasions such as markets, railway stations, hospitals, road junctions, office buildings and the like, and can be used for monitoring intelligent events (such as regional invasion, tripwire invasion, people counting and face recognition) besides shooting conventional monitoring videos. The user usually wants to know the change situation of the activity of the intelligent event in the monitoring view while watching the monitoring video, namely, the occurrence frequency of the intelligent event in each local area in the monitoring view changes along with the time, so as to carry out more reasonable manpower and material resource distribution. At present, statistics and dynamic display of intelligent event activity in a surveillance video are not realized.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a method, an apparatus, a device, and a storage medium for generating a heat map video, which can perform statistics and dynamic display on intelligent event liveness in a surveillance video.
A method of generating a heat map video, the method comprising:
acquiring a video picture shot by a camera;
detecting a preset video event in the video picture, and generating heat data of the video event in each shooting time period;
and coding the heat data of the video event in each shooting time period and the video pictures in the corresponding time period to generate a heat map video of the video event.
In one embodiment, the step of generating heat data of the video event in each shooting time period comprises:
detecting the video event in a video picture within the shooting time period;
when the video event is detected, acquiring the current occurrence frequency and the occurrence position of the video event;
and counting the occurrence times of the video events at each picture position in the shooting time period according to the current occurrence times and the occurrence positions of the video events to obtain heat data of the video events in the shooting time period.
In one embodiment, the step of generating heat data of the video event in each shooting time period comprises:
detecting the video event in a video picture within the shooting time period;
when the video event is detected, acquiring the current occurrence frequency and the occurrence position of the video event;
and counting the occurrence times of the video events at each picture position in the shooting time period according to the current occurrence times and the occurrence positions of the video events to obtain heat data of the video events in the shooting time period.
In one embodiment, the step of generating heat data of the video event in each shooting time period comprises:
detecting the video event in a video picture within the shooting time period;
when the video event is detected, acquiring the current occurrence frequency and the occurrence position of the video event;
and counting the occurrence times of the video events at each picture position in the shooting time period according to the current occurrence times and the occurrence positions of the video events to obtain heat data of the video events in the shooting time period.
In one embodiment, the step of generating heat data of the video event in each shooting time period comprises:
detecting the video event in a video picture within the shooting time period;
when the video event is detected, acquiring the current occurrence frequency and the occurrence position of the video event;
and counting the occurrence times of the video events at each picture position in the shooting time period according to the current occurrence times and the occurrence positions of the video events to obtain heat data of the video events in the shooting time period.
A method of generating a heat map video, the method comprising:
receiving a video event to be observed and an observation time period input by a user;
sending a data acquisition request to the camera equipment, and receiving the video data in the observation time period and the heat data of the video event to be observed in the observation time period, which are returned by the camera equipment;
and coding the shot video in the observation time period and the heat data of the video event to be observed in the observation time period, generating a heat image video of the video event to be observed in the observation time period, and playing the heat image video.
An apparatus for generating a heat map video, the apparatus comprising:
the video image acquisition module is used for acquiring a video image shot by the camera;
the heat data generation module is used for detecting a preset video event in the video picture and generating heat data of the video event in each shooting time period; and
and the heat map video generation module is used for encoding the heat data of the video event in each shooting time period and the video pictures in the corresponding time period to generate the heat map video of the video event.
An apparatus for generating a heat map video, the apparatus comprising:
the input data receiving module is used for receiving a video event to be observed and an observation time period which are input by a user;
the camera shooting data acquisition module is used for sending a data acquisition request to camera shooting equipment and receiving video data in the observation time period and heat data of the video event to be observed in the observation time period, which are returned by the camera shooting equipment;
and the heat map video generation module is used for encoding the shooting video in the observation time period and the heat data of the video event to be observed in the observation time period, generating the heat map video of the video event to be observed in the observation time period and playing the heat map video.
An image pickup apparatus includes a memory storing a computer program and a processor implementing the steps of the above-described method of generating a heat map video when the processor executes the computer program.
A computer device comprising a memory storing a computer program and a processor implementing the steps of the above-described method for generating a heat map video when the processor executes the computer program.
A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the above-mentioned method for generating a heat map video.
According to the method, the device, the equipment and the medium for generating the heat map video, in a video picture shot by a camera, a video event is detected to obtain heat data of the video event in each shooting time period, the heat data of the video event in each shooting time period and the video picture in the corresponding time period are coded to generate the heat map video of the video event, a video frame of the heat map video combines a scene picture and the heat data, the video event is an intelligent event which is currently detected, and therefore statistics and dynamic display of the activity of the intelligent event in the video are achieved through the heat map video.
Drawings
FIG. 1 is a diagram of an exemplary embodiment of a method for generating a heat map video;
FIG. 2 is a flowchart illustrating a method for generating a heat map video according to an embodiment;
FIG. 3 is a flowchart illustrating a method for generating a heat map video according to another embodiment;
FIG. 4 is a block diagram showing an example of a structure of a device for generating a heat map video;
FIG. 5 is a block diagram showing the construction of a heat map video generating apparatus according to another embodiment; and
FIG. 6 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The method for generating the heat map video can be applied to the application environment shown in fig. 1. The camera device 102 communicates with the server 104 through a network, the camera device 102 captures a video image of a current scene through a camera, and can perform intelligent event detection in the captured video image, for example, intrusion detection (the intrusion detection includes area intrusion detection and tripwire intrusion detection, and whether a person enters a dangerous area is detected by presetting dangerous areas and intrusion detection rules), person loitering detection (the dangerous areas and the person loitering detection rules are preset, and whether a person loiters in a dangerous area is detected), and the like. The image capture device 102 may send the captured and detected data to the server 104 for storage or to the user device 106 via the server 104. The camera device 102 may be, but is not limited to, a gun camera, a dome camera, and a dome camera, the server 104 may be implemented by a stand-alone server or a server cluster composed of a plurality of servers, and the user device may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices.
In one embodiment, as shown in fig. 2, a method for generating a heat map video is provided, which is described by taking an example of the method applied to the image capturing apparatus 102 in fig. 1, and includes the following steps:
and step 202, acquiring a video picture shot by the camera.
The camera is located on the camera device and used for shooting videos of a scene where the camera device is located.
And 204, detecting a preset video event in the video picture, and generating heat data of the video event in each shooting time period.
In particular, the video event, i.e., the smart event currently being detected, may be specified by a user or by the system. The method comprises the steps of presetting a detection period of a video event, monitoring the occurrence of the video event in a video picture in each detection period, counting the occurrence frequency of the video event at each picture position in each detection period, and obtaining the heat data of the video event in each detection period, so as to obtain the heat data of the video event in different shooting time periods. And the duration of the shooting time period is the duration of the detection period.
In one embodiment, the detection period of the video event is set according to the video frames, and the heat data of the video event in the preset number of frames is counted each time, so that the corresponding relation between the counted heat data and the shot video pictures is conveniently established according to the video frames, the accuracy of subsequent coding of the heat data and the video pictures can be improved, and the accuracy of generating the heat map video is improved. For example, the statistics of the heat data of the monitoring event are performed every 8 frames of video.
And step 206, encoding the heat data of the video event in each shooting time period and the video pictures in the corresponding time period to generate a heat map video of the video event.
Specifically, the heat data of the video event at each shooting period includes the number of occurrences of the video event at different screen positions within each shooting period. According to the occurrence frequency of the video event at different picture positions in each shooting time period, the heat map of the video event in each shooting time period can be obtained, and the heat map of the video event in any shooting time period comprises all heat data of the video event in the shooting time period and before the shooting time period, namely the occurrence frequency of the video event in each picture position before the shooting time period and before the shooting time period. And superposing the heat maps of the video events in the shooting time periods on the video pictures in the corresponding time periods, finishing the coding of the heat maps of the video events in the shooting time periods and the video pictures in the corresponding time periods to obtain heat map video frames, and forming the heat map video of the video events by the heat map video frames.
In the method for generating the heat map video, a video event is detected in a video picture shot by a camera, heat data of the video event in each shooting time period is obtained, the frequency of occurrence of the video event in any shooting time period and at each picture position before the any shooting time period can be obtained according to the heat data of the video event in each shooting time period, the heat data of the video event in each shooting time period and the video picture in the corresponding time period are encoded, and the heat map video of the video event is generated, so that the dynamic statistics of the intelligent event in the video shot by the camera is realized, and the dynamic display of the intelligent event in the video shot by the camera is realized through the generated heat map video.
In one embodiment, when detecting heat data of a video event in a shooting time period, the video event is detected in the shooting time period, when the occurrence of the video event is detected, the current occurrence frequency and the occurrence position of the video event are acquired, and the occurrence frequency of the video event in each picture position in the shooting time period is counted according to the current occurrence frequency and the occurrence position of the video event to obtain the heat data of the video event in the shooting time period.
In one embodiment, when detecting that a video event occurs, the image capturing apparatus may obtain an occurrence region (typically, a rectangular region) of the video event, and obtain an occurrence position of the video event in a video screen by calculating center coordinates of the occurrence region, so as to improve detection accuracy of the occurrence position of the video event.
In one embodiment, when generating the heat maps of the video events in the respective shooting periods according to the heat data of the video events in the respective shooting periods, since the heat data of the video events in the respective shooting periods is the number of times of occurrence of the video events at the respective picture positions in the respective shooting periods, the heat data of the video events in the respective shooting periods can be converted into heat increment images of the video events in the respective shooting periods, and the heat increment images of the video events in the respective shooting periods represent the occurrence of the video events in the respective shooting periods. When the heat map of the video event in the current shooting time period is generated, the heat map of the video event in the last shooting time period is obtained, the heat increment image of the video event in the current shooting time period is overlaid on the heat map of the video event in the last shooting time period, the heat map of the video event in the current shooting time period is obtained, and then the heat map of the video event in each shooting time period can be obtained. The heat map of the video event in any shooting time period is used for representing the occurrence of the video event in any shooting time period and before any shooting time period.
In one embodiment, the heat data of the video event at each capture time period is a two-dimensional matrix. The matrix scale of the two-dimensional matrix is the same as that of the image matrix of the video picture, and the value of each element in the two-dimensional matrix represents the occurrence frequency of the video event at the corresponding picture position, so that the occurrence condition of the video event can be clearly and accurately recorded through the two-dimensional matrix.
As an example, taking the duration of the shooting period as 8 frames as an example, the number of occurrences and the occurrence position of the video event in the shooting period are added by 1 for each occurrence, and the resulting heat data may be:
Figure BDA0002096758430000071
where 8, 4, 3, 2, 6 all indicate that a video event occurred 8, 4, 3, 2, and 6 times at the corresponding picture position, respectively.
In one embodiment, the heat data of the video event at each capture time period is a two-dimensional matrix. When the heat data of the video event in each shooting time period is converted into a heat increment image of the video event in each shooting time period, the average value of the maximum value and the minimum value of the elements in the heat data of the video event in the current shooting time period is calculated, and the deviation of each element in the heat data relative to the average value is calculated. And calculating the gray scale corresponding to each element in the heat data according to the deviation of each element in the heat data relative to the mean value, so as to obtain a heat increment image of the video event in the current shooting time period. In this way, the heat increment image of the video event in each shooting time period can be obtained.
In one embodiment, in the heat data of the video event in the current shooting time period, the calculation formula of the gray scale corresponding to each element is as follows:
Figure BDA0002096758430000072
wherein χ is the deviation of the elements in the heat data relative to the mean value, N is the mean value of the maximum value and the minimum value of the elements in the heat data, and P is the gray scale corresponding to the elements in the heat data.
In one embodiment, when overlaying the heat increment image of the video event in the current shooting period on the heat map of the video event in the last shooting period, the overlaying formula is expressed as:
Figure BDA0002096758430000081
wherein, P1Pixel gray scale, P, of heat map for a video event during a previous capture period2Pixel gray scale of heat increment image for video event in current shooting time period, N1Heat mean of heat map of last shooting period for video event,N2The average value of the heat increment image of the video event in the current shooting time period is used. The heat mean value is a mean value of the maximum value and the minimum value of the heat on the heat map or the heat increment image, where the heat is the number of occurrences of a video event at a single image pixel on the heat map or the heat increment image.
In one embodiment, the heat map of the video event in each shooting time period is overlapped on the video pictures in the corresponding time period in a mask overlapping mode, so that the overlapping effect of the heat map and the video pictures is improved. Wherein, an image change algorithm is adopted in the process of overlapping in a mask overlapping mode. For example, the image change algorithm may be a rainbow plot algorithm, an image linear transformation algorithm, or the like.
In one embodiment, after obtaining the heat data of the video event in each shooting time period and the video pictures shot by the camera, the data are sent to the user equipment which is in communication with the camera where the camera is located, so that the heat data of the video event in each shooting time period and the video pictures in the corresponding time period are encoded on the user equipment, and the heat map video of the video event is obtained and played. Therefore, the process of encoding the heat data and the video picture to generate the heat map video is not limited to be implemented on the image capturing apparatus, and may be implemented on the user apparatus. Furthermore, the implementation on a server is also possible.
In one embodiment, as shown in fig. 3, a method for generating a heat map video is provided, which is described by taking the method as an example applied to the user equipment 106 in fig. 1, and includes the following steps:
step 302, receiving a video event to be observed and an observation time period input by a user.
Specifically, when a user needs to watch a current shot video of the image pickup device in real time or a video shot by the image pickup device and stored in the server, the user can input a video event to be observed and an observation time period, and the user can input the video event to be observed in a mode of selecting the video event to be observed from preset video events or in a mode of inputting characters or voice. For example, when the preset video events include zone intrusion detection, tripwire intrusion detection, and person loitering detection, the user may select any one of them as the video event to be observed.
And step 304, sending a data acquisition request to the image pickup equipment, and receiving the video data in the observation time period and the heat data of the video event to be observed in the observation time period, which are returned by the image pickup equipment.
Specifically, when the user equipment sends a data acquisition request to the camera equipment, video data acquired by the camera equipment in an observation time period and heat data of a video event to be observed detected by the camera equipment in the observation time period are acquired. If the data is stored on the server, the user device may send a data acquisition request to the server to obtain the data. The heat data of the event to be observed in the observation time period comprise the heat data of the event to be observed in each detection period in the observation time period.
And step 306, encoding the video data in the observation time period and the heat data of the video event to be observed in the observation time period, generating a heat map video of the video event to be observed in the observation time period, and playing the heat map video.
Specifically, after obtaining video data in an observation time period and heat data of a video event to be observed in the observation time period, encoding the heat data of the video event to be observed in each detection period in the observation time period and a corresponding video picture in the video data to obtain a heat map video of the video event to be observed in the observation time period and playing the heat map video. The process of obtaining the heat map video by encoding the heat data and the video picture can refer to the corresponding content of the above embodiments, and will not be described herein again.
It should be understood that although the various steps in the flow charts of fig. 1-3 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1-3 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 4, there is provided a heat map video generation apparatus 400, including: a video picture acquisition module 402, a heat data generation module 404 and a heat map video generation module 406, wherein:
a video image obtaining module 402, configured to obtain a video image captured by a camera;
a heat data generating module 404, configured to detect occurrence of a preset monitoring event in a video frame, and generate heat data of the monitoring event in different shooting time periods;
and a heat map video generating module 406, configured to encode heat data of the monitoring event in different shooting time periods and video pictures in corresponding time periods, so as to generate a heat map video of the monitoring event.
In one embodiment, the heat data generation module 404 includes:
the event detection module is used for detecting a video event in a video picture in a shooting time period;
the event information acquisition module is used for acquiring the current occurrence frequency and the occurrence position of the video event when the video event is detected; and
and the frequency counting module is used for counting the frequency of the video event at each picture position in the shooting time period according to the current frequency and position of the video event to obtain the heat data of the video event in the shooting time period.
In one embodiment, the heat map video generation module 406 includes:
the heat map generation module is used for generating heat maps of the video events in all the shooting time periods according to the heat data of the video events in all the shooting time periods; and
and the video frame generation module is used for superposing the heat maps of the video events in the shooting time periods on the video pictures in the corresponding time periods to obtain the video frames of the heat map videos.
In one embodiment, the heat map generation module includes:
the image conversion module is used for converting the heat data of the video event in the current shooting time period into a heat increment image of the video event in the current shooting time period; and
and the heat map overlaying module is used for acquiring a heat map of the video event in the last shooting time period, and overlaying the heat increment image of the video event in the current shooting time period onto the heat map of the video event in the last shooting time period to obtain the heat map of the video event in the current shooting time period.
In one embodiment, the heat map video generation module 406 includes:
and the data sending module is used for sending the heat data of the video event and the video data acquired by the camera to user equipment which is communicated with the camera where the camera is located, so that the heat data of the video event in each shooting time period and video pictures in the corresponding time period are coded on the user equipment, and a heat map video of the video event is obtained and played.
In one embodiment, as shown in fig. 5, there is provided a heat map video generation apparatus 500, including: an input data receiving module 502, a camera data acquiring module 504 and a heat map video generating module 506, wherein:
an input data receiving module 502, configured to receive a video event to be observed and an observation time period input by a user;
the camera data acquisition module 504 is configured to send a data acquisition request to the camera device, and receive video data in an observation time period and heat data of a video event to be observed in the observation time period, which are returned by the camera device; and
the heat map video generating module 506 encodes the video data in the observation time period and the heat data of the video event to be observed in the observation time period, generates a heat map video of the video event to be observed in the observation time period, and plays the heat map video.
For specific limitations of the generating device of the heat map video, reference may be made to the above limitations of the generating method of the heat map video, and details are not repeated here. The modules in the device for generating the heat map video can be wholly or partially implemented by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 6. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of generating a heat map video. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, can display a heat map video, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 6 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, an image pickup apparatus is provided, which includes a memory in which a computer program is stored and a processor that implements the steps in the above-described method embodiment in which the image pickup apparatus is an execution subject when the computer program is executed by the processor.
In one embodiment, a computer device is provided, which includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps in the method embodiment described above with the user device as an execution subject when executing the computer program.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which, when executed by a processor, implements the steps in the above-described method embodiment with the image pickup apparatus as an execution subject, or the method embodiment with the user apparatus as an execution subject.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (9)

1. A method for generating a heat map video, the method comprising:
acquiring a video picture shot by a camera;
detecting a preset video event in the video picture, and generating heat data of the video event in each shooting time period;
encoding the heat data of the video event in each shooting time period and the video pictures in the corresponding time period to generate a heat map video of the video event;
wherein encoding the heat data of the video event in each shooting time period and the video picture in the corresponding shooting time period comprises: converting the heat data of the video event in the current shooting time period into a heat increment image of the video event in the current shooting time period, wherein the pixel gray value of each image pixel on the heat increment image is used for representing the number of times of the video event occurring at the image pixel in the current shooting time period; acquiring a heat map of the video event in the last shooting time period, and overlaying a heat increment image of the video event in the current shooting time period onto the heat map of the video event in the last shooting time period to obtain the heat map of the video event in the current shooting time period; and superposing the heat map of the video event in each shooting time period on the video picture in the corresponding shooting time period to obtain the video frame of the heat map video.
2. The method of claim 1, wherein generating heat data for the video event for each capture period comprises:
detecting the video event in a video picture within the shooting time period;
when the video event is detected, acquiring the current occurrence frequency and the occurrence position of the video event;
and counting the occurrence times of the video events at each picture position in the shooting time period according to the current occurrence times and the occurrence positions of the video events to obtain heat data of the video events in the shooting time period.
3. The method of claim 1, wherein encoding the heat data of the video event in each shooting time period with the video pictures in the corresponding time period, and wherein generating the heat map video of the video event comprises:
and sending the heat data of the video event and the video data acquired by the camera to user equipment in communication with the camera where the camera is located, so as to encode the heat data of the video event in each shooting time period and the video pictures in the corresponding time period on the user equipment, obtain the heat map video of the video event and play the heat map video.
4. A method for generating a heat map video, the method comprising:
receiving a video event to be observed and an observation time period input by a user;
sending a data acquisition request to the camera equipment, and receiving the video pictures in the observation time period and the heat data of the video event to be observed in the observation time period, which are returned by the camera equipment;
coding the video picture in the observation time period and the heat data of the video event to be observed in the observation time period, generating a heat image video of the video event to be observed in the observation time period and playing the heat image video;
wherein encoding the video picture in the observation time period and the heat data of the video event to be observed in the observation time period comprises: converting the heat data of the video event to be observed in the current detection period into a heat increment image of the video event to be observed in the current detection period, wherein the pixel gray value of each image pixel on the heat increment image is used for representing the occurrence frequency of the video event to be observed at the image pixel in the current detection period; acquiring a heat map of the video event to be observed in a last detection period, and overlaying a heat increment image of the video event to be observed in the current detection period onto the heat map of the video event to be observed in the last detection period to obtain the heat map of the video event to be observed in the current detection period; and superposing the heat map of the video event to be observed in each detection period on the video picture in the corresponding detection period to obtain the video frame of the heat map video.
5. An apparatus for generating a heat map video, the apparatus comprising:
the video image acquisition module is used for acquiring a video image shot by the camera;
the heat data generation module is used for detecting a preset video event in the video picture and generating heat data of the video event in each shooting time period; and
the heat map video generation module is used for encoding heat data of the video event in each shooting time period and the video pictures in the corresponding time period to generate a heat map video of the video event;
wherein encoding the heat data of the video event in each shooting time period and the video picture in the corresponding shooting time period comprises: converting the heat data of the video event in the current shooting time period into a heat increment image of the video event in the current shooting time period, wherein the pixel gray value of each image pixel on the heat increment image is used for representing the number of times of the video event occurring at the image pixel in the current shooting time period; acquiring a heat map of the video event in the last shooting time period, and overlaying a heat increment image of the video event in the current shooting time period onto the heat map of the video event in the last shooting time period to obtain the heat map of the video event in the current shooting time period; and superposing the heat map of the video event in each shooting time period on the video picture in the corresponding shooting time period to obtain the video frame of the heat map video.
6. An apparatus for generating a heat map video, the apparatus comprising:
the input data receiving module is used for receiving a video event to be observed and an observation time period which are input by a user;
the camera shooting data acquisition module is used for sending a data acquisition request to camera shooting equipment and receiving the video pictures in the observation time period and the heat data of the video event to be observed in the observation time period, which are returned by the camera shooting equipment;
the heat map video generation module is used for encoding the video pictures in the observation time period and the heat data of the video event to be observed in the observation time period, generating and playing the heat map video of the video event to be observed in the observation time period;
wherein encoding the video picture in the observation time period and the heat data of the video event to be observed in the observation time period comprises: converting the heat data of the video event to be observed in the current detection period into a heat increment image of the video event to be observed in the current detection period, wherein the pixel gray value of each image pixel on the heat increment image is used for representing the occurrence frequency of the video event to be observed at the image pixel in the current detection period; acquiring a heat map of the video event to be observed in a last detection period, and overlaying a heat increment image of the video event to be observed in the current detection period onto the heat map of the video event to be observed in the last detection period to obtain the heat map of the video event to be observed in the current detection period; and superposing the heat map of the video event to be observed in each detection period on the video picture in the corresponding detection period to obtain the video frame of the heat map video.
7. An image capturing apparatus comprising a memory storing a computer program and a processor, wherein the processor implements the steps of the method for generating a heat map video according to claim 1 or 2 when executing the computer program.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor when executing the computer program implements the steps of the method of generating a heat map video of claim 1, claim 2 or claim 4.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of generating a heat map video of claim 1, claim 2 or claim 4.
CN201910521249.XA 2019-06-17 2019-06-17 Method, device and equipment for generating heat map video and storage medium Active CN110392303B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910521249.XA CN110392303B (en) 2019-06-17 2019-06-17 Method, device and equipment for generating heat map video and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910521249.XA CN110392303B (en) 2019-06-17 2019-06-17 Method, device and equipment for generating heat map video and storage medium

Publications (2)

Publication Number Publication Date
CN110392303A CN110392303A (en) 2019-10-29
CN110392303B true CN110392303B (en) 2021-07-06

Family

ID=68285437

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910521249.XA Active CN110392303B (en) 2019-06-17 2019-06-17 Method, device and equipment for generating heat map video and storage medium

Country Status (1)

Country Link
CN (1) CN110392303B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111294563B (en) * 2020-02-24 2021-06-18 浙江大华技术股份有限公司 Video monitoring method and device, storage medium and electronic device
CN113591549B (en) * 2021-06-16 2024-06-18 浙江大华技术股份有限公司 Video event detection method, computer equipment and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107123126A (en) * 2017-03-29 2017-09-01 天棣网络科技(上海)有限公司 A kind of stream of people's moving scene temperature method of estimation
CN107256225A (en) * 2017-04-28 2017-10-17 济南中维世纪科技有限公司 A kind of temperature drawing generating method and device based on video analysis

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070292024A1 (en) * 2006-06-20 2007-12-20 Baer Richard L Application specific noise reduction for motion detection methods

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107123126A (en) * 2017-03-29 2017-09-01 天棣网络科技(上海)有限公司 A kind of stream of people's moving scene temperature method of estimation
CN107256225A (en) * 2017-04-28 2017-10-17 济南中维世纪科技有限公司 A kind of temperature drawing generating method and device based on video analysis

Also Published As

Publication number Publication date
CN110392303A (en) 2019-10-29

Similar Documents

Publication Publication Date Title
US20210343027A1 (en) Object tracking method and apparatus, storage medium and electronic device
AU2022246412A1 (en) A method and apparatus for conducting surveillance
US8675065B2 (en) Video monitoring system
Xu et al. Viewport-based CNN: A multi-task approach for assessing 360° video quality
CN110392303B (en) Method, device and equipment for generating heat map video and storage medium
KR101308946B1 (en) Method for reconstructing three dimensional facial shape
CN111414873A (en) Alarm prompting method, device and alarm system based on wearing state of safety helmet
US11816877B2 (en) Method and apparatus for object detection in image, vehicle, and robot
CN112422909B (en) Video behavior analysis management system based on artificial intelligence
CN111212246B (en) Video generation method and device, computer equipment and storage medium
CN111080571A (en) Camera shielding state detection method and device, terminal and storage medium
CN109002776B (en) Face recognition method, system, computer device and computer-readable storage medium
CN110659564A (en) Method and device for tracking users in area, computer equipment and storage medium
CN112163503A (en) Method, system, storage medium and equipment for generating insensitive track of personnel in case handling area
JP7255841B2 (en) Information processing device, information processing system, control method, and program
CN110688950B (en) Face living body detection method and device based on depth information
CN114092720A (en) Target tracking method and device, computer equipment and storage medium
CN115035580A (en) Figure digital twinning construction method and system
CN111582024B (en) Video stream processing method, device, computer equipment and storage medium
US10783365B2 (en) Image processing device and image processing system
CN110889352A (en) Image blurring processing method, computer device, and computer-readable storage medium
CN113421241A (en) Abnormal event reporting method and device, computer equipment and storage medium
CN108921036B (en) Random number generation method and generation system based on face image recognition
CN113191210A (en) Image processing method, device and equipment
CN116456127B (en) Video processing system, method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant