CN109544879B - Alarm data processing method and system - Google Patents

Alarm data processing method and system Download PDF

Info

Publication number
CN109544879B
CN109544879B CN201811270027.7A CN201811270027A CN109544879B CN 109544879 B CN109544879 B CN 109544879B CN 201811270027 A CN201811270027 A CN 201811270027A CN 109544879 B CN109544879 B CN 109544879B
Authority
CN
China
Prior art keywords
alarm
node server
video network
information
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811270027.7A
Other languages
Chinese (zh)
Other versions
CN109544879A (en
Inventor
庾少华
张洋
亓娜
王艳辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hainan Shilian Communication Technology Co.,Ltd.
Original Assignee
Visionvera Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visionvera Information Technology Co Ltd filed Critical Visionvera Information Technology Co Ltd
Priority to CN201811270027.7A priority Critical patent/CN109544879B/en
Publication of CN109544879A publication Critical patent/CN109544879A/en
Application granted granted Critical
Publication of CN109544879B publication Critical patent/CN109544879B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2852Metropolitan area networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2854Wide area networks, e.g. public data networks
    • H04L12/2856Access arrangements, e.g. Internet access
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • H04L67/141Setup of application sessions

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The embodiment of the invention provides a method and a system for processing alarm data, wherein the method comprises the following steps: the first video network node server receives alarm data from the mobile terminal, wherein the alarm data comprises alarm position information and alarm image-text information; the first video network node server stores the alarm image-text information and generates an access address of the alarm image-text information in the first video network node server; the first video network node server sends the alarm position information and the access address to the second video network node server, and the second video network node server obtains the alarm image-text information according to the access address and displays the alarm position information and the alarm image-text information. According to the embodiment of the invention, the alarm image-text information is added in the alarm data, the second video network node server can display the on-site alarm data more abundantly, control the on-site alarm condition in more detail and improve the accuracy of alarm processing.

Description

Alarm data processing method and system
Technical Field
The invention relates to the technical field of video networking, in particular to a method and a system for processing alarm data.
Background
The video network is a special network for transmitting high-definition video and a special protocol at high speed based on Ethernet hardware, is a higher-level form of the Internet and is a real-time network.
At present, in a video networking GIS (geographic information system) sky-eye scheduling application platform, alarm data reported by a mobile terminal only contain basic information of an alarm, such as the position of the alarm, the video networking GIS sky-eye scheduling application platform cannot control the alarm condition of a site in detail, and the accuracy of alarm processing is low.
Disclosure of Invention
In view of the above, embodiments of the present invention are proposed to provide a method of processing alarm data and a corresponding system of processing alarm data that overcome or at least partially solve the above problems.
In order to solve the above problem, an embodiment of the present invention discloses a method for processing alarm data, where the method is applied to an internet and a video network, the internet includes a mobile terminal, the video network includes a first video network node server and a second video network node server, the first video network node server communicates with the mobile terminal and the second video network node server, respectively, and the method includes: the first video network node server receives alarm data from the mobile terminal, wherein the alarm data comprises alarm position information and alarm image-text information, and the alarm image-text information at least comprises one of the following information: alarm picture information, alarm audio and video information and alarm character information; the first video network node server stores the alarm image-text information and generates an access address of the alarm image-text information in the first video network node server; and the first video network node server sends the alarm position information and the access address to the second video network node server, and the second video network node server is used for acquiring the alarm image-text information according to the access address and displaying the alarm position information and the alarm image-text information.
Optionally, the sending, by the first node server of the internet of view, the alarm location information and the access address to the second node server of the internet of view includes: and the first video network node server sends the alarm position information and the access address to a preprocessing service object of the second video network node server, wherein the preprocessing service object is used for reading the alarm image-text information from the access address and forwarding the alarm position information and the alarm image-text information to a main processing service object of the second video network node server.
Optionally, the video network further includes a video network client, and the main processing service object is configured to transmit the alarm location information and the alarm image-text information to the video network client by using a websocket protocol, so that the alarm location information and the alarm image-text information are displayed on the video network client.
Optionally, the alarm data further includes identification information of the mobile terminal; the method further comprises the following steps: and the first video network node server sends the identification information to the second video network node server, and the second video network node server is also used for initiating a video call request to the mobile terminal according to the identification information.
Optionally, the mobile terminal is configured to compress the alarm graphics and text information before sending the alarm graphics and text information to the first node server of the video network.
The embodiment of the invention also discloses a system for processing alarm data, which is applied to the internet and the video network, wherein the internet comprises a mobile terminal, the video network comprises a first video network node server and a second video network node server, the first video network node server is respectively communicated with the mobile terminal and the second video network node server, and the first video network node server comprises: the receiving module is used for receiving alarm data from the mobile terminal, wherein the alarm data comprises alarm position information and alarm image-text information, and the alarm image-text information at least comprises one of the following information: alarm picture information, alarm audio and video information and alarm character information; the generating module is used for storing and processing the alarm image-text information and generating an access address of the alarm image-text information in the first video network node server; and the sending module is used for sending the alarm position information and the access address to the second video network node server, and the second video network node server is used for acquiring the alarm image-text information according to the access address and displaying the alarm position information and the alarm image-text information.
Optionally, the sending module is configured to send the alarm location information and the access address to a preprocessing service object of the second node server of the internet of view, where the preprocessing service object is configured to read the alarm image-text information from the access address, and forward the alarm location information and the alarm image-text information to a main processing service object of the second node server of the internet of view.
Optionally, the video network further includes a video network client, and the main processing service object is configured to transmit the alarm location information and the alarm image-text information to the video network client by using a websocket protocol, so that the alarm location information and the alarm image-text information are displayed on the video network client.
Optionally, the alarm data further includes identification information of the mobile terminal; the sending module is further configured to send the identification information to the second node server of the video networking, and the second node server of the video networking is further configured to initiate a video call request to the mobile terminal according to the identification information.
Optionally, the mobile terminal is configured to compress the alarm graphics and text information before sending the alarm graphics and text information to the first node server of the video network.
The embodiment of the invention has the following advantages:
the embodiment of the invention is applied to the Internet and the video network, wherein the Internet can comprise a mobile terminal, the video network can comprise a first video network node server and a second video network node server, and the first video network node server is respectively communicated with the mobile terminal and the second video network node server.
In the embodiment of the invention, the mobile terminal is used for acquiring the on-site alarm data and transmitting the acquired alarm data to the first video network node server. The alarm data can comprise alarm position information and alarm image-text information, and the alarm image-text information at least comprises one or more of alarm image information, alarm audio-video information and alarm text information. After receiving the alarm data, the first node server of the video network can store the alarm graph and text information in the alarm data at a specified position and generate an access address for the specified position, and then the first node server of the video network sends the access address and the alarm position information to the second node server of the video network. The second video network node server can acquire the alarm image-text information according to the access address and display the alarm position information and the alarm image-text information.
The embodiment of the invention applies the characteristics of the video network, and the mobile terminal adds alarm image-text information such as alarm picture information, alarm audio-video information, alarm text information and the like on the basis that the reported alarm data contains the alarm position information. The first node server of the video network receives the alarm data, stores the alarm image-text information therein, generates an access address of the alarm image-text information in the first node server of the video network, and sends the alarm position information and the access address to the second node server of the video network. And the second video network node server displays the alarm position information and the alarm image-text information acquired by using the access address. According to the embodiment of the invention, the alarm image-text information is added in the alarm data, the second video network node server can display the on-site alarm data more abundantly, control the on-site alarm condition in more detail and improve the accuracy of alarm processing.
Drawings
FIG. 1 is a schematic networking diagram of a video network of the present invention;
FIG. 2 is a schematic diagram of a hardware architecture of a node server according to the present invention;
fig. 3 is a schematic diagram of a hardware structure of an access switch of the present invention;
fig. 4 is a schematic diagram of a hardware structure of an ethernet protocol conversion gateway according to the present invention;
FIG. 5 is a flow chart of the steps of one embodiment of a method of alarm data processing of the present invention;
FIG. 6 is a schematic design diagram of a method for receiving alarm data of a handheld device based on a GIS platform of a video network according to the present invention;
fig. 7 is a block diagram of a first node server of the video network according to an embodiment of the system for processing alarm data of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The video networking is an important milestone for network development, is a real-time network, can realize high-definition video real-time transmission, and pushes a plurality of internet applications to high-definition video, and high-definition faces each other.
The video networking adopts a real-time high-definition video exchange technology, can integrate required services such as dozens of services of video, voice, pictures, characters, communication, data and the like on a system platform on a network platform, such as high-definition video conference, video monitoring, intelligent monitoring analysis, emergency command, digital broadcast television, delayed television, network teaching, live broadcast, VOD on demand, television mail, Personal Video Recorder (PVR), intranet (self-office) channels, intelligent video broadcast control, information distribution and the like, and realizes high-definition quality video broadcast through a television or a computer.
To better understand the embodiments of the present invention, the following description refers to the internet of view:
some of the technologies applied in the video networking are as follows:
network Technology (Network Technology)
Network technology innovation in video networking has improved over traditional Ethernet (Ethernet) to face the potentially enormous video traffic on the network. Unlike pure network Packet Switching (Packet Switching) or network Circuit Switching (Circuit Switching), the internet of vision technology employs network Packet Switching to satisfy the demand of Streaming (translated into Streaming, and continuous broadcasting, which is a data transmission technology, converting received data into a stable and continuous stream, and continuously transmitting the stream, so that the sound heard by the user or the image seen by the user is very smooth, and the user can start browsing on the screen before the whole data is transmitted). The video networking technology has the advantages of flexibility, simplicity and low price of packet switching, and simultaneously has the quality and safety guarantee of circuit switching, thereby realizing the seamless connection of the whole network switching type virtual circuit and the data format.
Switching Technology (Switching Technology)
The video network adopts two advantages of asynchronism and packet switching of the Ethernet, eliminates the defects of the Ethernet on the premise of full compatibility, has end-to-end seamless connection of the whole network, is directly communicated with a user terminal, and directly bears an IP data packet. The user data does not require any format conversion across the entire network. The video networking is a higher-level form of the Ethernet, is a real-time exchange platform, can realize the real-time transmission of the whole-network large-scale high-definition video which cannot be realized by the existing Internet, and pushes a plurality of network video applications to high-definition and unification.
Server Technology (Server Technology)
The server technology on the video networking and unified video platform is different from the traditional server, the streaming media transmission of the video networking and unified video platform is established on the basis of connection orientation, the data processing capacity of the video networking and unified video platform is independent of flow and communication time, and a single network layer can contain signaling and data transmission. For voice and video services, the complexity of video networking and unified video platform streaming media processing is much simpler than that of data processing, and the efficiency is greatly improved by more than one hundred times compared with that of a traditional server.
Storage Technology (Storage Technology)
The super-high speed storage technology of the unified video platform adopts the most advanced real-time operating system in order to adapt to the media content with super-large capacity and super-large flow, the program information in the server instruction is mapped to the specific hard disk space, the media content is not passed through the server any more, and is directly sent to the user terminal instantly, and the general waiting time of the user is less than 0.2 second. The optimized sector distribution greatly reduces the mechanical motion of the magnetic head track seeking of the hard disk, the resource consumption only accounts for 20% of that of the IP internet of the same grade, but concurrent flow which is 3 times larger than that of the traditional hard disk array is generated, and the comprehensive efficiency is improved by more than 10 times.
Network Security Technology (Network Security Technology)
The structural design of the video network completely eliminates the network security problem troubling the internet structurally by the modes of independent service permission control each time, complete isolation of equipment and user data and the like, generally does not need antivirus programs and firewalls, avoids the attack of hackers and viruses, and provides a structural carefree security network for users.
Service Innovation Technology (Service Innovation Technology)
The unified video platform integrates services and transmission, and is not only automatically connected once whether a single user, a private network user or a network aggregate. The user terminal, the set-top box or the PC are directly connected to the unified video platform to obtain various multimedia video services in various forms. The unified video platform adopts a menu type configuration table mode to replace the traditional complex application programming, can realize complex application by using very few codes, and realizes infinite new service innovation.
Networking of the video network is as follows:
the video network is a centralized control network structure, and the network can be a tree network, a star network, a ring network and the like, but on the basis of the centralized control node, the whole network is controlled by the centralized control node in the network.
As shown in fig. 1, the video network is divided into an access network and a metropolitan network.
The devices of the access network part can be mainly classified into 3 types: node server, access switch, terminal (including various set-top boxes, coding boards, memories, etc.). The node server is connected to an access switch, which may be connected to a plurality of terminals and may be connected to an ethernet network.
The node server is a node which plays a centralized control function in the access network and can control the access switch and the terminal. The node server can be directly connected with the access switch or directly connected with the terminal.
Similarly, devices of the metropolitan network portion may also be classified into 3 types: a metropolitan area server, a node switch and a node server. The metro server is connected to a node switch, which may be connected to a plurality of node servers.
The node server is a node server of the access network part, namely the node server belongs to both the access network part and the metropolitan area network part.
The metropolitan area server is a node which plays a centralized control function in the metropolitan area network and can control a node switch and a node server. The metropolitan area server can be directly connected with the node switch or directly connected with the node server.
Therefore, the whole video network is a network structure with layered centralized control, and the network controlled by the node server and the metropolitan area server can be in various structures such as tree, star and ring.
The access network part can form a unified video platform (circled part), and a plurality of unified video platforms can form a video network; each unified video platform may be interconnected via metropolitan area and wide area video networking.
Video networking device classification
1.1 devices in the video network of the embodiment of the present invention can be mainly classified into 3 types: servers, switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.). The video network as a whole can be divided into a metropolitan area network (or national network, global network, etc.) and an access network.
1.2 wherein the devices of the access network part can be mainly classified into 3 types: node servers, access switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.).
The specific hardware structure of each access network device is as follows:
a node server:
as shown in fig. 2, the system mainly includes a network interface module 201, a switching engine module 202, a CPU module 203, and a disk array module 204.
The network interface module 201, the CPU module 203, and the disk array module 204 all enter the switching engine module 202; the switching engine module 202 performs an operation of looking up the address table 205 on the incoming packet, thereby obtaining the direction information of the packet; and stores the packet in a queue of the corresponding packet buffer 206 based on the packet's steering information; if the queue of the packet buffer 206 is nearly full, it is discarded; the switching engine module 202 polls all packet buffer queues for forwarding if the following conditions are met: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero. The disk array module 204 mainly implements control over the hard disk, including initialization, read-write, and other operations on the hard disk; the CPU module 203 is mainly responsible for protocol processing with an access switch and a terminal (not shown in the figure), configuring an address table 205 (including a downlink protocol packet address table, an uplink protocol packet address table, and a data packet address table), and configuring the disk array module 204.
The access switch:
as shown in fig. 3, the network interface module (downstream network interface module 301, upstream network interface module 302), the switching engine module 303, and the CPU module 304 are mainly included.
Wherein, the packet (uplink data) coming from the downlink network interface module 301 enters the packet detection module 305; the packet detection module 305 detects whether the Destination Address (DA), the Source Address (SA), the packet type, and the packet length of the packet meet the requirements, if so, allocates a corresponding stream identifier (stream-id) and enters the switching engine module 303, otherwise, discards the stream identifier; the packet (downstream data) coming from the upstream network interface module 302 enters the switching engine module 303; the data packet coming from the CPU module 204 enters the switching engine module 303; the switching engine module 303 performs an operation of looking up the address table 306 on the incoming packet, thereby obtaining the direction information of the packet; if the packet entering the switching engine module 303 is from the downstream network interface to the upstream network interface, the packet is stored in the queue of the corresponding packet buffer 307 in association with the stream-id; if the queue of the packet buffer 307 is nearly full, it is discarded; if the packet entering the switching engine module 303 is not from the downlink network interface to the uplink network interface, the data packet is stored in the queue of the corresponding packet buffer 307 according to the guiding information of the packet; if the queue of the packet buffer 307 is nearly full, it is discarded.
The switching engine module 303 polls all packet buffer queues, which in this embodiment of the present invention is divided into two cases:
if the queue is from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queued packet counter is greater than zero; 3) and obtaining the token generated by the code rate control module.
If the queue is not from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero.
The rate control module 208 is configured by the CPU module 204, and generates tokens for packet buffer queues from all downstream network interfaces to upstream network interfaces at programmable intervals to control the rate of upstream forwarding.
The CPU module 304 is mainly responsible for protocol processing with the node server, configuration of the address table 306, and configuration of the code rate control module 308.
Ethernet protocol conversion gateway
As shown in fig. 4, the apparatus mainly includes a network interface module (a downlink network interface module 401 and an uplink network interface module 402), a switching engine module 403, a CPU module 404, a packet detection module 405, a rate control module 408, an address table 406, a packet buffer 407, a MAC adding module 409, and a MAC deleting module 410.
Wherein, the data packet coming from the downlink network interface module 401 enters the packet detection module 405; the packet detection module 405 detects whether the ethernet MAC DA, the ethernet MAC SA, the ethernet length or frame type, the video network destination address DA, the video network source address SA, the video network packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id); then, the MAC deletion module 410 subtracts MAC DA, MAC SA, length or frame type (2byte) and enters the corresponding receiving buffer, otherwise, discards it;
the downlink network interface module 401 detects the sending buffer of the port, and if there is a packet, obtains the ethernet MAC DA of the corresponding terminal according to the video networking destination address DA of the packet, adds the ethernet MAC DA of the terminal, the MAC SA of the ethernet coordination gateway, and the ethernet length or frame type, and sends the packet.
The other modules in the ethernet protocol gateway function similarly to the access switch.
A terminal:
the system mainly comprises a network interface module, a service processing module and a CPU module; for example, the set-top box mainly comprises a network interface module, a video and audio coding and decoding engine module and a CPU module; the coding board mainly comprises a network interface module, a video and audio coding engine module and a CPU module; the memory mainly comprises a network interface module, a CPU module and a disk array module.
1.3 devices of the metropolitan area network part can be mainly classified into 3 types: node server, node exchanger, metropolitan area server. The node switch mainly comprises a network interface module, a switching engine module and a CPU module; the metropolitan area server mainly comprises a network interface module, a switching engine module and a CPU module.
2. Video networking packet definition
2.1 Access network packet definition
The data packet of the access network mainly comprises the following parts: destination Address (DA), Source Address (SA), reserved bytes, payload (pdu), CRC.
As shown in the following table, the data packet of the access network mainly includes the following parts:
DA SA Reserved Payload CRC
the Destination Address (DA) is composed of 8 bytes (byte), the first byte represents the type of the data packet (e.g. various protocol packets, multicast data packets, unicast data packets, etc.), there are at most 256 possibilities, the second byte to the sixth byte are metropolitan area network addresses, and the seventh byte and the eighth byte are access network addresses.
The Source Address (SA) is also composed of 8 bytes (byte), defined as the same as the Destination Address (DA).
The reserved byte consists of 2 bytes.
The payload part has different lengths according to types of different datagrams, and is 64 bytes if the type of the datagram is a variety of protocol packets, or is 1056 bytes if the type of the datagram is a unicast packet, but is not limited to the above 2 types.
The CRC consists of 4 bytes and is calculated in accordance with the standard ethernet CRC algorithm.
2.2 metropolitan area network packet definition
The topology of a metropolitan area network is a graph and there may be 2, or even more than 2, connections between two devices, i.e., there may be more than 2 connections between a node switch and a node server, a node switch and a node switch, and a node switch and a node server. However, the metro network address of the metro network device is unique, and in order to accurately describe the connection relationship between the metro network devices, parameters are introduced in the embodiment of the present invention: a label to uniquely describe a metropolitan area network device.
In this specification, the definition of the Label is similar to that of a Label of Multi-Protocol Label switching (MPLS), and assuming that there are two connections between a device a and a device B, there are 2 labels for a packet from the device a to the device B, and 2 labels for a packet from the device B to the device a. The label is classified into an incoming label and an outgoing label, and assuming that the label (incoming label) of the packet entering the device a is 0x0000, the label (outgoing label) of the packet leaving the device a may become 0x 0001. The network access process of the metro network is a network access process under centralized control, that is, address allocation and label allocation of the metro network are both dominated by the metro server, and the node switch and the node server are both passively executed, which is different from label allocation of MPLS, and label allocation of MPLS is a result of mutual negotiation between the switch and the server.
As shown in the following table, the data packet of the metro network mainly includes the following parts:
DA SA Reserved label (R) Payload CRC
Namely Destination Address (DA), Source Address (SA), Reserved byte (Reserved), tag, payload (pdu), CRC. The format of the tag may be defined by reference to the following: the tag is 32 bits with the upper 16 bits reserved and only the lower 16 bits used, and its position is between the reserved bytes and payload of the packet.
Based on the characteristics of the video network, one of the core concepts of the embodiment of the invention is provided, and the mobile terminal transmits alarm data containing alarm position information and alarm image-text information to a first video network node server according to a protocol of the video network. The first video network node server stores the alarm image-text information, generates an access address of the alarm image-text information in the first video network node server, and sends the alarm position information and the access address to the second video network node server. And the second video network node server acquires the alarm image-text information by using the access address and displays the alarm position information and the alarm image-text information.
Referring to fig. 5, a flowchart illustrating steps of an embodiment of an alarm data processing method according to the present invention is shown, where the method may be applied to the internet and a video network, the internet may include a mobile terminal, and the video network may include a first video network node server and a second video network node server, where the first video network node server communicates with the mobile terminal and the second video network node server, respectively, and the method may specifically include the following steps:
step 501, the first video network node server receives alarm data from the mobile terminal.
In the embodiment of the present invention, the mobile terminal may be a terminal device such as a smart phone and a tablet computer, and the mobile terminal may adopt an android operating system or an IOS operating system. The mobile terminal may be located at any position where monitoring and alarming are needed, such as a station, a square, a school, and the like. The mobile terminal may generate the alarm data under the condition that the safety pre-warning occurs at the location, and the alarm data may be generated by manual triggering of an operator of the mobile terminal, for example, the operator clicks an alarm button on the mobile terminal to generate the alarm data. The alarm data may also be generated by the mobile terminal itself, for example, if the mobile terminal detects that the temperature nearby is higher than a preset temperature threshold value by using its own sensor, the alarm data is generated. The embodiment of the present invention does not specifically limit the generation manner of the alarm data.
In the embodiment of the present invention, the alarm data generated by the mobile terminal may include alarm location information, that is, location information where an alarm occurs, which may be specific latitude and longitude information or rough range information. For example, the specific latitude and longitude information may be obtained by the positioning device of the mobile terminal, and the operator may perform secondary revision on the latitude and longitude information obtained by the mobile terminal. The alarm data generated by the mobile terminal can also contain alarm image-text information, and the alarm image-text information can comprise one or more of alarm image information, alarm audio-video information and alarm text information. The alarm picture information can be one or more pictures shot by the mobile terminal on an alarm site; the alarm audio and video information can be a section of audio, a section of video or a plurality of sections of audio and a plurality of sections of video recorded by the mobile terminal on the alarm site; the alarm text information can be a section of text or a plurality of sections of text which is input by an operator on the mobile terminal and used for describing and explaining the alarm field. The embodiment of the invention does not specifically limit the type of information contained in the alarm image-text information, and the embodiment of the invention also does not specifically limit the acquisition mode of the alarm image-text information.
In the embodiment of the invention, after the mobile terminal generates the alarm data, the alarm data can be transmitted to the first video network node server through the wireless network. For example, the mobile terminal may transmit the alert data to the first video network node server over the 4G network. In a preferred embodiment of the present invention, when the mobile terminal transmits the alarm graphics context information in the alarm data, the mobile terminal may transmit the alarm graphics context information to the first node server of the internet of view in the form of streaming media, and after the first node server of the internet of view receives the streaming media, the first node server of the internet of view converts the received streaming media and restores the received streaming media to obtain the alarm graphics context information. In practical application, in order to improve the transmission rate, the mobile terminal may compress the alarm graphics and text information, and transmit the compressed alarm graphics and text information to the first node server of the video network in the form of streaming media.
And 502, the first video network node server stores and processes the alarm image-text information in the alarm data and generates an access address of the alarm image-text information in the first video network node server.
In the embodiment of the invention, after the first video network node server receives the alarm data, the alarm data is not directly forwarded to the second video network node server, but the alarm image-text information in the alarm data is stored, and the access address of the alarm image-text information is generated. Specifically, the first node server of the video network may store the alarm graphics and text information locally in the first node server of the video network, and the first node server of the video network may also store the alarm graphics and text information in a database of the first node server of the video network.
In a preferred embodiment of the present invention, when storing the alarm graphics context information, the first node server of the video network may store the alarm graphics context information according to the number of times of sources of the alarm graphics context information, that is, the alarm graphics context information obtained by transmitting the alarm graphics context information once through the mobile terminal is stored in a directory, and the attribute information of the directory may include identification information, time information, and the like of the mobile terminal from which the alarm graphics context information originates. The source mobile terminal of the alarm image-text information can be stored, that is, the alarm image-text information transmitted by the same mobile terminal is stored in a directory, and the attribute information of the directory can include the identification information, the time information and the like of the source mobile terminal of the alarm image-text information. The embodiment of the invention does not specifically limit the storage mode of the first node server of the video network for storing the alarm image-text information, and the like.
In the embodiment of the invention, the first video network node server and the second video network node server can communicate with each other, and the first video network node server and the second video network node server can access each other, so that the second video network node server can access the alarm image-text information stored in the first video network node server. Specifically, the second node server of the video network can access the alarm image-text information stored in the first node server of the video network by using the access address. The access address of the alarm graphic and text information generated by the first video network node server can be a network address inside the video network, and the access address can be set to be only open to the second video network node server, and accordingly, the second video network node server has the right to access the access address generated by the first video network node server. The embodiment of the present invention does not specifically limit the technical means used by the first node server of the video network to generate the access address, the form of the access address, the authority possessed by the second node server of the video network, and the like.
Step 503, the first node server of the video network sends the alarm location information and the access address in the alarm data to the second node server of the video network, so as to display on the second node server of the video network.
In the embodiment of the invention, when the first video network node server sends the alarm position information and the access address to the second video network node server, the alarm position information and the access address can be sent to the preprocessing service object of the second video network node server, the preprocessing service object preprocesses the alarm position information and the access address to obtain the processing result, and then the processing result is transmitted to the main processing service object of the second video network node server. The preprocessing service object can be a preprocessing application program in the second video network node server, the preprocessing service object can determine specific positions such as a point and a range where an alarm is located according to the alarm position information, the preprocessing service object can also browse the access address to read alarm image-text information stored in the first video network node server, and the obtained alarm image-text information is decompressed and the like. The embodiment of the present invention does not specifically limit the carrier for preprocessing the service object, the processing procedure of the preprocessing, and the like. The main processing service object can be a main processing application program in the second video network node server, the main processing service object can transmit the alarm position information and the alarm image-text information to a video network client in the video network by using a websocket protocol, and the main processing service object can also transmit the alarm position information and the alarm image-text information to a browser of the second video network node server. The video network client and/or the browser can display the alarm position information and the alarm image-text information, specifically, the video network client and/or the browser can display a specific position corresponding to the alarm position information, such as a certain point or a certain range, and the video network client and/or the browser can also play the alarm image information and the alarm audio-video information in the alarm image-text information and display the alarm text information in response to relevant operations of an administrator.
In a preferred embodiment of the present invention, when the mobile terminal generates and transmits the alarm data to the first node server of the internet of things, the identification information of the mobile terminal itself may be carried in the alarm data. The identification information is used to determine a unique mobile terminal. When the alarm data received by the first video network node server contains the identification information of the mobile terminal, the first video network node server can send the identification information of the mobile terminal to the second video network node server, the second video network node server can determine the only mobile terminal according to the identification information, then the video call between the second video network node server and the mobile terminal is established, the actual situation of an alarm site can be known in time, and the alarm processing speed is improved. When the second node server of the video network establishes the video call with the mobile terminal, the video call server in the video network, the video network protocol gateway and other devices can be used.
Based on the above description about the embodiment of the alarm data processing method, a method for receiving the alarm data of the handheld device based on the video networking GIS platform is introduced below, as shown in fig. 6, an alarm application on the handheld device alarms when finding an alarm situation, takes a picture or records a video of a scene, describes the scene situation with a voice or a text, compresses the picture, the video and the voice, and sends the alarm data of an alarm position, the picture, the video, the voice, the text and the like to the video networking streaming media server. When receiving alarm data, the video networking streaming media server stores pictures, videos and voices on the video networking streaming media server, generates a corresponding access address, sends the access address, an alarm position, characters and the like to a GIS Forward application program (an auxiliary application program in the GIS server) of the GIS server, preprocesses the received information by the GIS Forward application program, and forwards a preprocessing result to a GIS main application program of the GIS server. After receiving the preprocessing result, the GIS main application program sends the preprocessing result to the GIS display application program through the websocket protocol, and the GIS display application program displays corresponding information such as pictures, voice, videos and the like.
The embodiment of the invention is applied to the Internet and the video network, wherein the Internet can comprise a mobile terminal, the video network can comprise a first video network node server and a second video network node server, and the first video network node server is respectively communicated with the mobile terminal and the second video network node server.
In the embodiment of the invention, the mobile terminal is used for acquiring the on-site alarm data and transmitting the acquired alarm data to the first video network node server. The alarm data can comprise alarm position information and alarm image-text information, and the alarm image-text information at least comprises one or more of alarm image information, alarm audio-video information and alarm text information. After receiving the alarm data, the first node server of the video network can store the alarm graph and text information in the alarm data at a specified position and generate an access address for the specified position, and then the first node server of the video network sends the access address and the alarm position information to the second node server of the video network. The second video network node server can acquire the alarm image-text information according to the access address and display the alarm position information and the alarm image-text information.
The embodiment of the invention applies the characteristics of the video network, and the mobile terminal adds alarm image-text information such as alarm picture information, alarm audio-video information, alarm text information and the like on the basis that the reported alarm data contains the alarm position information. The first node server of the video network receives the alarm data, stores the alarm image-text information therein, generates an access address of the alarm image-text information in the first node server of the video network, and sends the alarm position information and the access address to the second node server of the video network. And the second video network node server displays the alarm position information and the alarm image-text information acquired by using the access address. According to the embodiment of the invention, the alarm image-text information is added in the alarm data, the second video network node server can display the on-site alarm data more abundantly, control the on-site alarm condition in more detail and improve the accuracy of alarm processing.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 7, a block diagram of a first node server of the internet of view in an embodiment of a system for processing alarm data according to the present invention is shown, the system may be applied to the internet and the internet of view, the internet may include a mobile terminal, and the internet of view may include a first node server of the internet of view and a second node server of the internet of view, where the first node server of the internet of view communicates with the mobile terminal and the second node server of the internet of view respectively, and the first node server of the internet of view in the system may specifically include the following modules:
a receiving module 701, configured to receive alarm data from a mobile terminal, where the alarm data includes alarm location information and alarm graphics and text information, and the alarm graphics and text information at least includes one of the following: alarm picture information, alarm audio and video information and alarm character information.
And the generating module 702 is configured to store the alarm image-text information and generate an access address of the alarm image-text information in the first node server of the video network.
The sending module 703 is configured to send the alarm position information and the access address to a second node server of the internet of things, where the second node server of the internet of things is configured to obtain the alarm image-text information according to the access address, and display the alarm position information and the alarm image-text information.
In a preferred embodiment of the present invention, the sending module 703 is configured to send the alarm location information and the access address to a preprocessing service object of the second node server in the video network, where the preprocessing service object is configured to read the alarm graphics information from the access address and forward the alarm location information and the alarm graphics information to a main processing service object of the second node server in the video network.
In a preferred embodiment of the present invention, the video network further includes a video network client, and the main processing service object is configured to transmit the alarm location information and the alarm graphics and text information to the video network client by using a websocket protocol, so as to display the alarm location information and the alarm graphics and text information on the video network client.
In a preferred embodiment of the invention, the alarm data further comprises identification information of the mobile terminal; the sending module 703 is further configured to send the identification information to a second node server of the video networking, where the second node server of the video networking is further configured to initiate a video call request to the mobile terminal according to the identification information.
In a preferred embodiment of the invention, the mobile terminal is adapted to compress the alarm teletext information before sending it to the first node server of the video network.
For the system embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The method for processing alarm data and the system for processing alarm data provided by the invention are described in detail, and a specific example is applied in the text to explain the principle and the implementation of the invention, and the description of the above embodiment is only used to help understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A method for processing alarm data, wherein the method is applied to the Internet and an Internet of vision, the Internet comprises a mobile terminal, the Internet of vision comprises a first Internet of vision node server and a second Internet of vision node server, the first Internet of vision node server communicates with the mobile terminal through the Internet and communicates with the second Internet of vision node server through the Internet, and the method comprises the following steps:
the first video network node server receives alarm data from the mobile terminal, wherein the alarm data comprises alarm position information and alarm image-text information, and the alarm image-text information at least comprises one of the following information: alarm picture information, alarm audio and video information and alarm character information;
the first video network node server stores the alarm image-text information and generates an access address of the alarm image-text information in the first video network node server, wherein the access address is a network address in a video network;
and the first video network node server sends the alarm position information and the access address to the second video network node server, and the second video network node server is used for acquiring the alarm image-text information according to the access address and displaying the alarm position information and the alarm image-text information.
2. The method for processing alarm data according to claim 1, wherein the first node server sends the alarm location information and the access address to the second node server, and the method comprises:
and the first video network node server sends the alarm position information and the access address to a preprocessing service object of the second video network node server, wherein the preprocessing service object is used for reading the alarm image-text information from the access address and forwarding the alarm position information and the alarm image-text information to a main processing service object of the second video network node server.
3. The method of processing alarm data of claim 2, wherein the video network further comprises a video network client,
and the main processing service object is used for transmitting the alarm position information and the alarm image-text information to the video network client by utilizing a websocket protocol so as to display the alarm position information and the alarm image-text information on the video network client.
4. The method for processing alarm data according to claim 1, wherein the alarm data further includes identification information of the mobile terminal;
the method further comprises the following steps:
and the first video network node server sends the identification information to the second video network node server, and the second video network node server is also used for initiating a video call request to the mobile terminal according to the identification information.
5. The method of processing alarm data of claim 1, wherein the mobile terminal is configured to compress the alarm teletext information before sending the alarm teletext information to the first node server of the video network.
6. A processing system of alarm data is characterized in that the system is applied to the Internet and the video network, the Internet comprises a mobile terminal, the video network comprises a first video network node server and a second video network node server, the first video network node server communicates with the mobile terminal through the Internet and communicates with the second video network node server through the video network, and the first video network node server comprises:
the receiving module is used for receiving alarm data from the mobile terminal, wherein the alarm data comprises alarm position information and alarm image-text information, and the alarm image-text information at least comprises one of the following information: alarm picture information, alarm audio and video information and alarm character information;
the generating module is used for storing and processing the alarm image-text information and generating an access address of the alarm image-text information in the first video network node server, wherein the access address is a network address in the video network;
and the sending module is used for sending the alarm position information and the access address to the second video network node server, and the second video network node server is used for acquiring the alarm image-text information according to the access address and displaying the alarm position information and the alarm image-text information.
7. The system for processing alarm data according to claim 6, wherein the sending module is configured to send the alarm location information and the access address to a preprocessing service object of the second node server in the internet of view, and the preprocessing service object is configured to read the alarm graphics and text information from the access address and forward the alarm location information and the alarm graphics and text information to a main processing service object of the second node server in the internet of view.
8. The alarm data processing system of claim 7, wherein the video network further comprises a video network client,
and the main processing service object is used for transmitting the alarm position information and the alarm image-text information to the video network client by utilizing a websocket protocol so as to display the alarm position information and the alarm image-text information on the video network client.
9. The system for processing alarm data according to claim 6, wherein said alarm data further comprises identification information of said mobile terminal;
the sending module is further configured to send the identification information to the second node server of the video networking, and the second node server of the video networking is further configured to initiate a video call request to the mobile terminal according to the identification information.
10. The alarm data processing system of claim 6, wherein the mobile terminal is configured to compress the alarm teletext information before sending the alarm teletext information to the first node server of the video network.
CN201811270027.7A 2018-10-29 2018-10-29 Alarm data processing method and system Active CN109544879B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811270027.7A CN109544879B (en) 2018-10-29 2018-10-29 Alarm data processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811270027.7A CN109544879B (en) 2018-10-29 2018-10-29 Alarm data processing method and system

Publications (2)

Publication Number Publication Date
CN109544879A CN109544879A (en) 2019-03-29
CN109544879B true CN109544879B (en) 2020-11-13

Family

ID=65845910

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811270027.7A Active CN109544879B (en) 2018-10-29 2018-10-29 Alarm data processing method and system

Country Status (1)

Country Link
CN (1) CN109544879B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112447033B (en) * 2019-09-05 2024-03-19 比亚迪股份有限公司 Security data processing method, system, computer equipment and storage medium
CN110795008B (en) * 2019-09-29 2021-11-16 视联动力信息技术股份有限公司 Picture transmission method and device and computer readable storage medium
CN111836201B (en) * 2020-06-12 2024-05-10 视联动力信息技术股份有限公司 Alarm method, system and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101448144A (en) * 2008-12-23 2009-06-03 北京中星微电子有限公司 Method for realizing alarm in video monitoring system and video monitor alarm system
CN201839376U (en) * 2010-07-22 2011-05-18 曹峰华 Monitoring system and monitoring video server
CN202514010U (en) * 2012-02-09 2012-10-31 成都三零凯天通信实业有限公司 System with support of linkage of alarm, real-time video, historical video, and short message
CN103024053A (en) * 2012-12-18 2013-04-03 华为技术有限公司 Cloud storage method, resource scheduling system and cloud storage node and system
CN103188284A (en) * 2011-12-27 2013-07-03 华为终端有限公司 Method and equipment for sharing media resource information between home networks
CN107993416A (en) * 2016-10-26 2018-05-04 北京视联动力国际信息技术有限公司 A kind for the treatment of method and apparatus of alert event
CN108063743A (en) * 2016-11-07 2018-05-22 北京视联动力国际信息技术有限公司 The method and apparatus that a kind of web camera communicates with depending on networked terminals

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105868288A (en) * 2016-03-23 2016-08-17 乐视控股(北京)有限公司 Image-text information management method, apparatus and system
CN108924451A (en) * 2017-05-09 2018-11-30 高德信息技术有限公司 A kind of image acquiring method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101448144A (en) * 2008-12-23 2009-06-03 北京中星微电子有限公司 Method for realizing alarm in video monitoring system and video monitor alarm system
CN201839376U (en) * 2010-07-22 2011-05-18 曹峰华 Monitoring system and monitoring video server
CN103188284A (en) * 2011-12-27 2013-07-03 华为终端有限公司 Method and equipment for sharing media resource information between home networks
CN202514010U (en) * 2012-02-09 2012-10-31 成都三零凯天通信实业有限公司 System with support of linkage of alarm, real-time video, historical video, and short message
CN103024053A (en) * 2012-12-18 2013-04-03 华为技术有限公司 Cloud storage method, resource scheduling system and cloud storage node and system
CN107993416A (en) * 2016-10-26 2018-05-04 北京视联动力国际信息技术有限公司 A kind for the treatment of method and apparatus of alert event
CN108063743A (en) * 2016-11-07 2018-05-22 北京视联动力国际信息技术有限公司 The method and apparatus that a kind of web camera communicates with depending on networked terminals

Also Published As

Publication number Publication date
CN109544879A (en) 2019-03-29

Similar Documents

Publication Publication Date Title
CN110121059B (en) Monitoring video processing method, device and storage medium
CN109167960B (en) Method and system for processing video stream data
CN110545405B (en) Video transmission method and system based on video network
CN110049273B (en) Video networking-based conference recording method and transfer server
CN109246135B (en) Method and system for acquiring streaming media data
CN109191808B (en) Alarm method and system based on video network
CN109194915B (en) Video data processing method and system
CN109544879B (en) Alarm data processing method and system
CN108965930B (en) Video data processing method and device
CN109802952B (en) Monitoring data synchronization method and device
CN109743555B (en) Information processing method and system based on video network
CN110769297A (en) Audio and video data processing method and system
CN110769179A (en) Audio and video data stream processing method and system
CN110072154B (en) Video networking-based clustering method and transfer server
CN110086773B (en) Audio and video data processing method and system
CN109963107B (en) Audio and video data display method and system
CN110896461B (en) Unmanned aerial vehicle shooting video display method, device and system
CN109379553B (en) Audio and video data display method and system
CN110324578B (en) Monitoring video processing method, device and storage medium
CN110830185B (en) Data transmission method and device
CN110650311B (en) Commanding and scheduling method, system and device
CN109688073B (en) Data processing method and system based on video network
CN110798645B (en) Alarm method and system based on video network
CN109194896B (en) Calling method and system for video networking terminal
CN110620796B (en) Fingerprint information access method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201230

Address after: 571924 building C07, Zone C, Hainan Ecological Software Park, hi tech Industrial Demonstration Zone, old town, Haikou City, Hainan Province

Patentee after: Hainan Shilian Communication Technology Co.,Ltd.

Address before: 100000 Beijing Dongcheng District Qinglong Hutong 1 Song Hua Building A1103-1113

Patentee before: VISIONVERA INFORMATION TECHNOLOGY Co.,Ltd.