CN108574689B - Method and device for video call - Google Patents

Method and device for video call Download PDF

Info

Publication number
CN108574689B
CN108574689B CN201711159099.XA CN201711159099A CN108574689B CN 108574689 B CN108574689 B CN 108574689B CN 201711159099 A CN201711159099 A CN 201711159099A CN 108574689 B CN108574689 B CN 108574689B
Authority
CN
China
Prior art keywords
server
video
terminal
data
terminal equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711159099.XA
Other languages
Chinese (zh)
Other versions
CN108574689A (en
Inventor
刘坤
王睿智
亓娜
王艳辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visionvera Information Technology Co Ltd
Original Assignee
Visionvera Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visionvera Information Technology Co Ltd filed Critical Visionvera Information Technology Co Ltd
Priority to CN201711159099.XA priority Critical patent/CN108574689B/en
Publication of CN108574689A publication Critical patent/CN108574689A/en
Application granted granted Critical
Publication of CN108574689B publication Critical patent/CN108574689B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1083In-session procedures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1096Supplementary features, e.g. call forwarding or call holding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals

Abstract

The embodiment of the invention provides a method and a device for video call, which are applied to a video network, wherein the video network comprises terminal equipment and a server, and the method comprises the following steps: displaying at least one terminal device on map data; determining target terminal equipment on the map data according to a selection request of a user; and calling the server to carry out video call with the target terminal equipment. By applying the embodiment of the invention, the video call can be quickly established with the target terminal equipment, the operation steps of the user are simple, and the use experience of the user is improved.

Description

Method and device for video call
Technical Field
The present invention relates to the field of video networking technologies, and in particular, to a method and an apparatus for video call.
Background
With the rapid development of network technologies, terminal devices are spread across the whole country, and are widely popularized in the aspects of life, work, learning and the like of users, such as conferences, teaching, monitoring and the like.
In practical application, in order to implement two-party communication in the video network, communication between the terminal device and the terminal device is generally required, but the current video network cannot enable a user to quickly locate the terminal device and quickly establish communication connection, so that for the user, the terminal device cannot be quickly located and the communication connection cannot be quickly established in the video network, and the user experience effect is poor.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are proposed to provide a method for a video call and a corresponding apparatus for a video call that overcome or at least partially solve the above problems.
In order to solve the above problem, an embodiment of the present invention discloses a method for a video call, where the method is applied to a video network, the video network includes a terminal device and a server, and the method includes:
displaying at least one terminal device on map data;
determining target terminal equipment on the map data according to a selection request of a user;
and calling the server to carry out video call with the target terminal equipment.
Optionally, the displaying at least one terminal device on the map data includes:
sending a terminal data acquisition request to a server; the server is used for collecting terminal data of the terminal equipment based on the acquisition request;
receiving the terminal data returned by the server according to the downlink communication link configured for the terminal equipment;
and calling map data, and displaying the terminal equipment on the map data based on the terminal data.
Optionally, the invoking the server to perform a video call with the target terminal device includes:
sending a videophone request to the server; the server is used for forwarding the video telephone request to a target terminal device and judging a call state according to a response message fed back by the target terminal device;
and when the call state is the answer agreement, carrying out the video call with the target terminal equipment through the server.
Optionally, the invoking the server to perform a video call with the target terminal device further includes:
and when the call state is the answer rejection, receiving an answer rejection message fed back by the server.
Optionally, the invoking the server to perform a video call with the target terminal device further includes:
and if the server does not receive the response message fed back by the target terminal within the preset time, receiving a message of failed answering fed back by the server.
The embodiment of the invention also discloses a device for video communication, the method is applied to a video network, the video network comprises terminal equipment and a server, and the device comprises:
the terminal equipment display module is used for displaying at least one terminal equipment on the map data;
the target terminal equipment determining module is used for determining target terminal equipment on the map data according to a selection request of a user;
and the visual call module is used for calling the server to carry out visual call with the target terminal equipment.
Optionally, the terminal device display module includes:
the acquisition request sending submodule is used for sending an acquisition request of terminal data to the server; the server is used for collecting terminal data of the terminal equipment based on the acquisition request;
the terminal data receiving submodule is used for receiving the terminal data returned by the server according to a downlink communication link configured for the terminal equipment;
and the terminal equipment display submodule is used for calling map data and displaying the terminal equipment on the map data based on the terminal data.
Optionally, the visual call module includes:
a videophone request sending submodule for sending a videophone request to the server; the server is used for forwarding the video telephone request to a target terminal device and judging a call state according to a response message fed back by the target terminal device;
and the visual call sub-module is used for carrying out visual call with the target terminal equipment through the server when the call state is in agreement to answer.
The embodiment of the invention also discloses an electronic device which comprises a memory and one or more programs, wherein the one or more programs are stored in the memory and are configured to be executed by one or more processors to execute the method for the video call, which is described in one or more of the methods.
The embodiment of the invention also discloses a readable storage medium, which is characterized in that when the instructions in the storage medium are executed by a processor of the electronic equipment, the electronic equipment can execute the method for the video call, wherein the method is one or more of the methods.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, the terminal equipment is firstly displayed on the map data in the video network, so that a user can look up the terminal equipment on the map data, and further, the user can select the target terminal equipment from the terminal equipment on the map data and call the server to try to establish the video call with the target terminal equipment.
Drawings
FIG. 1 is a schematic networking diagram of a video network of the present invention;
FIG. 2 is a schematic diagram of a hardware architecture of a node server according to the present invention;
fig. 3 is a schematic diagram of a hardware structure of an access switch of the present invention;
fig. 4 is a schematic diagram of a hardware structure of an ethernet protocol conversion gateway according to the present invention;
FIG. 5 is a flowchart illustrating steps of a method of a visual call in accordance with an embodiment of the present invention;
FIG. 6 is a schematic diagram of a server and terminal device communication according to the present invention;
fig. 7 is a flowchart of a call state determination of a target terminal device according to the present invention;
FIG. 8 is a block diagram of an embodiment of a video call apparatus of the present invention;
fig. 9 is a block diagram illustrating a structure of an electronic device for a visual call according to an example embodiment.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The video networking is an important milestone for network development, is a real-time network, can realize high-definition video real-time transmission, and pushes a plurality of internet applications to high-definition video, and high-definition faces each other.
The video networking adopts a real-time high-definition video exchange technology, can integrate required services such as dozens of services of video, voice, pictures, characters, communication, data and the like on a system platform on a network platform, such as high-definition video conference, video monitoring, intelligent monitoring analysis, emergency command, digital broadcast television, delayed television, network teaching, live broadcast, VOD on demand, television mail, Personal Video Recorder (PVR), intranet (self-office) channels, intelligent video broadcast control, information distribution and the like, and realizes high-definition quality video broadcast through a television or a computer.
To better understand the embodiments of the present invention, the following description refers to the internet of view:
some of the technologies applied in the video networking are as follows:
network Technology (Network Technology)
Network technology innovation in video networking has improved over traditional Ethernet (Ethernet) to face the potentially enormous video traffic on the network. Unlike pure network Packet Switching (Packet Switching) or network circuit Switching (circuit Switching), the Packet Switching is adopted by the technology of the video networking to meet the Streaming requirement. The video networking technology has the advantages of flexibility, simplicity and low price of packet switching, and simultaneously has the quality and safety guarantee of circuit switching, thereby realizing the seamless connection of the whole network switching type virtual circuit and the data format.
Switching Technology (Switching Technology)
The video network adopts two advantages of asynchronism and packet switching of the Ethernet, eliminates the defects of the Ethernet on the premise of full compatibility, has end-to-end seamless connection of the whole network, is directly communicated with a user terminal, and directly bears an IP data packet. The user data does not require any format conversion across the entire network. The video networking is a higher-level form of the Ethernet, is a real-time exchange platform, can realize the real-time transmission of the whole-network large-scale high-definition video which cannot be realized by the existing Internet, and pushes a plurality of network video applications to high-definition and unification.
Server Technology (Server Technology)
The server technology on the video networking and unified video platform is different from the traditional server, the streaming media transmission of the video networking and unified video platform is established on the basis of connection orientation, the data processing capacity of the video networking and unified video platform is independent of flow and communication time, and a single network layer can contain signaling and data transmission. For voice and video services, the complexity of video networking and unified video platform streaming media processing is much simpler than that of data processing, and the efficiency is greatly improved by more than one hundred times compared with that of a traditional server.
Storage Technology (Storage Technology)
The super-high speed storage technology of the unified video platform adopts the most advanced real-time operating system in order to adapt to the media content with super-large capacity and super-large flow, the program information in the server instruction is mapped to the specific hard disk space, the media content is not passed through the server any more, and is directly sent to the user terminal instantly, and the general waiting time of the user is less than 0.2 second. The optimized sector distribution greatly reduces the mechanical motion of the magnetic head track seeking of the hard disk, the resource consumption only accounts for 20% of that of the IP internet of the same grade, but concurrent flow which is 3 times larger than that of the traditional hard disk array is generated, and the comprehensive efficiency is improved by more than 10 times.
Network Security Technology (Network Security Technology)
The structural design of the video network completely eliminates the network security problem troubling the internet structurally by the modes of independent service permission control each time, complete isolation of equipment and user data and the like, generally does not need antivirus programs and firewalls, avoids the attack of hackers and viruses, and provides a structural carefree security network for users.
Service Innovation Technology (Service Innovation Technology)
The unified video platform integrates services and transmission, and is not only automatically connected once whether a single user, a private network user or a network aggregate. The user terminal, the set-top box or the PC are directly connected to the unified video platform to obtain various multimedia video services in various forms. The unified video platform adopts a menu type configuration table mode to replace the traditional complex application programming, can realize complex application by using very few codes, and realizes infinite new service innovation.
Networking of the video network is as follows:
the video network is a centralized control network structure, and the network can be a tree network, a star network, a ring network and the like, but on the basis of the centralized control node, the whole network is controlled by the centralized control node in the network.
As shown in fig. 1, the video network is divided into an access network and a metropolitan network.
The devices of the access network part can be mainly classified into 3 types: node server, access switch, terminal (including various set-top boxes, coding boards, memories, etc.). The node server is connected to an access switch, which may be connected to a plurality of terminals and may be connected to an ethernet network.
The node server is a node which plays a centralized control function in the access network and can control the access switch and the terminal. The node server can be directly connected with the access switch or directly connected with the terminal.
Similarly, devices of the metropolitan network portion may also be classified into 3 types: a metropolitan area server, a node switch and a node server. The metro server is connected to a node switch, which may be connected to a plurality of node servers.
The node server is a node server of the access network part, namely the node server belongs to both the access network part and the metropolitan area network part.
The metropolitan area server is a node which plays a centralized control function in the metropolitan area network and can control a node switch and a node server. The metropolitan area server can be directly connected with the node switch or directly connected with the node server.
Therefore, the whole video network is a network structure with layered centralized control, and the network controlled by the node server and the metropolitan area server can be in various structures such as tree, star and ring.
The access network part can form a unified video platform (the part in the dotted circle), and a plurality of unified video platforms can form a video network; each unified video platform may be interconnected via metropolitan area and wide area video networking.
Video networking device classification
1.1 devices in the video network of the embodiment of the present invention can be mainly classified into 3 types: servers, switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.). The video network as a whole can be divided into a metropolitan area network (or national network, global network, etc.) and an access network.
1.2 wherein the devices of the access network part can be mainly classified into 3 types: node servers, access switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.).
The specific hardware structure of each access network device is as follows:
a node server:
as shown in fig. 2, the system mainly includes a network interface module 201, a switching engine module 202, a CPU module 203, and a disk array module 204;
the network interface module 201, the CPU module 203, and the disk array module 204 all enter the switching engine module 202; the switching engine module 202 performs an operation of looking up the address table 205 on the incoming packet, thereby obtaining the direction information of the packet; and stores the packet in a queue of the corresponding packet buffer 206 based on the packet's steering information; if the queue of the packet buffer 206 is nearly full, it is discarded; the switching engine module 202 polls all packet buffer queues for forwarding if the following conditions are met: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero. The disk array module 204 mainly implements control over the hard disk, including initialization, read-write, and other operations on the hard disk; the CPU module 203 is mainly responsible for protocol processing with an access switch and a terminal (not shown in the figure), configuring an address table 205 (including a downlink protocol packet address table, an uplink protocol packet address table, and a data packet address table), and configuring the disk array module 204.
The access switch:
as shown in fig. 3, the network interface module mainly includes a network interface module (a downlink network interface module 301 and an uplink network interface module 302), a switching engine module 303 and a CPU module 304;
wherein, the packet (uplink data) coming from the downlink network interface module 301 enters the packet detection module 305; the packet detection module 305 detects whether the Destination Address (DA), the Source Address (SA), the packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id) and enters the switching engine module 303, otherwise, discards the stream identifier; the packet (downstream data) coming from the upstream network interface module 302 enters the switching engine module 303; the data packet coming from the CPU module 204 enters the switching engine module 303; the switching engine module 303 performs an operation of looking up the address table 306 on the incoming packet, thereby obtaining the direction information of the packet; if the packet entering the switching engine module 303 is from the downstream network interface to the upstream network interface, the packet is stored in the queue of the corresponding packet buffer 307 in association with the stream-id; if the queue of the packet buffer 307 is nearly full, it is discarded; if the packet entering the switching engine module 303 is not from the downlink network interface to the uplink network interface, the data packet is stored in the queue of the corresponding packet buffer 307 according to the guiding information of the packet; if the queue of the packet buffer 307 is nearly full, it is discarded.
The switching engine module 303 polls all packet buffer queues, which in this embodiment of the present invention is divided into two cases:
if the queue is from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queued packet counter is greater than zero; 3) obtaining a token generated by a code rate control module;
if the queue is not from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero.
The rate control module 208 is configured by the CPU module 204, and generates tokens for packet buffer queues from all downstream network interfaces to upstream network interfaces at programmable intervals to control the rate of upstream forwarding.
The CPU module 304 is mainly responsible for protocol processing with the node server, configuration of the address table 306, and configuration of the code rate control module 308.
Ethernet protocol conversion gateway
As shown in fig. 4, the apparatus mainly includes a network interface module (a downlink network interface module 401 and an uplink network interface module 402), a switching engine module 403, a CPU module 404, a packet detection module 405, a rate control module 408, an address table 406, a packet buffer 407, a MAC adding module 409, and a MAC deleting module 410.
Wherein, the data packet coming from the downlink network interface module 401 enters the packet detection module 405; the packet detection module 405 detects whether the ethernet MAC DA, the ethernet MAC SA, the ethernet length or frame type, the video network destination address DA, the video network source address SA, the video network packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id); then, the MAC deletion module 410 subtracts MAC DA, MAC SA, length or frame type (2byte) and enters the corresponding receiving buffer, otherwise, discards it;
the downlink network interface module 401 detects the sending buffer of the port, and if there is a packet, obtains the ethernet MAC DA of the corresponding terminal according to the destination address DA of the packet, adds the ethernet MAC DA of the terminal, the MACSA of the ethernet coordination gateway, and the ethernet length or frame type, and sends the packet.
The other modules in the ethernet protocol gateway function similarly to the access switch.
A terminal:
the system mainly comprises a network interface module, a service processing module and a CPU module; for example, the set-top box mainly comprises a network interface module, a video and audio coding and decoding engine module and a CPU module; the coding board mainly comprises a network interface module, a video and audio coding engine module and a CPU module; the memory mainly comprises a network interface module, a CPU module and a disk array module.
1.3 devices of the metropolitan area network part can be mainly classified into 2 types: node server, node exchanger, metropolitan area server. The node switch mainly comprises a network interface module, a switching engine module and a CPU module; the metropolitan area server mainly comprises a network interface module, a switching engine module and a CPU module.
2. Video networking packet definition
2.1 Access network packet definition
The data packet of the access network mainly comprises the following parts: destination Address (DA), Source Address (SA), reserved bytes, payload (pdu), CRC.
As shown in the following table, the data packet of the access network mainly includes the following parts:
DA SA Reserved Payload CRC
wherein:
the Destination Address (DA) is composed of 8 bytes (byte), the first byte represents the type of the data packet (such as various protocol packets, multicast data packets, unicast data packets, etc.), there are 256 possibilities at most, the second byte to the sixth byte are metropolitan area network addresses, and the seventh byte and the eighth byte are access network addresses;
the Source Address (SA) is also composed of 8 bytes (byte), defined as the same as the Destination Address (DA);
the reserved byte consists of 2 bytes;
the payload part has different lengths according to different types of datagrams, and is 64 bytes if the datagram is various types of protocol packets, and is 32+1024 or 1056 bytes if the datagram is a unicast packet, of course, the length is not limited to the above 2 types;
the CRC consists of 4 bytes and is calculated in accordance with the standard ethernet CRC algorithm.
2.2 metropolitan area network packet definition
The topology of a metropolitan area network is a graph and there may be 2, or even more than 2, connections between two devices, i.e., there may be more than 2 connections between a node switch and a node server, a node switch and a node switch, and a node switch and a node server. However, the metro network address of the metro network device is unique, and in order to accurately describe the connection relationship between the metro network devices, parameters are introduced in the embodiment of the present invention: a label to uniquely describe a metropolitan area network device.
In this specification, the definition of the Label is similar to that of the Label of MPLS (Multi-Protocol Label Switch), and assuming that there are two connections between the device a and the device B, there are 2 labels for the packet from the device a to the device B, and 2 labels for the packet from the device B to the device a. The label is classified into an incoming label and an outgoing label, and assuming that the label (incoming label) of the packet entering the device a is 0x0000, the label (outgoing label) of the packet leaving the device a may become 0x 0001. The network access process of the metro network is a network access process under centralized control, that is, address allocation and label allocation of the metro network are both dominated by the metro server, and the node switch and the node server are both passively executed, which is different from label allocation of MPLS, and label allocation of MPLS is a result of mutual negotiation between the switch and the server.
As shown in the following table, the data packet of the metro network mainly includes the following parts:
DA SA Reserved label (R) Payload CRC
Namely Destination Address (DA), Source Address (SA), Reserved byte (Reserved), tag, payload (pdu), CRC. The format of the tag may be defined by reference to the following: the tag is 32 bits with the upper 16 bits reserved and only the lower 16 bits used, and its position is between the reserved bytes and payload of the packet.
Based on the characteristics of the video network, one of the core concepts of the embodiment of the invention is provided, the point positions of all terminal equipment are shown on the map at the web client, and the user can dial the video telephone function by clicking the link at the terminal selected on the map.
Referring to fig. 5, a flowchart illustrating steps of an embodiment of a method for video call according to the present invention is shown, where the method is applied to a video network, where the video network includes a terminal device and a server, and the method specifically includes the following steps:
step 501, at least one terminal device is displayed on map data.
In a specific implementation, the terminal device of the embodiment of the present invention may also be referred to as a video network terminal, and includes a mobile terminal, a monitoring terminal, a video communication terminal, and so on. The terminal equipment is provided with a web client, and the terminal data fed back by the server can be shown on the map data arranged to the web client.
Specifically, the terminal data may include a terminal type, data statistics, a terminal number, a terminal longitude and latitude, a terminal name, a phone of a person in charge, and the like, which is not limited in this embodiment of the present invention. The terminal data can be acquired by a server of the video network or manually added by the user.
In a preferred embodiment of the present invention, the step 501 may include the following sub-steps:
substep S11, sending a request for acquiring terminal data to the server; the server is used for collecting terminal data of the terminal equipment based on the acquisition request;
substep S12, receiving the terminal data returned by the server according to the downlink communication link configured for the terminal device;
and a substep S13 of calling map data and presenting the terminal device on the map data based on the terminal data.
When the terminal equipment is displayed on the map, terminal data related to the terminal equipment can be requested from the server, the server collects the terminal data and returns the terminal data to the terminal equipment after receiving the request, a web client on the terminal equipment arranges the terminal data based on the terminal data, and the terminal data is filled into the map data, so that each terminal equipment in the video network can be displayed on the map data.
In the embodiment of the present invention, after the terminal data of the terminal device is acquired from the server, the existing GIS (Geographic Information System) map data may be called, and the terminal data is rendered on the map data for display.
In this practical application, the video network is a network with a centralized control function, and includes a main control server and a lower level network device, where the lower level network device includes a terminal, and one of the core concepts of the video network is that a table is configured for a downlink communication link of a current service by notifying a switching device by the main control server, and then a data packet is transmitted based on the configured table.
Namely, the communication method in the video network includes:
the main control server configures a downlink communication link of the current service;
and transmitting the data packet of the current service sent by the source terminal to a target terminal (such as a video network terminal) according to the downlink communication link.
In the embodiment of the present invention, configuring the downlink communication link of the current service includes: informing the switching equipment related to the downlink communication link of the current service to allocate a table;
further, transmitting according to the downlink communication link includes: the configured table is consulted, and the switching equipment transmits the received data packet through the corresponding port.
In particular implementations, the services include unicast communication services and multicast communication services. Namely, whether multicast communication or unicast communication, the core concept of the table matching-table can be adopted to realize communication in the video network.
As mentioned above, the video network includes an access network portion, in which the master server is a node server and the lower-level network devices include an access switch and a terminal.
For the unicast communication service in the access network, the step of configuring the downlink communication link of the current service by the master server may include the following steps:
the main control server obtains the downlink communication link information of the current service according to a service request protocol packet initiated by a source terminal, wherein the downlink communication link information comprises the main control server participating in the current service and the downlink communication port information of an access switch;
the main control server sets a downlink port to which a data packet of the current service is directed in a data packet address table in the main control server according to the downlink communication port information of the main control server; sending a port configuration command to a corresponding access switch according to the downlink communication port information of the access switch;
the access switch sets the downstream port to which the data packet of the current service is directed in the data packet address table in the access switch according to the port configuration command.
For multicast communication services (such as video conference and video monitoring) in the access network, the step of the master server acquiring downlink communication link information of the current service may include the following sub-steps:
the method comprises the steps that a main control server obtains a service request protocol packet which is initiated by a target terminal and applies for multicast communication service, wherein the service request protocol packet comprises service type information, service content information and an access network address of the target terminal; wherein, the service content information comprises a service number;
the main control server extracts the access network address of the source terminal in a preset content-address mapping table according to the service number;
the main control server acquires a multicast address corresponding to a source terminal and distributes the multicast address to a target terminal; and acquiring the communication link information of the current multicast service according to the service type information and the access network addresses of the source terminal and the target terminal.
Step 502, according to the selection request of the user, determining the target terminal device on the map data.
For the video network user, the terminal device to be linked can be selected on the map, and the terminal device selected by the user is the target terminal device.
Step 503, invoking the server to perform a video call with the target terminal device.
After the target terminal equipment is determined on the map data, the video network user platform (web client) sends a video telephone request to the video network bottom layer special service (video network server), the video network bottom layer special service receives the request, then calls the terminal equipment and returns the terminal equipment to the video network user platform, and the video call between the web client and the target terminal equipment is established.
In a preferred embodiment of the present invention, the step 503 may include the following sub-steps:
substep S21, sending a videophone request to the server; the server is used for forwarding the video telephone request to a target terminal device and judging a call state according to a response message fed back by the target terminal device;
and a substep S22, performing a video call with the target terminal device through the server when the call status is yes to answer.
After a user of the video network determines a target terminal device on map data of a web client, a video telephone request is sent to a special service of the bottom layer of the video network, the video telephone request is forwarded to the target terminal device by the special service of the bottom layer of the video network, if the target terminal device agrees to answer or refuses to answer, a response message is fed back to the special service of the bottom layer of the video network, after the special service of the bottom layer of the video network is received, whether the target terminal agrees to answer or not is determined according to the response message, and if the answer is agreed, the web client can carry out video conversation with the target terminal through the special service of the bottom layer.
In a preferred embodiment of the present invention, the step 503 may further include the following sub-steps:
and a substep S23, receiving a message of refusing to answer, which is fed back by the server, when the call state is refusing to answer.
According to the foregoing, the user of the target terminal device can select to accept or reject, and if the user agrees to accept, the server feeds back a message that the target terminal device rejects the video call to the web client.
In a preferred embodiment of the present invention, the step 503 may further include the following sub-steps:
and a substep S24, if the server does not receive the response message fed back by the target terminal within a preset time, receiving a message of failed listening fed back by the server.
In practice, there is also a case that the target terminal device does not feedback the request of the videophone, and in order to avoid sending the request all the time, it may be set that after the preset time is reached, a message that the target terminal device agrees or rejects is not received, and then a message that the answering fails is fed back to the web terminal.
For the embodiment of the present invention based on the dedicated service at the bottom layer of the video network, a flow of data interaction between the video network user platform and other terminal devices (e.g., palm top, mobile terminal, monitor, etc.) is shown in fig. 6.
In order to make the embodiment of the present invention better understood by those skilled in the art, a specific example is used below to describe a process of linking terminals to perform a video call. Referring to fig. 7, a flow chart of determining a call state of a target terminal device according to the present invention is shown, and the specific process includes:
1. a user selects target terminal equipment at a web client;
2. calling the other side, namely the target terminal equipment, through the special service of the bottom layer of the video network;
3. the other party receives the call request and selects to approve the connection or refuse the connection;
4. and receiving a message of agreeing to connect (or disagreeing) fed back by the other party.
In the calling process, if the calling is not connected, the calling is continued until the user hangs up or the opposite party rejects. Of course, considering the situation that the possible party is not answered by the other party, it can be set that if no refusal or agreement message is received in the preset time, the call is determined to be failed, and the call is ended.
In the embodiment of the invention, in order to ensure the real-time performance of data, a websocket is selected for data communication. Wherein, Websocket: websocket is a protocol specification proposed by html 5. The websocket stipulates a communication specification, and a connection similar to tcp can be established between a client (browser) and a server (webserver) through a handshake mechanism, so that communication between the client and the server is facilitated.
Of course, other communication protocols may also be used, such as ajax data communication, ajax being an encapsulation of XMLHttpRequest (XHR) objects used to exchange data with the server in the background. The client sends a request to the server, and the server responds to the request of the client and returns data. Are short-circuited.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 8, a block diagram of a structure of an embodiment of a device for video call according to the present invention is shown, where the device is applied to a video network, where the video network includes a terminal device and a server, and the device may specifically include the following modules:
a terminal device display module 601, configured to display at least one terminal device on map data;
a target terminal device determining module 602, configured to determine a target terminal device on the map data according to a selection request of a user;
and a video call module 603, configured to invoke the server to perform a video call with the target terminal device.
In a preferred embodiment of the present invention, the terminal device displaying module 601 includes:
the acquisition request sending submodule is used for sending an acquisition request of terminal data to the server; the server is used for collecting terminal data of the terminal equipment based on the acquisition request;
the terminal data receiving submodule is used for receiving the terminal data returned by the server according to a downlink communication link configured for the terminal equipment;
and the terminal equipment display submodule is used for calling map data and displaying the terminal equipment on the map data based on the terminal data.
In a preferred embodiment of the present invention, the visual call module 603 includes:
a videophone request sending submodule for sending a videophone request to the server; the server is used for forwarding the video telephone request to a target terminal device and judging a call state according to a response message fed back by the target terminal device;
and the visual call sub-module is used for carrying out visual call with the target terminal equipment through the server when the call state is in agreement to answer.
In a preferred embodiment of the present invention, the visual call module 603 further includes:
and the answer refusing sub-module is used for receiving the answer refusing message fed back by the server when the call state is the answer refusing state.
In a preferred embodiment of the present invention, the visual call module 603 further includes:
and the answering failure submodule is used for receiving the answering failure message fed back by the server if the server does not receive the response message fed back by the target terminal within the preset time.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
Therefore, in the embodiment of the invention, the terminal equipment is firstly displayed on the map data in the video network, so that the user can look up the terminal equipment on the map data, and further, the user can select the target terminal equipment from the terminal equipment on the map data and call the server to try to establish the video call with the target terminal equipment.
Fig. 9 is a block diagram illustrating a structure of an electronic device 800 for data processing according to an example embodiment. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 9, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing elements 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operation at the device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power components 804 provide power to the various components of the electronic device 800. Power components 804 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for electronic device 800.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 814 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 814 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the electronic device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium in which instructions, when executed by a processor of an electronic device, enable the electronic device to perform a method of video telephony, the method comprising: displaying at least one terminal device on map data; determining target terminal equipment on the map data according to a selection request of a user; and calling the server to carry out video call with the target terminal equipment.
Optionally, the displaying at least one terminal device on the map data includes: sending a terminal data acquisition request to a server; the server is used for collecting terminal data of the terminal equipment based on the acquisition request; receiving the terminal data returned by the server according to the downlink communication link configured for the terminal equipment; and calling map data, and displaying the terminal equipment on the map data based on the terminal data.
Optionally, the invoking the server to perform a video call with the target terminal device includes: sending a videophone request to the server; the server is used for forwarding the video telephone request to a target terminal device and judging a call state according to a response message fed back by the target terminal device; and when the call state is the answer agreement, carrying out the video call with the target terminal equipment through the server.
Optionally, the invoking the server to perform a video call with the target terminal device further includes: and when the call state is the answer rejection, receiving an answer rejection message fed back by the server.
Optionally, the invoking the server to perform a video call with the target terminal device further includes: and if the server does not receive the response message fed back by the target terminal within the preset time, receiving a message of failed answering fed back by the server.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a predictive manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The method and apparatus for video call and the electronic device provided by the present invention are described in detail above, and the principle and the implementation of the present invention are explained in this document by applying specific examples, and the description of the above examples is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (6)

1. A method of video telephony, the method being applied to a web client on a terminal device in a video network, the video network comprising the terminal device and a server, the method comprising:
displaying at least one terminal device on map data;
determining target terminal equipment on the map data according to a selection request of a user;
calling the server to carry out video call with the target terminal equipment;
the displaying of at least one terminal device on map data includes:
sending a terminal data acquisition request to a server; the server is used for collecting terminal data of the terminal equipment based on the acquisition request, wherein the terminal data at least comprises the longitude and latitude of the terminal;
receiving the terminal data returned by the server according to the downlink communication link configured for the terminal equipment;
calling map data, sorting the terminal data, and filling the terminal data into the map data to display the terminal equipment on the map data;
the calling the server to perform the video call with the target terminal device includes:
sending a video telephone request to the server, calling the target terminal device to return to the web client after the server receives the video telephone request, and establishing a video call between the web client and the target terminal device;
the calling the server to perform the video call with the target terminal device includes:
sending a videophone request to the server; the server is used for forwarding the video telephone request to a target terminal device and judging a call state according to a response message fed back by the target terminal device;
and when the call state is the answer agreement, carrying out the video call with the target terminal equipment through the server.
2. The method of claim 1, wherein the invoking the server to conduct a video call with the target end device further comprises:
and when the call state is the answer rejection, receiving an answer rejection message fed back by the server.
3. The method of claim 1, wherein the invoking the server to conduct a video call with the target end device further comprises:
and if the server does not receive the response message fed back by the target terminal within the preset time, receiving a message of failed answering fed back by the server.
4. An apparatus for video call, wherein the apparatus is applied to a web client on a terminal device in a video network, the video network comprises the terminal device and a server, the apparatus comprises:
the terminal equipment display module is used for displaying at least one terminal equipment on the map data;
the target terminal equipment determining module is used for determining target terminal equipment on the map data according to a selection request of a user;
the visual call module is used for calling the server to carry out visual call with the target terminal equipment and sending a visual call request to the server, and the server calls the target terminal equipment to return to the web client after receiving the visual call request and establishes the visual call between the web client and the target terminal equipment;
the terminal equipment display module comprises:
the acquisition request sending submodule is used for sending an acquisition request of terminal data to the server; the server is used for collecting terminal data of the terminal equipment based on the acquisition request, wherein the terminal data at least comprises the longitude and latitude of the terminal;
the terminal data receiving submodule is used for receiving the terminal data returned by the server according to a downlink communication link configured for the terminal equipment;
the terminal equipment display submodule is used for calling map data, sorting the terminal data and filling the terminal data into the map data so as to display the terminal equipment on the map data;
the visual call module comprises:
a videophone request sending submodule for sending a videophone request to the server; the server is used for forwarding the video telephone request to a target terminal device and judging a call state according to a response message fed back by the target terminal device;
and the visual call sub-module is used for carrying out visual call with the target terminal equipment through the server when the call state is in agreement to answer.
5. An electronic device comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by one or more processors to perform the method for video telephony according to one or more of claims 1-3.
6. A readable storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform a method of video telephony as claimed in one or more of claims 1-3.
CN201711159099.XA 2017-11-20 2017-11-20 Method and device for video call Active CN108574689B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711159099.XA CN108574689B (en) 2017-11-20 2017-11-20 Method and device for video call

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711159099.XA CN108574689B (en) 2017-11-20 2017-11-20 Method and device for video call

Publications (2)

Publication Number Publication Date
CN108574689A CN108574689A (en) 2018-09-25
CN108574689B true CN108574689B (en) 2020-07-03

Family

ID=63576442

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711159099.XA Active CN108574689B (en) 2017-11-20 2017-11-20 Method and device for video call

Country Status (1)

Country Link
CN (1) CN108574689B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109068086A (en) * 2018-09-26 2018-12-21 视联动力信息技术股份有限公司 A kind of processing method and system of visual telephone service
CN109450995B (en) * 2018-10-25 2020-10-27 视联动力信息技术股份有限公司 Method and system for acquiring server data
CN109743523A (en) * 2018-11-30 2019-05-10 视联动力信息技术股份有限公司 A kind of communication means and device
CN109698962A (en) * 2018-12-10 2019-04-30 视联动力信息技术股份有限公司 Live video communication method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226792A (en) * 2012-12-27 2013-07-31 富阳市供电局 Electric power work management and control method combining motion trail
CN103491339A (en) * 2012-06-11 2014-01-01 华为技术有限公司 Video acquisition method, video acquisition equipment and video acquisition system
CN203883928U (en) * 2014-03-28 2014-10-15 王慧 Community electronic commerce system based on video calls
CN107276886A (en) * 2017-06-30 2017-10-20 深圳前海弘稼科技有限公司 Managing device that management method is fallen behind by Team Member and Team Member falls behind

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2513090B (en) * 2013-01-28 2019-12-11 Microsoft Technology Licensing Llc Conditional concealment of lost video data
CN106341515B (en) * 2015-07-06 2019-02-22 视联动力信息技术股份有限公司 A kind of monitoring method and device of terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103491339A (en) * 2012-06-11 2014-01-01 华为技术有限公司 Video acquisition method, video acquisition equipment and video acquisition system
CN103226792A (en) * 2012-12-27 2013-07-31 富阳市供电局 Electric power work management and control method combining motion trail
CN203883928U (en) * 2014-03-28 2014-10-15 王慧 Community electronic commerce system based on video calls
CN107276886A (en) * 2017-06-30 2017-10-20 深圳前海弘稼科技有限公司 Managing device that management method is fallen behind by Team Member and Team Member falls behind

Also Published As

Publication number Publication date
CN108574689A (en) 2018-09-25

Similar Documents

Publication Publication Date Title
CN108574688B (en) Method and device for displaying participant information
CN108809686B (en) Resource synchronization method and device, electronic equipment and computer readable storage medium
CN108632558B (en) Video call method and device
CN109120946B (en) Method and device for watching live broadcast
CN108574689B (en) Method and device for video call
CN109068186B (en) Method and device for processing packet loss rate
CN110493554B (en) Method and system for switching speaking terminal
CN110049271B (en) Video networking conference information display method and device
CN110062191B (en) Multi-party group meeting method and server based on video network
CN110572607A (en) Video conference method, system and device and storage medium
CN110099240B (en) Control method and device for video conference
CN110049273B (en) Video networking-based conference recording method and transfer server
CN109743522B (en) Communication method and device based on video networking
CN109194915B (en) Video data processing method and system
CN111131754A (en) Control split screen method and device of conference management system
CN113194278A (en) Conference control method and device and computer readable storage medium
CN110557612B (en) Control method of monitoring equipment and video networking system
CN109963108B (en) One-to-many talkback method and device
CN110049268B (en) Video telephone connection method and device
CN109698818B (en) Method and device for acquiring online user and cross-streaming media communication
CN108989737B (en) Data playing method and device and electronic equipment
CN110049275B (en) Information processing method and device in video conference and storage medium
CN109889755B (en) Communication connection method and video networking terminal
CN110198384B (en) Communication method based on video networking and transfer server
CN109963123B (en) Camera control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 100000 Dongcheng District, Beijing, Qinglong Hutong 1, 1103 house of Ge Hua building.

Applicant after: Video Link Power Information Technology Co., Ltd.

Address before: 100000 Beijing Dongcheng District Qinglong Hutong 1 Song Hua Building A1103-1113

Applicant before: BEIJING VISIONVERA INTERNATIONAL INFORMATION TECHNOLOGY CO., LTD.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant