US20150077509A1 - System for a Virtual Multipoint Control Unit for Unified Communications - Google Patents

System for a Virtual Multipoint Control Unit for Unified Communications Download PDF

Info

Publication number
US20150077509A1
US20150077509A1 US14/341,818 US201414341818A US2015077509A1 US 20150077509 A1 US20150077509 A1 US 20150077509A1 US 201414341818 A US201414341818 A US 201414341818A US 2015077509 A1 US2015077509 A1 US 2015077509A1
Authority
US
United States
Prior art keywords
virtual
audio
applications
data stream
communication
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/341,818
Inventor
Avishay BEN NATAN
Derek Graham
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ClearOne Communications Hong Kong Ltd
Original Assignee
ClearOne Communications Hong Kong Ltd
ClearOne Ltd
ClearOne Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ClearOne Communications Hong Kong Ltd, ClearOne Ltd, ClearOne Inc filed Critical ClearOne Communications Hong Kong Ltd
Priority to US14/341,818 priority Critical patent/US20150077509A1/en
Assigned to ClearOne Inc. reassignment ClearOne Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GRAHAM, DEREK
Publication of US20150077509A1 publication Critical patent/US20150077509A1/en
Assigned to CLEARONE LTD. FKA (FORMERLY KNOWN AS) C-V PRIVATE (ISRAEL) LTD. reassignment CLEARONE LTD. FKA (FORMERLY KNOWN AS) C-V PRIVATE (ISRAEL) LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BEN-NATAN, Avishay
Assigned to CLEARONE COMMUNICATIONS HONG KONG LTD. reassignment CLEARONE COMMUNICATIONS HONG KONG LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CLEARONE LTD.
Assigned to CLEARONE COMMUNICATIONS HONG KONG LTD. reassignment CLEARONE COMMUNICATIONS HONG KONG LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ClearOne Inc.
Priority to US15/062,066 priority patent/US9781386B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/152Multipoint control units therefor
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L19/00Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis
    • G10L19/04Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis using predictive techniques
    • G10L19/16Vocoder architecture
    • G10L19/173Transcoding, i.e. converting between two coded representations avoiding cascaded coding-decoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals

Definitions

  • This disclosure relates to unified communication systems. More specifically, this disclosure invention relates to a system for a virtual multipoint control unit for unified communications.
  • Unified Communications (UC) software applications have proliferated into the enterprise communications market. These applications allow end-users to communicate using voice, video, application and data sharing, instant messaging, etc. A few examples of these applications include Microsoft Lync, Skype, IBM Sametime, Cisco IP Communicator, etc.
  • the multipoint control unit typically allows multiple UC applications to access local hardware devices in a mutually exclusive manner. For e.g., video data streams from one UC application are temporarily restricted from being played on a local hardware device such as a display device while the device is in use by another UC application. As a result, the multipoint control unit induces an unwanted delay during execution of simultaneously received data streams from different UC applications
  • This disclosure describes a system for a virtual multipoint control unit for unified communications.
  • a virtual multipoint control unit is in communication with a plurality of devices over a network.
  • the virtual multipoint control unit comprises a plurality of unified communication (UC) applications, at least one virtual imaging device, at least one virtual audio device, a virtual video mixer, and a virtual audio mixer.
  • the plurality of unified communication (UC) applications is executed on at least one of the plurality of devices.
  • Each of the plurality of UC applications receives an encoded audio data stream and an encoded video data stream.
  • the plurality of UC applications decodes the received audio data and the video data streams.
  • the at least one virtual imaging device is mapped to each of the plurality of UC applications.
  • the at least one virtual audio device is mapped to each of the plurality of UC applications.
  • the virtual video mixer is in communication with a physical imaging device.
  • the virtual video mixer receives the decoded video data stream from each of the plurality of UC applications via the at least one virtual imaging device.
  • the virtual audio mixer is in communication with a physical audio device.
  • the virtual audio mixer receives the decoded audio data stream from each of the plurality of UC applications via the at least one virtual audio device.
  • a system for managing communication among a plurality of devices is disclosed.
  • the system is employed within a communication network.
  • the system comprises a communication device and a virtual multipoint control unit.
  • the communication device receives encoded audio and video data streams via a plurality of unified communication (UC) applications.
  • the plurality of UC applications decodes the received encoded audio data stream and the received encoded video data stream.
  • the virtual multipoint control unit is in communication with the communication device.
  • the virtual multipoint control unit comprises at least one virtual imaging device, at least one virtual audio device, a virtual video mixer, and a virtual audio mixer.
  • the at least one virtual imaging device is mapped to each of the plurality of UC applications.
  • the at least one virtual audio device is mapped to each of the plurality of UC applications.
  • the virtual video mixer is in communication with a physical imaging device.
  • the virtual video mixer receives the decoded video data stream from each of the plurality of UC applications via the at least one virtual imaging device.
  • the virtual audio mixer is in communication with a physical audio device.
  • the virtual audio mixer receives the decoded audio data stream from each of the plurality of UC applications via the at least one virtual audio device.
  • a non-transitory computer readable medium storing a program of instructions executable by the computing device to perform a method for employing a virtual multipoint control unit for unified communications.
  • the method comprises receiving, from a plurality of terminals, an encoded audio data stream and an encoded video data stream.
  • the method also comprises decoding, using a plurality of unified communication (UC) applications, the received encoded audio data stream and the received encoded video data stream.
  • the method further comprises mapping, using a virtual multipoint control unit, at least one virtual imaging device and at least one virtual audio device to each of the plurality of UC applications.
  • the method also comprises communicating, using a virtual video mixer, the decoded video data stream to a physical imaging device via the at least one virtual imaging device.
  • the method comprises communicating, using a virtual audio mixer, the decoded audio data stream to a physical audio device via the at least one virtual audio device.
  • a method for managing communication among a plurality of devices comprises receiving, from a plurality of terminals, an encoded audio data stream and an encoded video data stream.
  • the method also comprises decoding, using a plurality of unified communication (UC) applications, the received encoded audio data stream and the received encoded video data stream.
  • the method further comprises mapping, using a virtual multipoint control unit, at least one virtual imaging device and at least one virtual audio device to each of the plurality of UC applications.
  • the method also comprises communicating, using a virtual video mixer, the decoded video data stream to a physical imaging device via the at least one virtual imaging device.
  • the method comprises communicating, using a virtual audio mixer, the decoded audio data stream to a physical audio device via the at least one virtual audio device.
  • FIG. 1A is a schematic that illustrates a first environment for implementing an exemplary virtual multipoint control unit, according to an embodiment of the present disclosure.
  • FIG. 1B is a schematic that illustrates a second environment for implementing the exemplary virtual multipoint control unit of FIG. 1A , according to an embodiment of the present disclosure.
  • FIG. 2A is a schematic that illustrates the exemplary virtual multipoint control unit of FIG. 1A , according to an embodiment of the present disclosure.
  • FIG. 2B is a schematic that illustrates an exemplary unified communication application included in the virtual multipoint control unit of FIG. 1A , according to an embodiment of the present disclosure.
  • FIG. 3 is a schematic that illustrates a virtual multipoint control unit 300 managing a multipoint video conference, according to an embodiment of the present disclosure.
  • FIG. 4 is a schematic illustrating implementation of a device driver module of the virtual multipoint control unit for communicating with hardware devices for conducting a video conference, according to an embodiment of the present disclosure.
  • This disclosure describes a system for a virtual multipoint control unit for unified communications. This disclosure describes numerous specific details in order to provide a thorough understanding of the present invention. One skilled in the art will appreciate that one may practice the present invention without these specific details. Additionally, this disclosure does not describe some well known items in detail in order not to obscure the present invention.
  • An endpoint refers to one or more computing devices capable of establishing a communication channel for exchange of audio, video, textual, or symbolic data in a communication session.
  • the computing devices may include, but are not limited to, a desktop PC, a personal digital assistant (PDA), a server, a mainframe computer, a mobile computing device (for e.g., mobile phones, laptops, tablets, etc.), an internet appliance, calling devices (for e.g., a telephone, an internet phone, video telephone, etc.).
  • FIG. 1A is a schematic that illustrates a first environment for implementing an exemplary virtual multipoint control unit, according to an embodiment of the present disclosure.
  • Embodiments are disclosed in the context of environments that represent a multipoint video conference among multiple users via respective endpoints capable of executing one or more computer applications for unified communications in the same communication session.
  • other embodiments may be applied in the context of other scenarios (e.g., an audio conference, a webinar, a multiplayer online game, etc.) involving at least one of audio, video, textual, or symbolic (e.g., emoticons, images, etc.) data being communicated among various endpoints in the same communication session.
  • at least one of the endpoints may execute a unified communication application (UC application) during the session.
  • UC application unified communication application
  • the first network environment 100 may comprise multiple endpoints including a communication device 102 configured to communicate with terminals 104 - 1 , 104 - 2 , 104 - 3 , 104 - 4 , and 104 - 5 (collectively, terminals 104 ) via a network 106 .
  • the network 106 may comprise, for example, one or more of the Internet, Wide Area Networks (WANs), Local Area Networks (LANs), analog or digital wired and wireless telephone networks (e.g., a PSTN, Integrated Services Digital Network (ISDN), a cellular network, and Digital Subscriber Line (xDSL)), radio, television, cable, satellite, and/or any other delivery or tunneling mechanism for carrying data.
  • WANs Wide Area Networks
  • LANs Local Area Networks
  • analog or digital wired and wireless telephone networks e.g., a PSTN, Integrated Services Digital Network (ISDN), a cellular network, and Digital Subscriber Line (xDSL)
  • radio television, cable, satellite, and
  • the network 106 may comprise multiple networks or sub-networks, each of which may comprise, for example, a wired or wireless data pathway.
  • the network 106 may comprise a circuit-switched voice network, a packet-switched data network, or any other network that is able to carry electronic communications.
  • the network 106 may comprise networks based on the Internet protocol (IP) or asynchronous transfer mode (ATM), and may support voice using, for example, VoIP, Voice-over-ATM, or other comparable protocols used for voice data communications.
  • IP Internet protocol
  • ATM asynchronous transfer mode
  • the network 106 may comprise a cellular telephone network configured to enable exchange of textual data, audio data, video data, or any combination thereof between the communication device 102 and at least one of the terminals 104 .
  • the communication device 102 may comprise or be coupled with one or more hardware devices either wirelessly or in a wired fashion for enabling a user to dynamically interact with other users via the endpoints connected to the network 106 .
  • the communication device 102 may be coupled with an imaging device 108 (including, but not limited to, a video camera, a webcam, a scanner, or any combination thereof) and an audio device 110 (including, but not limited to, a speaker, a microphone, or any combination thereof).
  • the communication device 102 may be compatible with any other device (not shown) connected to the network 106 to exchange audio, video, textual or symbolic data streams with each other or any other compatible devices.
  • the communication device 102 may comprise a virtual multipoint control unit 112 configured to, at least one of: (1) create a logical representation of one or more hardware devices in communication with the communication device 102 ; (2) establish a communication bridge or channel between the UC applications executed by the communication device 102 , and the corresponding UC applications being executed by at least one of the terminals 104 ; (3) store, manage, and process the multimodal input data streams received from the endpoints and/or associated devices such as the imaging device 108 and the audio device 110 connected to the network 106 ; and (4) request services from or deliver services to, or both, various devices connected to the network 106 .
  • a virtual multipoint control unit 112 configured to, at least one of: (1) create a logical representation of one or more hardware devices in communication with the communication device 102 ; (2) establish a communication bridge or channel between the UC applications executed by the communication device 102 , and the corresponding UC applications being executed by at least one of the terminals 104 ; (3) store, manage, and process the multimodal input
  • the virtual multipoint control unit 112 may facilitate integration of real-time and non-real-time communication services by bridging communication among various UC applications being simultaneously executed on various endpoints, such as the communication device 102 and the terminals 104 .
  • real-time services may include, but are not limited to, instant messaging, internet protocol (IP) telephony, video conferencing, desktop sharing, data sharing, call control, and speech recognition.
  • non-real-time services may include, but are not limited to, voicemail, E-mail, SMS, and fax.
  • examples of UC applications may include, but are not limited to, Microsoft Lync, Skype, IBM Sametime, and Cisco IP Communicator. Each of the UC applications may operate with same or different communication protocols and media formats.
  • the virtual multipoint control unit 112 may be implemented as a standalone and dedicated “black box” including hardware and installed software, where the hardware is closely matched to the requirements and/or functionality of the software.
  • the virtual multipoint control unit 112 may be implemented as a software application or a device driver.
  • the virtual multipoint control unit 112 may enhance or increase the functionality and/or capacity of the network 106 to which it is connected.
  • the virtual multipoint control unit 112 may be further configured, for example, to perform e-mail tasks, security tasks, network management tasks including IP address management, and other tasks.
  • the virtual multipoint control unit 112 may be configured to expose its computing environment or operating code to the user, and may comprise related art I/O devices, such as camera, speaker, scanner, keyboard or display.
  • the virtual multipoint control unit 112 may, however, comprise software, firmware or other resources that support remote administration and/or maintenance of the virtual multipoint control unit.
  • the virtual multipoint control unit 112 may comprise at least one processor (not shown) executing machine readable program instructions for performing various operations, such as those discussed above, on the received multimodal input audio, video, textual, or symbolic data stream.
  • the virtual multipoint control unit 112 may comprise, in whole or in part, a software application working alone or in conjunction with one or more hardware resources. Such software applications may be executed by the processor on different hardware platforms or emulated in a virtual environment, discussed below in greater detail. Aspects of the virtual multipoint control unit 112 may leverage known, related art, or later developed off-the-shelf software.
  • the virtual multipoint control unit 112 may be integrated with, or installed on, a network appliance 152 that is associated with or used to establish the network 106 .
  • the network appliance 152 may be capable of operating as an interface device to assist exchange of program instructions and data between the communication device 102 and the terminals 104 .
  • the network appliance 152 may be preconfigured or dynamically configured to comprise the virtual multipoint control unit 112 integrated with other devices.
  • the virtual multipoint control unit 112 may be integrated with the communication device 102 or any other device, such as at least one of the terminals 104 connected to the network 106 .
  • the communication device 102 may comprise a module (not shown), which introduces the communication device 102 to the network appliance 152 , thereby enabling the network appliance 152 to invoke the virtual multipoint control unit 112 as a service.
  • Examples of the network appliance 152 may include, but are not limited to, a DSL modem, a wireless access point, a router, a base station, and a gateway having a predetermined computing power sufficient for implementing the virtual multipoint control unit 112 .
  • the virtual multipoint control unit 112 and the communication device 102 may collectively constitute a unified communication system, which may reside in a single device or may be distributed across multiple devices.
  • the unified communication system may be implemented in hardware or a suitable combination of hardware and software, and may comprise one or more software systems operating on a digital signal processing platform.
  • the “hardware” may comprise a combination of discrete components, an integrated circuit, an application-specific integrated circuit, a field programmable gate array, a digital signal processor, or other suitable hardware.
  • the “software” may comprise one or more objects, agents, threads, lines of code, subroutines, separate software applications, two or more lines of code or other suitable software structures operating in one or more software applications or on one or more processors.
  • Embodiments may comprise the unified communication system operating as or in a mobile switching center, network gateway system, Internet access node, application server, IMS core, service node, or some other communication system, including any combination thereof.
  • each of the terminals 104 may be associated with various devices which may include, but are not limited to, a camera, display device, microphone, speakers, and one or more codecs, or any other type of conferencing hardware, or in any combination thereof.
  • the terminals 104 may comprise video, voice and data communications capabilities (for e.g., videoconferencing capabilities) by being coupled to or including, various audio devices (e.g., microphones, audio input devices, speakers, audio output devices, telephones, speaker telephones, etc.), various video devices (e.g., monitors, projectors, displays, televisions, video output devices, video input devices, cameras, etc.), various networks (IP, PSTN, etc.) or any combination thereof.
  • Each of the terminals 104 may comprise or implement one or more real time protocols, e.g., session initiation protocol (SIP), H.261, H.263, H.264, H.323, among others.
  • SIP session initiation protocol
  • FIG. 2A illustrates the exemplary virtual multipoint control unit of FIG. 1A , according to an embodiment of the present disclosure.
  • the virtual multipoint control unit 112 may comprise one or more processor(s) 202 , one or more interface(s) 204 , and a memory module 206 .
  • the processor(s) 202 may execute a machine readable program comprising instructions for manipulating the received video signal.
  • the processor(s) 202 may comprise, for example, microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuits, and/or any devices that manipulate signals based on operational instructions.
  • the virtual multipoint control unit 112 may further comprise, in whole or in part, a software application working alone or in conjunction with one or more hardware resources.
  • Such software applications may be executed by one or more processors on different hardware platforms or emulated in a virtual environment. Aspects of the virtual multipoint control unit 112 may leverage known, related art, or later developed off-the-shelf software. Among other capabilities, the processor(s) 202 may be configured to fetch and execute instructions in computer readable memory module 206 .
  • the interface(s) 204 may coordinate interactions of the virtual multipoint control unit 112 with at least one of the communication device 102 and the terminals 104 over the network 106 .
  • the interface(s) 204 may comprise a variety of known, related art, or later developed interfaces, such as (1) software interfaces, for example, an application programming interface, a graphical user interface, etc.; (2) hardware interfaces, for example, cable connectors, keyboards, touchscreen, scanners, display screens, etc.; or both.
  • the interface(s) 204 facilitate receiving of the audio, video, or data signals, and reliable broadcast or multicast transmissions of one or more encoded output signals.
  • the memory module 206 may comprise any computer-readable medium known in the art, comprising, for example, volatile memory (e.g., RAM) and/or non-volatile memory (e.g., flash, etc.).
  • the memory module 206 may comprise UC applications 208 - 1 , . . . , 208 -N (collectively referred to as UC applications 208 ), a virtual video mixer 210 , a virtual audio mixer 212 , virtual imaging devices 214 - 1 , . . . 214 -N (collectively, virtual imaging devices 214 ), virtual audio devices 216 - 1 , . . . , 216 -N (collectively, virtual audio devices 216 ), and a device driver module 218 .
  • the virtual multipoint control unit 112 may be configured to implement various real-time and non-real time communication protocols for rendering and transmitting various audio, video, textual, or symbolic communication signals from the UC applications 208 .
  • the virtual multipoint control unit 112 may be configured to determine various characteristics of the endpoints such as the communication device 102 and the terminals 104 for handling the respective received signals. The characteristics may include, but are not limited to, type of endpoint (for e.g., a mobile phone, a laptop, an IP television, etc.), supported video resolution, supported codecs, network connection speed, and so on.
  • the virtual video mixer 210 may be configured to receive a decoded video stream from each of the UC applications 208 or a physical local imaging device such as the imaging device 108 , which is associated with the communication device 102 .
  • the virtual video mixer 210 may be further configured to switch between various video streams received from different UC applications 208 to either continuously display a received video stream while operating in a continuous presence mode, or tile a video stream received from each of the UC applications 208 from which a respective audio stream is actively received while operating in a voice-switched mode.
  • the virtual audio mixer 212 may be configured to receive an input audio stream either from a physical local audio device such as the audio device 110 (for e.g., a microphone) or a virtual audio device 216 (for e.g., a virtual sound card) in communication with each of the UC applications 208 .
  • the virtual audio mixer may use the received audio streams to create a mixed and re-encoded audio stream for being played or transmitted to one or more UC applications 208 simultaneously.
  • the re-encoded audio stream excludes the input audio stream received from its corresponding virtual source, discussed below in greater detail.
  • the virtual imaging devices 214 and the virtual audio devices 216 may be logical representations of an imaging device (for e.g., a camera) and an audio device (for e.g., a speaker, a microphone, etc.) with no physical counterparts.
  • the virtual imaging device 214 may be an image derived from the virtual video mixer 210 .
  • the virtual imaging device 214 may provide either a tiled image of active participants, or an image of the active speaker, received from the respective UC applications, depending on whether the virtual video mixer 210 is operating in voice-switched or mixer (i.e., continuous presence) mode.
  • the virtual audio devices 216 may be virtual representations of an audio device, which may be emulated by each of the UC applications 208 . Similar to the virtual imaging device 214 , the virtual audio device 216 has no physical counterpart and derives its input audio stream from the virtual audio mixer 212 as well as outputs audio streams to the virtual audio mixer 212 . Each of the virtual imaging devices 214 and the virtual audio devices 216 may be created for each of the UC applications 208 for independently supporting the audio and video data streams being received from or sent to the endpoints or any other unified communication system simultaneously.
  • the device driver module 218 may comprise device drivers, which are software programs that introduce hardware devices such as endpoints (e.g., the communication device 102 and the terminals 104 ) and associated devices (e.g., the imaging device 108 and the audio device 110 ) to the virtual multipoint control unit 112 .
  • the device drivers handle software instructions received from the UC applications 208 for accessing the hardware devices and associated resources (e.g., attached peripheral devices, hardware memory, etc.) without causing conflicts.
  • At least one of the UC applications such as the UC application 208 - 1 may comprise an audio encoder/decoder 220 , a video encoder/decoder 222 , a display window 224 , and a virtual source 226 .
  • the audio encoder/decoder 220 may be configured to decode and encode audio streams received from or sent to the virtual audio mixer 212 respectively using known, related art, or later developed encoding/decoding protocols and standards.
  • the video encoder/decoder 222 may be configured to decode and encode the video streams received from or sent to the virtual video mixer 210 respectively using known, related art, or later developed encoding/decoding protocols and standards.
  • the display window 224 refers to an output software interface configured to display the video streams/signals.
  • the virtual source 226 may be configured to provide decoded video data stream to the virtual video mixer 210 via the display window 224 .
  • FIG. 3 is a schematic that illustrates a virtual multipoint control unit 300 managing a multipoint video conference, according to an embodiment of the present disclosure.
  • the virtual multipoint control unit 300 may comprise a virtual video mixer 302 , a virtual audio mixer 304 , and a device driver module 306 .
  • the virtual video mixer 302 may be in communication with one or more physical local imaging devices such as a camera 308 and the device driver module 306 may be in communication with one or more physical local audio devices such as a local speaker 310 and a local microphone 312 .
  • the device driver module 306 comprises device drivers for operating or managing the respective one or more physical hardware devices such as the camera 308 , the local speaker 310 , and the microphone 312 .
  • the virtual multipoint control unit 300 may further comprise various videoconferencing software applications such as UC applications 314 - 1 , 314 - 2 , and 314 - 3 (collectively, UC applications 314 ) for conducting a video conference among multiple endpoints, for e.g., the communication device 102 and the terminals 104 .
  • the first UC application 314 - 1 comprises a first audio encoder/decoder 316 - 1 , a first video encoder 318 - 1 , a first video decoder 320 - 1 , a first display window 322 - 1 , and a first virtual source 324 - 1 .
  • the second UC application 314 - 2 comprises a second audio encoder/decoder 316 - 2 , a second video encoder 318 - 2 , a second video decoder 320 - 2 , a second display window 322 - 2 , and a second virtual source 324 - 2 .
  • the third UC application 314 - 3 comprises a third audio encoder/decoder 316 - 3 , a third video encoder 318 - 3 , a third video decoder 320 - 3 , a third display window 322 - 3 , and a third virtual source 324 - 3 .
  • the display windows 322 - 1 , 322 - 2 , 322 - 3 may be configured to receive the decoded video streams from the respective video decoders 320 - 1 , 320 - 2 , 320 - 3 (collectively, video decoders 320 ) for displaying the video streams on a local display device such as an HDTV display.
  • the virtual sources 324 - 1 , 324 - 2 , 324 - 3 may be configured to receive the decoded video data streams via the respective display windows 322 for being sent to the virtual video mixer 302 , which combines the received video streams and renders a single video stream (e.g., to display images from each of the video streams in a tiled format) to reduce the required network bandwidth for display or recording.
  • Each of the UC applications 314 may communicate with a respective set of UC virtual devices to allow the UC applications 314 to access the physical hardware devices, which are in communication with the virtual multipoint control unit 300 simultaneously.
  • the UC application 314 - 1 may communicate with a first set of UC virtual devices including a first UC virtual camera 326 - 1 and a first UC virtual sound card 328 - 1
  • the UC application 314 - 2 may communicate with a second set of UC virtual devices including a second UC virtual camera 326 - 2 and a second UC virtual sound card 328 - 2
  • the UC application 314 - 3 may communicate with a third set of UC virtual devices including a third UC virtual camera 326 - 3 and a third UC virtual sound card 328 - 3 .
  • each set of the UC virtual devices may emulate a physical device by defining a software configuration and a related hardware configuration, both being compatible with the respective UC applications.
  • each of the UC virtual cameras 326 - 1 , 326 - 2 , 326 - 3 may be a logical representation or an image of an imaging device (e.g., a camera) without being mapped or corresponding to a physical camera.
  • each of the UC virtual sound cards 328 - 1 , 328 - 2 , 328 - 3 may refer to a logical representation or an image of an audio device (e.g., a speaker or a microphone) without being mapped or corresponding to a physical audio device.
  • the UC virtual cameras 326 may be derived from the virtual video mixer 302 and the UC virtual sound cards 328 may be derived from the virtual audio mixer 302 by the respective UC applications 314 .
  • multiple endpoints such as the communication device 102 and the terminals 104 may execute various videoconferencing software applications similar to the UC applications 314 in the virtual multipoint control unit 300 .
  • Each of these videoconferencing software applications may interact with the corresponding UC applications 314 in the virtual multipoint control unit 300 during a single communication session to conduct the video conference, since the videoconferencing software applications may not be compatible with each other due to different control protocols and media formats being used.
  • the virtual multipoint control unit 300 may operate in four distinct modes, namely, inbound audio mode, inbound video mode, outbound audio mode, and outbound video mode, to handle any audio, video, or data streams to and from these UC applications 314 , while simultaneously providing one or more local hardware devices to each of the participating UC applications 314 .
  • the endpoints such as the communication device 102 and the terminals 104 may capture local audio data stream and provide encoded audio streams via UC applications 314 (for e.g., Microsoft Lync, Skype, Cisco IP communicator, etc.) executing on each of these endpoints.
  • the provided encoded audio streams may be received by the audio encoder/decoders 316 - 1 , 316 - 2 , 316 - 3 (collectively, audio encoders/decoders 316 ) of the same UC applications 314 running in the virtual multipoint control unit 300 .
  • the received audio streams may be encoded in any of the known, related art, or later developed real-time audio protocols such as RTP (real-time transport protocol).
  • the audio encoder/decoders 316 may be configured to decode the received audio streams and send the decoded audio streams to the respective UC virtual sound cards 328 , which may route the decoded audio streams to the virtual audio mixer 304 via the virtual video mixer 302 .
  • the UC virtual sound cards 328 facilitate to provide the decoded audio streams to the virtual audio mixer 304 from multiple UC applications 314 simultaneously.
  • the decoded audio streams may facilitate the operation of the virtual video mixer 302 being configured to operate in the voice-switched mode.
  • the received audio streams may be summed at the virtual audio mixer 304 may be configured to perform a summation of the received audio streams to generate a single audio stream so as to minimize audio interferences or background noise as known in the art.
  • the generated audio stream may be sent to one or more physical audio devices such the local speaker 310 .
  • the virtual audio mixer 304 may send a notification to the virtual video mixer 302 to allow synchronization between the audio and the video streams.
  • the virtual video mixer 302 may add delay to a received video stream to synchronize it with the audio stream.
  • the local microphone 312 may capture audio data stream of the local participants and send the captured audio data stream to the virtual audio mixer 304 .
  • the virtual audio mixer 304 may send the captured audio stream to the audio encoder/decoders, such as the audio encoder/decoder 316 - 1 , via the respective UC virtual sound card 328 - 1 of a selected UC application, such as the UC application 314 - 1 , among the available UC applications 314 .
  • Such selection of the UC application may be performed on-the-fly by a user or may be preprogrammed.
  • the audio encoder/decoder 316 - 1 may encode the received audio data stream using a predetermined real-time protocol known in the art, related art or later developed, and transmit the encoded audio data stream to a predetermined endpoint such as the communication device 102 or the terminals 104 .
  • the audio streams may be encoded using any of the known, related art, or later developed real-time audio protocols such as RTP (real-time transport protocol).
  • RTP real-time transport protocol
  • the audio encoder/decoders 316 - 2 and 316 - 3 may also operate in a similar manner.
  • the endpoints such as the communication device 102 and the terminals 104 may capture video data stream of participants local to the endpoints and provide encoded video streams via the videoconferencing software applications being executed on each of these endpoints.
  • the provided encoded audio streams may be received by the video decoders 320 - 1 , 320 - 2 , 320 - 3 (collectively, video decoders 320 ) of the respective UC applications 314 , which were used at the endpoints to send the video streams to the virtual multipoint control unit 300 .
  • the received video streams may be encoded in any of the known, related art, or later developed real-time video protocols such as H.264, H.261, Scalable video coding (SVC), of any other protocol known in the art, related art, or developed later.
  • the video decoders 320 may be configured to decode the received video streams and send the decoded video streams to the respective UC virtual cameras 326 - 1 , 326 - 2 , 326 - 3 (collectively, UC virtual cameras 326 ), which may route the decoded video streams to the virtual video mixer 302 .
  • the UC virtual cameras 326 facilitate to provide the decoded video streams for use from multiple UC applications 314 simultaneously.
  • the virtual video mixer 302 may perform a summation of the received video streams to generate a single video stream for at least one of (1) minimizing video interferences or background noise; and (2) represent the video images from different decoded video streams as tiles for display.
  • the decoded video streams may be displayed on a display device (not shown) (e.g., an interactive display, HDTV display, etc.) on the basis of the virtual video mixer 302 being preset to operate in continuous presence mode or the voice-switched mode.
  • the local camera 308 may capture the video data stream of the local participants and send the captured video data stream to the virtual video mixer 302 .
  • the virtual video mixer 302 sends the captured video stream to a video encoder such as the video encoder 318 - 1 via the UC virtual camera 326 - 1 of a selected UC application, such as the UC application 314 - 1 , among the available UC applications 314 .
  • a selected UC application such as the UC application 314 - 1
  • selection of the UC application 314 - 1 may be performed on-the-fly by a user or may be preprogrammed.
  • the video encoders 320 may encode the received video data stream using a predetermined real-time protocol known in the art, related art or later developed and transmit the encoded video data stream to a predetermined endpoint such as the communication device 102 or the terminals 104 .
  • the video streams may be encoded using any of the known in the art, related art, or later developed real time protocols, e.g., session initiation protocol (SIP), H.261, H.263, H.264, H.323, etc.
  • SIP session initiation protocol
  • FIG. 4 is a schematic implementation of a device driver module of the virtual multipoint control unit for communicating with hardware devices, according to an embodiment of the present disclosure. Illustrated embodiments are disclosed in the context of a video conference environment 400 including (1) a videoconferencing system 402 (e.g., CLEARONE Collaborate Room) comprising a video encoder 404 and a video decoder 406 ; (2) a UC application 408 comprising a UC video encoder 410 and a UC decoder 412 , (3) a physical camera 414 , (4) a display device 416 , and (5) the device driver module 306 of the virtual multipoint control unit 300 .
  • the device driver module 306 comprises a video mixer driver 418 and a video display mixer driver 420 .
  • the physical camera 414 may be in communication with the video conferencing system 402 and the UC application 408 via the video display mixer driver 418 .
  • the display device 416 may be in communication with the videoconferencing system 402 and the UC application 408 via the video display mixer driver 420 .
  • the operation of the virtual multipoint control unit 300 may be described in four different modes.
  • the physical camera 414 may capture a local image of the participants and provide a video stream for transmission.
  • the video mixer driver 418 may be configured to transmit the captured video stream to a physical hardware such as the video conferencing system 402 or to the UC application 408 based on a transmission mode selected by a user.
  • the transmission mode indicates for a hardware to be used for transmission
  • the video mixer driver 418 guides the captured video stream to the video encoder 404 , which may encode the video stream using any of the known in the art, related art, or later developed encoding algorithms for real-time network transmission using a predetermined network protocol such as H.263, SIP, etc.
  • the video mixer driver 418 guides the captured video stream to the UC video encoder 410 , which may encode the video stream using any of the known in the art, related art, or later developed encoding algorithms for real-time network transmission using a predetermined network protocol such as H.263, SIP, etc.
  • the captured video stream may be simultaneously sent to the video encoder 404 and the UC video encoder 410 for encoding and network transmission.
  • the video display mixer driver 420 may be configured to manage an encoded video stream received from a network such as the network 106 via the videoconferencing system 402 .
  • the received video stream may be encoded using any of the known in the art, related art, or later developed algorithms according to predetermined network protocols.
  • the video decoder 406 may receive and decode the encoded video stream to generate a decoded video stream, which may be sent to the video mixer driver 418 .
  • the video mixer driver 418 may be configured to guide the decoded video stream to the video display mixer driver 420 , which may transmit the decoded video stream to the display device 416 for display.
  • the encoded video stream may be received by the UC video decoder 412 , which may be configured to decode the encoded video stream and send a generated decoded video stream to the display device 416 via the video display mixer driver 420 for display.
  • a virtual multipoint control unit 112 , 300 is in communication with a plurality of devices over a network 106 .
  • the virtual multipoint control unit 112 , 300 comprises a plurality of UC applications 208 , 314 , at least one virtual imaging device 214 , at least one virtual audio device 216 , a virtual video mixer 210 , and a virtual audio mixer 212 .
  • the plurality of UC applications 208 , 314 is executed on at least one of the plurality of devices 102 , 104 .
  • Each of the plurality of UC applications 208 , 314 receives an encoded audio data stream and an encoded video data stream.
  • the plurality of UC applications 208 , 314 decodes the received audio data and the video data streams.
  • the at least one virtual imaging device 214 is mapped to each of the plurality of UC applications 208 , 314 .
  • the at least one virtual audio device 216 is mapped to each of the plurality of UC applications 208 , 314 .
  • the virtual video mixer 210 is in communication with a physical imaging device 108 .
  • the virtual video mixer 210 receives the decoded video data stream from each of the plurality of UC applications 208 , 314 via the at least one virtual imaging device 214 .
  • the virtual audio mixer 212 is in communication with a physical audio device 110 .
  • the virtual audio mixer 212 receives the decoded audio data stream from each of the plurality of UC applications 208 , 314 via the at least one virtual audio device 216 .
  • a system for managing communication among a plurality of devices is disclosed.
  • the system is employed within a communication network 106 .
  • the system comprises a communication device 102 and a virtual multipoint control unit 112 , 300 .
  • the communication device 102 receives encoded audio and video data streams via a plurality of UC applications 208 , 314 .
  • the plurality of UC applications 208 , 314 decodes the received encoded audio data stream and the received encoded video data stream.
  • the virtual multipoint control unit 112 , 300 is in communication with the communication device 102 .
  • the virtual multipoint control unit 112 , 300 comprises at least one virtual imaging device 214 , at least one virtual audio device 216 , a virtual video mixer 210 , and a virtual audio mixer 212 .
  • the at least one virtual imaging device 214 is mapped to each of the plurality of UC applications 208 , 314 .
  • the at least one virtual audio device 216 is mapped to each of the plurality of UC applications 208 , 314 .
  • the virtual video mixer 210 is in communication with a physical imaging device 108 .
  • the virtual video mixer 210 receives the decoded video data stream from each of the plurality of UC applications 208 , 314 via the at least one virtual imaging device 214 .
  • the virtual audio mixer 212 is in communication with a physical audio device 110 .
  • the virtual audio mixer 212 receives the decoded audio data stream from each of the plurality of UC applications 208 , 314 via the at least one virtual audio device 216 .
  • a non-transitory computer readable medium storing a program of instructions executable by the computing device to perform a method for employing a virtual multipoint control unit 112 , 300 for unified communications.
  • the method comprises receiving, from a plurality of terminals 104 , an encoded audio data stream and an encoded video data stream.
  • the method also comprises decoding, using a plurality of UC applications 208 , 314 , the received encoded audio data stream and the received encoded video data stream.
  • the method further comprises mapping, using a virtual multipoint control unit 112 , 300 , at least one virtual imaging device 214 and at least one virtual audio device 216 to each of the plurality of UC applications 208 , 314 .
  • the method also comprises communicating, using a virtual video mixer 210 , the decoded video data stream to a physical imaging device 108 via the at least one virtual imaging device 214 . Furthermore, the method comprises communicating, using a virtual audio mixer 212 , the decoded audio data stream to a physical audio device 110 via the at least one virtual audio device 216 .
  • a method for managing communication among a plurality of devices comprises receiving, from a plurality of terminals 104 , an encoded audio data stream and an encoded video data stream.
  • the method also comprises decoding, using a plurality of UC applications 208 , 314 , the received encoded audio data stream and the received encoded video data stream.
  • the method further comprises mapping, using a virtual multipoint control unit 112 , 300 , at least one virtual imaging device 214 and at least one virtual audio device 216 to each of the plurality of UC applications 208 , 314 .
  • the method also comprises communicating, using a virtual video mixer 210 , the decoded video data stream to a physical imaging device 108 via the at least one virtual imaging device 214 .
  • the method comprises communicating, using a virtual audio mixer 212 , the decoded audio data stream to a physical audio device 110 via the at least one virtual audio device 216 .

Abstract

This disclosure describes a virtual multipoint control unit (112, 300) for unified communications. The virtual multipoint control unit (112, 300) communicates with multiple devices (102, 104) over a network (106). The control unit (112, 300) includes multiple unified communication (UC) applications (208, 314) being executed on the devices (102, 104). The UC applications (208, 314) decode received audio data stream and video data stream. A virtual imaging device (214) and a virtual audio device (216) are mapped to the UC applications (208, 314). A virtual video mixer (210) receives the decoded video data stream from the UC applications (208, 314) via the virtual imaging device (214). A virtual audio mixer (212) receives the decoded audio data stream from the UC applications (208, 314) via the virtual audio device (216).

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims priority and the benefits of the earlier filed Provisional U.S. Application No. 61/859,358, filed Jul. 29, 2013, which is incorporated by reference for all purposes into this specification.
  • TECHNICAL FIELD
  • This disclosure relates to unified communication systems. More specifically, this disclosure invention relates to a system for a virtual multipoint control unit for unified communications.
  • BACKGROUND ART
  • Several Unified Communications (UC) software applications have proliferated into the enterprise communications market. These applications allow end-users to communicate using voice, video, application and data sharing, instant messaging, etc. A few examples of these applications include Microsoft Lync, Skype, IBM Sametime, Cisco IP Communicator, etc.
  • Most of these applications use different call control protocols and media formats (i.e. video/audio encoding and decoding formats), making them incompatible with each other. For example, Microsoft Lync uses a proprietary implementation of the Session Initiation Protocol (SIP) that is different from that being implemented by Skype for call negotiation. In addition, various UC applications typically operate with different media formats, thereby causing these applications to become incompatible with each other for communicating audio, video, or data streams. As a result, the communication among these applications is facilitated by a multipoint control unit, which may be implemented as standalone hardware or a web-based service via a central host performing call signaling and media transcoding. The standalone hardware is extremely costly whereas the web-based central host does not support ad-hoc communication among the UC applications running on terminals that are not connected to the same central host.
  • Further, the multipoint control unit typically allows multiple UC applications to access local hardware devices in a mutually exclusive manner. For e.g., video data streams from one UC application are temporarily restricted from being played on a local hardware device such as a display device while the device is in use by another UC application. As a result, the multipoint control unit induces an unwanted delay during execution of simultaneously received data streams from different UC applications
  • Therefore, there exists a need for a system that allows multiple UC applications to access the local hardware devices simultaneously for ad-hoc unified communications.
  • SUMMARY OF INVENTION
  • This disclosure describes a system for a virtual multipoint control unit for unified communications.
  • In one embodiment, a virtual multipoint control unit is in communication with a plurality of devices over a network. The virtual multipoint control unit comprises a plurality of unified communication (UC) applications, at least one virtual imaging device, at least one virtual audio device, a virtual video mixer, and a virtual audio mixer. The plurality of unified communication (UC) applications is executed on at least one of the plurality of devices. Each of the plurality of UC applications receives an encoded audio data stream and an encoded video data stream. The plurality of UC applications decodes the received audio data and the video data streams. The at least one virtual imaging device is mapped to each of the plurality of UC applications. The at least one virtual audio device is mapped to each of the plurality of UC applications. The virtual video mixer is in communication with a physical imaging device. The virtual video mixer receives the decoded video data stream from each of the plurality of UC applications via the at least one virtual imaging device. The virtual audio mixer is in communication with a physical audio device. The virtual audio mixer receives the decoded audio data stream from each of the plurality of UC applications via the at least one virtual audio device.
  • In another embodiment, a system for managing communication among a plurality of devices is disclosed. The system is employed within a communication network. The system comprises a communication device and a virtual multipoint control unit. The communication device receives encoded audio and video data streams via a plurality of unified communication (UC) applications. The plurality of UC applications decodes the received encoded audio data stream and the received encoded video data stream. The virtual multipoint control unit is in communication with the communication device. The virtual multipoint control unit comprises at least one virtual imaging device, at least one virtual audio device, a virtual video mixer, and a virtual audio mixer. The at least one virtual imaging device is mapped to each of the plurality of UC applications. The at least one virtual audio device is mapped to each of the plurality of UC applications. The virtual video mixer is in communication with a physical imaging device. The virtual video mixer receives the decoded video data stream from each of the plurality of UC applications via the at least one virtual imaging device. The virtual audio mixer is in communication with a physical audio device. The virtual audio mixer receives the decoded audio data stream from each of the plurality of UC applications via the at least one virtual audio device.
  • In yet another embodiment, a non-transitory computer readable medium storing a program of instructions executable by the computing device to perform a method for employing a virtual multipoint control unit for unified communications is disclosed. The method comprises receiving, from a plurality of terminals, an encoded audio data stream and an encoded video data stream. The method also comprises decoding, using a plurality of unified communication (UC) applications, the received encoded audio data stream and the received encoded video data stream. The method further comprises mapping, using a virtual multipoint control unit, at least one virtual imaging device and at least one virtual audio device to each of the plurality of UC applications. The method also comprises communicating, using a virtual video mixer, the decoded video data stream to a physical imaging device via the at least one virtual imaging device. Furthermore, the method comprises communicating, using a virtual audio mixer, the decoded audio data stream to a physical audio device via the at least one virtual audio device.
  • In still another embodiment, a method for managing communication among a plurality of devices is disclosed. The method comprises receiving, from a plurality of terminals, an encoded audio data stream and an encoded video data stream. The method also comprises decoding, using a plurality of unified communication (UC) applications, the received encoded audio data stream and the received encoded video data stream. The method further comprises mapping, using a virtual multipoint control unit, at least one virtual imaging device and at least one virtual audio device to each of the plurality of UC applications. The method also comprises communicating, using a virtual video mixer, the decoded video data stream to a physical imaging device via the at least one virtual imaging device. Furthermore, the method comprises communicating, using a virtual audio mixer, the decoded audio data stream to a physical audio device via the at least one virtual audio device.
  • Other and further aspects and features of the disclosure will be evident from reading the following detailed description of the embodiments, which are intended to illustrate, and not limit, the present disclosure.
  • BRIEF DESCRIPTION OF DRAWINGS
  • To further aid in understanding the disclosure, the attached drawings help illustrate specific features of the disclosure and the following is a brief description of the attached drawings:
  • FIG. 1A is a schematic that illustrates a first environment for implementing an exemplary virtual multipoint control unit, according to an embodiment of the present disclosure.
  • FIG. 1B is a schematic that illustrates a second environment for implementing the exemplary virtual multipoint control unit of FIG. 1A, according to an embodiment of the present disclosure.
  • FIG. 2A is a schematic that illustrates the exemplary virtual multipoint control unit of FIG. 1A, according to an embodiment of the present disclosure.
  • FIG. 2B is a schematic that illustrates an exemplary unified communication application included in the virtual multipoint control unit of FIG. 1A, according to an embodiment of the present disclosure.
  • FIG. 3 is a schematic that illustrates a virtual multipoint control unit 300 managing a multipoint video conference, according to an embodiment of the present disclosure.
  • FIG. 4 is a schematic illustrating implementation of a device driver module of the virtual multipoint control unit for communicating with hardware devices for conducting a video conference, according to an embodiment of the present disclosure.
  • DISCLOSURE OF EMBODIMENTS
  • This disclosure describes a system for a virtual multipoint control unit for unified communications. This disclosure describes numerous specific details in order to provide a thorough understanding of the present invention. One skilled in the art will appreciate that one may practice the present invention without these specific details. Additionally, this disclosure does not describe some well known items in detail in order not to obscure the present invention.
  • In various embodiments of the present disclosure, definitions of one or more terms that will be used in the document are provided below.
  • An endpoint refers to one or more computing devices capable of establishing a communication channel for exchange of audio, video, textual, or symbolic data in a communication session. Examples of the computing devices may include, but are not limited to, a desktop PC, a personal digital assistant (PDA), a server, a mainframe computer, a mobile computing device (for e.g., mobile phones, laptops, tablets, etc.), an internet appliance, calling devices (for e.g., a telephone, an internet phone, video telephone, etc.).
  • The numerous references in the disclosure to a system for a virtual multipoint control unit are intended to cover any and/or all devices capable of performing respective operations on endpoints in a unified communication application-based conferencing environment relevant to the applicable context, regardless of whether or not the same are specifically provided.
  • FIG. 1A is a schematic that illustrates a first environment for implementing an exemplary virtual multipoint control unit, according to an embodiment of the present disclosure. Embodiments are disclosed in the context of environments that represent a multipoint video conference among multiple users via respective endpoints capable of executing one or more computer applications for unified communications in the same communication session. However, other embodiments may be applied in the context of other scenarios (e.g., an audio conference, a webinar, a multiplayer online game, etc.) involving at least one of audio, video, textual, or symbolic (e.g., emoticons, images, etc.) data being communicated among various endpoints in the same communication session. In some embodiments, at least one of the endpoints may execute a unified communication application (UC application) during the session.
  • The first network environment 100 may comprise multiple endpoints including a communication device 102 configured to communicate with terminals 104-1, 104-2, 104-3, 104-4, and 104-5 (collectively, terminals 104) via a network 106. The network 106 may comprise, for example, one or more of the Internet, Wide Area Networks (WANs), Local Area Networks (LANs), analog or digital wired and wireless telephone networks (e.g., a PSTN, Integrated Services Digital Network (ISDN), a cellular network, and Digital Subscriber Line (xDSL)), radio, television, cable, satellite, and/or any other delivery or tunneling mechanism for carrying data.
  • The network 106 may comprise multiple networks or sub-networks, each of which may comprise, for example, a wired or wireless data pathway. The network 106 may comprise a circuit-switched voice network, a packet-switched data network, or any other network that is able to carry electronic communications. For example, the network 106 may comprise networks based on the Internet protocol (IP) or asynchronous transfer mode (ATM), and may support voice using, for example, VoIP, Voice-over-ATM, or other comparable protocols used for voice data communications. In some embodiments, the network 106 may comprise a cellular telephone network configured to enable exchange of textual data, audio data, video data, or any combination thereof between the communication device 102 and at least one of the terminals 104.
  • The communication device 102 may comprise or be coupled with one or more hardware devices either wirelessly or in a wired fashion for enabling a user to dynamically interact with other users via the endpoints connected to the network 106. For example, the communication device 102 may be coupled with an imaging device 108 (including, but not limited to, a video camera, a webcam, a scanner, or any combination thereof) and an audio device 110 (including, but not limited to, a speaker, a microphone, or any combination thereof). The communication device 102 may be compatible with any other device (not shown) connected to the network 106 to exchange audio, video, textual or symbolic data streams with each other or any other compatible devices.
  • In one embodiment, the communication device 102 may comprise a virtual multipoint control unit 112 configured to, at least one of: (1) create a logical representation of one or more hardware devices in communication with the communication device 102; (2) establish a communication bridge or channel between the UC applications executed by the communication device 102, and the corresponding UC applications being executed by at least one of the terminals 104; (3) store, manage, and process the multimodal input data streams received from the endpoints and/or associated devices such as the imaging device 108 and the audio device 110 connected to the network 106; and (4) request services from or deliver services to, or both, various devices connected to the network 106.
  • In one embodiment, the virtual multipoint control unit 112 may facilitate integration of real-time and non-real-time communication services by bridging communication among various UC applications being simultaneously executed on various endpoints, such as the communication device 102 and the terminals 104. Examples of such real-time services may include, but are not limited to, instant messaging, internet protocol (IP) telephony, video conferencing, desktop sharing, data sharing, call control, and speech recognition. Examples of these non-real-time services may include, but are not limited to, voicemail, E-mail, SMS, and fax. Further, examples of UC applications may include, but are not limited to, Microsoft Lync, Skype, IBM Sametime, and Cisco IP Communicator. Each of the UC applications may operate with same or different communication protocols and media formats.
  • In another embodiment, the virtual multipoint control unit 112 may be implemented as a standalone and dedicated “black box” including hardware and installed software, where the hardware is closely matched to the requirements and/or functionality of the software. Alternatively, the virtual multipoint control unit 112 may be implemented as a software application or a device driver. The virtual multipoint control unit 112 may enhance or increase the functionality and/or capacity of the network 106 to which it is connected. The virtual multipoint control unit 112 may be further configured, for example, to perform e-mail tasks, security tasks, network management tasks including IP address management, and other tasks.
  • In yet another embodiment, the virtual multipoint control unit 112 may be configured to expose its computing environment or operating code to the user, and may comprise related art I/O devices, such as camera, speaker, scanner, keyboard or display. The virtual multipoint control unit 112 may, however, comprise software, firmware or other resources that support remote administration and/or maintenance of the virtual multipoint control unit.
  • In a further embodiment, the virtual multipoint control unit 112 may comprise at least one processor (not shown) executing machine readable program instructions for performing various operations, such as those discussed above, on the received multimodal input audio, video, textual, or symbolic data stream. The virtual multipoint control unit 112 may comprise, in whole or in part, a software application working alone or in conjunction with one or more hardware resources. Such software applications may be executed by the processor on different hardware platforms or emulated in a virtual environment, discussed below in greater detail. Aspects of the virtual multipoint control unit 112 may leverage known, related art, or later developed off-the-shelf software.
  • Now turning to FIG. 1B, a second network environment 150 is provided. In this embodiment, the virtual multipoint control unit 112 may be integrated with, or installed on, a network appliance 152 that is associated with or used to establish the network 106. The network appliance 152 may be capable of operating as an interface device to assist exchange of program instructions and data between the communication device 102 and the terminals 104. In some embodiments, the network appliance 152 may be preconfigured or dynamically configured to comprise the virtual multipoint control unit 112 integrated with other devices. For example, the virtual multipoint control unit 112 may be integrated with the communication device 102 or any other device, such as at least one of the terminals 104 connected to the network 106. The communication device 102 may comprise a module (not shown), which introduces the communication device 102 to the network appliance 152, thereby enabling the network appliance 152 to invoke the virtual multipoint control unit 112 as a service. Examples of the network appliance 152 may include, but are not limited to, a DSL modem, a wireless access point, a router, a base station, and a gateway having a predetermined computing power sufficient for implementing the virtual multipoint control unit 112.
  • The virtual multipoint control unit 112 and the communication device 102 may collectively constitute a unified communication system, which may reside in a single device or may be distributed across multiple devices. The unified communication system may be implemented in hardware or a suitable combination of hardware and software, and may comprise one or more software systems operating on a digital signal processing platform. The “hardware” may comprise a combination of discrete components, an integrated circuit, an application-specific integrated circuit, a field programmable gate array, a digital signal processor, or other suitable hardware. The “software” may comprise one or more objects, agents, threads, lines of code, subroutines, separate software applications, two or more lines of code or other suitable software structures operating in one or more software applications or on one or more processors. Embodiments may comprise the unified communication system operating as or in a mobile switching center, network gateway system, Internet access node, application server, IMS core, service node, or some other communication system, including any combination thereof.
  • Similar to the communication device 102, each of the terminals 104 may be associated with various devices which may include, but are not limited to, a camera, display device, microphone, speakers, and one or more codecs, or any other type of conferencing hardware, or in any combination thereof. The terminals 104 may comprise video, voice and data communications capabilities (for e.g., videoconferencing capabilities) by being coupled to or including, various audio devices (e.g., microphones, audio input devices, speakers, audio output devices, telephones, speaker telephones, etc.), various video devices (e.g., monitors, projectors, displays, televisions, video output devices, video input devices, cameras, etc.), various networks (IP, PSTN, etc.) or any combination thereof. Each of the terminals 104 may comprise or implement one or more real time protocols, e.g., session initiation protocol (SIP), H.261, H.263, H.264, H.323, among others.
  • FIG. 2A illustrates the exemplary virtual multipoint control unit of FIG. 1A, according to an embodiment of the present disclosure. The virtual multipoint control unit 112 may comprise one or more processor(s) 202, one or more interface(s) 204, and a memory module 206. The processor(s) 202 may execute a machine readable program comprising instructions for manipulating the received video signal. The processor(s) 202 may comprise, for example, microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuits, and/or any devices that manipulate signals based on operational instructions. The virtual multipoint control unit 112 may further comprise, in whole or in part, a software application working alone or in conjunction with one or more hardware resources. Such software applications may be executed by one or more processors on different hardware platforms or emulated in a virtual environment. Aspects of the virtual multipoint control unit 112 may leverage known, related art, or later developed off-the-shelf software. Among other capabilities, the processor(s) 202 may be configured to fetch and execute instructions in computer readable memory module 206.
  • The interface(s) 204 may coordinate interactions of the virtual multipoint control unit 112 with at least one of the communication device 102 and the terminals 104 over the network 106. The interface(s) 204 may comprise a variety of known, related art, or later developed interfaces, such as (1) software interfaces, for example, an application programming interface, a graphical user interface, etc.; (2) hardware interfaces, for example, cable connectors, keyboards, touchscreen, scanners, display screens, etc.; or both. The interface(s) 204 facilitate receiving of the audio, video, or data signals, and reliable broadcast or multicast transmissions of one or more encoded output signals.
  • The memory module 206 may comprise any computer-readable medium known in the art, comprising, for example, volatile memory (e.g., RAM) and/or non-volatile memory (e.g., flash, etc.). The memory module 206 may comprise UC applications 208-1, . . . , 208-N (collectively referred to as UC applications 208), a virtual video mixer 210, a virtual audio mixer 212, virtual imaging devices 214-1, . . . 214-N (collectively, virtual imaging devices 214), virtual audio devices 216-1, . . . , 216-N (collectively, virtual audio devices 216), and a device driver module 218.
  • The virtual multipoint control unit 112 may be configured to implement various real-time and non-real time communication protocols for rendering and transmitting various audio, video, textual, or symbolic communication signals from the UC applications 208. In some embodiments, the virtual multipoint control unit 112 may be configured to determine various characteristics of the endpoints such as the communication device 102 and the terminals 104 for handling the respective received signals. The characteristics may include, but are not limited to, type of endpoint (for e.g., a mobile phone, a laptop, an IP television, etc.), supported video resolution, supported codecs, network connection speed, and so on.
  • The virtual video mixer 210 may be configured to receive a decoded video stream from each of the UC applications 208 or a physical local imaging device such as the imaging device 108, which is associated with the communication device 102. The virtual video mixer 210 may be further configured to switch between various video streams received from different UC applications 208 to either continuously display a received video stream while operating in a continuous presence mode, or tile a video stream received from each of the UC applications 208 from which a respective audio stream is actively received while operating in a voice-switched mode.
  • The virtual audio mixer 212 may be configured to receive an input audio stream either from a physical local audio device such as the audio device 110 (for e.g., a microphone) or a virtual audio device 216 (for e.g., a virtual sound card) in communication with each of the UC applications 208. The virtual audio mixer may use the received audio streams to create a mixed and re-encoded audio stream for being played or transmitted to one or more UC applications 208 simultaneously. The re-encoded audio stream excludes the input audio stream received from its corresponding virtual source, discussed below in greater detail.
  • The virtual imaging devices 214 and the virtual audio devices 216 may be logical representations of an imaging device (for e.g., a camera) and an audio device (for e.g., a speaker, a microphone, etc.) with no physical counterparts. The virtual imaging device 214 may be an image derived from the virtual video mixer 210. The virtual imaging device 214 may provide either a tiled image of active participants, or an image of the active speaker, received from the respective UC applications, depending on whether the virtual video mixer 210 is operating in voice-switched or mixer (i.e., continuous presence) mode.
  • The virtual audio devices 216 may be virtual representations of an audio device, which may be emulated by each of the UC applications 208. Similar to the virtual imaging device 214, the virtual audio device 216 has no physical counterpart and derives its input audio stream from the virtual audio mixer 212 as well as outputs audio streams to the virtual audio mixer 212. Each of the virtual imaging devices 214 and the virtual audio devices 216 may be created for each of the UC applications 208 for independently supporting the audio and video data streams being received from or sent to the endpoints or any other unified communication system simultaneously.
  • The device driver module 218 may comprise device drivers, which are software programs that introduce hardware devices such as endpoints (e.g., the communication device 102 and the terminals 104) and associated devices (e.g., the imaging device 108 and the audio device 110) to the virtual multipoint control unit 112. The device drivers handle software instructions received from the UC applications 208 for accessing the hardware devices and associated resources (e.g., attached peripheral devices, hardware memory, etc.) without causing conflicts.
  • Further, as shown in FIG. 2B, at least one of the UC applications such as the UC application 208-1 may comprise an audio encoder/decoder 220, a video encoder/decoder 222, a display window 224, and a virtual source 226. The audio encoder/decoder 220 may be configured to decode and encode audio streams received from or sent to the virtual audio mixer 212 respectively using known, related art, or later developed encoding/decoding protocols and standards. Similarly, the video encoder/decoder 222 may be configured to decode and encode the video streams received from or sent to the virtual video mixer 210 respectively using known, related art, or later developed encoding/decoding protocols and standards. The display window 224 refers to an output software interface configured to display the video streams/signals. The virtual source 226 may be configured to provide decoded video data stream to the virtual video mixer 210 via the display window 224.
  • FIG. 3 is a schematic that illustrates a virtual multipoint control unit 300 managing a multipoint video conference, according to an embodiment of the present disclosure. In one embodiment, the virtual multipoint control unit 300 may comprise a virtual video mixer 302, a virtual audio mixer 304, and a device driver module 306. The virtual video mixer 302 may be in communication with one or more physical local imaging devices such as a camera 308 and the device driver module 306 may be in communication with one or more physical local audio devices such as a local speaker 310 and a local microphone 312. The device driver module 306 comprises device drivers for operating or managing the respective one or more physical hardware devices such as the camera 308, the local speaker 310, and the microphone 312.
  • The virtual multipoint control unit 300 may further comprise various videoconferencing software applications such as UC applications 314-1, 314-2, and 314-3 (collectively, UC applications 314) for conducting a video conference among multiple endpoints, for e.g., the communication device 102 and the terminals 104. The first UC application 314-1 comprises a first audio encoder/decoder 316-1, a first video encoder 318-1, a first video decoder 320-1, a first display window 322-1, and a first virtual source 324-1. Similarly, the second UC application 314-2 comprises a second audio encoder/decoder 316-2, a second video encoder 318-2, a second video decoder 320-2, a second display window 322-2, and a second virtual source 324-2. The third UC application 314-3 comprises a third audio encoder/decoder 316-3, a third video encoder 318-3, a third video decoder 320-3, a third display window 322-3, and a third virtual source 324-3.
  • The display windows 322-1, 322-2, 322-3 (collectively, display windows 322) may be configured to receive the decoded video streams from the respective video decoders 320-1, 320-2, 320-3 (collectively, video decoders 320) for displaying the video streams on a local display device such as an HDTV display.
  • The virtual sources 324-1, 324-2, 324-3 (collectively, virtual sources 324) may be configured to receive the decoded video data streams via the respective display windows 322 for being sent to the virtual video mixer 302, which combines the received video streams and renders a single video stream (e.g., to display images from each of the video streams in a tiled format) to reduce the required network bandwidth for display or recording.
  • Each of the UC applications 314 may communicate with a respective set of UC virtual devices to allow the UC applications 314 to access the physical hardware devices, which are in communication with the virtual multipoint control unit 300 simultaneously. For example, the UC application 314-1 may communicate with a first set of UC virtual devices including a first UC virtual camera 326-1 and a first UC virtual sound card 328-1, and the UC application 314-2 may communicate with a second set of UC virtual devices including a second UC virtual camera 326-2 and a second UC virtual sound card 328-2. Similarly, the UC application 314-3 may communicate with a third set of UC virtual devices including a third UC virtual camera 326-3 and a third UC virtual sound card 328-3.
  • Further, each set of the UC virtual devices may emulate a physical device by defining a software configuration and a related hardware configuration, both being compatible with the respective UC applications. Correspondingly, each of the UC virtual cameras 326-1, 326-2, 326-3 (collectively, UC virtual cameras 326) may be a logical representation or an image of an imaging device (e.g., a camera) without being mapped or corresponding to a physical camera. Similarly, each of the UC virtual sound cards 328-1, 328-2, 328-3 (collectively, UC virtual sound cards 328) may refer to a logical representation or an image of an audio device (e.g., a speaker or a microphone) without being mapped or corresponding to a physical audio device. The UC virtual cameras 326 may be derived from the virtual video mixer 302 and the UC virtual sound cards 328 may be derived from the virtual audio mixer 302 by the respective UC applications 314.
  • During a video conference, multiple endpoints such as the communication device 102 and the terminals 104 may execute various videoconferencing software applications similar to the UC applications 314 in the virtual multipoint control unit 300. Each of these videoconferencing software applications may interact with the corresponding UC applications 314 in the virtual multipoint control unit 300 during a single communication session to conduct the video conference, since the videoconferencing software applications may not be compatible with each other due to different control protocols and media formats being used. The virtual multipoint control unit 300 may operate in four distinct modes, namely, inbound audio mode, inbound video mode, outbound audio mode, and outbound video mode, to handle any audio, video, or data streams to and from these UC applications 314, while simultaneously providing one or more local hardware devices to each of the participating UC applications 314.
  • In the inbound audio mode, the endpoints such as the communication device 102 and the terminals 104 may capture local audio data stream and provide encoded audio streams via UC applications 314 (for e.g., Microsoft Lync, Skype, Cisco IP communicator, etc.) executing on each of these endpoints. The provided encoded audio streams may be received by the audio encoder/decoders 316-1, 316-2, 316-3 (collectively, audio encoders/decoders 316) of the same UC applications 314 running in the virtual multipoint control unit 300. The received audio streams may be encoded in any of the known, related art, or later developed real-time audio protocols such as RTP (real-time transport protocol). The audio encoder/decoders 316 may be configured to decode the received audio streams and send the decoded audio streams to the respective UC virtual sound cards 328, which may route the decoded audio streams to the virtual audio mixer 304 via the virtual video mixer 302. The UC virtual sound cards 328 facilitate to provide the decoded audio streams to the virtual audio mixer 304 from multiple UC applications 314 simultaneously.
  • The decoded audio streams may facilitate the operation of the virtual video mixer 302 being configured to operate in the voice-switched mode. The received audio streams may be summed at the virtual audio mixer 304 may be configured to perform a summation of the received audio streams to generate a single audio stream so as to minimize audio interferences or background noise as known in the art. The generated audio stream may be sent to one or more physical audio devices such the local speaker 310. When the generated audio stream is played by the local speaker 310, the virtual audio mixer 304 may send a notification to the virtual video mixer 302 to allow synchronization between the audio and the video streams. In some embodiments, the virtual video mixer 302 may add delay to a received video stream to synchronize it with the audio stream.
  • In the outbound audio mode, the local microphone 312 may capture audio data stream of the local participants and send the captured audio data stream to the virtual audio mixer 304. The virtual audio mixer 304 may send the captured audio stream to the audio encoder/decoders, such as the audio encoder/decoder 316-1, via the respective UC virtual sound card 328-1 of a selected UC application, such as the UC application 314-1, among the available UC applications 314. Such selection of the UC application may be performed on-the-fly by a user or may be preprogrammed. The audio encoder/decoder 316-1 may encode the received audio data stream using a predetermined real-time protocol known in the art, related art or later developed, and transmit the encoded audio data stream to a predetermined endpoint such as the communication device 102 or the terminals 104. The audio streams may be encoded using any of the known, related art, or later developed real-time audio protocols such as RTP (real-time transport protocol). The audio encoder/decoders 316-2 and 316-3 may also operate in a similar manner.
  • In the inbound video mode, the endpoints such as the communication device 102 and the terminals 104 may capture video data stream of participants local to the endpoints and provide encoded video streams via the videoconferencing software applications being executed on each of these endpoints. The provided encoded audio streams may be received by the video decoders 320-1, 320-2, 320-3 (collectively, video decoders 320) of the respective UC applications 314, which were used at the endpoints to send the video streams to the virtual multipoint control unit 300. The received video streams may be encoded in any of the known, related art, or later developed real-time video protocols such as H.264, H.261, Scalable video coding (SVC), of any other protocol known in the art, related art, or developed later. The video decoders 320 may be configured to decode the received video streams and send the decoded video streams to the respective UC virtual cameras 326-1, 326-2, 326-3 (collectively, UC virtual cameras 326), which may route the decoded video streams to the virtual video mixer 302.
  • The UC virtual cameras 326 facilitate to provide the decoded video streams for use from multiple UC applications 314 simultaneously. The virtual video mixer 302 may perform a summation of the received video streams to generate a single video stream for at least one of (1) minimizing video interferences or background noise; and (2) represent the video images from different decoded video streams as tiles for display. The decoded video streams may be displayed on a display device (not shown) (e.g., an interactive display, HDTV display, etc.) on the basis of the virtual video mixer 302 being preset to operate in continuous presence mode or the voice-switched mode.
  • In the outbound video mode, the local camera 308 may capture the video data stream of the local participants and send the captured video data stream to the virtual video mixer 302. The virtual video mixer 302 sends the captured video stream to a video encoder such as the video encoder 318-1 via the UC virtual camera 326-1 of a selected UC application, such as the UC application 314-1, among the available UC applications 314. Such selection of the UC application 314-1 may be performed on-the-fly by a user or may be preprogrammed. The video encoders 320 may encode the received video data stream using a predetermined real-time protocol known in the art, related art or later developed and transmit the encoded video data stream to a predetermined endpoint such as the communication device 102 or the terminals 104. The video streams may be encoded using any of the known in the art, related art, or later developed real time protocols, e.g., session initiation protocol (SIP), H.261, H.263, H.264, H.323, etc.
  • FIG. 4 is a schematic implementation of a device driver module of the virtual multipoint control unit for communicating with hardware devices, according to an embodiment of the present disclosure. Illustrated embodiments are disclosed in the context of a video conference environment 400 including (1) a videoconferencing system 402 (e.g., CLEARONE Collaborate Room) comprising a video encoder 404 and a video decoder 406; (2) a UC application 408 comprising a UC video encoder 410 and a UC decoder 412, (3) a physical camera 414, (4) a display device 416, and (5) the device driver module 306 of the virtual multipoint control unit 300. The device driver module 306 comprises a video mixer driver 418 and a video display mixer driver 420. The physical camera 414 may be in communication with the video conferencing system 402 and the UC application 408 via the video display mixer driver 418. The display device 416 may be in communication with the videoconferencing system 402 and the UC application 408 via the video display mixer driver 420.
  • The operation of the virtual multipoint control unit 300 may be described in four different modes. In the first mode, the physical camera 414 may capture a local image of the participants and provide a video stream for transmission. The video mixer driver 418 may be configured to transmit the captured video stream to a physical hardware such as the video conferencing system 402 or to the UC application 408 based on a transmission mode selected by a user. When the transmission mode indicates for a hardware to be used for transmission, the video mixer driver 418 guides the captured video stream to the video encoder 404, which may encode the video stream using any of the known in the art, related art, or later developed encoding algorithms for real-time network transmission using a predetermined network protocol such as H.263, SIP, etc.
  • When the transmission mode indicates for the UC application 408 to be used for transmission, the video mixer driver 418 guides the captured video stream to the UC video encoder 410, which may encode the video stream using any of the known in the art, related art, or later developed encoding algorithms for real-time network transmission using a predetermined network protocol such as H.263, SIP, etc. In some embodiments, the captured video stream may be simultaneously sent to the video encoder 404 and the UC video encoder 410 for encoding and network transmission.
  • Similarly, the video display mixer driver 420 may be configured to manage an encoded video stream received from a network such as the network 106 via the videoconferencing system 402. The received video stream may be encoded using any of the known in the art, related art, or later developed algorithms according to predetermined network protocols. The video decoder 406 may receive and decode the encoded video stream to generate a decoded video stream, which may be sent to the video mixer driver 418. The video mixer driver 418 may be configured to guide the decoded video stream to the video display mixer driver 420, which may transmit the decoded video stream to the display device 416 for display. Alternatively, the encoded video stream may be received by the UC video decoder 412, which may be configured to decode the encoded video stream and send a generated decoded video stream to the display device 416 via the video display mixer driver 420 for display.
  • In one embodiment, a virtual multipoint control unit 112, 300 is in communication with a plurality of devices over a network 106. The virtual multipoint control unit 112, 300 comprises a plurality of UC applications 208, 314, at least one virtual imaging device 214, at least one virtual audio device 216, a virtual video mixer 210, and a virtual audio mixer 212. The plurality of UC applications 208, 314 is executed on at least one of the plurality of devices 102, 104. Each of the plurality of UC applications 208, 314 receives an encoded audio data stream and an encoded video data stream. The plurality of UC applications 208, 314 decodes the received audio data and the video data streams. The at least one virtual imaging device 214 is mapped to each of the plurality of UC applications 208, 314. The at least one virtual audio device 216 is mapped to each of the plurality of UC applications 208, 314. The virtual video mixer 210 is in communication with a physical imaging device 108. The virtual video mixer 210 receives the decoded video data stream from each of the plurality of UC applications 208, 314 via the at least one virtual imaging device 214. The virtual audio mixer 212 is in communication with a physical audio device 110. The virtual audio mixer 212 receives the decoded audio data stream from each of the plurality of UC applications 208, 314 via the at least one virtual audio device 216.
  • In another embodiment, a system for managing communication among a plurality of devices is disclosed. The system is employed within a communication network 106. The system comprises a communication device 102 and a virtual multipoint control unit 112, 300. The communication device 102 receives encoded audio and video data streams via a plurality of UC applications 208, 314. The plurality of UC applications 208, 314 decodes the received encoded audio data stream and the received encoded video data stream. The virtual multipoint control unit 112, 300 is in communication with the communication device 102. The virtual multipoint control unit 112, 300 comprises at least one virtual imaging device 214, at least one virtual audio device 216, a virtual video mixer 210, and a virtual audio mixer 212. The at least one virtual imaging device 214 is mapped to each of the plurality of UC applications 208, 314. The at least one virtual audio device 216 is mapped to each of the plurality of UC applications 208, 314. The virtual video mixer 210 is in communication with a physical imaging device 108. The virtual video mixer 210 receives the decoded video data stream from each of the plurality of UC applications 208, 314 via the at least one virtual imaging device 214. The virtual audio mixer 212 is in communication with a physical audio device 110. The virtual audio mixer 212 receives the decoded audio data stream from each of the plurality of UC applications 208, 314 via the at least one virtual audio device 216.
  • In yet another embodiment, a non-transitory computer readable medium storing a program of instructions executable by the computing device to perform a method for employing a virtual multipoint control unit 112, 300 for unified communications is disclosed. The method comprises receiving, from a plurality of terminals 104, an encoded audio data stream and an encoded video data stream. The method also comprises decoding, using a plurality of UC applications 208, 314, the received encoded audio data stream and the received encoded video data stream. The method further comprises mapping, using a virtual multipoint control unit 112, 300, at least one virtual imaging device 214 and at least one virtual audio device 216 to each of the plurality of UC applications 208, 314. The method also comprises communicating, using a virtual video mixer 210, the decoded video data stream to a physical imaging device 108 via the at least one virtual imaging device 214. Furthermore, the method comprises communicating, using a virtual audio mixer 212, the decoded audio data stream to a physical audio device 110 via the at least one virtual audio device 216.
  • In still another embodiment, a method for managing communication among a plurality of devices is disclosed. The method comprises receiving, from a plurality of terminals 104, an encoded audio data stream and an encoded video data stream. The method also comprises decoding, using a plurality of UC applications 208, 314, the received encoded audio data stream and the received encoded video data stream. The method further comprises mapping, using a virtual multipoint control unit 112, 300, at least one virtual imaging device 214 and at least one virtual audio device 216 to each of the plurality of UC applications 208, 314. The method also comprises communicating, using a virtual video mixer 210, the decoded video data stream to a physical imaging device 108 via the at least one virtual imaging device 214. Furthermore, the method comprises communicating, using a virtual audio mixer 212, the decoded audio data stream to a physical audio device 110 via the at least one virtual audio device 216.
  • Other embodiments of the present invention will be apparent to those skilled in the art after considering this disclosure or practicing the disclosed invention. The specification and examples above are exemplary only, with the true scope of the present invention being determined by the following claims.

Claims (10)

We claim the following invention:
1. A virtual multipoint control unit in communication with a plurality of devices over a network, comprising:
a plurality of unified communication (UC) applications being executed on at least one of the plurality of devices, each of the plurality of UC applications receiving an encoded audio data stream and an encoded video data stream, wherein the plurality of UC applications decode the received audio and the video data streams;
at least one virtual imaging device mapped to each of the plurality of UC applications;
at least one virtual audio device mapped to each of the plurality of UC applications;
a virtual video mixer in communication with a physical imaging device, wherein the virtual video mixer receives the decoded video data stream from each of the plurality of UC applications via the at least one virtual imaging device; and
a virtual audio mixer in communication with a physical audio device, wherein the virtual audio mixer receives the decoded audio data stream from each of the plurality of UC applications via the at least one virtual audio device.
2. The claim according to claim 1, wherein the plurality of UC applications comprises at least one of Microsoft Lync, IBM Sametime, Cisco IP Communicator, and Skype.
3. A system for managing communication among a plurality of devices, the system being employed within a communication network, comprising:
a communication device receiving encoded audio and video data streams via a plurality of unified communication (UC) applications, wherein the plurality of UC applications decode the received encoded audio data stream and the received encoded video data stream; and
a virtual multipoint control unit in communication with the communication device, wherein the virtual multipoint control unit comprises:
at least one virtual imaging device mapped to each of the plurality of UC applications;
at least one virtual audio device mapped to each of the plurality of UC applications;
a virtual video mixer in communication with a physical imaging device, wherein the virtual video mixer receives the decoded video data stream from each of the plurality of UC applications via the at least one virtual imaging device; and
a virtual audio mixer in communication with a physical audio device, wherein the virtual audio mixer receives the decoded audio data stream from each of the plurality of UC applications via the at least one virtual audio device.
4. The claim according to claim 3, wherein the plurality of UC applications comprises at least one of Microsoft Lync, IBM Sametime, Cisco IP Communicator, and Skype.
5. A non-transitory computer readable medium storing a program of instructions executable by a computing device to perform a method for employing a virtual multipoint control unit for unified communications, comprising:
receiving, from a plurality of terminals, an encoded audio data stream and an encoded video data stream;
decoding, using a plurality of unified communication (UC) applications, the received encoded audio data stream and the received encoded video data stream;
mapping, using a virtual multipoint control unit, at least one virtual imaging device and at least one virtual audio device to each of the plurality of UC applications;
communicating, using a virtual video mixer, the decoded video data stream to a physical imaging device via the at least one virtual imaging device; and
communicating, using a virtual audio mixer, the decoded audio data stream to a physical audio device via the at least one virtual audio device.
6. The claim according to claim 5, wherein the plurality of UC applications comprise at least one of Microsoft Lync, IBM Sametime, Cisco IP Communicator, and Skype.
7. A method for managing communication among a plurality of devices, the method comprising:
receiving, from a plurality of terminals, an encoded audio data stream and an encoded video data stream;
decoding, using a plurality of unified communication (UC) applications, the received encoded audio data stream and the received encoded video data stream;
mapping, using a virtual multipoint control unit, at least one virtual imaging device and at least one virtual audio device to each of the plurality of UC applications;
communicating, using a virtual video mixer, the decoded video data stream to a physical imaging device via the at least one virtual imaging device; and
communicating, using a virtual audio mixer, the decoded audio data stream to a physical audio device via the at least one virtual audio device.
8. The claim according to claim 8, wherein the plurality of UC applications comprises at least one of Microsoft Lync, IBM Sametime, Cisco IP Communicator, and Skype.
9. A method for manufacturing a system for managing communication among a plurality of devices, the system being employed within a communication network, comprising:
providing a communication device receiving encoded audio and video data streams via a plurality of unified communication (UC) applications, wherein the plurality of UC applications decode the received encoded audio data stream and the received encoded video data stream; and
providing a virtual multipoint control unit in communication with the communication device, wherein the virtual multipoint control unit comprises:
at least one virtual imaging device mapped to each of the plurality of UC applications;
at least one virtual audio device mapped to each of the plurality of UC applications;
a virtual video mixer in communication with a physical imaging device, wherein the virtual video mixer receives the decoded video data stream from each of the plurality of UC applications via the at least one virtual imaging device; and
a virtual audio mixer in communication with a physical audio device, wherein the virtual audio mixer receives the decoded audio data stream from each of the plurality of UC applications via the at least one virtual audio device.
10. The claim according to claim 9, wherein the plurality of UC applications comprises at least one of Microsoft Lync, IBM Sametime, Cisco IP Communicator, and Skype.
US14/341,818 2013-07-29 2014-07-27 System for a Virtual Multipoint Control Unit for Unified Communications Abandoned US20150077509A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/341,818 US20150077509A1 (en) 2013-07-29 2014-07-27 System for a Virtual Multipoint Control Unit for Unified Communications
US15/062,066 US9781386B2 (en) 2013-07-29 2016-03-05 Virtual multipoint control unit for unified communications

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361859358P 2013-07-29 2013-07-29
US14/341,818 US20150077509A1 (en) 2013-07-29 2014-07-27 System for a Virtual Multipoint Control Unit for Unified Communications

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/062,066 Continuation US9781386B2 (en) 2013-07-29 2016-03-05 Virtual multipoint control unit for unified communications

Publications (1)

Publication Number Publication Date
US20150077509A1 true US20150077509A1 (en) 2015-03-19

Family

ID=52667581

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/341,818 Abandoned US20150077509A1 (en) 2013-07-29 2014-07-27 System for a Virtual Multipoint Control Unit for Unified Communications
US15/062,066 Active US9781386B2 (en) 2013-07-29 2016-03-05 Virtual multipoint control unit for unified communications

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/062,066 Active US9781386B2 (en) 2013-07-29 2016-03-05 Virtual multipoint control unit for unified communications

Country Status (1)

Country Link
US (2) US20150077509A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150264505A1 (en) * 2014-03-13 2015-09-17 Accusonus S.A. Wireless exchange of data between devices in live events
WO2017019911A1 (en) 2015-07-28 2017-02-02 Mersive Technologies, Inc. Virtual video driver bridge system for multi-source collaboration within a web conferencing system
US20170098453A1 (en) * 2015-06-24 2017-04-06 Microsoft Technology Licensing, Llc Filtering sounds for conferencing applications
US20170127022A1 (en) * 2014-07-14 2017-05-04 Shenzhen Grandstream Networks Technologies Co. Ltd. Conference processing method of third-party application and communication device thereof
US9781386B2 (en) * 2013-07-29 2017-10-03 Clearone Communications Hong Kong Ltd. Virtual multipoint control unit for unified communications
US9812150B2 (en) 2013-08-28 2017-11-07 Accusonus, Inc. Methods and systems for improved signal decomposition
US9935915B2 (en) 2011-09-30 2018-04-03 Clearone, Inc. System and method that bridges communications between multiple unfied communication(UC) clients
CN110070878A (en) * 2019-03-26 2019-07-30 苏州科达科技股份有限公司 The coding/decoding method and electronic equipment of audio code stream
CN110324565A (en) * 2019-06-06 2019-10-11 浙江华创视讯科技有限公司 Audio-frequency inputting method, device, conference host, storage medium and electronic device
US10468036B2 (en) 2014-04-30 2019-11-05 Accusonus, Inc. Methods and systems for processing and mixing signals using signal decomposition
US10931959B2 (en) * 2018-05-09 2021-02-23 Forcepoint Llc Systems and methods for real-time video transcoding of streaming image data
US20220217377A1 (en) * 2016-02-17 2022-07-07 V-Nova International Limited Physical adapter, signal processing equipment, methods and computer programs
US11522936B2 (en) * 2021-04-30 2022-12-06 Salesforce, Inc. Synchronization of live streams from web-based clients
US20230095692A1 (en) * 2021-09-30 2023-03-30 Samsung Electronics Co., Ltd. Parallel metadata generation based on a window of overlapped frames

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106534808B (en) * 2016-12-28 2019-10-18 浙江宇视科技有限公司 A kind of video monitoring method and device based on virtual camera
CA3087509A1 (en) 2018-01-16 2019-07-25 Qsc, Llc Audio, video and control system implementing virtual machines
EP3740868A1 (en) 2018-01-16 2020-11-25 Qsc, Llc Server support for multiple audio/video operating systems
EP3808067A1 (en) 2018-06-15 2021-04-21 Shure Acquisition Holdings, Inc. Systems and methods for integrated conferencing platform
CN109862305B (en) * 2019-01-02 2021-09-21 视联动力信息技术股份有限公司 Method and device for adjusting stream during meeting of video network

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7113992B1 (en) * 1999-11-08 2006-09-26 Polycom, Inc. Decomposition architecture for an MCU
US20070166007A1 (en) * 2006-01-19 2007-07-19 Sony Corporation Recording apparatus, recording method, program, encoding apparatus, and encoding method
US7701926B2 (en) * 2002-06-14 2010-04-20 Polycom, Inc. Multipoint multimedia/audio conference using IP trunking
US8035679B2 (en) * 2006-12-12 2011-10-11 Polycom, Inc. Method for creating a videoconferencing displayed image
US8446451B2 (en) * 2006-03-01 2013-05-21 Polycom, Inc. Method and system for providing continuous presence video in a cascading conference
US8456510B2 (en) * 2009-03-04 2013-06-04 Lifesize Communications, Inc. Virtual distributed multipoint control unit
US8692864B2 (en) * 2012-05-31 2014-04-08 Ronald Angelo Dynamic virtual multipoint video conference control unit
US8745256B2 (en) * 2009-02-02 2014-06-03 Wistron Corp. Method and system for multimedia audio video transfer
US20150124863A1 (en) * 2013-05-29 2015-05-07 ClearOne Inc. Chroma-based video converter
US20150135207A1 (en) * 2012-04-19 2015-05-14 Sony Corporation Reception device, reception method, broadcasting device, broadcasting method, program, and link application control system

Family Cites Families (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5689553A (en) 1993-04-22 1997-11-18 At&T Corp. Multimedia telecommunications network and service
US5689641A (en) 1993-10-01 1997-11-18 Vicor, Inc. Multimedia collaboration system arrangement for routing compressed AV signal through a participant site without decompressing the AV signal
US6075571A (en) 1997-07-29 2000-06-13 Kuthyar; Ashok K. Composite image display device and service for video conferencing
GB2349055B (en) 1999-04-16 2004-03-24 Mitel Corp Virtual meeting rooms with spatial audio
US7085243B2 (en) 2000-03-01 2006-08-01 Polycom Israel Ltd. System and method for providing reservationless conferencing
US20040054747A1 (en) 2002-09-12 2004-03-18 International Business Machines Corporation Pervasive home network appliance
DE10358846A1 (en) 2003-12-16 2005-07-21 Nekritch, Iakov, Dr. Video data method for compressing and transmitting video data has prioritized protection for different parts of a data stream divided into classes according to importance
US8027335B2 (en) * 2004-05-05 2011-09-27 Prodea Systems, Inc. Multimedia access device and system employing the same
US8316104B2 (en) 2005-11-15 2012-11-20 California Institute Of Technology Method and apparatus for collaborative system
US20070127668A1 (en) 2005-12-02 2007-06-07 Ahya Deepak P Method and system for performing a conference call
US7643436B2 (en) 2006-02-01 2010-01-05 Sun Microsystems, Inc. Apparatus and method for combining network conferences that are not co-located
US20080010482A1 (en) 2006-06-13 2008-01-10 Microsoft Corporation Remote control of a media computing device
US20080120675A1 (en) 2006-11-22 2008-05-22 Horizon Semiconductors Ltd. Home gateway for multiple units
US20090210789A1 (en) 2008-02-14 2009-08-20 Microsoft Corporation Techniques to generate a visual composition for a multimedia conference event
TWI435589B (en) 2008-03-18 2014-04-21 Wistron Corp Voip integrating system and method thereof
US7739333B2 (en) 2008-06-27 2010-06-15 Microsoft Corporation Management of organizational boundaries in unified communications systems
US20100005497A1 (en) 2008-07-01 2010-01-07 Michael Maresca Duplex enhanced quality video transmission over internet
US20100008419A1 (en) 2008-07-10 2010-01-14 Apple Inc. Hierarchical Bi-Directional P Frames
US20100180224A1 (en) * 2009-01-15 2010-07-15 Open Labs Universal music production system with added user functionality
CN101902536A (en) * 2009-06-01 2010-12-01 讯动科技股份有限公司 Network communication system for supporting network communication protocols and method thereof
US9143729B2 (en) 2010-05-12 2015-09-22 Blue Jeans Networks, Inc. Systems and methods for real-time virtual-reality immersive multimedia communications
US8848028B2 (en) * 2010-10-25 2014-09-30 Dell Products L.P. Audio cues for multi-party videoconferencing on an information handling system
US20120196614A1 (en) * 2011-02-02 2012-08-02 Vonage Network Llc. Method and system for unified management of communication events
US8972984B2 (en) 2011-05-20 2015-03-03 Citrix Systems, Inc. Methods and systems for virtualizing audio hardware for one or more virtual machines
US20130097333A1 (en) * 2011-06-12 2013-04-18 Clearone Communications, Inc. Methods and apparatuses for unified streaming communication
WO2012172310A2 (en) * 2011-06-16 2012-12-20 Blinkpipe Limited Video conferencing systems
US8903922B2 (en) * 2011-07-27 2014-12-02 Cisco Technology, Inc. Exporting an email thread to a persistent chat room
US20130097244A1 (en) 2011-09-30 2013-04-18 Clearone Communications, Inc. Unified communications bridging architecture
US8872880B1 (en) * 2011-12-30 2014-10-28 Juniper Networks, Inc. Video conference service with multiple service tiers
US9241129B2 (en) * 2012-01-31 2016-01-19 Mitel Networks Corporation Video calls for external networks
US8428228B1 (en) * 2012-09-18 2013-04-23 Weerawan Wongmanee Unified communication system
US8671149B1 (en) * 2012-09-18 2014-03-11 Weerawan Wongmanee Unified messaging platform with intelligent voice recognition (IVR)
US8706912B2 (en) * 2012-09-18 2014-04-22 Weerawan Wongmanee Unified LTE cloud system
US20140148934A1 (en) * 2012-11-20 2014-05-29 ClearOne Communication, Inc. Unified communications bridging architecture
US10135823B2 (en) * 2013-01-07 2018-11-20 Dell Products L.P. Input redirection with a cloud client device
US9571529B2 (en) * 2013-03-15 2017-02-14 Avaya Inc. Browser-based communications enhanced with enterprise communication features
US20140280595A1 (en) * 2013-03-15 2014-09-18 Polycom, Inc. Cloud Based Elastic Load Allocation for Multi-media Conferencing
US20140359457A1 (en) * 2013-05-30 2014-12-04 NextPlane, Inc. User portal to a hub-based system federating disparate unified communications systems
WO2015000118A1 (en) * 2013-07-01 2015-01-08 华为技术有限公司 Unified communication-based video conference call method, device and system
US20150077509A1 (en) * 2013-07-29 2015-03-19 ClearOne Inc. System for a Virtual Multipoint Control Unit for Unified Communications

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7113992B1 (en) * 1999-11-08 2006-09-26 Polycom, Inc. Decomposition architecture for an MCU
US7701926B2 (en) * 2002-06-14 2010-04-20 Polycom, Inc. Multipoint multimedia/audio conference using IP trunking
US20070166007A1 (en) * 2006-01-19 2007-07-19 Sony Corporation Recording apparatus, recording method, program, encoding apparatus, and encoding method
US8446451B2 (en) * 2006-03-01 2013-05-21 Polycom, Inc. Method and system for providing continuous presence video in a cascading conference
US8035679B2 (en) * 2006-12-12 2011-10-11 Polycom, Inc. Method for creating a videoconferencing displayed image
US8745256B2 (en) * 2009-02-02 2014-06-03 Wistron Corp. Method and system for multimedia audio video transfer
US8456510B2 (en) * 2009-03-04 2013-06-04 Lifesize Communications, Inc. Virtual distributed multipoint control unit
US20150135207A1 (en) * 2012-04-19 2015-05-14 Sony Corporation Reception device, reception method, broadcasting device, broadcasting method, program, and link application control system
US8692864B2 (en) * 2012-05-31 2014-04-08 Ronald Angelo Dynamic virtual multipoint video conference control unit
US20150124863A1 (en) * 2013-05-29 2015-05-07 ClearOne Inc. Chroma-based video converter

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9935915B2 (en) 2011-09-30 2018-04-03 Clearone, Inc. System and method that bridges communications between multiple unfied communication(UC) clients
US9781386B2 (en) * 2013-07-29 2017-10-03 Clearone Communications Hong Kong Ltd. Virtual multipoint control unit for unified communications
US10366705B2 (en) 2013-08-28 2019-07-30 Accusonus, Inc. Method and system of signal decomposition using extended time-frequency transformations
US11581005B2 (en) 2013-08-28 2023-02-14 Meta Platforms Technologies, Llc Methods and systems for improved signal decomposition
US11238881B2 (en) 2013-08-28 2022-02-01 Accusonus, Inc. Weight matrix initialization method to improve signal decomposition
US9812150B2 (en) 2013-08-28 2017-11-07 Accusonus, Inc. Methods and systems for improved signal decomposition
US9584940B2 (en) 2014-03-13 2017-02-28 Accusonus, Inc. Wireless exchange of data between devices in live events
US20150264505A1 (en) * 2014-03-13 2015-09-17 Accusonus S.A. Wireless exchange of data between devices in live events
US9918174B2 (en) 2014-03-13 2018-03-13 Accusonus, Inc. Wireless exchange of data between devices in live events
US11610593B2 (en) 2014-04-30 2023-03-21 Meta Platforms Technologies, Llc Methods and systems for processing and mixing signals using signal decomposition
US10468036B2 (en) 2014-04-30 2019-11-05 Accusonus, Inc. Methods and systems for processing and mixing signals using signal decomposition
US9900552B2 (en) * 2014-07-14 2018-02-20 Shenzhen Grandstream Networks Technologies Co., Ltd Conference processing method of third-party application and communication device thereof
US20170127022A1 (en) * 2014-07-14 2017-05-04 Shenzhen Grandstream Networks Technologies Co. Ltd. Conference processing method of third-party application and communication device thereof
US10127917B2 (en) * 2015-06-24 2018-11-13 Microsoft Technology Licensing, Llc Filtering sounds for conferencing applications
US20170098453A1 (en) * 2015-06-24 2017-04-06 Microsoft Technology Licensing, Llc Filtering sounds for conferencing applications
EP3329670A4 (en) * 2015-07-28 2019-03-06 Mersive Technologies, Inc. Virtual video driver bridge system for multi-source collaboration within a web conferencing system
JP2018529251A (en) * 2015-07-28 2018-10-04 マーシブ テクノロジーズ,インコーポレイティド Virtual video driver bridge system for multi-source collaboration in web conferencing system
WO2017019911A1 (en) 2015-07-28 2017-02-02 Mersive Technologies, Inc. Virtual video driver bridge system for multi-source collaboration within a web conferencing system
CN108028905A (en) * 2015-07-28 2018-05-11 Mersive技术有限公司 Virtual video driver bridge system for the multi-source cooperation in netmeeting
US11489891B2 (en) 2015-07-28 2022-11-01 Mersive Technologies, Inc. Virtual video driver bridge system for multi-source collaboration within a web conferencing system
US20220217377A1 (en) * 2016-02-17 2022-07-07 V-Nova International Limited Physical adapter, signal processing equipment, methods and computer programs
US11924450B2 (en) * 2016-02-17 2024-03-05 V-Nova International Limited Physical adapter, signal processing equipment, methods and computer programs
US10931959B2 (en) * 2018-05-09 2021-02-23 Forcepoint Llc Systems and methods for real-time video transcoding of streaming image data
CN110070878A (en) * 2019-03-26 2019-07-30 苏州科达科技股份有限公司 The coding/decoding method and electronic equipment of audio code stream
CN110324565A (en) * 2019-06-06 2019-10-11 浙江华创视讯科技有限公司 Audio-frequency inputting method, device, conference host, storage medium and electronic device
US11522936B2 (en) * 2021-04-30 2022-12-06 Salesforce, Inc. Synchronization of live streams from web-based clients
US20230095692A1 (en) * 2021-09-30 2023-03-30 Samsung Electronics Co., Ltd. Parallel metadata generation based on a window of overlapped frames
US11930189B2 (en) * 2021-09-30 2024-03-12 Samsung Electronics Co., Ltd. Parallel metadata generation based on a window of overlapped frames

Also Published As

Publication number Publication date
US20160373696A1 (en) 2016-12-22
US9781386B2 (en) 2017-10-03

Similar Documents

Publication Publication Date Title
US9781386B2 (en) Virtual multipoint control unit for unified communications
US10015440B2 (en) Multiple channel communication using multiple cameras
US9300705B2 (en) Methods and systems for interfacing heterogeneous endpoints and web-based media sources in a video conference
US9021062B2 (en) Sharing audio and video device on a client endpoint device between local use and hosted virtual desktop use
US8994778B2 (en) Systems and methods for providing video conferencing services via an ethernet adapter
US9035991B2 (en) Collaboration system and method
US20130151623A1 (en) Systems and methods for translating multiple client protocols via a conference bridge
US9363472B2 (en) Video injection for video communication
US9398257B2 (en) Methods and systems for sharing a plurality of encoders between a plurality of endpoints
US20140028778A1 (en) Systems and methods for ad-hoc integration of tablets and phones in video communication systems
US11489891B2 (en) Virtual video driver bridge system for multi-source collaboration within a web conferencing system
US20060215630A1 (en) Feature scalability in a multimedia communication system
US20130147906A1 (en) Systems and methods for offloading video processing of a video conference
US8717408B2 (en) Conducting a private videoconference within a videoconference via an MCU
US20130147903A1 (en) Systems and methods for including video traffic from external sources into a video conferencing
US20130147902A1 (en) Systems and methods for mapping a uri to a plurality of endpoints for a sip communication
WO2012170913A1 (en) Systems and methods for improved interactive content sharing in video communication systems
US20130147901A1 (en) Systems and methods for video enabling pbx systems without a sip stack
US9432624B2 (en) Method for improving an MCU's performance using common properties of the H.264 codec standard
US8717409B2 (en) Conducting a direct private videoconference within a videoconference
US8558862B2 (en) Videoconferencing using a precoded bitstream
US9503812B2 (en) Systems and methods for split echo cancellation
US20130198399A1 (en) Input/output communication
US9288436B2 (en) Systems and methods for using split endpoints in video communication systems
Sorokin et al. IP Video Conferencing: A Tutorial

Legal Events

Date Code Title Description
AS Assignment

Owner name: CLEARONE INC., UTAH

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GRAHAM, DEREK;REEL/FRAME:034427/0374

Effective date: 20141031

AS Assignment

Owner name: CLEARONE COMMUNICATIONS HONG KONG LTD., HONG KONG

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CLEARONE LTD.;REEL/FRAME:036348/0810

Effective date: 20141111

Owner name: CLEARONE LTD. FKA (FORMERLY KNOWN AS) C-V PRIVATE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BEN-NATAN, AVISHAY;REEL/FRAME:036348/0475

Effective date: 20120301

AS Assignment

Owner name: CLEARONE COMMUNICATIONS HONG KONG LTD., HONG KONG

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CLEARONE INC.;REEL/FRAME:036890/0528

Effective date: 20151026

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION