WO2019118890A1 - Method and system for cloud video stitching - Google Patents

Method and system for cloud video stitching Download PDF

Info

Publication number
WO2019118890A1
WO2019118890A1 PCT/US2018/065776 US2018065776W WO2019118890A1 WO 2019118890 A1 WO2019118890 A1 WO 2019118890A1 US 2018065776 W US2018065776 W US 2018065776W WO 2019118890 A1 WO2019118890 A1 WO 2019118890A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
video stream
stream
wireless mobile
streams
Prior art date
Application number
PCT/US2018/065776
Other languages
French (fr)
Inventor
Paul Lemley
Ryan DAULTON
Original Assignee
Hivecast, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hivecast, Llc filed Critical Hivecast, Llc
Publication of WO2019118890A1 publication Critical patent/WO2019118890A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23424Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving splicing one content stream with another content stream, e.g. for inserting or substituting an advertisement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection

Definitions

  • the present invention relates to controlling, by a wireless mobile device, production of a video stream that is generated by a remote server.
  • FIG. 1 is a block diagram of a cloud based video stitching system including video contributor and video producer mobile devices.
  • FIG. 2 is a data flow diagram for producing, at a wireless mobile device, a video clip generated by a remote device, based on multiple video streams received via a network.
  • FIG. 3 is a block diagram of a producer wireless mobile device for designing an output video stream based on multiple input video streams.
  • FIG. 4 is a block diagram of a computer system for generating an output video stream by stitching portions of multiple video streams based on commands received from a wireless mobile device.
  • FIG. 5 is a flow chart for stitching video streams at a server based on commands received from a wireless mobile device to generate an output video stream.
  • FIG. 6 is a flow chart for designing an output video stream based on multiple video streams at a wireless mobile device, and generating video stitching commands to a server for generating the output video stream.
  • engines, controllers, generators, processors and the like, that are described in the specification can include standard processing components, such as one or more electronic processors, one or more computer-readable medium modules, one or more input/output interfaces, and various connections (e.g., a system bus) connecting the components.
  • the engines, controllers, generators, and processors described in the specification may be implemented in one of or a combination of a general processor, an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), or the like.
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • An embodiment of the invention allows a user to produce or design a video stream from a remote mobile wireless device based on multiple input video streams.
  • the multiple input video streams are contributed by multiple wireless devices and sent to a server.
  • the server sends the multiple video streams to the remote mobile wireless device.
  • the server stitches portions of the contributed video streams according to commands comprising stream metadata received from the remote mobile wireless device, and streams the resulting stitched video stream to an audience of wireless mobile device users.
  • the present invention solves the technical problem of designing or producing a video stream from a remote mobile wireless device using inputs from multiple devices for generation and distribution of the video stream by a server device.
  • FIG. 1 is a block diagram of a cloud based video stitching system including video contributor and video producer mobile devices.
  • a cloud based video stitching system 100 includes a producer client device 110 and a plurality of contributor client devices H2a, H2b, H2c and H2d.
  • the contributor client devices H2a, H2b, H2c and H2d may be referred to as the contributor client devices 112.
  • Also shown in FIG. 1 are a stitch process server 120, a splice handler 122, an input/output (I/O) server 124, a compute engine 126, and a logging manager 128.
  • I/O input/output
  • the system 100 includes four separate input video streams 140, a video stream 142, and a stitch event command 142. Although four contributor client devices 112 and four separate video streams 140 are shown in FIG. 1, the disclosure is not limited to any specific number of contributor devices 112 and respective video streams 140, and any suitable number N contributor client devices 112 and respective video streams 112 may be utilized.
  • Each of the plurality of contributor client devices 112 is a wireless mobile communication device, for example, a wireless mobile phone or a handheld communication device (e.g., a tablet or other computer).
  • Each of the contributor devices 112 comprises a video camera operable to capture video images and sound, and each includes an application that enables the respective contributor client device to wirelessly stream its captured video to the I/O server 124 via a respective one of the four input video streams 140.
  • the four input video streams 140 may include multiplexed audio and video packets.
  • multiple respective users of the multiple contributor client devices 112 may be concurrently shooting and streaming video at a particular event.
  • the multiple users may each capture video comprising a different aspect or different view of the event and may stream video of their particular view to the I/O server 124.
  • the disclosure is not limited in this regard, and the different contributor client devices 112 may capture any images of interest and stream the captured images to the I/O server 124.
  • the producer client device 110 is a wireless communication device, for example, a wireless phone or handheld communication device that is operable to receive and display a plurality of video streams.
  • the producer client device 110 may receive the four input video streams 140 from the I/O server 124 within the video stream 144 that may also comprise metadata regarding the four input video streams 140.
  • the producer client device 110 includes a software application that generates a video stream viewer in a graphical user interface (GUI).
  • GUI graphical user interface
  • the video stream viewer allows a user to view the multiple input video streams 140 and define a“final cut” version video stream derived from the multiple input video streams 140.
  • The“final cut” video stream is stitched and distributed by the I/O server 124 to audience devices.
  • the producer client device 110 user may make selections in the viewer GUI to generate editing and stitching or splicing commands to the stitch process server 120 for defining the“final cut” version video stream.
  • the user may use the video stream viewer GUI to select one of the multiple video input streams 140, which will in turn make the selected stream a focused view in the“final cut” version video stream that is output by the server 124.
  • the focused view in the“final cut” version changes according to the repeated selection action.
  • the selections made in the producer client device GUI are ultimately cut as an on-screen portion in the final rendering of the production by the server 124.
  • video stream selections are received in the video stream viewer GUI.
  • the timing of a selection within a video stream triggers a universal time coordinate (UTC) time stamp.
  • the time stamp indicates at what time or frame to splice and stitch the selected stream into the“final cut” version video stream that is output by the server 124.
  • a double tap gesture by the user on a specific video stream as the streams 140 are displayed in the viewer GUI identifies a UTC time stamp.
  • the UTC time stamp is communicated to the stich process server 120.
  • the time stamp indicates at what time or frame of a currently rendered“final cut” video stream and the newly selected stream are spliced and stitched together for distribution to audience devices.
  • the editing, stitching, and splicing commands are communicated via a wireless network and/or the Internet to the stitch process server 120 within the stitch event command 142.
  • the stitch event command 142 may include video stream metadata, for example, the UTC time stamp that indicates where to splice or stitch two of the video streams 140 for the “final cut” video stream.
  • The“final cut” video stream may be referred to as a video clip or an audience video stream.
  • one or more of the stitch process server 120, the splice handler 122, the I/O server 124, the compute engine 126, and the logging manager 128 may comprise cloud computing resources that may be include shared configurable resources accessible by the contributor client devices 112 and the producer client device 110 over the Internet.
  • the I/O server 124 may comprise communication interfaces for receiving the four separate video input streams 140 from the contributor client devices 112 via a wireless network and/or the Internet.
  • the I/O server 124 comprises the compute engine 126, and the logging manager 128.
  • the logging manager 128 may log metadata for each of the separate video input streams 140 and for the video stream 144 that is sent to the producer client device 110.
  • the metadata may include, for example, stream identifiers, client identifiers, transmission rates, and stream positions.
  • the logging manager 128 may also store metadata for the outgoing“final cut” version video stream including a current stream identifier that indicates which of the four separate video input streams 112 is included in the outgoing“final cut” video stream, a transmission rate, and a stream position for the outgoing“final cut” video stream.
  • the compute engine 126 may manage receipt and buffering of the four separate video input streams 140, and buffering and transmission of the“final cut” video stream sent to a plurality of audience client devices (see Figure 2, audience client devices 210).
  • the splice handler 122 may receive the stitch event command from the producer client device 110 and may determine which of the four video input streams 140 to stitch into the outgoing “final cut” video stream.
  • the splice handler 122 may also determine the specified frame position or a splice time in the outgoing video stream for the stitching event based on the command.
  • the splice handler 122 may communicate the determined in formation in a second stitch event command and send it to the I/O server 124.
  • the compute engine 126 may then stitch the newly identified video input stream into the outgoing“final cut” video stream, and the logging manager 128 may log the newly identified video stream as the current outgoing video stream.
  • FIG. 2 is a data flow diagram for producing, at a wireless mobile device, a video clip generated by a remote device, based on multiple video streams received via a network.
  • the system 200 comprises the wireless mobile producer client device 110, N wireless mobile contributor devices 112, the I/O server 124 and the stitch process server 120, and a plurality of audience client devices 210.
  • the plurality of audience client devices may include wireless communication devices such as a mobile phone or handheld communication device that includes communication interfaces and user interfaces for wirelessly receiving and display video streams from the I/O server 124.
  • the logging manager 128 of the I/O server 124 may store a current selected or default video stream identifier n x in memory.
  • the stored video stream identifier may indicate which video stream is a current outgoing video stream from the server 124 for distribution to the audience client devices 210.
  • Each of N contributor client devices 112 capture and transmit a respective video stream to the I/O server 124.
  • the I/O server 124 decodes the N video streams, stores the N video streams in a buffer and logs position information for the N video streams.
  • the I/O server 124 encodes the N video streams for transmission to the producer client device 110 and streams the N video streams to the producer client device, at a first latency time, along with metadata for each of the N video streams.
  • the producer client device 110 displays the N video streams in a graphical user interface.
  • the I/O server 124 encodes the current selected or default nx video stream for transmission to one or more of the audience client devices 210 and transmits the outgoing video stream having the identifier n x to the one or audience client devices 210, at a second latency time, for display of the n x video stream at the one or more audience client devices 210.
  • the graphical user interface of producer client device 110 that displays the N video streams allows a user to design, in real time, the“final cut” version video stream for output from the server to the audience client devices.
  • the graphical user interface receives a user selection n y of one of the N video streams, for example, via a touch screen selection in a window displaying the n y video stream.
  • the producer client device 110 generates a stitch event command that includes metadata for the selected n y video stream including, at least, the n y video stream ID and a splice time, and transmits the stitch command to the stitch process server 120.
  • the splice handler 122 of the stitch process server 120 receive the stitch event command and determines which video input streams and a frame position or splice time to stitch the n y video stream into the outgoing“final cut,” video stream.
  • the compute engine 126 of the I/O server 124 may then stitch, in real time, the n y video input stream into the outgoing“final cut” video stream in place of the n x video stream, and seamlessly continue transmitting the outgoing“final cut” video stream to the audience client devices 210.
  • the logging manager 128 may log the n y video stream identifier as the current outgoing video stream.
  • FIG. 3 is a block diagram of a producer wireless mobile device for designing an output video stream based on multiple input video streams.
  • a producer wireless mobile communication device 300 includes suitable logic, circuitry, interfaces and code, which enables the producer wireless mobile communication device 300 to perform operations similar to or substantially the same as operations of the producer client device 110.
  • the producer wireless mobile communication device 300 is a mobile phone.
  • the producer wireless mobile communication device 300 may be referred to as the producer client device 300.
  • the producer client device 300 includes suitable logic, circuitry, interfaces and code, which enables it to perform operations similar to or substantially the same as operations of the contributor client device 112 or the audience client device 210.
  • the producer client device 300 may be, for example, a wireless mobile phone, and may include, among other things, communication interfaces 320, an audio/video demultiplexing system 322, video and audio buffers 324, a video decoder and a video encoder 326, an audio decoder and an audio encoder 328, a video controller 354, a display generator 355, a telephony processing system 330, one or more user interfaces 332, a video stream design graphical user interface (GUI) 334, a stitch command generator 336, one or more electronic processors 338, a memory 340, video stream metadata memory, software program instruction memory 344, a display system 346, a speaker system 348, a camera 350 and a GNSS receiver.
  • communication interfaces 320 an audio/video demultiplexing system 322, video and audio buffers 324, a video decoder and a video encoder 326, an audio decoder and an audio encoder 328, a video controller 354, a display generator 355, a
  • the communication interfaces 320 may comprise a plurality of interfaces for wireless and/or wireline communication that are operable to communicate with the servers 120 and/or 124 via a network.
  • the communication interfaces 320 may support wireless communication technologies for transmitting and/or receiving signals including video streaming signals via wide area or cellular communication networks, wireless local area networks (e.g., 802.11 communications), and/or via personal area communication (e.g., Bluetooth communications).
  • One or more of the communication interfaces 320 are operable to support wireless video streaming input received from the servers 120 and/or 124.
  • the audio/video demultiplexing system 322 is operable to (i) receive one or more multiplexed audio/video streams, for example, the video stream 144 that includes the four different video streams 112, (ii) demultiplex the audio/video streams into separate audio and video streams, and (iii) store the audio and video steams to the audio and video buffers 324.
  • the video decoder 326 and audio decoder 328 are operable to decode the video and audio streams respectively, for display and audio playback by the display 346 and speaker 348 respectively.
  • the GNSS receiver 352 may provide metadata for the video streams with respect to time and or location of the producer client device 300 when video streams are received, or when video is captured by the camera and/or microphone 350 in a contributor client device 112, for example.
  • the video controller 354 may manage the flow of video data in the producer client device 300, and the display generator 355 may generate video signals that are transmitted to the display device 346.
  • the video stream design GUI 334 is operable to generate a web browser page or a GUI screen with a plurality of widows or display ports for displaying one or more of the N video streams in the display 346, and playing corresponding audio via the speaker 348.
  • the video stream design GUI 334 allows a user to view one or more of the N video streams concurrently, and it provides interactive elements that allow the user to remotely design, edit, or in general, produce a single video stream that is generated by the remote servers 120 and 124, and comprises one or more of the N video streams 140 stitched into a single“final cut” video stream to be transmitted to the audience client devices 210.
  • a user may tap on a window displaying one of the N video streams at a particular time or frame position to indicate which video stream should be inserted into the“final cut” version video stream.
  • the stitch command generator 336 may generate a command based on the frame time/location and the selected video stream ID.
  • more than one stream may be selected to cause a split screen effect in the final cut version including the more than one selected video streams.
  • Other interactive GUI elements may allow the user to design the“final cut” by user selection or user input via a touch screen, voice input, a mechanical button, or keying, for example.
  • the video stream design GUI 334 may slow the playback speed or allow rewind of the one or more displayed video streams, in response to a user selection, to enable the user to make editing or splicing selections over a longer period of time.
  • a user may select in the GUI 334, which audio from the N received video streams to playback while viewing the one or more concurrent video streams.
  • Other video motion, graphics, or sound effects to be included in the“final cut” video stream may also be selected or input by the user in the video stream design GUI 334.
  • the servers 120 and 124 perform the actual stitching, switching, or modifying of the video streams 140 to create the“final cut” version video stream, and the transmission of the“final cut” version video stream to the audience client devices 210.
  • the“final cut” version video stream may be designed or produced in real time as the N video streams 140 are processed at the producer client device 110.
  • the stitch command generator 336 is operable to receive the user’s selections made in the video stream design GUI 334 for designing or modify the“final cut” version video stream, and generate a command to be sent to the servers 120 and/or 124.
  • the command is transmitted via the communication interface 320 and a wireless or wireline network such as the Internet.
  • the command may include metadata 342 corresponding to one or more of the N video streams.
  • the command may include an identifier of the “final cut” video stream or clip, a splice time, a video stream ID for a stream to be removed at the splice time, and/or a video stream ID to be replaced or stitched at the splice time.
  • Other information for video motion or graphics effects to be displayed in the final cut version video stream may also be included in the command sent to the servers 120 and/or 124.
  • the memory 340 stores, among other things, the video stream metadata that may be utilized by the stitch command generator 336 when generating the command to be sent to the servers 120 and/or 124.
  • the metadata may pertain to the N received video streams and may include metadata received from the servers 120 and/or 124, data generated by the producer client device 300, or data generated by user input.
  • the memory 340 further stores, among other things, program instructions 344 that when executed by the one or more electronic processors 338, cause the electronic processors 338 to perform all or a portion of the functions described herein of the producer client device 300.
  • the program instructions 344 may cause the electronic processors 338 to perform all or a portion of the functions described herein of the contributor client devices 112 and/or the audience client devices 210.
  • FIG. 4 is a block diagram of a computer system 400 for generating an output video stream by stitching portions of multiple video streams together, based on commands received from a wireless mobile device.
  • the computer system 400 may stream the output video to a plurality of remote devices, for example, via a wireless network and/or the Internet.
  • a computer system 400 includes suitable logic, circuitry, interfaces and code, which enables the server 400 to perform operations similar to or the same as operations of the server 120 and/or the server 124.
  • the computer system 400 may comprise multiple physical resources, for example, physical systems that may be communicatively coupled via a network and support cloud computing services.
  • the computer system 400 may be any of various types of devices, including, but not limited to, a personal computer system, desktop computer, laptop, notebook, or netbook computer, mainframe computer system, handheld computer, mobile telephone, workstation, network computer, a camera, a set top box, a mobile device, a consumer device, video game console, handheld video game device, application server, storage device, a peripheral device such as a switch, modem, router, or another type of computing or electronic device.
  • the computer system 400 is an example of a computer system that may be configured to implement the server 120 and/or the server 124.
  • computer system 400 includes one or more processors 410 coupled to a system memory 420 via an input/output (I/O) interface 430.
  • Computer system 400 further includes a network interface 442 coupled to I/O interface 430, that is operable to transmit and receive video streams, for example, the N video streams 140, the video stream 144, and the output“final cut” video stream.
  • the computer system 400 may include one or more input/output devices 450, such as cursor control device, a keyboard, and display(s).
  • input/output devices 450 such as cursor control device, a keyboard, and display(s).
  • it is contemplated that embodiments may be implemented using a single instance of computer system 400, while in other embodiments multiple such systems, or multiple nodes making up computer system 400, may be configured to host different portions or instances of embodiments.
  • some elements may be implemented via one or more nodes of computer system 400 that are distinct from those nodes implementing other elements.
  • computer system 400 may be a uniprocessor system including one processor 410, or a multiprocessor system including several processors 410 (e.g., two, four, eight, or another suitable number).
  • Processors 410 may be any suitable processor capable of executing instructions.
  • processors 410 may implement any of a variety of instruction set architectures (IS As), such as the x86, PowerPC, SPARC, or MIPS ISAs, or any other suitable ISA.
  • IS As instruction set architectures
  • processors 410 may commonly, but not necessarily, implement the same ISA.
  • At least one processor 410 may be a graphics processing unit.
  • a graphics processing unit or GPU may be considered a dedicated graphics-rendering device for a personal computer, workstation, game console or other computing or electronic device.
  • Modem GPUs may be very efficient at manipulating and displaying computer graphics, and their highly parallel structure may make them more effective than typical CPUs for a range of complex graphical algorithms.
  • a graphics processor may implement a number of graphics primitive operations in a way that makes executing them much faster than drawing directly to the screen with a host central processing unit (CPU).
  • the image or video processing methods disclosed herein may, at least in part, be implemented by program instructions configured for execution on one of, or parallel execution on two or more of, such GPUs.
  • the GPU(s) may implement one or more application programmer interfaces (APIs) that permit programmers to invoke the functionality of the GPU(s). Suitable GPUs may be commercially available from vendors such as NVIDIA Corporation, ATI Technologies (AMD), and others.
  • APIs application programmer interfaces
  • System memory 420 may be configured to store program instructions 422 accessible by the one or more processors 410.
  • system memory 420 may be implemented using any suitable memory technology, such as static random access memory (SRAM), synchronous dynamic RAM (SDRAM), nonvolatile/Flash-type memory, or any other type of memory.
  • SRAM static random access memory
  • SDRAM synchronous dynamic RAM
  • program instructions and data implementing desired functions, such as those described above for various embodiments, are shown stored within system memory 420 as program instructions 422, video stream metadata 424, and logging data 128.
  • program instructions and/or data may be received, sent or stored upon different types of computer-accessible media or on similar media separate from system memory 420 or computer system 400.
  • the database 454 that is accessible via a network may store, among other things, video stream data, video stream metadata and data for use in an interactive graphical interface that may be implemented on the computer system 400.
  • a computer-accessible medium may include storage media or memory media such as magnetic or optical media, e.g., disk or CD/DVD-ROM coupled to computer system 400 via I/O interface 430.
  • Program instructions and data stored via a computer- accessible medium may be transmitted by transmission media or signals such as electrical, electromagnetic, or digital signals, which may be conveyed via a communication medium such as a network and/or a wireless link, such as may be implemented via network interface 440.
  • I/O interface 430 may be configured to coordinate I/O traffic between processor 410, system memory 420, and any peripheral devices in the device, including network interfaces 442 or other peripheral interfaces, such as input/output devices 450.
  • I/O interface 430 may perform any necessary protocol, timing or other data transformations to convert data signals from one component (e.g., system memory 420, or buffer 440) into a format suitable for use by another component (e.g., processor 410).
  • I/O interface 430 may include support for devices attached through various types of peripheral buses, such as a variant of the Peripheral Component Interconnect (PCI) bus standard or the Universal Serial Bus (USB) standard, for example.
  • PCI Peripheral Component Interconnect
  • USB Universal Serial Bus
  • I/O interface 430 may be split into two or more separate components, such as a north bridge and a south bridge, for example.
  • some or all of the functionality of I/O interface 430 such as an interface to system memory 420, may be incorporated directly into processor 410.
  • Network interface 442 may be configured to allow data to be exchanged between computer system 400 and other devices attached to a network, such as other computer systems, the database 454, the producer client device 300, the contributor client devices 112, the audience client devices 210, or between nodes of computer system 400.
  • the network interface 442 may support communication via wired or wireless general data networks, such as any suitable type of Ethernet network, for example; via telecommunications/telephony networks such as analog voice networks or digital fiber communications networks; via storage area networks such as Fiber Channel SANs, or via any other suitable type of network and/or protocol.
  • Input/output devices 450 may, in some embodiments, include one or more display terminals, keyboards, keypads, touchpads, scanning devices, voice or optical recognition devices, or any other devices suitable for entering or retrieving data by one or more computer system 400. Multiple input/output devices 450 may be present in computer system 400 or may be distributed on various nodes of computer system 400. In some embodiments, similar input/output devices may be separate from computer system 400 and may interact with one or more nodes of computer system 400 through a wired or wireless connection, such as over network interface 1240.
  • instructions stored on a computer-accessible medium separate from computer system 400 may be transmitted to computer system 400 via transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as a network and/or a wireless link.
  • Various embodiments may further include receiving, sending or storing instructions and/or data implemented in accordance with the foregoing description upon a computer- accessible medium. Accordingly, the present embodiments may be practiced with other computer system configurations.
  • Various embodiments may further include receiving, sending or storing instructions and/or data implemented in accordance with the foregoing description upon a computer-accessible medium.
  • a computer-accessible medium may include storage media or memory media such as magnetic or optical media, e.g., disk or DVD/CD-ROM, volatile or non-volatile media such as RAM (e.g. SDRAM, DDR, RDRAM, SRAM, etc.), ROM, etc., as well as transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as network and/or a wireless link.
  • Video stream dataflow between computer system 400 and one of the client devices 300, 112, or 210 is described with respect to FIGS. 1-3.
  • the computer system 400 may concurrently receive a plurality of input video streams from a plurality of remote wireless devices via the network interfaces 442 and a network (e.g., a LAN and/or WAN).
  • Each of the plurality of concurrent input video streams may comprise multiplexed video/audio data.
  • Some of the plurality of multiplexed video/audio data streams may include video data compressed at a different data rate and the plurality of streams may be received with different latency.
  • the plurality of multiplexed video/audio data streams may be stored in an incoming buffer of the buffers 440.
  • the buffers 440 may relay the plurality multiplexed video/audio data streams to the A/V de-multiplexing system 432 that may concurrently de multiplex each stream into two separate sets of data including video and an audio information. Integrity of each of the two sets of data may be checked.
  • the transcoders 426 may concurrently transcode the plurality of video streams to reduce the size of each of the plurality of video streams such that each of the plurality of video streams consume a transmission bandwidth appropriate for concurrently transmitting the plurality of video streams in one video stream via a wireless network to a mobile wireless device.
  • the reduction in size of each of the plurality of video streams may be appropriate for concurrently viewing the plurality of video streams in a display screen such that the display screen appears as a composite image comprising the plurality of video streams.
  • Transmission of the plurality of transcoded video streams via the network interfaces 442 to the producer client device may be synchronized by the stream synchronizing system 434.
  • the stream design GUI generator 456 is operable to generate a display screen GUI or web page having a plurality of widows or display ports for concurrently displaying one or more of the N video streams at the producer client device 300 in the display 346, and playing corresponding audio via the speaker 348.
  • the GUI generator 456 may define a display screen for the video stream design GUI 334.
  • the stream design GUI generator 456 generates the interactive display screen that allows a user to view one or more of the N video streams concurrently, and it defines interactive elements that allow the user to remotely design, edit, or in general,“produce” a single video stream for an audience.
  • the audience version video stream comprises one or more of the N video streams 140 that are stitched into a“final cut” video stream that is processed and transmitted to the audience client devices 210.
  • the generated interactive display screen may be transmitted to the producer client device 300 with the N video streams, transcoded by the transcoded by the transcoders 426 and synchronized by the stream synchronizing system 434.
  • the N video streams may be viewed at the producer client device 300 in the interactive display screen.
  • a user may tap on a window displaying one of the N video streams at a particular time or frame position to indicate which video stream should be inserted into the“final cut” version video stream.
  • the stitch command generator 336 may generate a command based on the frame time/location of the user’s tap, and a video stream ID corresponding to the tapped video stream.
  • more than one stream may be selected to cause a split screen effect in the final cut version including the more than one selected video streams.
  • Other interactive GUI elements may allow the user to design the audience“final cut” stream by user selection or user input via a touch screen, voice input, a mechanical button, or keying, for example.
  • the video stream design GUI 334 may slow the playback speed or allow rewind of the one or more displayed video streams, in response to a user selection, to enable the user to make editing or splicing selections over a longer period of time.
  • a user may select in the GUI 334, which audio from the N received video streams to playback while viewing the one or more concurrent video streams.
  • Other video motion, graphics, or sound effects to be included in the“final cut” video stream may also be selected or input by the user in the video stream design GUI 334.
  • the servers 120 and 124 in the computer system 400 perform the actual stitching, switching, or modifying of the video streams 140 to create the“final cut” version video stream.
  • the computer system 400 also transmits of the“final cut” version video stream to the audience client devices 210.
  • the“final cut” version video stream may be designed or produced in real time at the producer client device 110, as the N video streams 140 are processed at the server 400 to transmit the“final cut” version video stream to the audience client devices 210.
  • One or more of the plurality of input video streams may be processed by the transcoders 426 for streaming to one or more audience client devices 210 as the audience version video stream. Based on a command received from the remote producer wireless device 300, a user selected one of the input video streams may be stitched into the audience version video stream at a specified time or position by the compute engine 126 and splice handler 122.
  • the audience version video stream data may be encoded by the transcoder 426, multiplexed by the multiplexor 430 and buffered in an output buffer of the buffer 440 for transmission to one or more audience client devices 210.
  • FIG. 5 is a flow chart for stitching video streams at a server based on commands received from a wireless mobile device to generate an output video stream.
  • a plurality of video streams 140 are received by the server computer system 400 from a plurality of contributor wireless mobile devices 112.
  • each of the contributor wireless mobile devices may capture video and transmit video streams to the server computer 400 via a wireless cellular network.
  • the disclosure is not limited in this regard, and any suitable communication networks may be utilized for transmission of the video streams 140 to the server computer 400.
  • step 520 the plurality of video streams 140 is stored in the buffer 440 and transcoded by the transcoder 426. Positions of the plurality of video streams are tracked in the logging module 128.
  • the plurality of video streams 140 are transmitted to the producer wireless mobile device 110 at a first delay time.
  • the plurality of video streams 140 may be transmitted to the producer wireless mobile device 110 via a wireless cellular network.
  • the disclosure is not limited in this regard, and any suitable communication networks may be utilized to transmit the video streams 140 to the producer wireless mobile device 110.
  • a first stream ID selection is received from the producer wireless mobile device 110, and the selected first video stream is transmitted to a plurality of audience wireless mobile devices at a second delay, as a current audience video stream.
  • the first stream ID selection is transmitted by the producer wireless mobile device 110 via a wireless cellular network to the server computer 400.
  • the disclosure is not limited in this regard, and any suitable communication networks may be utilized for transmission of the first stream ID selection to the server computer 400.
  • a video stitch event command including a second stream ID selection and a splice time is received from the producer wireless mobile device 110, and the selected second video stream is stitched into the transmission of the current audience video stream at the splice time.
  • the selected second video stream may replace the selected first video stream in the current audience video stream, or be added to the current audience video stream.
  • FIG. 6 is a flow chart for designing an output video stream based on multiple video streams at a wireless mobile device, and generating video stitching commands to a server for generating the output video stream.
  • a plurality of video streams 144 are received at a producer wireless mobile device 110 from a server computer system 400, where the plurality of video streams 144 comprise the plurality of video streams 140.
  • the producer wireless mobile device 110 and the server computer system 400 may communicate via a wireless cellular network.
  • the disclosure is not limited in this regard, and any suitable communication networks may be utilized for communication between the producer wireless mobile device 110 and the server computer 400.
  • step 620 the plurality of video streams 144 are displayed at the producer wireless mobile device 110 in a video stream design GUI 334.
  • step 630 a user selection of a first one of the video streams 144 is received via the video stream design GUI 334 for use as a current audience view video stream at the server computer system 400.
  • a video stream stitch command is generated by the video stitch command generator 336 and is transmitted to the server computer system 400.
  • the video stream stitch command includes a stream ID of the first selected video stream for transmission as the current audience view video stream by the server computer system 400.
  • the current audience view video stream may be transmitted by the computer system 400 to the audience devices via a wireless cellular network.
  • any suitable communication networks may be utilized for communication of the current audience view video stream to the audience devices.
  • step 650 a user selection of a second video stream from the plurality of video streams is received via the video stream design GUI 334, for a user selected splice time or frame position.
  • the second selected video stream is associated with a second stream ID.
  • a second video stream stitch command is generated by the video stitch command generator 336 and is transmitted to the server computer system 400.
  • the video stream stitch command includes the first video stream ID, the second video stream ID, and a splice time for stitching the second selected video stream into the current audience view video stream.
  • the second selected video stream may replace the first selected video stream in the current audience view video stream at the splice time, or it may be added to the current audience view video stream at the splice time.
  • the embodiments provide, among other things, systems and methods for dynamically configuring display of a plurality of documents based on a document review task associated with the plurality of documents.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

A system and method are provided for controlling, by a wireless mobile device, the production of a video stream that is generated by a remote server. The remote server receives multiple video streams from multiple contributor wireless clients, and streams the multiple contributor videos to a producer wireless client device. The multiple streams are displayed in a GUI at the producer wireless device, and in response to a user selection of one of the multiple displayed video streams in the GUI, the producer wireless mobile device transmits a stream ID and splice time in a command to the remote server. The remote server selects one of the contributor video streams that corresponds to the stream ID received in the command, and stitches the corresponding contributor video stream into the video stream generated by the remote server. The video stream generated by the remote server may be broadcast to an audience.

Description

METHOD AND SYSTEM FOR CLOUD VIDEO STITCHING
[0001] The present invention relates to controlling, by a wireless mobile device, production of a video stream that is generated by a remote server.
[0002] Existing systems use cloud switchers that include software suites built in the cloud to allow producers to live-switch between streams. Often, these existing systems experience discontinuity between the streams, and it is difficult to execute switching between the streams without losing integrity in the quality of the streams.
[0003] Other aspects and embodiments will become apparent by consideration of the detailed description and accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] FIG. 1 is a block diagram of a cloud based video stitching system including video contributor and video producer mobile devices.
[0005] FIG. 2 is a data flow diagram for producing, at a wireless mobile device, a video clip generated by a remote device, based on multiple video streams received via a network.
[0006] FIG. 3 is a block diagram of a producer wireless mobile device for designing an output video stream based on multiple input video streams.
[0007] FIG. 4 is a block diagram of a computer system for generating an output video stream by stitching portions of multiple video streams based on commands received from a wireless mobile device.
[0008] FIG. 5 is a flow chart for stitching video streams at a server based on commands received from a wireless mobile device to generate an output video stream.
[0009] FIG. 6 is a flow chart for designing an output video stream based on multiple video streams at a wireless mobile device, and generating video stitching commands to a server for generating the output video stream.
[0010] Before any embodiments are explained in detail, it is to be understood that the embodiments are not limited in their application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. Other embodiments are possible and embodiments described are capable of being practiced or of being carried out in various ways.
[0011] It should be noted that a plurality of hardware and software based devices, as well as a plurality of different structural components, may be used to implement various embodiments. In addition, it should be understood that embodiments may include hardware, software, and electronic components or modules that, for purposes of discussion, may be illustrated and described as if the majority of the components were implemented solely in hardware. However, one of ordinary skill in the art, and based on a reading of this detailed description, would recognize that, in at least one embodiment, the electronic based aspects may be implemented in software (e.g., stored on non-transitory computer-readable medium) executable by one or more processors. As such, it should be noted that a plurality of hardware and software based devices, as well as a plurality of different structural components may be utilized to implement various embodiments. Furthermore, and as described in subsequent paragraphs, the specific configurations illustrated in the drawings are intended to exemplify embodiments and that other alternative configurations are possible. For example, engines, controllers, generators, processors and the like, that are described in the specification can include standard processing components, such as one or more electronic processors, one or more computer-readable medium modules, one or more input/output interfaces, and various connections (e.g., a system bus) connecting the components. In some instances, the engines, controllers, generators, and processors described in the specification may be implemented in one of or a combination of a general processor, an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), or the like.
[0012] An embodiment of the invention allows a user to produce or design a video stream from a remote mobile wireless device based on multiple input video streams. The multiple input video streams are contributed by multiple wireless devices and sent to a server. The server sends the multiple video streams to the remote mobile wireless device. The server stitches portions of the contributed video streams according to commands comprising stream metadata received from the remote mobile wireless device, and streams the resulting stitched video stream to an audience of wireless mobile device users. [0013] The present invention solves the technical problem of designing or producing a video stream from a remote mobile wireless device using inputs from multiple devices for generation and distribution of the video stream by a server device.
[0014] FIG. 1 is a block diagram of a cloud based video stitching system including video contributor and video producer mobile devices. Referring to FIG. 1 a cloud based video stitching system 100 includes a producer client device 110 and a plurality of contributor client devices H2a, H2b, H2c and H2d. The contributor client devices H2a, H2b, H2c and H2d may be referred to as the contributor client devices 112. Also shown in FIG. 1 are a stitch process server 120, a splice handler 122, an input/output (I/O) server 124, a compute engine 126, and a logging manager 128. Moreover, the system 100 includes four separate input video streams 140, a video stream 142, and a stitch event command 142. Although four contributor client devices 112 and four separate video streams 140 are shown in FIG. 1, the disclosure is not limited to any specific number of contributor devices 112 and respective video streams 140, and any suitable number N contributor client devices 112 and respective video streams 112 may be utilized.
[0015] Each of the plurality of contributor client devices 112 is a wireless mobile communication device, for example, a wireless mobile phone or a handheld communication device (e.g., a tablet or other computer). Each of the contributor devices 112 comprises a video camera operable to capture video images and sound, and each includes an application that enables the respective contributor client device to wirelessly stream its captured video to the I/O server 124 via a respective one of the four input video streams 140. In some embodiments, the four input video streams 140 may include multiplexed audio and video packets. In some instances, multiple respective users of the multiple contributor client devices 112 may be concurrently shooting and streaming video at a particular event. The multiple users may each capture video comprising a different aspect or different view of the event and may stream video of their particular view to the I/O server 124. However, the disclosure is not limited in this regard, and the different contributor client devices 112 may capture any images of interest and stream the captured images to the I/O server 124.
[0016] The producer client device 110 is a wireless communication device, for example, a wireless phone or handheld communication device that is operable to receive and display a plurality of video streams. The producer client device 110 may receive the four input video streams 140 from the I/O server 124 within the video stream 144 that may also comprise metadata regarding the four input video streams 140. The producer client device 110 includes a software application that generates a video stream viewer in a graphical user interface (GUI). The video stream viewer allows a user to view the multiple input video streams 140 and define a“final cut” version video stream derived from the multiple input video streams 140. The“final cut” video stream is stitched and distributed by the I/O server 124 to audience devices. The producer client device 110 user may make selections in the viewer GUI to generate editing and stitching or splicing commands to the stitch process server 120 for defining the“final cut” version video stream. For example, the user may use the video stream viewer GUI to select one of the multiple video input streams 140, which will in turn make the selected stream a focused view in the“final cut” version video stream that is output by the server 124. As the user repeats the selection action between the input streams 140, the focused view in the“final cut” version changes according to the repeated selection action. The selections made in the producer client device GUI are ultimately cut as an on-screen portion in the final rendering of the production by the server 124.
[0017] In one embodiment, during the broadcasting of the multi-stream videos 140 to the producer client device 110, video stream selections are received in the video stream viewer GUI. The timing of a selection within a video stream triggers a universal time coordinate (UTC) time stamp. The time stamp indicates at what time or frame to splice and stitch the selected stream into the“final cut” version video stream that is output by the server 124. For example, a double tap gesture by the user on a specific video stream as the streams 140 are displayed in the viewer GUI identifies a UTC time stamp. The UTC time stamp is communicated to the stich process server 120. The time stamp indicates at what time or frame of a currently rendered“final cut” video stream and the newly selected stream are spliced and stitched together for distribution to audience devices.
[0018] The editing, stitching, and splicing commands are communicated via a wireless network and/or the Internet to the stitch process server 120 within the stitch event command 142. The stitch event command 142 may include video stream metadata, for example, the UTC time stamp that indicates where to splice or stitch two of the video streams 140 for the “final cut” video stream. The“final cut” video stream may be referred to as a video clip or an audience video stream.
[0019] In some embodiments, one or more of the stitch process server 120, the splice handler 122, the I/O server 124, the compute engine 126, and the logging manager 128 may comprise cloud computing resources that may be include shared configurable resources accessible by the contributor client devices 112 and the producer client device 110 over the Internet.
[0020] The I/O server 124 may comprise communication interfaces for receiving the four separate video input streams 140 from the contributor client devices 112 via a wireless network and/or the Internet. The I/O server 124 comprises the compute engine 126, and the logging manager 128. The logging manager 128 may log metadata for each of the separate video input streams 140 and for the video stream 144 that is sent to the producer client device 110. The metadata may include, for example, stream identifiers, client identifiers, transmission rates, and stream positions. The logging manager 128 may also store metadata for the outgoing“final cut” version video stream including a current stream identifier that indicates which of the four separate video input streams 112 is included in the outgoing“final cut” video stream, a transmission rate, and a stream position for the outgoing“final cut” video stream.
[0021] The compute engine 126 may manage receipt and buffering of the four separate video input streams 140, and buffering and transmission of the“final cut” video stream sent to a plurality of audience client devices (see Figure 2, audience client devices 210). The splice handler 122, may receive the stitch event command from the producer client device 110 and may determine which of the four video input streams 140 to stitch into the outgoing “final cut” video stream. The splice handler 122 may also determine the specified frame position or a splice time in the outgoing video stream for the stitching event based on the command. The splice handler 122 may communicate the determined in formation in a second stitch event command and send it to the I/O server 124. The compute engine 126 may then stitch the newly identified video input stream into the outgoing“final cut” video stream, and the logging manager 128 may log the newly identified video stream as the current outgoing video stream.
[0022] FIG. 2 is a data flow diagram for producing, at a wireless mobile device, a video clip generated by a remote device, based on multiple video streams received via a network. Referring to FIG. 2, the system 200 comprises the wireless mobile producer client device 110, N wireless mobile contributor devices 112, the I/O server 124 and the stitch process server 120, and a plurality of audience client devices 210. The plurality of audience client devices may include wireless communication devices such as a mobile phone or handheld communication device that includes communication interfaces and user interfaces for wirelessly receiving and display video streams from the I/O server 124.
[0023] The logging manager 128 of the I/O server 124 may store a current selected or default video stream identifier nx in memory. The stored video stream identifier may indicate which video stream is a current outgoing video stream from the server 124 for distribution to the audience client devices 210.
[0024] Each of N contributor client devices 112 capture and transmit a respective video stream to the I/O server 124. The I/O server 124 decodes the N video streams, stores the N video streams in a buffer and logs position information for the N video streams.
[0025] The I/O server 124 encodes the N video streams for transmission to the producer client device 110 and streams the N video streams to the producer client device, at a first latency time, along with metadata for each of the N video streams. The producer client device 110 displays the N video streams in a graphical user interface.
[0026] The I/O server 124 encodes the current selected or default nx video stream for transmission to one or more of the audience client devices 210 and transmits the outgoing video stream having the identifier nx to the one or audience client devices 210, at a second latency time, for display of the nx video stream at the one or more audience client devices 210.
[0027] The graphical user interface of producer client device 110 that displays the N video streams allows a user to design, in real time, the“final cut” version video stream for output from the server to the audience client devices. The graphical user interface receives a user selection ny of one of the N video streams, for example, via a touch screen selection in a window displaying the ny video stream. The producer client device 110 generates a stitch event command that includes metadata for the selected ny video stream including, at least, the ny video stream ID and a splice time, and transmits the stitch command to the stitch process server 120.
[0028] The splice handler 122 of the stitch process server 120 receive the stitch event command and determines which video input streams and a frame position or splice time to stitch the ny video stream into the outgoing“final cut,” video stream. The compute engine 126 of the I/O server 124 may then stitch, in real time, the ny video input stream into the outgoing“final cut” video stream in place of the nx video stream, and seamlessly continue transmitting the outgoing“final cut” video stream to the audience client devices 210. The logging manager 128 may log the ny video stream identifier as the current outgoing video stream.
[0029] FIG. 3 is a block diagram of a producer wireless mobile device for designing an output video stream based on multiple input video streams. Referring to FIG. 3, a producer wireless mobile communication device 300 includes suitable logic, circuitry, interfaces and code, which enables the producer wireless mobile communication device 300 to perform operations similar to or substantially the same as operations of the producer client device 110. In some embodiments, the producer wireless mobile communication device 300 is a mobile phone. The producer wireless mobile communication device 300 may be referred to as the producer client device 300. In some embodiments, the producer client device 300 includes suitable logic, circuitry, interfaces and code, which enables it to perform operations similar to or substantially the same as operations of the contributor client device 112 or the audience client device 210.
[0030] The producer client device 300 may be, for example, a wireless mobile phone, and may include, among other things, communication interfaces 320, an audio/video demultiplexing system 322, video and audio buffers 324, a video decoder and a video encoder 326, an audio decoder and an audio encoder 328, a video controller 354, a display generator 355, a telephony processing system 330, one or more user interfaces 332, a video stream design graphical user interface (GUI) 334, a stitch command generator 336, one or more electronic processors 338, a memory 340, video stream metadata memory, software program instruction memory 344, a display system 346, a speaker system 348, a camera 350 and a GNSS receiver.
[0031] The communication interfaces 320 may comprise a plurality of interfaces for wireless and/or wireline communication that are operable to communicate with the servers 120 and/or 124 via a network. The communication interfaces 320 may support wireless communication technologies for transmitting and/or receiving signals including video streaming signals via wide area or cellular communication networks, wireless local area networks (e.g., 802.11 communications), and/or via personal area communication (e.g., Bluetooth communications). One or more of the communication interfaces 320 are operable to support wireless video streaming input received from the servers 120 and/or 124. [0032] The audio/video demultiplexing system 322 is operable to (i) receive one or more multiplexed audio/video streams, for example, the video stream 144 that includes the four different video streams 112, (ii) demultiplex the audio/video streams into separate audio and video streams, and (iii) store the audio and video steams to the audio and video buffers 324. The video decoder 326 and audio decoder 328 are operable to decode the video and audio streams respectively, for display and audio playback by the display 346 and speaker 348 respectively. In some embodiments the GNSS receiver 352 may provide metadata for the video streams with respect to time and or location of the producer client device 300 when video streams are received, or when video is captured by the camera and/or microphone 350 in a contributor client device 112, for example.
[0033] The video controller 354 may manage the flow of video data in the producer client device 300, and the display generator 355 may generate video signals that are transmitted to the display device 346.
[0034] In some embodiments, the video stream design GUI 334 is operable to generate a web browser page or a GUI screen with a plurality of widows or display ports for displaying one or more of the N video streams in the display 346, and playing corresponding audio via the speaker 348. The video stream design GUI 334 allows a user to view one or more of the N video streams concurrently, and it provides interactive elements that allow the user to remotely design, edit, or in general, produce a single video stream that is generated by the remote servers 120 and 124, and comprises one or more of the N video streams 140 stitched into a single“final cut” video stream to be transmitted to the audience client devices 210. For example, a user may tap on a window displaying one of the N video streams at a particular time or frame position to indicate which video stream should be inserted into the“final cut” version video stream. As a result, the stitch command generator 336 may generate a command based on the frame time/location and the selected video stream ID. In some embodiments more than one stream may be selected to cause a split screen effect in the final cut version including the more than one selected video streams. Other interactive GUI elements may allow the user to design the“final cut” by user selection or user input via a touch screen, voice input, a mechanical button, or keying, for example. Moreover, in some embodiments, the video stream design GUI 334 may slow the playback speed or allow rewind of the one or more displayed video streams, in response to a user selection, to enable the user to make editing or splicing selections over a longer period of time. Moreover, a user may select in the GUI 334, which audio from the N received video streams to playback while viewing the one or more concurrent video streams. Other video motion, graphics, or sound effects to be included in the“final cut” video stream may also be selected or input by the user in the video stream design GUI 334. Although the design and editing of the“final cut” video stream is performed on the producer client device 110, the servers 120 and 124 perform the actual stitching, switching, or modifying of the video streams 140 to create the“final cut” version video stream, and the transmission of the“final cut” version video stream to the audience client devices 210. In this manner, the“final cut” version video stream may be designed or produced in real time as the N video streams 140 are processed at the producer client device 110.
[0035] The stitch command generator 336 is operable to receive the user’s selections made in the video stream design GUI 334 for designing or modify the“final cut” version video stream, and generate a command to be sent to the servers 120 and/or 124. The command is transmitted via the communication interface 320 and a wireless or wireline network such as the Internet. The command may include metadata 342 corresponding to one or more of the N video streams. For example, the command may include an identifier of the “final cut” video stream or clip, a splice time, a video stream ID for a stream to be removed at the splice time, and/or a video stream ID to be replaced or stitched at the splice time. Other information for video motion or graphics effects to be displayed in the final cut version video stream may also be included in the command sent to the servers 120 and/or 124.
[0036] The memory 340 stores, among other things, the video stream metadata that may be utilized by the stitch command generator 336 when generating the command to be sent to the servers 120 and/or 124. The metadata may pertain to the N received video streams and may include metadata received from the servers 120 and/or 124, data generated by the producer client device 300, or data generated by user input. The memory 340 further stores, among other things, program instructions 344 that when executed by the one or more electronic processors 338, cause the electronic processors 338 to perform all or a portion of the functions described herein of the producer client device 300. In some embodiments, the program instructions 344 may cause the electronic processors 338 to perform all or a portion of the functions described herein of the contributor client devices 112 and/or the audience client devices 210. [0037] FIG. 4 is a block diagram of a computer system 400 for generating an output video stream by stitching portions of multiple video streams together, based on commands received from a wireless mobile device. The computer system 400 may stream the output video to a plurality of remote devices, for example, via a wireless network and/or the Internet. Referring to Figure 4, a computer system 400 includes suitable logic, circuitry, interfaces and code, which enables the server 400 to perform operations similar to or the same as operations of the server 120 and/or the server 124. Although a single physical computer system 400 is illustrated in FIG. 4, the computer system 400 may comprise multiple physical resources, for example, physical systems that may be communicatively coupled via a network and support cloud computing services.
[0038] In various embodiments, the computer system 400 may be any of various types of devices, including, but not limited to, a personal computer system, desktop computer, laptop, notebook, or netbook computer, mainframe computer system, handheld computer, mobile telephone, workstation, network computer, a camera, a set top box, a mobile device, a consumer device, video game console, handheld video game device, application server, storage device, a peripheral device such as a switch, modem, router, or another type of computing or electronic device. The computer system 400 is an example of a computer system that may be configured to implement the server 120 and/or the server 124.
[0039] In the illustrated embodiment, computer system 400 includes one or more processors 410 coupled to a system memory 420 via an input/output (I/O) interface 430. Computer system 400 further includes a network interface 442 coupled to I/O interface 430, that is operable to transmit and receive video streams, for example, the N video streams 140, the video stream 144, and the output“final cut” video stream. The computer system 400 may include one or more input/output devices 450, such as cursor control device, a keyboard, and display(s). In some embodiments, it is contemplated that embodiments may be implemented using a single instance of computer system 400, while in other embodiments multiple such systems, or multiple nodes making up computer system 400, may be configured to host different portions or instances of embodiments. For example, in one embodiment some elements may be implemented via one or more nodes of computer system 400 that are distinct from those nodes implementing other elements.
[0040] In various embodiments, computer system 400 may be a uniprocessor system including one processor 410, or a multiprocessor system including several processors 410 (e.g., two, four, eight, or another suitable number). Processors 410 may be any suitable processor capable of executing instructions. For example, in various embodiments, processors 410 may implement any of a variety of instruction set architectures (IS As), such as the x86, PowerPC, SPARC, or MIPS ISAs, or any other suitable ISA. In multiprocessor systems, each of processors 410 may commonly, but not necessarily, implement the same ISA.
[0041] In some embodiments, at least one processor 410 may be a graphics processing unit. A graphics processing unit or GPU may be considered a dedicated graphics-rendering device for a personal computer, workstation, game console or other computing or electronic device. Modem GPUs may be very efficient at manipulating and displaying computer graphics, and their highly parallel structure may make them more effective than typical CPUs for a range of complex graphical algorithms. For example, a graphics processor may implement a number of graphics primitive operations in a way that makes executing them much faster than drawing directly to the screen with a host central processing unit (CPU). In various embodiments, the image or video processing methods disclosed herein may, at least in part, be implemented by program instructions configured for execution on one of, or parallel execution on two or more of, such GPUs. The GPU(s) may implement one or more application programmer interfaces (APIs) that permit programmers to invoke the functionality of the GPU(s). Suitable GPUs may be commercially available from vendors such as NVIDIA Corporation, ATI Technologies (AMD), and others.
[0042] System memory 420 may be configured to store program instructions 422 accessible by the one or more processors 410. In various embodiments, system memory 420 may be implemented using any suitable memory technology, such as static random access memory (SRAM), synchronous dynamic RAM (SDRAM), nonvolatile/Flash-type memory, or any other type of memory. In the illustrated embodiment, program instructions and data implementing desired functions, such as those described above for various embodiments, are shown stored within system memory 420 as program instructions 422, video stream metadata 424, and logging data 128. In other embodiments, program instructions and/or data may be received, sent or stored upon different types of computer-accessible media or on similar media separate from system memory 420 or computer system 400. Moreover, in some embodiments, the database 454 that is accessible via a network (not shown) may store, among other things, video stream data, video stream metadata and data for use in an interactive graphical interface that may be implemented on the computer system 400. Generally speaking, a computer-accessible medium may include storage media or memory media such as magnetic or optical media, e.g., disk or CD/DVD-ROM coupled to computer system 400 via I/O interface 430. Program instructions and data stored via a computer- accessible medium may be transmitted by transmission media or signals such as electrical, electromagnetic, or digital signals, which may be conveyed via a communication medium such as a network and/or a wireless link, such as may be implemented via network interface 440.
[0043] In one embodiment, I/O interface 430 may be configured to coordinate I/O traffic between processor 410, system memory 420, and any peripheral devices in the device, including network interfaces 442 or other peripheral interfaces, such as input/output devices 450. In some embodiments, I/O interface 430 may perform any necessary protocol, timing or other data transformations to convert data signals from one component (e.g., system memory 420, or buffer 440) into a format suitable for use by another component (e.g., processor 410). In some embodiments, I/O interface 430 may include support for devices attached through various types of peripheral buses, such as a variant of the Peripheral Component Interconnect (PCI) bus standard or the Universal Serial Bus (USB) standard, for example. In some embodiments, the function of I/O interface 430 may be split into two or more separate components, such as a north bridge and a south bridge, for example. In addition, in some embodiments some or all of the functionality of I/O interface 430, such as an interface to system memory 420, may be incorporated directly into processor 410.
[0044] Network interface 442 may be configured to allow data to be exchanged between computer system 400 and other devices attached to a network, such as other computer systems, the database 454, the producer client device 300, the contributor client devices 112, the audience client devices 210, or between nodes of computer system 400. In various embodiments, the network interface 442 may support communication via wired or wireless general data networks, such as any suitable type of Ethernet network, for example; via telecommunications/telephony networks such as analog voice networks or digital fiber communications networks; via storage area networks such as Fiber Channel SANs, or via any other suitable type of network and/or protocol.
[0045] Input/output devices 450 may, in some embodiments, include one or more display terminals, keyboards, keypads, touchpads, scanning devices, voice or optical recognition devices, or any other devices suitable for entering or retrieving data by one or more computer system 400. Multiple input/output devices 450 may be present in computer system 400 or may be distributed on various nodes of computer system 400. In some embodiments, similar input/output devices may be separate from computer system 400 and may interact with one or more nodes of computer system 400 through a wired or wireless connection, such as over network interface 1240.
[0046] Those skilled in the art will also appreciate that, while various items are illustrated as being stored in memory or on storage while being used, these items or portions of them may be transferred between memory and other storage devices for purposes of memory management and data integrity. Alternatively, in other embodiments some or all of the software components may execute in memory on another device and communicate with the illustrated computer system via inter-computer communication. Some or all of the system components or data structures may also be stored (e.g., as instructions or structured data) on a computer-accessible medium or a portable article to be read by an appropriate drive, various examples of which are described above. In some embodiments, instructions stored on a computer-accessible medium separate from computer system 400 may be transmitted to computer system 400 via transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as a network and/or a wireless link. Various embodiments may further include receiving, sending or storing instructions and/or data implemented in accordance with the foregoing description upon a computer- accessible medium. Accordingly, the present embodiments may be practiced with other computer system configurations.
[0047] Various embodiments may further include receiving, sending or storing instructions and/or data implemented in accordance with the foregoing description upon a computer-accessible medium. Generally speaking, a computer-accessible medium may include storage media or memory media such as magnetic or optical media, e.g., disk or DVD/CD-ROM, volatile or non-volatile media such as RAM (e.g. SDRAM, DDR, RDRAM, SRAM, etc.), ROM, etc., as well as transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as network and/or a wireless link.
[0048] Video stream dataflow between computer system 400 and one of the client devices 300, 112, or 210 is described with respect to FIGS. 1-3. The computer system 400 may concurrently receive a plurality of input video streams from a plurality of remote wireless devices via the network interfaces 442 and a network (e.g., a LAN and/or WAN). Each of the plurality of concurrent input video streams may comprise multiplexed video/audio data. Some of the plurality of multiplexed video/audio data streams may include video data compressed at a different data rate and the plurality of streams may be received with different latency. The plurality of multiplexed video/audio data streams may be stored in an incoming buffer of the buffers 440. The buffers 440 may relay the plurality multiplexed video/audio data streams to the A/V de-multiplexing system 432 that may concurrently de multiplex each stream into two separate sets of data including video and an audio information. Integrity of each of the two sets of data may be checked. The transcoders 426 may concurrently transcode the plurality of video streams to reduce the size of each of the plurality of video streams such that each of the plurality of video streams consume a transmission bandwidth appropriate for concurrently transmitting the plurality of video streams in one video stream via a wireless network to a mobile wireless device. The reduction in size of each of the plurality of video streams may be appropriate for concurrently viewing the plurality of video streams in a display screen such that the display screen appears as a composite image comprising the plurality of video streams. Transmission of the plurality of transcoded video streams via the network interfaces 442 to the producer client device may be synchronized by the stream synchronizing system 434.
[0049] In some embodiments, the stream design GUI generator 456 is operable to generate a display screen GUI or web page having a plurality of widows or display ports for concurrently displaying one or more of the N video streams at the producer client device 300 in the display 346, and playing corresponding audio via the speaker 348. For example, the GUI generator 456 may define a display screen for the video stream design GUI 334. The stream design GUI generator 456 generates the interactive display screen that allows a user to view one or more of the N video streams concurrently, and it defines interactive elements that allow the user to remotely design, edit, or in general,“produce” a single video stream for an audience. The audience version video stream comprises one or more of the N video streams 140 that are stitched into a“final cut” video stream that is processed and transmitted to the audience client devices 210.
[0050] In one embodiment, the generated interactive display screen may be transmitted to the producer client device 300 with the N video streams, transcoded by the transcoded by the transcoders 426 and synchronized by the stream synchronizing system 434. The N video streams may be viewed at the producer client device 300 in the interactive display screen. A user may tap on a window displaying one of the N video streams at a particular time or frame position to indicate which video stream should be inserted into the“final cut” version video stream. As a result, the stitch command generator 336 may generate a command based on the frame time/location of the user’s tap, and a video stream ID corresponding to the tapped video stream. In some embodiments more than one stream may be selected to cause a split screen effect in the final cut version including the more than one selected video streams. Other interactive GUI elements may allow the user to design the audience“final cut” stream by user selection or user input via a touch screen, voice input, a mechanical button, or keying, for example. Moreover, in some embodiments, the video stream design GUI 334 may slow the playback speed or allow rewind of the one or more displayed video streams, in response to a user selection, to enable the user to make editing or splicing selections over a longer period of time. Moreover, a user may select in the GUI 334, which audio from the N received video streams to playback while viewing the one or more concurrent video streams. Other video motion, graphics, or sound effects to be included in the“final cut” video stream may also be selected or input by the user in the video stream design GUI 334. Although the design and editing of the“final cut” video stream is performed on the producer client device 110, the servers 120 and 124 in the computer system 400 perform the actual stitching, switching, or modifying of the video streams 140 to create the“final cut” version video stream. The computer system 400 also transmits of the“final cut” version video stream to the audience client devices 210. In this manner, the“final cut” version video stream may be designed or produced in real time at the producer client device 110, as the N video streams 140 are processed at the server 400 to transmit the“final cut” version video stream to the audience client devices 210.
[0051] One or more of the plurality of input video streams may be processed by the transcoders 426 for streaming to one or more audience client devices 210 as the audience version video stream. Based on a command received from the remote producer wireless device 300, a user selected one of the input video streams may be stitched into the audience version video stream at a specified time or position by the compute engine 126 and splice handler 122. The audience version video stream data may be encoded by the transcoder 426, multiplexed by the multiplexor 430 and buffered in an output buffer of the buffer 440 for transmission to one or more audience client devices 210. [0052] FIG. 5 is a flow chart for stitching video streams at a server based on commands received from a wireless mobile device to generate an output video stream.
[0053] In step 510, a plurality of video streams 140 are received by the server computer system 400 from a plurality of contributor wireless mobile devices 112. In one embodiment, each of the contributor wireless mobile devices may capture video and transmit video streams to the server computer 400 via a wireless cellular network. However, the disclosure is not limited in this regard, and any suitable communication networks may be utilized for transmission of the video streams 140 to the server computer 400.
[0054] In step 520, the plurality of video streams 140 is stored in the buffer 440 and transcoded by the transcoder 426. Positions of the plurality of video streams are tracked in the logging module 128.
[0055] In step 530, the plurality of video streams 140 are transmitted to the producer wireless mobile device 110 at a first delay time. In one embodiment, the plurality of video streams 140 may be transmitted to the producer wireless mobile device 110 via a wireless cellular network. However, the disclosure is not limited in this regard, and any suitable communication networks may be utilized to transmit the video streams 140 to the producer wireless mobile device 110.
[0056] In step 540, a first stream ID selection is received from the producer wireless mobile device 110, and the selected first video stream is transmitted to a plurality of audience wireless mobile devices at a second delay, as a current audience video stream. In one embodiment, the first stream ID selection is transmitted by the producer wireless mobile device 110 via a wireless cellular network to the server computer 400. However, the disclosure is not limited in this regard, and any suitable communication networks may be utilized for transmission of the first stream ID selection to the server computer 400.
[0057] In step 550, a video stitch event command including a second stream ID selection and a splice time is received from the producer wireless mobile device 110, and the selected second video stream is stitched into the transmission of the current audience video stream at the splice time. The selected second video stream may replace the selected first video stream in the current audience video stream, or be added to the current audience video stream. [0058] FIG. 6 is a flow chart for designing an output video stream based on multiple video streams at a wireless mobile device, and generating video stitching commands to a server for generating the output video stream.
[0059] In step 610, a plurality of video streams 144 are received at a producer wireless mobile device 110 from a server computer system 400, where the plurality of video streams 144 comprise the plurality of video streams 140. In one embodiment, the producer wireless mobile device 110 and the server computer system 400 may communicate via a wireless cellular network. However, the disclosure is not limited in this regard, and any suitable communication networks may be utilized for communication between the producer wireless mobile device 110 and the server computer 400.
[0060] In step 620, the plurality of video streams 144 are displayed at the producer wireless mobile device 110 in a video stream design GUI 334.
[0061] In step 630, a user selection of a first one of the video streams 144 is received via the video stream design GUI 334 for use as a current audience view video stream at the server computer system 400.
[0062] In step 640, a video stream stitch command is generated by the video stitch command generator 336 and is transmitted to the server computer system 400. The video stream stitch command includes a stream ID of the first selected video stream for transmission as the current audience view video stream by the server computer system 400. In one embodiment, the current audience view video stream may be transmitted by the computer system 400 to the audience devices via a wireless cellular network. However, the disclosure is not limited in this regard, and any suitable communication networks may be utilized for communication of the current audience view video stream to the audience devices.
[0063] In step 650, a user selection of a second video stream from the plurality of video streams is received via the video stream design GUI 334, for a user selected splice time or frame position. The second selected video stream is associated with a second stream ID.
[0064] In step 660, a second video stream stitch command is generated by the video stitch command generator 336 and is transmitted to the server computer system 400. The video stream stitch command includes the first video stream ID, the second video stream ID, and a splice time for stitching the second selected video stream into the current audience view video stream. The second selected video stream may replace the first selected video stream in the current audience view video stream at the splice time, or it may be added to the current audience view video stream at the splice time.
[0065] The various methods as illustrated in the Figures and described herein represent example embodiments of methods. The methods may be implemented in software, hardware, or a combination thereof. The order of method may be changed, and various elements may be added, reordered, combined, omitted, modified, etc.
[0066] Various modifications and changes may be made as would be obvious to a person skilled in the art having the benefit of this disclosure. It is intended that the present embodiments embrace all such modifications and changes and, accordingly, the above description to be regarded in an illustrative rather than a restrictive sense.
[0067] Thus, the embodiments provide, among other things, systems and methods for dynamically configuring display of a plurality of documents based on a document review task associated with the plurality of documents. Various features and advantages are set forth in the following claims.

Claims

1. A wireless mobile device for controlling the production of a remote video stream, the wireless mobile device comprising:
processor electronic circuitry;
a communication interface coupled to a wireless network;
video and audio processing electronic circuitry; and
a memory communicatively coupled to the processor electronic circuitry, the memory storing program instructions that when executed by the processor electronic circuitry, cause the processor electronic circuitry to:
receive via the communication interface and the wireless network, a video stream comprising a plurality of constituent video streams from a remote server computer system;
process the video stream by the video and audio processing electronic circuitry of the wireless mobile device to generate the plurality of constituent video streams for display;
generate a graphical user interface and display the plurality constituent video streams via the graphical user interface, wherein the graphical user interface includes user selection elements for selecting from the plurality of constituent video streams;
receive a user selection of one of the constituent video streams via the user selection elements of the graphical user interface;
generate a video stitch command comprising video stream metadata based on the user selections in the graphical user interface; and
transmit the video stitch command to the remote server computer system.
2. The wireless mobile device of claim 1, wherein the remote server computer system includes and a logging manager, wherein the logging manager is configured to log the video stream metadata for each of the plurality of constituent video stream.
3. The wireless mobile device of claim 2, wherein the remote server computer system includes a splice handle configured to receive the video stitch command, wherein the splice handle is configured to determine a splice time in a final cut video stream based on the video stitch command.
4. The wireless mobile device of claim 3, wherein the remote server computer system includes a compute engine, the compute engine configured to transmits the final cut video stream to a plurality of audience client devices.
5. The wireless mobile device of claim 4, wherein the remote server computer receives a second video stitch command from the splice handler corresponding to a newly identified video input stream.
6. The wireless mobile device of claim 5, wherein the compute engine is configured to stitch the newly identified video input stream into the final cut video stream, and wherein the logging manager is configured to log the newly identified video input as the final cut video stream.
7. The wireless mobile device of claim 1, wherein at least one of the processor electric circuitry or the video and audio processing electronic circuitry includes a graphics processing unit configured to implement one or more application programmer interfaces on the graphical user interface.
8. The wireless mobile device of claim 1, wherein the video and audio processing electronic circuitry includes a plurality of transcoders configured to concurrently process the plurality of constituent video streams to reduce the size of each of the plurality of constituent video streams to synchronize concurrent streaming of the plurality of constituent video streams configured to be displayed.
9. A method for controlling the production of a remote video stream by a wireless mobile device, the method comprising:
in a wireless mobile device:
receiving via a communication interface coupled to a wireless network, a video stream comprising a plurality of constituent video streams from a remote server computer system;
processing the video stream by video and audio processing electronic circuitry of the wireless mobile device to generate the plurality of constituent video streams for display by the wireless mobile device;
generating a graphical user interface and displaying the plurality constituent video streams via the graphical user interface, wherein the graphical user interface includes user selection elements for selecting from the plurality of constituent video streams in the wireless mobile device;
receiving a user selection of one of the constituent video streams via the user selection elements of the graphical user interface;
generating, by electronic processing circuitry, a video stitch command comprising video stream metadata based on the user selection in the graphical user interface; and
transmitting the video stitch command to the remote server computer system.
10. The method of claim 9, the method in the remote server computer further including logging the video stream metadata for each of the plurality of constituent video stream via a logging manager.
11. The method of claim 10, further comprising receiving the video stitch command via a splice handle, determining a splice time in a final cut video stream based on the video stitch command, and transmitting the final cut video stream to a plurality of audience client devices via a compute engine.
12. The method of claim 11, further comprising receiving a second video stitch command from the splice handler corresponding to a newly identified video input stream.
13. The method of claim 12, further comprising receiving a stitching the newly identified video input stream into the final cut video stream via the compute engine, and logging the newly identified video input as the final cut video stream via the logging manager.
14. A server computer system for generating an outgoing video stream designed by a remote mobile wireless device, the server computer system comprising:
processor electronic circuitry;
a communication interface communicatively coupled to a wireless network;
video and audio processing electronic circuitry;
a display page generator; and
a memory communicatively coupled to the processor electronic circuitry, the memory storing program instructions that when executed by the processor electronic circuitry, cause the processor electronic circuitry to:
receive a plurality of input video streams, the plurality of input video streams concurrently received via the communication interface, each of the plurality of input video streams corresponding to a stream identifier;
cause the video and audio processing electronic circuitry to concurrently process the plurality of input video streams to reduce the size of each of the plurality of video streams for synchronized concurrent streaming of the processed plurality of video streams to a wireless device;
generate a display page comprising an interactive graphical user interface for concurrently viewing each of plurality of processed video streams and receiving user selections;
transmit to the remote mobile wireless device via the communication interface, the display page and a video stream comprising constituent video streams including the processed plurality of video streams;
receive a stitch command from the remote mobile wireless device comprising a stream identifier corresponding to a user selected one of the constituent processed plurality of video streams and a splice time; and
stitch to the outgoing video stream at the splice time one of the received plurality of input video streams that corresponds to the stream identifier received in the stitch command received from the remote mobile wireless device.
15. The server computer system of claim 14, wherein the plurality of input video streams are transmitted to the remote mobile wireless device at a first delay time, wherein the user selected one of the constituent processed plurality of video streams, defined as a first selected video stream, and is transmitted to a plurality of audience wireless mobile devices at a second delay as the outgoing video.
16. The server computer system of claim 15, wherein the remote mobile wireless device generates a second stitch command including a second video stream identifier based on a second selected video stream from one of the constituent processed plurality of video streams and a second splice time.
17. The server computer system of claim 16, wherein the server computer system stitches the selected second video stream into the outgoing video, in place of or in additional to the first selected video stream based on the second stitching command.
18. A system for controlling the production of a remote video stream by a wireless mobile device, the system comprising:
a wireless mobile producer client device;
a plurality of wireless mobile contributor client devices; and a
a server computer system communicatively coupled to the wireless mobile producer client device and the plurality of wireless mobile contributor client devices via a wireless network; wherein:
each of the plurality of wireless mobile contributor devices transmits, via the wireless network, a video stream to the server computer system;
the sever computer system transmits to the wireless mobile producer client device, via the wireless network, a video stream comprising a plurality of constituent video streams that include the video streams transmitted by the plurality of wireless mobile contributor client devices;
the wireless mobile producer client device receives the video stream comprising the plurality of constituent video streams and displays each of the constituent video streams in a graphical user interface;
the wireless mobile producer client device generates a stitching command to the server computer system via the wireless network, the stitching command comprising a video stream ID and a splicing time that is determined based on a user selection of one of the constituent video streams in the graphical user interface; and
the server computer system selects one of the video streams received from the plurality of wireless mobile contributor client devices based on the video stream ID received from the producer wireless mobile device in the stitching command, and stitches the selected video stream from the plurality of wireless mobile contributor devices into an output video stream based on the splicing time received from the producer wireless mobile device in the stitching command.
19. The system of claim 18, wherein the wireless mobile producer client device generates a second stitch command including the first video stream ID, a second video stream ID based on a second selected video stream, and a second splice time for stitching the second selected video into the output video.
20. The system of claim 19, wherein the server computer system stitches the selected second video stream into the output video in place of or in additional to the first selected video stream based on the second stitching command.
PCT/US2018/065776 2017-12-14 2018-12-14 Method and system for cloud video stitching WO2019118890A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762598502P 2017-12-14 2017-12-14
US62/598,502 2017-12-14

Publications (1)

Publication Number Publication Date
WO2019118890A1 true WO2019118890A1 (en) 2019-06-20

Family

ID=66820719

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/065776 WO2019118890A1 (en) 2017-12-14 2018-12-14 Method and system for cloud video stitching

Country Status (1)

Country Link
WO (1) WO2019118890A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113038287A (en) * 2019-12-09 2021-06-25 上海幻电信息科技有限公司 Method and device for realizing multi-user video live broadcast service and computer equipment
WO2024082561A1 (en) * 2022-10-20 2024-04-25 腾讯科技(深圳)有限公司 Video processing method and apparatus, computer, readable storage medium, and program product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100304731A1 (en) * 2009-05-26 2010-12-02 Bratton R Alex Apparatus and method for video display and control for portable device
US20160191961A1 (en) * 2014-12-31 2016-06-30 Imagine Communications Corp. Fragmented video transcoding systems and methods
US20160247537A1 (en) * 2015-02-24 2016-08-25 Plaay, Llc System and method for creating a sports video
US20170287200A1 (en) * 2016-04-05 2017-10-05 Qualcomm Incorporated Dual fisheye image stitching for spherical image content

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100304731A1 (en) * 2009-05-26 2010-12-02 Bratton R Alex Apparatus and method for video display and control for portable device
US20160191961A1 (en) * 2014-12-31 2016-06-30 Imagine Communications Corp. Fragmented video transcoding systems and methods
US20160247537A1 (en) * 2015-02-24 2016-08-25 Plaay, Llc System and method for creating a sports video
US20170287200A1 (en) * 2016-04-05 2017-10-05 Qualcomm Incorporated Dual fisheye image stitching for spherical image content

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113038287A (en) * 2019-12-09 2021-06-25 上海幻电信息科技有限公司 Method and device for realizing multi-user video live broadcast service and computer equipment
CN113038287B (en) * 2019-12-09 2022-04-01 上海幻电信息科技有限公司 Method and device for realizing multi-user video live broadcast service and computer equipment
US11889132B2 (en) 2019-12-09 2024-01-30 Shanghai Hode Information Technology Co., Ltd. Method and apparatus for implementing multi-person video live-streaming service, and computer device
WO2024082561A1 (en) * 2022-10-20 2024-04-25 腾讯科技(深圳)有限公司 Video processing method and apparatus, computer, readable storage medium, and program product

Similar Documents

Publication Publication Date Title
CN111386708B (en) System and method for broadcasting live media streams
CN106792092B (en) Live video stream split-mirror display control method and corresponding device thereof
US10728586B2 (en) System and method for controlling media content capture for live video broadcast production
EP3562163B1 (en) Audio-video synthesis method and system
US10930318B2 (en) Gapless video looping
JP6610555B2 (en) Reception device, transmission device, and data processing method
CN110351493B (en) Remote cloud-based video production system in an environment with network delay
US11516518B2 (en) Live streaming with live video production and commentary
AU2008101244A4 (en) Device and method for synchronisation of digital video and audio streams to media presentation devices
US10193944B2 (en) Systems and methods for multi-device media broadcasting or recording with active control
CN103190092A (en) A system and method for synchronized playback of streaming digital content
JP2005051703A (en) Live streaming broadcasting method, live streaming broadcasting apparatus, live streaming broadcasting system, program, recording medium, broadcasting method, and broadcasting apparatus
KR20080082759A (en) System and method for realizing vertual studio via network
EP3748983B1 (en) Video playback method, terminal apparatus, and storage medium
EP3908006A2 (en) Systems and methods for real time control of a remote video production with multiple streams
CN113490007A (en) Live broadcast processing system, method, storage medium and electronic device
WO2019118890A1 (en) Method and system for cloud video stitching
CN114845136A (en) Video synthesis method, device, equipment and storage medium
JP2010157906A (en) Video display device
US11463747B2 (en) Systems and methods for real time control of a remote video production with multiple streams
CN114339405A (en) AR video data stream remote manufacturing method and device, equipment and storage medium
JP2014220572A (en) Content distribution system, distribution device, reception terminal, distribution program, and reception program
KR102403263B1 (en) Method, system, and computer readable record medium to implement fast switching mode between channels in multiple live transmission environment
KR102376348B1 (en) Method, system, and computer readable record medium to implement seamless switching mode between channels in multiple live transmission environment
KR101067952B1 (en) Managing System for less traffic in video communication and Method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18889470

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18889470

Country of ref document: EP

Kind code of ref document: A1