US20170251231A1 - System and Method for Media Synchronization and Collaboration - Google Patents

System and Method for Media Synchronization and Collaboration Download PDF

Info

Publication number
US20170251231A1
US20170251231A1 US15/596,916 US201715596916A US2017251231A1 US 20170251231 A1 US20170251231 A1 US 20170251231A1 US 201715596916 A US201715596916 A US 201715596916A US 2017251231 A1 US2017251231 A1 US 2017251231A1
Authority
US
United States
Prior art keywords
media
event
views
metadata
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/596,916
Inventor
Larry W. Fullerton
Mark D. Roberts
Dennis M. Weldy
Eric Fullerton
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gitcirrus LLC
Original Assignee
Gitcirrus LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/988,568 external-priority patent/US20160197837A1/en
Application filed by Gitcirrus LLC filed Critical Gitcirrus LLC
Priority to US15/596,916 priority Critical patent/US20170251231A1/en
Priority to US15/688,519 priority patent/US10694219B2/en
Publication of US20170251231A1 publication Critical patent/US20170251231A1/en
Assigned to GITCIRRUS LLC reassignment GITCIRRUS LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ROBERTS, MARK D., WELDY, DENNIS M., FULLERTON, ERIC
Assigned to GITCIRRUS LLC reassignment GITCIRRUS LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FULLERTON, LARRY W., L&M IP, ROBERTS, MARK D.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/022Electronic editing of analogue information signals, e.g. audio or video signals
    • G11B27/028Electronic editing of analogue information signals, e.g. audio or video signals with computer assistance
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/762Media network packet handling at the source 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data
    • H04N21/25841Management of client data involving the geographical location of the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2743Video hosting of uploaded data from client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4756End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for rating content, e.g. scoring a recommended movie
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/10Flow control; Congestion control
    • H04L47/28Flow control; Congestion control in relation to timing considerations
    • H04L47/283Flow control; Congestion control in relation to timing considerations in response to processing delays, e.g. caused by jitter or round trip time [RTT]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • H04L67/561Adding application-functional data or data for application control, e.g. adding metadata

Definitions

  • the present invention relates generally to a system and method for media synchronization and collaboration. More particularly, the present invention relates to a system and method for media synchronization and collaboration, where metadata is used to synchronize media allowing multiple views of an event recorded independently by multiple media recording devices to be synchronized and combined into a collaborative media file.
  • one aspect of the present invention involves an improved system for media synchronization and collaboration that includes a data storage, a plurality of media recording devices used by a plurality of users to independently record an event from multiple locations thereby producing a plurality of recorded media data corresponding to a plurality of views of the event, each of the plurality of media recording devices conveying to the data storage media data and metadata corresponding to their respective view of the event, the metadata including time samples in accordance with a common time reference, and a media player comprising a processor and a graphical user interface, the media player using the metadata to synchronize and play the plurality of views of the event.
  • the plurality of media recording devices can be a plurality of smart phones.
  • the graphical user interface may include a plurality of viewing windows used to display the plurality of views of the event.
  • Each of the plurality of viewing windows can have a corresponding selection icon of a plurality of selection icons.
  • the plurality of selection icons can be color-coded.
  • the plurality of selection icons can be used to select views of the plurality of views of the event to be playing during periods of time as part of an overall timeline of a composite video that consists of a sequence of selected views.
  • the graphical user interface can play the composite video.
  • the timeline can be color-coded to identify the view of the plurality of views that is playing during a given period of time of the timeline.
  • the graphical user interface can include a multi-view combination control that causes a sequence of different views of the same time period or overlapping time periods to be played in the composite video.
  • the metadata can include location and view angle data and the graphical user interface can provide the ability to display a location/view angle indicator window that indicates the location and view angle of each view of the plurality of views.
  • the location/view angle indicator window can have a selection icon that can be selected to cause it to be displayed in the composite video during a given period of time of the timeline.
  • the metadata can enable at least one of a still camera image, animation, digital score board, digital scorebook, or live video to be synchronized so that it can be displayed in the composite video during a given period of time of the timeline.
  • the media player can be configured to create a multiple view video data package comprising a plurality of media files and metadata files corresponding to multiple views of a recorded event.
  • the multiple view video data package may include a composite video produced using the plurality of media files and metadata files.
  • the media player can be configured to post the multiple view video data package to an internet media-sharing website thereby enabling users of the internet media-sharing website to at least one of comment on the composite video, rate the composite video, or download the plurality of media files and the plurality of metadata files.
  • Another aspect of the present invention involves a method of media synchronization and collaboration, comprising recording an event using a plurality of media recording devices that independently record the event to produce a plurality of recorded media data corresponding to a plurality of views of the event, storing media data and metadata corresponding to the plurality of views of the event to a data storage, the metadata including time samples in accordance with a common time reference, and providing a media player comprising a processor and a graphical user interface that uses the metadata to synchronize and play the plurality of views of the event.
  • the method may also include producing a composite video by selecting using the graphical user interface a sequence of selected views of the plurality of views of the event that play during a periods of time as part of an overall timeline of the composite video.
  • The may also include creating a multiple view video data package comprising a plurality of media files and a plurality of metadata files used to produce the composite video.
  • the multiple view video data package may include the composite video.
  • the method may also include posting the multiple view video data package to an internet media-sharing website thereby enabling users of the internet media-sharing website to at least one of comment on the composite video, rate the composite video, or download the plurality of media files and the plurality of metadata files.
  • FIG. 1 depicts a user of a phone recording an event that is conveyed to the cloud
  • FIG. 2A depicts a first exemplary method in accordance with the invention
  • FIG. 2B depicts a second exemplary method in accordance with the invention
  • FIG. 2C depicts a third exemplary method in accordance with the invention.
  • FIG. 2D depicts a fourth exemplary method in accordance with the invention.
  • FIG. 3A depicts a fifth exemplary method in accordance with the invention.
  • FIG. 3B depicts a sixth exemplary method in accordance with the invention.
  • FIG. 3C depicts a seventh exemplary method in accordance with the invention.
  • FIG. 3D depicts an eighth exemplary method in accordance with the invention.
  • FIG. 4A depicts a ninth exemplary method in accordance with the invention.
  • FIG. 4B depicts a tenth exemplary method in accordance with the invention.
  • FIG. 4C depicts an eleventh exemplary method in accordance with the invention.
  • FIG. 5A depicts a twelfth exemplary method in accordance with the invention
  • FIG. 5B depicts a thirteenth exemplary method in accordance with the invention.
  • FIG. 5C depicts an fourteenth exemplary method in accordance with the invention.
  • FIG. 6 depicts a fifteenth exemplary method in accordance with the invention.
  • FIG. 7 depicts an exemplary computing architecture in accordance with the invention.
  • FIG. 8 depicts an exemplary graphical user interface of a media player for playing a single view of an event using media and metadata corresponding to the event in accordance with the invention
  • FIG. 9 depicts an exemplary graphical user interface of a multiple view media player for producing a composite video from media and metadata corresponding to multiple views of an event in accordance with the invention.
  • FIG. 10 depicts an exemplary graphical user interface of a multiple view media player for producing a composite video from media and metadata corresponding to multiple views of a baseball game in accordance with the invention.
  • a media recording device interfaced with a cloud computing environment (i.e., the “cloud”) conveys a recorded media to the cloud in near real-time while the media is being recorded by the media recording device, where the conveying of the media to the cloud may be based upon the occurrence of a user-defined event and/or a user command, where the media may or may not be encrypted and storage of a local copy of the media is optional.
  • the media recording device may also upload to the cloud a previously recorded media stored on the media recording device.
  • the uploading of the previously recorded media may be in accordance with a schedule and/or upon the occurrence of a user-defined event, where the media may be encrypted when stored on the media recording device and subsequently uploaded to the cloud or stored on the media recording device as unencrypted data that is then encrypted as it is uploaded to the cloud or after it has been uploaded to the cloud (i.e., by a cloud service provider as opposed to encryption by the media recording device). Rules may be established and enforced for forbidding access, modification and/or erasure from the recording device and/or the cloud.
  • a first recording device forwards a media to a second recording device that conveys the media to the cloud, where the first recording device could be a service interfacing with multiple second recording devices and which could be configured to have substantial network bandwidth significantly greater than that required and typically used by the individual second recording devices.
  • data latency rules can be established to control whether various optional functions pertaining to processing of a recorded media (e.g., encryption, local storage, adding of metadata, etc.) are performed or not or an amount or rate corresponding to a processing function being performed (e.g., data compression level, data sampling rate, data buffering amount, resolution, etc.) for a given recorded media.
  • Data latency rules may be based on one or more established data latency thresholds corresponding to one or more user-defined conditions. For example, a data compression rate and/or data sampling rate may be controlled in real-time as required to keep data latency below an established data latency threshold.
  • a data latency threshold may be conditional in that it might be modified or overridden, for example, based on an event such as a sensed event. For example, given a sensed fire condition, all unnecessary processing may be avoided regardless of an established data latency threshold.
  • a user may establish a set of parameters relating to which media data processing functions are to be used or not used and the extent to which they may be used (e.g., sampling rate, amount of data compression, data resolution) so as to control data latency while meeting certain user requirements (e.g., user always wants encryption), where media data processing functions are either turned on or off or an extent (e.g., rate, resolution) changed based on criteria or rules established by a user.
  • Such criteria may determine a sampling rate, an amount of data compression, the extent of which metadata is added.
  • a user may establish a parameter where video is to be captured at a 1080i (i.e., 1920 ⁇ 1080) resolution but, when a certain criteria is met, the resolution is to change to a different resolution, which may be, for example, a lower resolution (e.g., 702p) or higher resolution (8640p).
  • a 1080i i.e., 1920 ⁇ 1080
  • the resolution is to change to a different resolution, which may be, for example, a lower resolution (e.g., 702p) or higher resolution (8640p).
  • a command may be provided by the user of a media recording device to change a mode of operation.
  • an “upload button” might be pressed or an upload voice command may be spoken to cause an upload to the cloud function to be started immediately.
  • a “fast upload button” might be pressed or a fast upload function otherwise initiated to cause an upload function to be initiated under conditions that provide for minimal data latency.
  • a user may establish one or more events that correspond to ‘fast load triggers’, whereby the occurrence of such an event causes the fast upload function to be initiated.
  • a data latency indicator may be provided which might be a number and/or a color or some other indicator.
  • an event may result in other applications running on a device (e.g., cell phone) being turned off in order to speed up performance of the device or otherwise reduce data latency.
  • a device e.g., cell phone
  • access controls can be employed to prevent unauthorized access to or deletion of a recorded media stored on the media recording device and/or on the cloud.
  • One or more deletion events can also be defined by a user, where a local copy of recorded media will be automatically deleted from the media recording device based on the occurrence of a deletion event.
  • a media recording device can include a video recording device and/or an audio recording device, for example, a camera and a microphone of a mobile phone or a Bluetooth or Wi-Fi connected device, and the media can be, for example, video (still picture or movie) and/or audio data recorded by the video and audio recording devices of the phone, where the recorded media is in the form of digital data.
  • a video recording device and/or an audio recording device for example, a camera and a microphone of a mobile phone or a Bluetooth or Wi-Fi connected device
  • the media can be, for example, video (still picture or movie) and/or audio data recorded by the video and audio recording devices of the phone, where the recorded media is in the form of digital data.
  • media recording devices include a media recording device located in a home or business, a media recording device (e.g., dash cam) located in a vehicle (e.g., car, truck, emergency vehicle, plane, helicopter, drone, etc.), a media recording device (e.g., body cam) associated with a person or animal, or a media recording devices associated with a fixed object (e.g., bridge, tree, light post, gas pump, etc.).
  • Recorded media may be in the form of text, for example, where an audio recording device converts voice to text.
  • Video data may correspond to a picture or video taken from the front (display side) of a cell phone or from the back of a cell phone, or both, which may be taken simultaneously. As such, an event being filmed may be captured at the same time the user of a media recording device is captured (e.g., a selfie).
  • Encryption of media data may for example involve use of a symmetric key encryption scheme or a public key encryption scheme. Encryption may be performed by the media recording device such that a local copy can be stored in encrypted form and/or media is conveyed to the cloud in encrypted form. Alternatively, media may be conveyed in unencrypted form and then encrypted by a cloud service provider as it is being received and stored on the cloud or sometime after it has been received and stored on the cloud.
  • Various data access and user authentication methods can be employed to control access and/or deletion to data stored on the media recording device or on the cloud. Such methods may include a password, a signature, or a biometric such as an eye scan, a fingerprint scan, a facial image (i.e., individual's photo, a selfie), a recognized voice, or the like. At least one physical key (e.g., a dongle) may be required to access data, where multiple keys (or dongles) distributed to multiple persons may be required to access or delete data.
  • a third party authentication service provider might be used such as VeriSign.
  • access and/or data deletion may be limited to a certain time period, require a certain aging of data (i.e., an elapsed period of time), require an event to have occurred (such as described below), require the media recording device to be in a certain location, etc.
  • the concepts described below relating to user-defined events being used to determine the starting and stopping of recording of a media recording device and corresponding conveying of recorded media to the cloud or the uploading of previously recorded media data to the cloud can also be applied for controlling access to and deletion of media stored on the cloud or on the media recording device.
  • a media can be identified that cannot be deleted from the media recording device and/or from the cloud under any circumstance or without participation by a third party given control over such access/deletion such as an attorney, an editor of a publication, or some third party service.
  • a user 100 of a media recording device 102 runs an application (or ‘app’) or otherwise selects a mode of operation of the media recording device 102 that causes a media 104 to be conveyed to the cloud 106 while the media 104 is being recorded by the media recording device 102 upon the occurrence of a user command.
  • the media may be encrypted by the media recording device prior to the media being conveyed to the cloud. Alternatively, the media may be conveyed to cloud in unencrypted form where it then may or may not be encrypted.
  • storage of a local copy of the media on the media recording device is optional, where the media may be conveyed directly to the cloud and may never actually be stored locally on the media recording device.
  • a user of a mobile phone such as an HTC® phone may record an event with their phone and the media data would be conveyed directly to the cloud as it was being recorded with very low latency between the time a given data packet is recorded until it is stored on the cloud, where the data corresponding to the event would not be stored or otherwise be present on the phone unless it is necessary to temporarily buffer data for some required reason, for example, due to a poor or non-existent data connection between a first and second media recording device or between a media recording device and the cloud.
  • the user may choose to store a local copy of the media data on the phone while also conveying the media data directly to the cloud as the media is being recorded, where the local copy of the media data may be encrypted or unencrypted and where the data conveyed to the cloud may be encrypted prior to being conveyed to the cloud or the data may be encrypted after it has been conveyed to the cloud.
  • unencrypted data might be conveyed to a cloud service provider that encrypts the data it receives from the media recording device prior to storing it, where the time required to encrypt the media would not add to the data latency of the media recording device, but the data would be vulnerable while be conveyed to the cloud because it is in an unencrypted form.
  • a user can establish rules used to control whether such processing is performed or not depending on the conditions of a given situation. For example, a user might set up a rule whereby video footage of a business security system would be automatically encrypted and conveyed to the cloud upon the occurrence of a user-defined security alarm event but encryption should not be performed should a fire condition (e.g., fire alarm, a sensor detecting smoke or heat, etc.) also be detected.
  • a fire condition e.g., fire alarm, a sensor detecting smoke or heat, etc.
  • all sorts of rules can be employed to control the processing performed prior to conveyance of media data to the cloud based on one or more conditions so as to control data latency.
  • FIGS. 2A-2D depict exemplary methods corresponding to a user-activated near real-time mode, where the amount of data latency between the time a given data packet is recorded until it is stored on the cloud depends on whether or not media is encrypted and whether or not a local copy of the media (or encrypted media) is stored on the media recording device.
  • FIG. 2A one skilled in the art will recognize that once media has been received by the cloud it may be recorded by a cloud service provider prior to storage or at any time thereafter.
  • a media recording device is configured to automatically begin and/or stop recording media to be encrypted and conveyed to the cloud while the media is being recorded by a media recording device upon the occurrence of a defined event or events.
  • storage of a local copy of the media on the media recording device is optional, where the local copy of the media data may be encrypted or unencrypted while stored on the media recording device.
  • An event can be generally described as an occurrence that meets an established criterion, condition or rule that can be recognized by a control system (e.g., an application running on a cell phone).
  • a control system e.g., an application running on a cell phone.
  • an event may be based on a position of an object/person/animal/vehicle, for example, a cell phone's specific location (i.e., latitude, longitude) as might be determined by a location system such as a global positioning system (GPS).
  • GPS global positioning system
  • An event might correspond to a status of a media recording device, for example, a battery status or a signal strength status.
  • an event might be based on a position of an object relative to a location of another object, where both objects might be fixed or mobile.
  • Such location based events are commonly known as geolocation events where, generally, an event can be defined based on the location of one or more objects relative to one or more defined areas (e.g., a perimeter or a property or a building, or a room within a building) corresponding to one or more locations.
  • An event may be based on movement, lack of movement, or a change in movement (e.g., speed or direction) of an object, which might be detected using a compass.
  • an ‘impact threshold’ may be established corresponding to an abrupt movement change indicating an impact associated with a media recording device (e.g., hitting the ground, being in a vehicle crash, etc.).
  • An event may be based on a position, for example, a position of the phone within a coordinate system.
  • An event may relate to a movement of the phone or the non-movement of the phone, which might be detected using an accelerometer.
  • An event may relate to a detected movement, which might be detected by a motion or proximity detector/radar.
  • An event may be based on an orientation of an object, which might be measured using a 6-DOF measurement device. In one arrangement, the orientation of a phone may be determined using a magnetometer contained in a media recording device.
  • An event may be based upon an emergency or alarm situation, which might involve a severe weather advisory or warning relating to a thunderstorm, tornado, hurricane, snowstorm, high wind, etc. or any other sort of emergency situation such as a vehicle accident or crash, a break-in, a fire, a flood, a landslide, a prisoner escape, a riot, a hazardous materials spill, a runaway train, an airplane experiencing an emergency situation, etc.
  • a media recording device may be set to automatically begin recording if a nearby nuclear reactor alarm were to sound or if a person presses a medical alert button.
  • An event might involve a government controlled security level, for example, a Transportation Security Administration (TSA) security level, Home Security level, or DEFCON level.
  • TSA Transportation Security Administration
  • Home Security level or DEFCON level.
  • An event might be a sensed environmental condition such as a sensed temperature, humidity, light, smoke, carbon dioxide, seismic event, sound intensity and/or frequency, pressure, altitude, water depth, or the like, which might be measured using one or more sensors.
  • a sensor might detect an earthquake, an explosion, thunder, a gunshot, or a scream.
  • an event might be a sensed physical condition of a person or animal such as a heart rate, breathing rate, skin resistivity, blood pressure, body temperature, blood sugar level, etc.
  • the media recording device can also perform various other processing beyond uploading media to the cloud relating to the sensed information.
  • seismic information sensed by the media recording device might be used to identify the location, timing, and magnitude of a seismic event, which might even be used to determine an amount of time before a catastrophic even will occur at the location of the recording media device for providing warning, instructions, or other relevant information.
  • An event may involve the recognition of a command such as a voice command, a hand gesture, or a RF signal command.
  • FIGS. 3A-3D depict exemplary methods corresponding to an event-activated near real-time mode, where the amount of data latency between the time a given data packet is recorded until it is stored on the cloud depends on whether or not media is encrypted and whether or not a local copy of the media (or encrypted media) is stored on the media recording device.
  • FIG. 3A one skilled in the art will recognize that once media has been received by the cloud it may be recorded by a cloud service provider prior to storage or at any time thereafter.
  • a local copy of a recorded media is stored on the media recording device and all or part of the stored local copy of the media is encrypted and conveyed to the cloud in accordance with a defined schedule.
  • the scheduled uploading of the media to the cloud may be referred to as a scheduled upload mode, where the local copy of the media data may be encrypted or unencrypted while stored on the media recording device.
  • the media data can be automatically deleted from the media recording device once it has been conveyed to the cloud.
  • a user of a phone may choose to have media data moved from a phone to the cloud weekly, daily, hourly, or at specific scheduled times and/or in response to an event, for example, a voice command, a location, an emergency condition, etc.
  • the media may be stored and uploaded using a rolling period of time, for example, at the end of each day, the stored media data from the same day one week prior may be automatically deleted such that, at any given time, there is stored media data for the most recent seven days, where the rolling periodic upload mode can be overridden, the rolling period of time can be changed (increased or decreased), and specific subsets of stored media data can be identified as not to be deleted.
  • a rolling period of time mode might be configured to only delete media data from 12 pm to 6 am or to never delete data recorded on a Saturday.
  • all sorts of options for controlling one or more periods of time where stored media data would be uploaded to the cloud and automatically deleted from local storage, or not, are possible.
  • FIGS. 4A-4C depict exemplary methods corresponding to a scheduled update mode, where the amount of data latency between the time uploading of a stored media begins until it is stored on the cloud depends on whether or not the media is encrypted by the media recording device.
  • FIG. 4A one skilled in the art will recognize that once media has been received by the cloud it may be recorded by a cloud service provider prior to storage or at any time thereafter.
  • a local copy of a recorded media is stored on a media recording device and then later automatically conveyed to a cloud computing environment upon the occurrence of an event, such as described above, where the media may be encrypted prior to being conveyed to the cloud and where the local copy of the media data may be encrypted or unencrypted while stored on the media recording device.
  • the media data can be automatically deleted from the media recording device once it has been conveyed to the cloud.
  • FIGS. 5A-5C depict exemplary methods corresponding to an event-activated update mode, where the amount of data latency between the time uploading of a stored media begins until it is stored on the cloud depends on whether or not the media is encrypted by the media recording device.
  • FIG. 5A one skilled in the art will recognize that once media has been received by the cloud it may be encrypted by a cloud service provider prior to storage or at any time thereafter.
  • FIG. 6 depicts an exemplary method corresponding to an event-activated delete mode.
  • Metadata can be conveyed to the cloud along with the media data such as the media author, media title, date and time of the media recording, location and/or orientation of the media recording device, velocity, acceleration, temperature, barometric pressure, biometric data, light levels, etc. Metadata might include the person or persons in a video, or a short description or keyword(s) such as wedding, pet's name, flowers, waterfall, food, or the like. Generally, one skilled in the art will understand that such metadata can be used to enable processing of the media data from the cloud to include a user retrieving a subset or subsets of such media data based upon a query of the metadata stored along with the media data.
  • Whether or not metadata is added to media prior to it being conveyed to the cloud can also be controlled in accordance with a data latency limit in a manner similar to how encryption can be controlled. Similarly, whether or not metadata is added to media prior to it being conveyed to the cloud can also be controlled in accordance with an established rule and one or more conditions of a situation.
  • Metadata e.g., timing and location
  • metadata corresponding to one or more records corresponding to one or more recorded media of one or more recorded media devices
  • one recorded sound might provide a range of the source relative to a recording device
  • two recordings of the sound may determine a plane relative to the locations of the recording devices
  • three recordings of the sound may identify a coordinate of the source relative to the three recording devices.
  • a source of a sound could be, for example, a gun or a tornado.
  • All sorts of data processing involving multiple recorded media data by one or more media recording devices are possible.
  • a user interface could be used to define the events, rules, and conditions required to support event-activated approaches described herein.
  • a user interface can be used to define limits such as latency limits, to manage encryption, and to enter metadata.
  • a user interface can be used to produce queries used to retrieve media data from the cloud.
  • limits such as latency limits
  • a user interface can be used to produce queries used to retrieve media data from the cloud.
  • the present invention can be practiced using publicly available computing devices, communications networks, and related software or can be practiced using proprietary computing devices, communications networks, and/or software.
  • Rules and thresholds and the like can be established for one or more media recording devices using one or more computing device (e.g., a desktop computer) other than a recording device.
  • an interface can be provided to access media data stored on the cloud via computing devices other than a media recording device.
  • a product is provided that includes a software application resident on a media recording device and a software application resident on a computing device other than a recording device.
  • an application running on a cell phone may store media data to the cloud that is later accessed via a desktop computer via an internet connection.
  • an application e.g., a dashboard
  • a desktop computer may be used to configure parameters (e.g., rules, thresholds, etc.) relating to a user account that are then loaded by a cell phone application and used to manage the conveyance of recorded media data to the cloud by the cell phone.
  • parameters e.g., rules, thresholds, etc.
  • one or more other applications used to manage events such as a calendar management application (e.g., Microsoft Outlook®) can be used to establish and manage events that are used to manage the conveyance of media data to the cloud.
  • a meeting request received via an email may establish a location and a time used in a rule used to manage the conveyance of media data to the cloud.
  • an alert condition established in a weather alert application might be inherited by another application managing the conveyance of media data to the cloud.
  • the application managing the conveyance of media data to the cloud may interface with one or more publicly available data sources (e.g., National Weather Service, USGS Earthquake Early Warning system, a RSS news blog) and/or private data sources (e.g., a Social Directory API), where data provided by the one or more publicly available data sources and/or private data sources may be used, for example, to determine the occurrence of an event.
  • one or more publicly available data sources e.g., National Weather Service, USGS Earthquake Early Warning system, a RSS news blog
  • private data sources e.g., a Social Directory API
  • the present invention may be used as part of a monitoring service where the control of media recording functions can be at least partially managed by the monitoring service.
  • one or more media recording devices within a home or business may be activated based on a detected condition, a schedule, or as part of a random status check, where certain parameters are controllable by a user (e.g., home owner, business owner).
  • the present invention may take advantage of artificial intelligence algorithms that enable a media recording device to establish its own rules make its own decisions regarding which functions should be employed and to what extent as determined based on one or more events.
  • a media recording device may be configured to operate without displaying images being recorded on the display of the device.
  • a phone may be filming and streaming a video to the cloud without displaying the video on the display of the phone.
  • This ‘non-display recording mode’ might be activated by a user selecting a button, providing a voice command, or automatically due to the occurrence of an event (e.g., a detected abrupt movement), etc. in the same manner as media recording can be activated.
  • multiple media recording devices may be configured to collaborate.
  • data samples from different devices can have relative timings that are coordinated. For example, if three media recording devices are recording the same event from three different locations the their data samples may have staggered timing such that the timing of data samples from the first media recording device may be provided every 3 seconds beginning at time t 0 , data samples from the second media recording device may be provided every 3 seconds beginning at time t 0 +1 second, and data samples from the third media recording device may be provided every 3 seconds beginning at time t 0 +2 second.
  • two or more media recording devices may coordinate their times such that they take data samples at substantially the same times, t 0 , t 1 , t 2 , etc.
  • collaboration include: 1) sharing of rules, thresholds, sensor information, and the like such that a set of parameters can be established that is used to manage conveyance of media data produced by multiple media recording devices, 2) sharing of recorded data among media recording devices allowing the display of information from multiple devices as an event is being recorded by the devices, 3) sharing of warnings and messages between devices upon occurrences of events (e.g., Fred's phone is code red and located at x,y,z coordinates).
  • FIG. 7 depicts an exemplary computing architecture 700 in accordance with the invention.
  • the computing architecture 700 includes a plug-in application 702 (e.g., a cell phone ‘app’) and a PC application 704 .
  • the plug-in application 702 includes a front end 706 and an application program interface (API) 708 .
  • the front end 706 provides a user interface that enables a user to log into the application which would involve use of a user management service 712 , which provides user login and authentication capabilities, and corresponding payment management service 714 , which provides for various means of payment for the application's services, products, etc.
  • the user management service 712 provides notification to the API 708 that the user is authorized (i.e., authenticated and appropriates payments have been made) to use the application.
  • the API 708 interfaces with a cloud service 720 and interfaces with a media server service 716 which streams media data to one or more media file databases 718 , which may be provided by a cloud service or by another data storage service 720 .
  • the API 708 also provides metadata associated with the media data to one or more metadata file databases 719 .
  • the API 708 also interfaces with a web server service 722 , which interfaces with a website 724 and a dashboard 726 .
  • the website 724 provides unregistered users a product description, pricing, and a product registration form. Once registered, a user is able to access the dashboard 726 that can be used to configure parameters relating to the user's account and provides access to the user's media files.
  • a plurality of media recording devices interfaced with a cloud computing environment convey a corresponding plurality of recorded media and metadata to the cloud computing environment, where the plurality of recorded media and metadata may or may not correspond to multiple views of the same recorded event, and where the metadata enables the plurality of recorded media to by synchronized.
  • the metadata includes time samples in accordance with a common time reference (i.e., a universal clock, atomic clock, NIST, US Naval Observatory clock, etc.) and may include magnetometer data that can be used to determine the view angle of a media recording device (e.g., a smartphone) and/or location information from a location system such as GPS location information.
  • a media recording device e.g., a smartphone
  • location information e.g., a location system
  • the plurality of media recording devices may record media and store metadata without using the cloud.
  • the present invention enables multiple individuals to use multiple media recording devices, for example, smartphones to independently and asynchronously record an event (e.g., a baseball game) from different ad hoc locations, where the beginning and ending recording times of the corresponding multiple recorded views of the event are determined by the individuals recording the different view (or media clip) of the event.
  • an event e.g., a baseball game
  • the multiple individuals recording an event are not required to collaborate prior or during their recording of the event and a given individual may not even have any a priori knowledge of the recording by other individuals or even be aware of the recording of the event by any other individuals.
  • a recorded event can be something involving some number of people that are collocated such as those attending a wedding where multiple persons might use their smartphones to record all or part of the wedding from different locations.
  • a recorded event might involve multitudes of people at many different locations, for example, persons recorded celebrating the occurrence of a new year.
  • An event might also be some sort of tragedy such as a natural disaster (e.g., an earthquake), a terrorist attack (e.g., 911), a volcanic eruption, or the like.
  • the metadata stored during the recording of the multiple recorded views by the multiple media recording devices allows the various recorded media clips to then be synchronized such that multiple views of an event can be used collaboratively to produce one or more composite videos that are a combination of a sequence of different views of an event taken during periods of time.
  • many different types of recording devices e.g., video, cameras, audio recorders, etc.
  • the devices store metadata along with their recordings in accordance with the invention, and the recorded output of the devices can be later synchronized and used together in a collaborative matter, for example, a composite media output can be produced.
  • the plug-in application 702 running on the multiple smartphones of multiple individuals is used to record multiple views of an event where the media and metadata corresponding to the multiple views is stored in a media database 718 and metadata database 719 provided by a cloud service or another data storage service 720 .
  • a PC application 704 (aka. a dashboard application) of any one of the multiple individuals can be provided the media and metadata of the multiple views of the event where the views can be synchronized and combined in various ways to produce various composite videos made up of different views of the event recorded by the multiple smartphones.
  • FIG. 8 depicts an exemplary graphical user interface (GUI) 800 of a media player of a PC application 704 for playing a single view of an event using media and metadata corresponding to the event.
  • GUI graphical user interface
  • the GUI includes a media player viewing window 802 and a location/view angle indicator window 804 , which would typically display a coordinate system, a floor plan, a map or any other of various types of locational information that would provide an indication of the location 806 of a media recording device and also the view angle 808 of the media recording device which is indicated by a cone.
  • the location/view angle indicator window 804 may also include various other types of information such as weather radar, satellite imagery, and the like.
  • the GUI includes an information window 810 , which can be used for displaying the product name, logos, a banner, advertisements, and other information.
  • the media player viewing window 802 is controlled using media player controls 812 , which typically includes a current frame control 814 shown on a timeline 816 as well as various other control buttons for controlling basic media player functions (e.g., pause/play, fast forward, time, closed caption, full screen, sound control, zoom, etc.).
  • the location/view angle indicator window is shown as a plan view but could alternatively be presented using three dimensional viewing techniques to as to indicate view elevation. View elevation might also be indicated using a numerical indication.
  • FIG. 9 depicts an exemplary GUI 900 of a multiple view media player of a PC application 704 for producing a composite video from media and metadata corresponding to multiple views of an event.
  • the GUI 900 includes a media player viewing window 802 , a location/view angle indicator window 804 , an information window 810 , and media player controls 812 like the GUI 800 of FIG. 8 .
  • the locations 806 a - 806 d of multiple media recording devices are indicated along with their respective view angles as indicated by cones 808 a - 808 d .
  • multi-view media player viewing windows 902 a - 902 d are used to display the multiple views of the event as recorded by the multiple media recording devices having the locations 806 a - 806 d and viewing angles 808 a - 808 d .
  • the multi-view media player viewing windows 902 a - 902 d each have corresponding selection icons 904 a - 904 d , which in a preferred embodiment are color-coded, and view mode control buttons 906 a - 906 d for controlling the viewing mode of each respective viewing window 902 a - 902 d .
  • the view mode control buttons 906 a - 906 d are used to select which view is being played during a given period of time as part of the overall timeline of a composite video, where essentially the composite video consists of a sequence of selected views.
  • An additional selection icon 904 e is shown in the location/view angle indicator window 804 , which allows the window 804 to be selected and included in a composite video.
  • the timeline 816 of the composite video is color-coded consistent with the color coding of the selection icons 904 a - 904 d corresponding to the views selected for the different portions of the composite video.
  • Viewing modes may include may include, for example, OFF, PLAY, and PERIODIC FRAME modes, where the PLAY mode the media normally and the PERIODIC FRAME mode samples frames from the media data periodically (e.g., once per second) and displays the same frame for a period of time.
  • the PERIODIC FRAME mode reduces bandwidth requirements but also reduces overall movement in the GUI, which might be preferable to certain users when selecting among views. Views may be played simultaneously or may be played one at a time. Non-selected views may update frames periodically (e.g., once per second) whereas the selected view plays normally.
  • a view control window 908 may include various functions specific to the multiple views or to the composite video.
  • a speed control 910 could be used to vary the speed (e.g., slow motion) at which one or more views is playing or a multi-view combination control 912 might be used, which causes a sequence of different views of the same or otherwise overlapping periods of time (e.g., four view angles of a bat hitting the winning home run).
  • different types of view editing functions can be included in the view control window 908 .
  • metadata files can be used by a plurality of media service providers to enable their media products to be synchronized, where such media products may involve videos, still camera images, animations, GUIs, etc.
  • media products may involve videos, still camera images, animations, GUIs, etc.
  • different vendor products can be used to produce composite products.
  • a provider of digital score board displays for sporting events could provide metadata files with its display data files enabling the displays to be integrated with other products that could benefit from the displays.
  • FIG. 10 depicts an exemplary GUI 1000 of a multiple view media player of a PC application 704 for producing a composite video from media and metadata corresponding to multiple views of a baseball game, which includes products that might be provided by other service providers.
  • the information window 804 of the GUI 1000 might depict a baseball park 1002 , which might be a live video image or images, a digital scoreboard 1004 , and a digital scorebook 1006 , which might be provided by other service providers but integrated into the GUI and synchronized with the multiple views of the baseball game using the metadata provided by the various service providers.
  • selection icons 904 e - 904 g can be associated with the output of these third party products, which causes their images to be included in a composite video.
  • the present invention enables various types of specialized theme-based PC applications 704 having GUIs that are tailored for specific types of events such as sporting events (e.g., tennis, basketball, soccer, baseball, football, volleyball, etc.), weddings, concerts, plays, recitals, political events, and any other type of event where specialized products can be integrated as a result of product vendors complying with metadata requirements that enable synchronization of products.
  • Such specialized PC applications 704 can have advertising focused on specific groups that would be interested in the specific type of events. For example, advertisements targeted at baseball fans might be displayed on a GUI having a baseball theme.
  • a multiple view video data package comprising a plurality of media files and metadata files corresponding to multiple views of a recorded event can be posted to an internet media-sharing website (e.g., YouTube®) of FaceBook®), where the package may or may not include one or more composite videos produced using the plurality of media files and metadata files.
  • an internet media-sharing website e.g., YouTube®
  • FaceBook® e.g., Facebook®
  • other users of the media-sharing website can comment on and rate (e.g., thumbs up or down) a composite video and/or can download the media files and metadata files corresponding to the recorded event to their own PC applications 704 allowing them to produce their own composite video(s).
  • the internet media-sharing website might provide a real-time dashboard application for producing a composite video using the media files and metadata files.
  • the present invention enables a new collaborative media sharing environment whereby users can combine views taken from their smartphones with other views or otherwise participate in the creation of composite videos from whatever views of an event are available. Users can post their multiple view video data packages and/or media and metadata for a single view using an identifier such as a Hashtag#.
  • Twitter® which might be used to organize views in real time (e.g., someone needs to move to the right of the stage to get a view . . . ).
  • applications such as LinkedIn® might be used to organize the views recorded by different groups or organizations.
  • information may be displayed at an event that can be captured with a media recording device and be used to coordinate a collaborative recording of media relating to an event.
  • a barcode or QR code
  • a barcode can be displayed on a placard, on a display, or in a program for an event that a user of a smartphone can use the camera of the smartphone to take a picture of the barcode, where a software application running on the phone would decipher the barcode information and use it to coordinate the collaborative recording of media relating to the event.
  • a person organizing the collaborative recording of media relating to an event can provide information to a graphical user interface of a software application that subsequently can create a barcode (or QR code), which can then be displayed at an event such that it can be captured by media recording devices of persons wanting to participate in the collaborative recording of media relating to the event.
  • a barcode or QR code
  • digital watermarks might be placed on certain frames during the recording of an event and a dashboard application might vary relative time bases of different views using the digital watermarks.
  • the media and metadata corresponding to a given recorded event can be made publicly available or access to such media and metadata can be controlled such that it is only provided to one or more authorized persons.
  • a service provider may convey media and metadata to authorized users in accordance with access control permissions which may administered using one or more layers of administration that control membership in groups where access to media and metadata may involve an access control list, a password, and/or a role.
  • control of access to media and metadata can be controlled in the same or similar way that access to location based information is described in U.S. Pat. No. 7,525,425, which is incorporated by reference herein in its entirety.
  • a person who records an event can be considered the owner of the corresponding recorded media and metadata and can determine which other person or persons have access to such data.
  • an owner of certain media and metadata may restrict access to persons included on an access control list, persons that know a password, or persons provided access because of a role (e.g., a detective, an administrator, etc.).
  • a person may request an owner to provide access to certain media and metadata, which might be granted electronically, for example, by an owner responding to an automated permission message in a manner similar to how a LinkedIn® connection can be requested and accepted.
  • a media recording device may include or be interfaced to a local media server or a data buffering device which might be a memory card, thumb drive, or the like that enables media data and metadata to be stored periodically, for example, during a loss of a data link to the cloud.
  • data may be automatically buffered to a local media server or a data buffering device until successfully conveyed to the cloud in what can be described as store and forward streaming to the cloud.
  • buffering to a local media server or a buffering device only occurs when a link to the cloud is unavailable.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

An improved system and method for media synchronization and collaboration involves a data storage, a plurality of media recording devices used by a plurality of users to independently record an event from multiple locations thereby producing a plurality of recorded media data corresponding to a plurality of views of the event, and a media player comprising a processor and a graphical user interface. Each of the plurality of media recording devices convey to the data storage media data and metadata corresponding to their respective view of the event, where the metadata includes time samples in accordance with a common time reference. The media player uses the metadata to synchronize and play the plurality of views of the event. The graphical user interfaces can be used to select views of the plurality of views of the event to be playing during periods of time as part of an overall timeline of a composite video that consists of a sequence of selected views. The media player is configured to create a multiple view video data package comprising a plurality of media files and metadata files corresponding to multiple views of a recorded event. The multiple view video data package may include a composite video produced using the plurality of media files and metadata files. The media player can be configured to post the multiple view video data package to an internet media-sharing website thereby enabling users of the internet media-sharing website to at least one of comment on the composite video, rate the composite video, or download the plurality of media files and the plurality of metadata files.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This United States Non-provisional application is a continuation-in-part of U.S. patent application Ser. No. 14/988,568, filed Jan. 5, 2016, which claims the benefit of U.S. Provisional Patent Application No. U.S. 62/099,937, filed Jan. 5, 2015, titled “A System and Method for Cloud-based Media Streaming”.
  • This U.S. Non-provisional patent application also claims the benefit of U.S. Provisional Patent Application No. U.S. 62/340,013, filed May 23, 2016, titled “System and Method for Cloud-based Media Synchronization and Collaboration”.
  • These patent applications are each incorporated by reference herein in their entirety.
  • FIELD OF THE INVENTION
  • The present invention relates generally to a system and method for media synchronization and collaboration. More particularly, the present invention relates to a system and method for media synchronization and collaboration, where metadata is used to synchronize media allowing multiple views of an event recorded independently by multiple media recording devices to be synchronized and combined into a collaborative media file.
  • SUMMARY OF THE INVENTION
  • Briefly, one aspect of the present invention involves an improved system for media synchronization and collaboration that includes a data storage, a plurality of media recording devices used by a plurality of users to independently record an event from multiple locations thereby producing a plurality of recorded media data corresponding to a plurality of views of the event, each of the plurality of media recording devices conveying to the data storage media data and metadata corresponding to their respective view of the event, the metadata including time samples in accordance with a common time reference, and a media player comprising a processor and a graphical user interface, the media player using the metadata to synchronize and play the plurality of views of the event.
  • The plurality of media recording devices can be a plurality of smart phones.
  • The graphical user interface may include a plurality of viewing windows used to display the plurality of views of the event.
  • Each of the plurality of viewing windows can have a corresponding selection icon of a plurality of selection icons.
  • The plurality of selection icons can be color-coded.
  • The plurality of selection icons can be used to select views of the plurality of views of the event to be playing during periods of time as part of an overall timeline of a composite video that consists of a sequence of selected views.
  • The graphical user interface can play the composite video.
  • The timeline can be color-coded to identify the view of the plurality of views that is playing during a given period of time of the timeline.
  • The graphical user interface can include a multi-view combination control that causes a sequence of different views of the same time period or overlapping time periods to be played in the composite video.
  • The metadata can include location and view angle data and the graphical user interface can provide the ability to display a location/view angle indicator window that indicates the location and view angle of each view of the plurality of views.
  • The location/view angle indicator window can have a selection icon that can be selected to cause it to be displayed in the composite video during a given period of time of the timeline.
  • The metadata can enable at least one of a still camera image, animation, digital score board, digital scorebook, or live video to be synchronized so that it can be displayed in the composite video during a given period of time of the timeline.
  • The media player can be configured to create a multiple view video data package comprising a plurality of media files and metadata files corresponding to multiple views of a recorded event.
  • The multiple view video data package may include a composite video produced using the plurality of media files and metadata files.
  • The media player can be configured to post the multiple view video data package to an internet media-sharing website thereby enabling users of the internet media-sharing website to at least one of comment on the composite video, rate the composite video, or download the plurality of media files and the plurality of metadata files.
  • Another aspect of the present invention involves a method of media synchronization and collaboration, comprising recording an event using a plurality of media recording devices that independently record the event to produce a plurality of recorded media data corresponding to a plurality of views of the event, storing media data and metadata corresponding to the plurality of views of the event to a data storage, the metadata including time samples in accordance with a common time reference, and providing a media player comprising a processor and a graphical user interface that uses the metadata to synchronize and play the plurality of views of the event.
  • The method may also include producing a composite video by selecting using the graphical user interface a sequence of selected views of the plurality of views of the event that play during a periods of time as part of an overall timeline of the composite video.
  • The may also include creating a multiple view video data package comprising a plurality of media files and a plurality of metadata files used to produce the composite video.
  • The multiple view video data package may include the composite video.
  • The method may also include posting the multiple view video data package to an internet media-sharing website thereby enabling users of the internet media-sharing website to at least one of comment on the composite video, rate the composite video, or download the plurality of media files and the plurality of metadata files.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention is described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
  • FIG. 1 depicts a user of a phone recording an event that is conveyed to the cloud;
  • FIG. 2A depicts a first exemplary method in accordance with the invention;
  • FIG. 2B depicts a second exemplary method in accordance with the invention;
  • FIG. 2C depicts a third exemplary method in accordance with the invention;
  • FIG. 2D depicts a fourth exemplary method in accordance with the invention;
  • FIG. 3A depicts a fifth exemplary method in accordance with the invention;
  • FIG. 3B depicts a sixth exemplary method in accordance with the invention;
  • FIG. 3C depicts a seventh exemplary method in accordance with the invention;
  • FIG. 3D depicts an eighth exemplary method in accordance with the invention;
  • FIG. 4A depicts a ninth exemplary method in accordance with the invention;
  • FIG. 4B depicts a tenth exemplary method in accordance with the invention;
  • FIG. 4C depicts an eleventh exemplary method in accordance with the invention;
  • FIG. 5A depicts a twelfth exemplary method in accordance with the invention;
  • FIG. 5B depicts a thirteenth exemplary method in accordance with the invention;
  • FIG. 5C depicts an fourteenth exemplary method in accordance with the invention;
  • FIG. 6 depicts a fifteenth exemplary method in accordance with the invention;
  • FIG. 7 depicts an exemplary computing architecture in accordance with the invention;
  • FIG. 8 depicts an exemplary graphical user interface of a media player for playing a single view of an event using media and metadata corresponding to the event in accordance with the invention;
  • FIG. 9 depicts an exemplary graphical user interface of a multiple view media player for producing a composite video from media and metadata corresponding to multiple views of an event in accordance with the invention; and
  • FIG. 10 depicts an exemplary graphical user interface of a multiple view media player for producing a composite video from media and metadata corresponding to multiple views of a baseball game in accordance with the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • In accordance with one aspect of the present invention, a media recording device interfaced with a cloud computing environment (i.e., the “cloud”) conveys a recorded media to the cloud in near real-time while the media is being recorded by the media recording device, where the conveying of the media to the cloud may be based upon the occurrence of a user-defined event and/or a user command, where the media may or may not be encrypted and storage of a local copy of the media is optional. The media recording device may also upload to the cloud a previously recorded media stored on the media recording device. The uploading of the previously recorded media may be in accordance with a schedule and/or upon the occurrence of a user-defined event, where the media may be encrypted when stored on the media recording device and subsequently uploaded to the cloud or stored on the media recording device as unencrypted data that is then encrypted as it is uploaded to the cloud or after it has been uploaded to the cloud (i.e., by a cloud service provider as opposed to encryption by the media recording device). Rules may be established and enforced for forbidding access, modification and/or erasure from the recording device and/or the cloud.
  • Under one arrangement a first recording device forwards a media to a second recording device that conveys the media to the cloud, where the first recording device could be a service interfacing with multiple second recording devices and which could be configured to have substantial network bandwidth significantly greater than that required and typically used by the individual second recording devices.
  • In accordance with another aspect of the invention, data latency rules can be established to control whether various optional functions pertaining to processing of a recorded media (e.g., encryption, local storage, adding of metadata, etc.) are performed or not or an amount or rate corresponding to a processing function being performed (e.g., data compression level, data sampling rate, data buffering amount, resolution, etc.) for a given recorded media. Data latency rules may be based on one or more established data latency thresholds corresponding to one or more user-defined conditions. For example, a data compression rate and/or data sampling rate may be controlled in real-time as required to keep data latency below an established data latency threshold. Moreover, a data latency threshold may be conditional in that it might be modified or overridden, for example, based on an event such as a sensed event. For example, given a sensed fire condition, all unnecessary processing may be avoided regardless of an established data latency threshold.
  • Under one arrangement, a user may establish a set of parameters relating to which media data processing functions are to be used or not used and the extent to which they may be used (e.g., sampling rate, amount of data compression, data resolution) so as to control data latency while meeting certain user requirements (e.g., user always wants encryption), where media data processing functions are either turned on or off or an extent (e.g., rate, resolution) changed based on criteria or rules established by a user. Such criteria may determine a sampling rate, an amount of data compression, the extent of which metadata is added. For example, a user may establish a parameter where video is to be captured at a 1080i (i.e., 1920×1080) resolution but, when a certain criteria is met, the resolution is to change to a different resolution, which may be, for example, a lower resolution (e.g., 702p) or higher resolution (8640p).
  • Under another arrangement a command may be provided by the user of a media recording device to change a mode of operation. For example, an “upload button” might be pressed or an upload voice command may be spoken to cause an upload to the cloud function to be started immediately. Similarly, a “fast upload button” might be pressed or a fast upload function otherwise initiated to cause an upload function to be initiated under conditions that provide for minimal data latency. Generally, a user may establish one or more events that correspond to ‘fast load triggers’, whereby the occurrence of such an event causes the fast upload function to be initiated.
  • In accordance with one feature of the invention, a data latency indicator may be provided which might be a number and/or a color or some other indicator.
  • In accordance with another feature of the invention, an event may result in other applications running on a device (e.g., cell phone) being turned off in order to speed up performance of the device or otherwise reduce data latency.
  • In accordance with yet another aspect of the invention, access controls can be employed to prevent unauthorized access to or deletion of a recorded media stored on the media recording device and/or on the cloud. One or more deletion events can also be defined by a user, where a local copy of recorded media will be automatically deleted from the media recording device based on the occurrence of a deletion event.
  • A media recording device can include a video recording device and/or an audio recording device, for example, a camera and a microphone of a mobile phone or a Bluetooth or Wi-Fi connected device, and the media can be, for example, video (still picture or movie) and/or audio data recorded by the video and audio recording devices of the phone, where the recorded media is in the form of digital data. Other examples of media recording devices include a media recording device located in a home or business, a media recording device (e.g., dash cam) located in a vehicle (e.g., car, truck, emergency vehicle, plane, helicopter, drone, etc.), a media recording device (e.g., body cam) associated with a person or animal, or a media recording devices associated with a fixed object (e.g., bridge, tree, light post, gas pump, etc.). Recorded media may be in the form of text, for example, where an audio recording device converts voice to text. Video data may correspond to a picture or video taken from the front (display side) of a cell phone or from the back of a cell phone, or both, which may be taken simultaneously. As such, an event being filmed may be captured at the same time the user of a media recording device is captured (e.g., a selfie).
  • Encryption of media data, whether stored on the media recording device or on the cloud, may for example involve use of a symmetric key encryption scheme or a public key encryption scheme. Encryption may be performed by the media recording device such that a local copy can be stored in encrypted form and/or media is conveyed to the cloud in encrypted form. Alternatively, media may be conveyed in unencrypted form and then encrypted by a cloud service provider as it is being received and stored on the cloud or sometime after it has been received and stored on the cloud.
  • Various data access and user authentication methods can be employed to control access and/or deletion to data stored on the media recording device or on the cloud. Such methods may include a password, a signature, or a biometric such as an eye scan, a fingerprint scan, a facial image (i.e., individual's photo, a selfie), a recognized voice, or the like. At least one physical key (e.g., a dongle) may be required to access data, where multiple keys (or dongles) distributed to multiple persons may be required to access or delete data. A third party authentication service provider might be used such as VeriSign. Generally, one skilled in the art of protecting data stored on an electronic device or across a network will understand that all sorts of methods can be employed to control access to data and to authenticate a user, where such controls can also be used to prevent unauthorized data deletion. Moreover, rules can be employed in conjunction with such access control methods, for example, access and/or data deletion may be limited to a certain time period, require a certain aging of data (i.e., an elapsed period of time), require an event to have occurred (such as described below), require the media recording device to be in a certain location, etc. Generally, the concepts described below relating to user-defined events being used to determine the starting and stopping of recording of a media recording device and corresponding conveying of recorded media to the cloud or the uploading of previously recorded media data to the cloud can also be applied for controlling access to and deletion of media stored on the cloud or on the media recording device. Alternatively, it may be desirable that a media can be identified that cannot be deleted from the media recording device and/or from the cloud under any circumstance or without participation by a third party given control over such access/deletion such as an attorney, an editor of a publication, or some third party service.
  • Under a first arrangement, which is depicted in FIG. 1, a user 100 of a media recording device 102 runs an application (or ‘app’) or otherwise selects a mode of operation of the media recording device 102 that causes a media 104 to be conveyed to the cloud 106 while the media 104 is being recorded by the media recording device 102 upon the occurrence of a user command. The media may be encrypted by the media recording device prior to the media being conveyed to the cloud. Alternatively, the media may be conveyed to cloud in unencrypted form where it then may or may not be encrypted. Under this arrangement, which may be referred to as a user-activated near real-time mode, storage of a local copy of the media on the media recording device is optional, where the media may be conveyed directly to the cloud and may never actually be stored locally on the media recording device. For example, a user of a mobile phone such as an HTC® phone may record an event with their phone and the media data would be conveyed directly to the cloud as it was being recorded with very low latency between the time a given data packet is recorded until it is stored on the cloud, where the data corresponding to the event would not be stored or otherwise be present on the phone unless it is necessary to temporarily buffer data for some required reason, for example, due to a poor or non-existent data connection between a first and second media recording device or between a media recording device and the cloud. Alternatively, the user may choose to store a local copy of the media data on the phone while also conveying the media data directly to the cloud as the media is being recorded, where the local copy of the media data may be encrypted or unencrypted and where the data conveyed to the cloud may be encrypted prior to being conveyed to the cloud or the data may be encrypted after it has been conveyed to the cloud. For example, unencrypted data might be conveyed to a cloud service provider that encrypts the data it receives from the media recording device prior to storing it, where the time required to encrypt the media would not add to the data latency of the media recording device, but the data would be vulnerable while be conveyed to the cloud because it is in an unencrypted form. Depending on the conditions of a given situation, it may be preferable to reduce data latency of the media recording device by not requiring the media recording device to encrypt a recorded media prior to conveying it to the cloud. As such, a user can establish rules used to control whether such processing is performed or not depending on the conditions of a given situation. For example, a user might set up a rule whereby video footage of a business security system would be automatically encrypted and conveyed to the cloud upon the occurrence of a user-defined security alarm event but encryption should not be performed should a fire condition (e.g., fire alarm, a sensor detecting smoke or heat, etc.) also be detected. Generally, all sorts of rules can be employed to control the processing performed prior to conveyance of media data to the cloud based on one or more conditions so as to control data latency.
  • FIGS. 2A-2D depict exemplary methods corresponding to a user-activated near real-time mode, where the amount of data latency between the time a given data packet is recorded until it is stored on the cloud depends on whether or not media is encrypted and whether or not a local copy of the media (or encrypted media) is stored on the media recording device. In reference to FIG. 2A, one skilled in the art will recognize that once media has been received by the cloud it may be recorded by a cloud service provider prior to storage or at any time thereafter.
  • Under a second arrangement, a media recording device is configured to automatically begin and/or stop recording media to be encrypted and conveyed to the cloud while the media is being recorded by a media recording device upon the occurrence of a defined event or events. Under this arrangement, which may be referred to as an event-activated near real-time mode, storage of a local copy of the media on the media recording device is optional, where the local copy of the media data may be encrypted or unencrypted while stored on the media recording device.
  • An event can be generally described as an occurrence that meets an established criterion, condition or rule that can be recognized by a control system (e.g., an application running on a cell phone). For example, an event may be based on a position of an object/person/animal/vehicle, for example, a cell phone's specific location (i.e., latitude, longitude) as might be determined by a location system such as a global positioning system (GPS). An event might correspond to a status of a media recording device, for example, a battery status or a signal strength status. Similarly, an event might be based on a position of an object relative to a location of another object, where both objects might be fixed or mobile. Such location based events are commonly known as geolocation events where, generally, an event can be defined based on the location of one or more objects relative to one or more defined areas (e.g., a perimeter or a property or a building, or a room within a building) corresponding to one or more locations. An event may be based on movement, lack of movement, or a change in movement (e.g., speed or direction) of an object, which might be detected using a compass. For example, an ‘impact threshold’ may be established corresponding to an abrupt movement change indicating an impact associated with a media recording device (e.g., hitting the ground, being in a vehicle crash, etc.).
  • An event may be based on a position, for example, a position of the phone within a coordinate system. An event may relate to a movement of the phone or the non-movement of the phone, which might be detected using an accelerometer. An event may relate to a detected movement, which might be detected by a motion or proximity detector/radar. An event may be based on an orientation of an object, which might be measured using a 6-DOF measurement device. In one arrangement, the orientation of a phone may be determined using a magnetometer contained in a media recording device.
  • An event may be based upon an emergency or alarm situation, which might involve a severe weather advisory or warning relating to a thunderstorm, tornado, hurricane, snowstorm, high wind, etc. or any other sort of emergency situation such as a vehicle accident or crash, a break-in, a fire, a flood, a landslide, a prisoner escape, a riot, a hazardous materials spill, a runaway train, an airplane experiencing an emergency situation, etc. For example, a media recording device may be set to automatically begin recording if a nearby nuclear reactor alarm were to sound or if a person presses a medical alert button.
  • An event might involve a government controlled security level, for example, a Transportation Security Administration (TSA) security level, Home Security level, or DEFCON level.
  • An event might be a sensed environmental condition such as a sensed temperature, humidity, light, smoke, carbon dioxide, seismic event, sound intensity and/or frequency, pressure, altitude, water depth, or the like, which might be measured using one or more sensors. For example, a sensor might detect an earthquake, an explosion, thunder, a gunshot, or a scream. Similarly, an event might be a sensed physical condition of a person or animal such as a heart rate, breathing rate, skin resistivity, blood pressure, body temperature, blood sugar level, etc. One skilled in the art will also recognize that if the media recording device is configured to received sensed information then the media recording device can also perform various other processing beyond uploading media to the cloud relating to the sensed information. For example, seismic information sensed by the media recording device might be used to identify the location, timing, and magnitude of a seismic event, which might even be used to determine an amount of time before a catastrophic even will occur at the location of the recording media device for providing warning, instructions, or other relevant information.
  • An event may involve the recognition of a command such as a voice command, a hand gesture, or a RF signal command.
  • FIGS. 3A-3D depict exemplary methods corresponding to an event-activated near real-time mode, where the amount of data latency between the time a given data packet is recorded until it is stored on the cloud depends on whether or not media is encrypted and whether or not a local copy of the media (or encrypted media) is stored on the media recording device. In reference to FIG. 3A, one skilled in the art will recognize that once media has been received by the cloud it may be recorded by a cloud service provider prior to storage or at any time thereafter.
  • Under the first or second arrangements, efforts can be made to limit latency between recording and conveyance to the cloud to an amount less than or equal to a defined latency limit, where to meet an established latency limit, storing of a local copy of the media and/or encryption of the local copy may not occur.
  • Under a third arrangement, a local copy of a recorded media is stored on the media recording device and all or part of the stored local copy of the media is encrypted and conveyed to the cloud in accordance with a defined schedule. The scheduled uploading of the media to the cloud may be referred to as a scheduled upload mode, where the local copy of the media data may be encrypted or unencrypted while stored on the media recording device. Optionally, the media data can be automatically deleted from the media recording device once it has been conveyed to the cloud. For example, a user of a phone may choose to have media data moved from a phone to the cloud weekly, daily, hourly, or at specific scheduled times and/or in response to an event, for example, a voice command, a location, an emergency condition, etc. The media may be stored and uploaded using a rolling period of time, for example, at the end of each day, the stored media data from the same day one week prior may be automatically deleted such that, at any given time, there is stored media data for the most recent seven days, where the rolling periodic upload mode can be overridden, the rolling period of time can be changed (increased or decreased), and specific subsets of stored media data can be identified as not to be deleted. For example, a rolling period of time mode might be configured to only delete media data from 12 pm to 6 am or to never delete data recorded on a Saturday. Generally, one skilled in the art will recognize that all sorts of options for controlling one or more periods of time where stored media data would be uploaded to the cloud and automatically deleted from local storage, or not, are possible.
  • FIGS. 4A-4C depict exemplary methods corresponding to a scheduled update mode, where the amount of data latency between the time uploading of a stored media begins until it is stored on the cloud depends on whether or not the media is encrypted by the media recording device. In reference to FIG. 4A, one skilled in the art will recognize that once media has been received by the cloud it may be recorded by a cloud service provider prior to storage or at any time thereafter.
  • Under a fourth arrangement, a local copy of a recorded media is stored on a media recording device and then later automatically conveyed to a cloud computing environment upon the occurrence of an event, such as described above, where the media may be encrypted prior to being conveyed to the cloud and where the local copy of the media data may be encrypted or unencrypted while stored on the media recording device. Optionally, the media data can be automatically deleted from the media recording device once it has been conveyed to the cloud.
  • FIGS. 5A-5C depict exemplary methods corresponding to an event-activated update mode, where the amount of data latency between the time uploading of a stored media begins until it is stored on the cloud depends on whether or not the media is encrypted by the media recording device. In reference to FIG. 5A, one skilled in the art will recognize that once media has been received by the cloud it may be encrypted by a cloud service provider prior to storage or at any time thereafter.
  • Under a fifth arrangement, a local copy of a recorded media is deleted upon the occurrence of an event, such as described above. FIG. 6 depicts an exemplary method corresponding to an event-activated delete mode.
  • Various metadata can be conveyed to the cloud along with the media data such as the media author, media title, date and time of the media recording, location and/or orientation of the media recording device, velocity, acceleration, temperature, barometric pressure, biometric data, light levels, etc. Metadata might include the person or persons in a video, or a short description or keyword(s) such as wedding, pet's name, flowers, waterfall, food, or the like. Generally, one skilled in the art will understand that such metadata can be used to enable processing of the media data from the cloud to include a user retrieving a subset or subsets of such media data based upon a query of the metadata stored along with the media data. Whether or not metadata is added to media prior to it being conveyed to the cloud can also be controlled in accordance with a data latency limit in a manner similar to how encryption can be controlled. Similarly, whether or not metadata is added to media prior to it being conveyed to the cloud can also be controlled in accordance with an established rule and one or more conditions of a situation. Under one arrangement, metadata (e.g., timing and location) corresponding to one or more records corresponding to one or more recorded media of one or more recorded media devices may be used to located the source of a sound as recorded by the one or more media recording devices, where one skilled in the art will recognize that one recorded sound might provide a range of the source relative to a recording device, two recordings of the sound may determine a plane relative to the locations of the recording devices, and three recordings of the sound may identify a coordinate of the source relative to the three recording devices. A source of a sound could be, for example, a gun or a tornado. One skilled in the art will recognize that all sorts of data processing involving multiple recorded media data by one or more media recording devices are possible.
  • A user interface could be used to define the events, rules, and conditions required to support event-activated approaches described herein. Similarly, a user interface can be used to define limits such as latency limits, to manage encryption, and to enter metadata. Similarly, a user interface can be used to produce queries used to retrieve media data from the cloud. Generally, one skilled in the art of data management will understand that such interfaces can be employed to practice the invention.
  • The following are examples of the use of the present invention:
      • a baby monitoring system whereby a sensed condition such as a temperature, irregular heartbeat, or the like cause the baby monitoring system to convey sound, video, and sensor information to the cloud, where it may then be forwarded to medical personnel either manually or automatically as part of a service.
      • a business security system whereby a detected forced entry automatically conveys security footage to the cloud, where it may then be forwarded to security personnel or police either manually or automatically as part of a service.
      • a structure health monitoring system whereby upon the occurrence of an earthquake sensor information pertaining to the health of the structure (e.g., bridge, building, dam, etc.) is automatically conveyed to the cloud where video footage may also be automatically recorded and conveyed.
      • a personal video monitoring system worn on a person automatically beginning recording and conveying video and/or sensor information to the cloud upon recognition of an irregular heartbeat or other sensed characteristic of a person, which might result from fear, an accident, a medical condition (e.g., a diabetic seizure), excitement, or the like.
      • a vehicle monitoring system that begins recording occurrences outside the vehicle and/or inside the vehicle given a sensed condition such as a break-in of the vehicle, the vehicle being in an accident, the vehicle being driven recklessly, etc.
      • a drone-based video surveillance system reacts to an occurrence on the ground (e.g., a detected explosion) by directing the drone to reduce altitude or alter course so as to achieve a different surveillance location or to cause a zoom function in a camera to zoom in so as to better view the occurrence on the ground.
  • The present invention can be practiced using publicly available computing devices, communications networks, and related software or can be practiced using proprietary computing devices, communications networks, and/or software. Rules and thresholds and the like can be established for one or more media recording devices using one or more computing device (e.g., a desktop computer) other than a recording device. Similarly, an interface can be provided to access media data stored on the cloud via computing devices other than a media recording device. Under one arrangement, a product is provided that includes a software application resident on a media recording device and a software application resident on a computing device other than a recording device. For example, an application running on a cell phone may store media data to the cloud that is later accessed via a desktop computer via an internet connection. Similarly, an application (e.g., a dashboard) executing on a desktop computer may be used to configure parameters (e.g., rules, thresholds, etc.) relating to a user account that are then loaded by a cell phone application and used to manage the conveyance of recorded media data to the cloud by the cell phone. Under another arrangement, one or more other applications used to manage events such as a calendar management application (e.g., Microsoft Outlook®) can be used to establish and manage events that are used to manage the conveyance of media data to the cloud. For example, a meeting request received via an email may establish a location and a time used in a rule used to manage the conveyance of media data to the cloud. Similarly, an alert condition established in a weather alert application might be inherited by another application managing the conveyance of media data to the cloud. Under yet another arrangement, the application managing the conveyance of media data to the cloud may interface with one or more publicly available data sources (e.g., National Weather Service, USGS Earthquake Early Warning system, a RSS news blog) and/or private data sources (e.g., a Social Directory API), where data provided by the one or more publicly available data sources and/or private data sources may be used, for example, to determine the occurrence of an event.
  • The present invention may be used as part of a monitoring service where the control of media recording functions can be at least partially managed by the monitoring service. For example, one or more media recording devices within a home or business may be activated based on a detected condition, a schedule, or as part of a random status check, where certain parameters are controllable by a user (e.g., home owner, business owner).
  • The present invention may take advantage of artificial intelligence algorithms that enable a media recording device to establish its own rules make its own decisions regarding which functions should be employed and to what extent as determined based on one or more events.
  • Under one arrangement, a media recording device may be configured to operate without displaying images being recorded on the display of the device. For example, a phone may be filming and streaming a video to the cloud without displaying the video on the display of the phone. This ‘non-display recording mode’ might be activated by a user selecting a button, providing a voice command, or automatically due to the occurrence of an event (e.g., a detected abrupt movement), etc. in the same manner as media recording can be activated.
  • In accordance with an embodiment of the invention, multiple media recording devices may be configured to collaborate. In one arrangement, data samples from different devices can have relative timings that are coordinated. For example, if three media recording devices are recording the same event from three different locations the their data samples may have staggered timing such that the timing of data samples from the first media recording device may be provided every 3 seconds beginning at time t0, data samples from the second media recording device may be provided every 3 seconds beginning at time t0+1 second, and data samples from the third media recording device may be provided every 3 seconds beginning at time t0+2 second. Alternatively, two or more media recording devices may coordinate their times such that they take data samples at substantially the same times, t0, t1, t2, etc. Other forms of collaboration include: 1) sharing of rules, thresholds, sensor information, and the like such that a set of parameters can be established that is used to manage conveyance of media data produced by multiple media recording devices, 2) sharing of recorded data among media recording devices allowing the display of information from multiple devices as an event is being recorded by the devices, 3) sharing of warnings and messages between devices upon occurrences of events (e.g., Fred's phone is code red and located at x,y,z coordinates).
  • FIG. 7 depicts an exemplary computing architecture 700 in accordance with the invention. Referring to FIG. 7, the computing architecture 700 includes a plug-in application 702 (e.g., a cell phone ‘app’) and a PC application 704. The plug-in application 702 includes a front end 706 and an application program interface (API) 708. The front end 706 provides a user interface that enables a user to log into the application which would involve use of a user management service 712, which provides user login and authentication capabilities, and corresponding payment management service 714, which provides for various means of payment for the application's services, products, etc. The user management service 712 provides notification to the API 708 that the user is authorized (i.e., authenticated and appropriates payments have been made) to use the application. The API 708 interfaces with a cloud service 720 and interfaces with a media server service 716 which streams media data to one or more media file databases 718, which may be provided by a cloud service or by another data storage service 720. The API 708 also provides metadata associated with the media data to one or more metadata file databases 719. The API 708 also interfaces with a web server service 722, which interfaces with a website 724 and a dashboard 726. The website 724 provides unregistered users a product description, pricing, and a product registration form. Once registered, a user is able to access the dashboard 726 that can be used to configure parameters relating to the user's account and provides access to the user's media files.
  • In accordance with another aspect of the present invention, a plurality of media recording devices interfaced with a cloud computing environment convey a corresponding plurality of recorded media and metadata to the cloud computing environment, where the plurality of recorded media and metadata may or may not correspond to multiple views of the same recorded event, and where the metadata enables the plurality of recorded media to by synchronized. The metadata includes time samples in accordance with a common time reference (i.e., a universal clock, atomic clock, NIST, US Naval Observatory clock, etc.) and may include magnetometer data that can be used to determine the view angle of a media recording device (e.g., a smartphone) and/or location information from a location system such as GPS location information. One skilled in the art will recognize that the plurality of media recording devices may record media and store metadata without using the cloud.
  • The present invention enables multiple individuals to use multiple media recording devices, for example, smartphones to independently and asynchronously record an event (e.g., a baseball game) from different ad hoc locations, where the beginning and ending recording times of the corresponding multiple recorded views of the event are determined by the individuals recording the different view (or media clip) of the event. Generally, the multiple individuals recording an event are not required to collaborate prior or during their recording of the event and a given individual may not even have any a priori knowledge of the recording by other individuals or even be aware of the recording of the event by any other individuals. A recorded event can be something involving some number of people that are collocated such as those attending a wedding where multiple persons might use their smartphones to record all or part of the wedding from different locations. Or, a recorded event might involve multitudes of people at many different locations, for example, persons recorded celebrating the occurrence of a new year. An event might also be some sort of tragedy such as a natural disaster (e.g., an earthquake), a terrorist attack (e.g., 911), a volcanic eruption, or the like.
  • The metadata stored during the recording of the multiple recorded views by the multiple media recording devices allows the various recorded media clips to then be synchronized such that multiple views of an event can be used collaboratively to produce one or more composite videos that are a combination of a sequence of different views of an event taken during periods of time. Generally, many different types of recording devices (e.g., video, cameras, audio recorders, etc.) can independently and asynchronously record an event, where the devices store metadata along with their recordings in accordance with the invention, and the recorded output of the devices can be later synchronized and used together in a collaborative matter, for example, a composite media output can be produced.
  • Referring again to the architecture of FIG. 7, the plug-in application 702 running on the multiple smartphones of multiple individuals is used to record multiple views of an event where the media and metadata corresponding to the multiple views is stored in a media database 718 and metadata database 719 provided by a cloud service or another data storage service 720. A PC application 704 (aka. a dashboard application) of any one of the multiple individuals can be provided the media and metadata of the multiple views of the event where the views can be synchronized and combined in various ways to produce various composite videos made up of different views of the event recorded by the multiple smartphones.
  • FIG. 8 depicts an exemplary graphical user interface (GUI) 800 of a media player of a PC application 704 for playing a single view of an event using media and metadata corresponding to the event. Referring to FIG. 8, the GUI includes a media player viewing window 802 and a location/view angle indicator window 804, which would typically display a coordinate system, a floor plan, a map or any other of various types of locational information that would provide an indication of the location 806 of a media recording device and also the view angle 808 of the media recording device which is indicated by a cone. The location/view angle indicator window 804 may also include various other types of information such as weather radar, satellite imagery, and the like. The GUI includes an information window 810, which can be used for displaying the product name, logos, a banner, advertisements, and other information. The media player viewing window 802 is controlled using media player controls 812, which typically includes a current frame control 814 shown on a timeline 816 as well as various other control buttons for controlling basic media player functions (e.g., pause/play, fast forward, time, closed caption, full screen, sound control, zoom, etc.).
  • The location/view angle indicator window is shown as a plan view but could alternatively be presented using three dimensional viewing techniques to as to indicate view elevation. View elevation might also be indicated using a numerical indication.
  • FIG. 9 depicts an exemplary GUI 900 of a multiple view media player of a PC application 704 for producing a composite video from media and metadata corresponding to multiple views of an event. Referring to FIG. 9, the GUI 900 includes a media player viewing window 802, a location/view angle indicator window 804, an information window 810, and media player controls 812 like the GUI 800 of FIG. 8. As seen in the location/view angle indicator window 804, the locations 806 a-806 d of multiple media recording devices are indicated along with their respective view angles as indicated by cones 808 a-808 d. Four multi-view media player viewing windows 902 a-902 d are used to display the multiple views of the event as recorded by the multiple media recording devices having the locations 806 a-806 d and viewing angles 808 a-808 d. The multi-view media player viewing windows 902 a-902 d each have corresponding selection icons 904 a-904 d, which in a preferred embodiment are color-coded, and view mode control buttons 906 a-906 d for controlling the viewing mode of each respective viewing window 902 a-902 d. The view mode control buttons 906 a-906 d are used to select which view is being played during a given period of time as part of the overall timeline of a composite video, where essentially the composite video consists of a sequence of selected views. An additional selection icon 904 e is shown in the location/view angle indicator window 804, which allows the window 804 to be selected and included in a composite video. In a preferred embodiment the timeline 816 of the composite video is color-coded consistent with the color coding of the selection icons 904 a-904 d corresponding to the views selected for the different portions of the composite video. Viewing modes may include may include, for example, OFF, PLAY, and PERIODIC FRAME modes, where the PLAY mode the media normally and the PERIODIC FRAME mode samples frames from the media data periodically (e.g., once per second) and displays the same frame for a period of time. Generally, the PERIODIC FRAME mode reduces bandwidth requirements but also reduces overall movement in the GUI, which might be preferable to certain users when selecting among views. Views may be played simultaneously or may be played one at a time. Non-selected views may update frames periodically (e.g., once per second) whereas the selected view plays normally.
  • To the right of the media player viewing window 802 is a view control window 908, which may include various functions specific to the multiple views or to the composite video. For example, a speed control 910 could be used to vary the speed (e.g., slow motion) at which one or more views is playing or a multi-view combination control 912 might be used, which causes a sequence of different views of the same or otherwise overlapping periods of time (e.g., four view angles of a bat hitting the winning home run). Generally, different types of view editing functions can be included in the view control window 908.
  • In accordance with another aspect of the invention, metadata files can be used by a plurality of media service providers to enable their media products to be synchronized, where such media products may involve videos, still camera images, animations, GUIs, etc. As such, different vendor products can be used to produce composite products. For example, a provider of digital score board displays for sporting events could provide metadata files with its display data files enabling the displays to be integrated with other products that could benefit from the displays.
  • FIG. 10 depicts an exemplary GUI 1000 of a multiple view media player of a PC application 704 for producing a composite video from media and metadata corresponding to multiple views of a baseball game, which includes products that might be provided by other service providers. For example, the information window 804 of the GUI 1000 might depict a baseball park 1002, which might be a live video image or images, a digital scoreboard 1004, and a digital scorebook 1006, which might be provided by other service providers but integrated into the GUI and synchronized with the multiple views of the baseball game using the metadata provided by the various service providers. As shown, selection icons 904 e-904 g can be associated with the output of these third party products, which causes their images to be included in a composite video.
  • Generally, the present invention enables various types of specialized theme-based PC applications 704 having GUIs that are tailored for specific types of events such as sporting events (e.g., tennis, basketball, soccer, baseball, football, volleyball, etc.), weddings, concerts, plays, recitals, political events, and any other type of event where specialized products can be integrated as a result of product vendors complying with metadata requirements that enable synchronization of products. Such specialized PC applications 704 can have advertising focused on specific groups that would be interested in the specific type of events. For example, advertisements targeted at baseball fans might be displayed on a GUI having a baseball theme.
  • In accordance with a still further aspect of the invention, a multiple view video data package comprising a plurality of media files and metadata files corresponding to multiple views of a recorded event can be posted to an internet media-sharing website (e.g., YouTube®) of FaceBook®), where the package may or may not include one or more composite videos produced using the plurality of media files and metadata files. As such, other users of the media-sharing website can comment on and rate (e.g., thumbs up or down) a composite video and/or can download the media files and metadata files corresponding to the recorded event to their own PC applications 704 allowing them to produce their own composite video(s). Alternatively, the internet media-sharing website might provide a real-time dashboard application for producing a composite video using the media files and metadata files.
  • As such, the present invention enables a new collaborative media sharing environment whereby users can combine views taken from their smartphones with other views or otherwise participate in the creation of composite videos from whatever views of an event are available. Users can post their multiple view video data packages and/or media and metadata for a single view using an identifier such as a Hashtag#.
  • Moreover, various other social media products can be leveraged including Twitter®, which might be used to organize views in real time (e.g., someone needs to move to the right of the stage to get a view . . . ). Similarly, applications such as LinkedIn® might be used to organize the views recorded by different groups or organizations.
  • In accordance with an aspect of the invention, information may be displayed at an event that can be captured with a media recording device and be used to coordinate a collaborative recording of media relating to an event. For example, a barcode (or QR code) can be displayed on a placard, on a display, or in a program for an event that a user of a smartphone can use the camera of the smartphone to take a picture of the barcode, where a software application running on the phone would decipher the barcode information and use it to coordinate the collaborative recording of media relating to the event. Under one arrangement, a person organizing the collaborative recording of media relating to an event can provide information to a graphical user interface of a software application that subsequently can create a barcode (or QR code), which can then be displayed at an event such that it can be captured by media recording devices of persons wanting to participate in the collaborative recording of media relating to the event.
  • To support synchronization, digital watermarks might be placed on certain frames during the recording of an event and a dashboard application might vary relative time bases of different views using the digital watermarks.
  • In accordance with a further aspect of the invention, the media and metadata corresponding to a given recorded event can be made publicly available or access to such media and metadata can be controlled such that it is only provided to one or more authorized persons. Under one arrangement, a service provider may convey media and metadata to authorized users in accordance with access control permissions which may administered using one or more layers of administration that control membership in groups where access to media and metadata may involve an access control list, a password, and/or a role. Generally, control of access to media and metadata can be controlled in the same or similar way that access to location based information is described in U.S. Pat. No. 7,525,425, which is incorporated by reference herein in its entirety. More particularly, a person who records an event can be considered the owner of the corresponding recorded media and metadata and can determine which other person or persons have access to such data. For example, an owner of certain media and metadata may restrict access to persons included on an access control list, persons that know a password, or persons provided access because of a role (e.g., a detective, an administrator, etc.). Alternatively, a person may request an owner to provide access to certain media and metadata, which might be granted electronically, for example, by an owner responding to an automated permission message in a manner similar to how a LinkedIn® connection can be requested and accepted.
  • In accordance with yet another aspect of the invention, a media recording device may include or be interfaced to a local media server or a data buffering device which might be a memory card, thumb drive, or the like that enables media data and metadata to be stored periodically, for example, during a loss of a data link to the cloud. Under one arrangement, data may be automatically buffered to a local media server or a data buffering device until successfully conveyed to the cloud in what can be described as store and forward streaming to the cloud. Under another arrangement buffering to a local media server or a buffering device only occurs when a link to the cloud is unavailable.
  • While particular embodiments of the invention have been described, it will be understood, however, that the invention is not limited thereto, since modifications may be made by those skilled in the art, particularly in light of the foregoing teachings.

Claims (20)

1. A system for media synchronization and collaboration, comprising:
a data storage;
a plurality of media recording devices used by a plurality of users to independently and asynchronously record an event from multiple locations thereby producing a plurality of recorded media data corresponding to a plurality of views of said event, each of said plurality of media recording devices conveying to said data storage media data and metadata corresponding to their respective view of said event, said metadata including time samples in accordance with a common time reference; and
a media player comprising a processor and a graphical user interface, said media player using said metadata to synchronize and play said plurality of views of said event.
2. The system of claim 1, wherein said plurality of media recording devices is a plurality of smart phones.
3. The system of claim 1, wherein said graphical user interface comprises a plurality of viewing windows used to display the plurality of views of the event.
4. The system of claim 3, wherein each of said plurality of viewing windows has a corresponding selection icon of a plurality of selection icons.
5. The system of claim 3, wherein said plurality of selection icons are color-coded.
6. The system of claim 5, wherein said plurality of selection icons can be used to select views of said plurality of views of said event to be playing during periods of time as part of an overall timeline of a composite video that consists of a sequence of selected views.
7. The system of claim 6, wherein said graphical user interface plays said composite video.
8. The system of claim 6, wherein said timeline is color-coded to identify the view of said plurality of views that is playing during a given period of time of said timeline.
9. The system of claim 6, where said graphical user interface includes a multi-view combination control that causes a sequence of different views of the same time period or overlapping time periods to be played in said composite video.
10. The system of claim 6, wherein said metadata includes location and view angle data and said graphical user interface provides the ability to display a location/view angle indicator window that indicates the location and view angle of each view of said plurality of views.
11. The system of claim 10, wherein said location/view angle indicator window has a selection icon that can be selected to cause it to be displayed in said composite video during a given period of time of said timeline.
12. The system of claim 6, wherein said metadata enables at least one of a still camera image, animation, digital score board, digital scorebook, or live video to be synchronized so that it can be displayed in said composite video during a given period of time of said timeline.
13. The system of claim 1, wherein said media player is configured to create a multiple view video data package comprising a plurality of media files and metadata files corresponding to multiple views of a recorded event.
14. The system of claim 13, wherein said multiple view video data package further comprises a composite video produced using said plurality of media files and metadata files.
15. The system of claim 14, wherein said media player is configured to post said multiple view video data package to an internet media-sharing website thereby enabling users of said internet media-sharing website to at least one of comment on said composite video, rate said composite video, or download said plurality of media files and said plurality of metadata files.
16. A method of media synchronization and collaboration, comprising:
recording an event using a plurality of media recording devices that independently and asynchronously record said event to produce a plurality of recorded media clips corresponding to a plurality of views of said event;
storing media data and metadata corresponding to said plurality of views of said event to a data storage, said metadata including time samples in accordance with a common time reference; and
providing a media player comprising a processor and a graphical user interface that uses said metadata to synchronize and play said plurality of views of said event.
17. The method of claim 16, further comprising:
producing a composite video by selecting using said graphical user interface a sequence of selected views of said plurality of views of said event that play during a periods of time as part of an overall timeline of said composite video.
18. The method of claim 17, further comprising:
creating a multiple view video data package comprising a plurality of media files and a plurality of metadata files used to produce said composite video.
19. The method of claim 18, wherein said multiple view video data package further comprises said composite video.
20. The method of claim 19, further comprising:
posting said multiple view video data package to an internet media-sharing website thereby enabling users of said internet media-sharing website to at least one of comment on said composite video, rate said composite video, or download said plurality of media files and said plurality of metadata files.
US15/596,916 2015-01-05 2017-05-16 System and Method for Media Synchronization and Collaboration Abandoned US20170251231A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/596,916 US20170251231A1 (en) 2015-01-05 2017-05-16 System and Method for Media Synchronization and Collaboration
US15/688,519 US10694219B2 (en) 2015-01-05 2017-08-28 System and method for media synchronization and collaboration

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201562099937P 2015-01-05 2015-01-05
US14/988,568 US20160197837A1 (en) 2015-01-05 2016-01-05 Method for Conveying Media to a Cloud Computing Environment
US201662340013P 2016-05-23 2016-05-23
US15/596,916 US20170251231A1 (en) 2015-01-05 2017-05-16 System and Method for Media Synchronization and Collaboration

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/988,568 Continuation-In-Part US20160197837A1 (en) 2015-01-05 2016-01-05 Method for Conveying Media to a Cloud Computing Environment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/688,519 Continuation US10694219B2 (en) 2015-01-05 2017-08-28 System and method for media synchronization and collaboration

Publications (1)

Publication Number Publication Date
US20170251231A1 true US20170251231A1 (en) 2017-08-31

Family

ID=59679990

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/596,916 Abandoned US20170251231A1 (en) 2015-01-05 2017-05-16 System and Method for Media Synchronization and Collaboration
US15/688,519 Active US10694219B2 (en) 2015-01-05 2017-08-28 System and method for media synchronization and collaboration

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/688,519 Active US10694219B2 (en) 2015-01-05 2017-08-28 System and method for media synchronization and collaboration

Country Status (1)

Country Link
US (2) US20170251231A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019083544A1 (en) * 2017-10-27 2019-05-02 Szeklinski Deel Charles Interactive sports fan experience
US20190149833A1 (en) * 2017-11-15 2019-05-16 Sony Interactive Entertainment, LLC Synchronizing session content to external content
CN110022495A (en) * 2019-03-28 2019-07-16 青岛海信电器股份有限公司 A kind of mobile terminal pushes the method and display equipment of media file to display equipment
CN110418201A (en) * 2019-07-16 2019-11-05 咪咕视讯科技有限公司 A kind of the sharing processing method and equipment of multi-channel video
CN110417885A (en) * 2019-07-26 2019-11-05 四川新东盛科技发展有限公司 Flattening video directing and scheduling system and method based on wisdom police service
US10499016B1 (en) * 2016-04-21 2019-12-03 Probable Cause Solutions LLC System and method for synchronizing camera footage from a plurality of cameras in canvassing a scene
US20190370562A1 (en) * 2018-06-04 2019-12-05 Genetec Inc. Automated association of media with occurrence records
US10623385B2 (en) * 2018-03-16 2020-04-14 At&T Mobility Ii Llc Latency sensitive tactile network security interfaces
US10863230B1 (en) 2018-09-21 2020-12-08 Amazon Technologies, Inc. Content stream overlay positioning
US10897637B1 (en) * 2018-09-20 2021-01-19 Amazon Technologies, Inc. Synchronize and present multiple live content streams
US11176269B2 (en) 2019-03-08 2021-11-16 International Business Machines Corporation Access control of specific encrypted data segment
US11270736B2 (en) * 2017-12-29 2022-03-08 SZ DJI Technology Co., Ltd. Video data processing method, device, system, and storage medium
US20220086163A1 (en) * 2020-09-14 2022-03-17 Box, Inc. Establishing user device trust levels
US11375976B2 (en) * 2017-11-03 2022-07-05 John Jun Cai Wireless stethoscope for transmitting, recording, storing and diagnostic capabilities including an earpiece
US11528678B2 (en) * 2019-12-20 2022-12-13 EMC IP Holding Company LLC Crowdsourcing and organizing multiple devices to perform an activity
US11729189B1 (en) * 2017-01-27 2023-08-15 Rapid7, Inc. Virtual security appliances for eliciting attacks

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2480869B (en) 2010-06-04 2017-01-11 Bisn Tec Ltd Method and apparatus for use in well abandonment
GB201223055D0 (en) 2012-12-20 2013-02-06 Carragher Paul Method and apparatus for use in well abandonment
GB201406071D0 (en) 2014-04-04 2014-05-21 Bisn Tec Ltd Well Casing / Tubing Disposal
GB201414565D0 (en) 2014-08-15 2014-10-01 Bisn Oil Tools Ltd Methods and apparatus for use in oil and gas well completion
GB2551693B (en) 2016-05-24 2021-09-15 Bisn Tec Ltd Down-hole chemical heater and methods of operating such
GB2562208B (en) 2017-04-04 2021-04-07 Bisn Tec Ltd Improvements relating to thermally deformable annular packers
GB2568519B (en) 2017-11-17 2022-09-28 Bisn Tec Ltd An expandable eutectic alloy based downhole tool and methods of deploying such

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090109294A1 (en) * 2007-10-26 2009-04-30 Greg Allen Cummings User interface for a portable digital video camera
US20100128121A1 (en) * 2008-11-25 2010-05-27 Stuart Leslie Wilkinson Method and apparatus for generating and viewing combined images
US20110280540A1 (en) * 2010-05-12 2011-11-17 Woodman Nicholas D Broadcast management system
US20120162436A1 (en) * 2009-07-01 2012-06-28 Ustar Limited Video acquisition and compilation system and method of assembling and distributing a composite video
US20120263439A1 (en) * 2011-04-13 2012-10-18 David King Lassman Method and apparatus for creating a composite video from multiple sources
US20160337718A1 (en) * 2014-09-23 2016-11-17 Joshua Allen Talbott Automated video production from a plurality of electronic devices

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7689720B2 (en) * 2003-11-05 2010-03-30 Microsoft Corporation Method for establishing and maintaining a shared view of time in a peer-to-peer network
US20080092047A1 (en) * 2006-10-12 2008-04-17 Rideo, Inc. Interactive multimedia system and method for audio dubbing of video
US9117483B2 (en) * 2011-06-03 2015-08-25 Michael Edward Zaletel Method and apparatus for dynamically recording, editing and combining multiple live video clips and still photographs into a finished composition
US20130259447A1 (en) * 2012-03-28 2013-10-03 Nokia Corporation Method and apparatus for user directed video editing
EP3207433A4 (en) * 2014-10-15 2018-06-27 Nowak, Benjamin Multiple view-point content capture and composition
US20160197837A1 (en) * 2015-01-05 2016-07-07 L&M Ip Method for Conveying Media to a Cloud Computing Environment
WO2017069670A1 (en) * 2015-10-23 2017-04-27 Telefonaktiebolaget Lm Ericsson (Publ) Providing camera settings from at least one image/video hosting service

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090109294A1 (en) * 2007-10-26 2009-04-30 Greg Allen Cummings User interface for a portable digital video camera
US20100128121A1 (en) * 2008-11-25 2010-05-27 Stuart Leslie Wilkinson Method and apparatus for generating and viewing combined images
US20120162436A1 (en) * 2009-07-01 2012-06-28 Ustar Limited Video acquisition and compilation system and method of assembling and distributing a composite video
US20110280540A1 (en) * 2010-05-12 2011-11-17 Woodman Nicholas D Broadcast management system
US20120263439A1 (en) * 2011-04-13 2012-10-18 David King Lassman Method and apparatus for creating a composite video from multiple sources
US20160337718A1 (en) * 2014-09-23 2016-11-17 Joshua Allen Talbott Automated video production from a plurality of electronic devices

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10499016B1 (en) * 2016-04-21 2019-12-03 Probable Cause Solutions LLC System and method for synchronizing camera footage from a plurality of cameras in canvassing a scene
US10771746B2 (en) * 2016-04-21 2020-09-08 Probable Cause Solutions LLC System and method for synchronizing camera footage from a plurality of cameras in canvassing a scene
US11729189B1 (en) * 2017-01-27 2023-08-15 Rapid7, Inc. Virtual security appliances for eliciting attacks
WO2019083544A1 (en) * 2017-10-27 2019-05-02 Szeklinski Deel Charles Interactive sports fan experience
US11375976B2 (en) * 2017-11-03 2022-07-05 John Jun Cai Wireless stethoscope for transmitting, recording, storing and diagnostic capabilities including an earpiece
US20190149833A1 (en) * 2017-11-15 2019-05-16 Sony Interactive Entertainment, LLC Synchronizing session content to external content
US10425654B2 (en) * 2017-11-15 2019-09-24 Sony Interactive Entertainment LLC Synchronizing session content to external content
US20200014949A1 (en) * 2017-11-15 2020-01-09 Sony Interactive Entertainment LLC Synchronizing session content to external content
US11270736B2 (en) * 2017-12-29 2022-03-08 SZ DJI Technology Co., Ltd. Video data processing method, device, system, and storage medium
US10938794B2 (en) * 2018-03-16 2021-03-02 At&T Mobility Ii Llc Latency sensitive tactile network security interfaces
US10623385B2 (en) * 2018-03-16 2020-04-14 At&T Mobility Ii Llc Latency sensitive tactile network security interfaces
US11615624B2 (en) 2018-06-04 2023-03-28 Genetec Inc. Automated association of media with occurrence records
US11036997B2 (en) * 2018-06-04 2021-06-15 Genetec Inc. Automated association of media with occurrence records
US20190370562A1 (en) * 2018-06-04 2019-12-05 Genetec Inc. Automated association of media with occurrence records
US10897637B1 (en) * 2018-09-20 2021-01-19 Amazon Technologies, Inc. Synchronize and present multiple live content streams
US10863230B1 (en) 2018-09-21 2020-12-08 Amazon Technologies, Inc. Content stream overlay positioning
US11176269B2 (en) 2019-03-08 2021-11-16 International Business Machines Corporation Access control of specific encrypted data segment
CN110022495A (en) * 2019-03-28 2019-07-16 青岛海信电器股份有限公司 A kind of mobile terminal pushes the method and display equipment of media file to display equipment
CN110418201A (en) * 2019-07-16 2019-11-05 咪咕视讯科技有限公司 A kind of the sharing processing method and equipment of multi-channel video
CN110417885A (en) * 2019-07-26 2019-11-05 四川新东盛科技发展有限公司 Flattening video directing and scheduling system and method based on wisdom police service
US11528678B2 (en) * 2019-12-20 2022-12-13 EMC IP Holding Company LLC Crowdsourcing and organizing multiple devices to perform an activity
US20220086163A1 (en) * 2020-09-14 2022-03-17 Box, Inc. Establishing user device trust levels

Also Published As

Publication number Publication date
US20190014355A1 (en) 2019-01-10
US10694219B2 (en) 2020-06-23

Similar Documents

Publication Publication Date Title
US10694219B2 (en) System and method for media synchronization and collaboration
US11734723B1 (en) System for providing context-sensitive display overlays to a mobile device via a network
US10965723B2 (en) Instantaneous call sessions over a communications application
US10162999B2 (en) Face recognition based on spatial and temporal proximity
US8982220B2 (en) Broadcasting content
US9407881B2 (en) Systems and methods for automated cloud-based analytics for surveillance systems with unmanned aerial devices
US8774452B2 (en) Preferred images from captured video sequence
KR20190134813A (en) Location privacy management on map-based social media platforms
US20110111786A1 (en) Mobile based neighborhood watch system capable of group interactions, anonymous messages and observation reports
CN115552403B (en) Inviting media overlays for private collections of media content items
US10567844B2 (en) Camera with reaction integration
US10554715B2 (en) Video icons
US20160197837A1 (en) Method for Conveying Media to a Cloud Computing Environment
CN116438788A (en) Media content playback and comment management
US20160274737A1 (en) Video-based social interaction system
US11310623B2 (en) Network based video surveillance and logistics for multiple users
CN111226262A (en) Composite animation
US8892538B2 (en) System and method for location based event management
US20240135701A1 (en) High accuracy people identification over time by leveraging re-identification
AU2012238085B2 (en) Face recognition based on spatial and temporal proximity

Legal Events

Date Code Title Description
AS Assignment

Owner name: GITCIRRUS LLC, ALABAMA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FULLERTON, LARRY W.;ROBERTS, MARK D.;L&M IP;REEL/FRAME:045814/0190

Effective date: 20160501

Owner name: GITCIRRUS LLC, ALABAMA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROBERTS, MARK D.;WELDY, DENNIS M.;FULLERTON, ERIC;SIGNING DATES FROM 20180510 TO 20180511;REEL/FRAME:045814/0186

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION