US20100306671A1 - Avatar Integrated Shared Media Selection - Google Patents
Avatar Integrated Shared Media Selection Download PDFInfo
- Publication number
- US20100306671A1 US20100306671A1 US12/551,403 US55140309A US2010306671A1 US 20100306671 A1 US20100306671 A1 US 20100306671A1 US 55140309 A US55140309 A US 55140309A US 2010306671 A1 US2010306671 A1 US 2010306671A1
- Authority
- US
- United States
- Prior art keywords
- group
- users
- media
- user
- avatar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000011093 media selection Methods 0.000 title 1
- 238000000034 method Methods 0.000 claims abstract description 43
- 230000008451 emotion Effects 0.000 claims description 53
- 238000003860 storage Methods 0.000 claims description 17
- 230000004044 response Effects 0.000 claims description 12
- 238000009877 rendering Methods 0.000 claims description 7
- 230000015572 biosynthetic process Effects 0.000 claims description 2
- 230000007935 neutral effect Effects 0.000 claims description 2
- 230000008569 process Effects 0.000 description 17
- 238000004891 communication Methods 0.000 description 16
- 230000000694 effects Effects 0.000 description 9
- 230000009471 action Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 230000002093 peripheral effect Effects 0.000 description 4
- 238000003825 pressing Methods 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 230000005055 memory storage Effects 0.000 description 3
- 230000006855 networking Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- CDFKCKUONRRKJD-UHFFFAOYSA-N 1-(3-chlorophenoxy)-3-[2-[[3-(3-chlorophenoxy)-2-hydroxypropyl]amino]ethylamino]propan-2-ol;methanesulfonic acid Chemical compound CS(O)(=O)=O.CS(O)(=O)=O.C=1C=CC(Cl)=CC=1OCC(O)CNCCNCC(O)COC1=CC=CC(Cl)=C1 CDFKCKUONRRKJD-UHFFFAOYSA-N 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 238000012508 change request Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 230000006397 emotional response Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005316 response function Methods 0.000 description 1
- 238000009987 spinning Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/40—Business processes related to the transportation industry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/40—Network security protocols
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2387—Stream processing in response to a playback request from an end-user, e.g. for trick-play
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/254—Management at additional data server, e.g. shopping server, rights management server
- H04N21/2541—Rights Management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4627—Rights management associated to the content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6587—Control parameters, e.g. trick play commands, viewpoint selection
Definitions
- avatars may be used to enhance a group or online experience.
- An avatar can represent a user in a variety of contexts, including computer or video games, applications, chats, forums, communities, and instant messaging services.
- An avatar may be an object representing the embodiment of a user and may represent various actions and aspects of the user's personal, beliefs, interests, or social status.
- the use of avatars has not overcome the deficiencies of existing group and online systems and applications.
- One popular social activity is the viewing of movies and television shows.
- a method and system are disclosed herein in which a group of people may replicate the physical world experience of going with a group of friends to select a movie or show to watch together in a virtual world while the users are in different physical locations.
- Users of a group may nominate movie or television selections for group viewing.
- a display at a user's location may render avatars representing the users of the group.
- Users may also select an emotion that the member wishes to express to the other users of the group.
- FIG. 1 is a block diagram of an example network configuration.
- FIG. 2 depicts an example user interface that maybe provided during a networked, social multimedia experience.
- FIGS. 3A-3C are flowcharts of example methods for synchronizing control commands in a networked, social multimedia environment.
- FIG. 4 is a block diagram of an example computing environment.
- FIGS. 5A-5G are screen shots from the perspective of one user of a group of users illustrating a process flow of an avatar integrated shared media nomination and watching experience.
- FIG. 6 depicts an exemplary user interface incorporating some of the embodiments disclosed herein.
- FIG. 7 depicts an exemplary process incorporating some of the embodiments disclosed herein.
- FIG. 8 depicts example system for providing a shared media experience.
- FIG. 9 illustrates a computer readable medium bearing computer executable instructions discussed with respect to FIGS. 1-8 .
- the following example embodiments describe the media nomination and selection process in the context of viewing movies and television shows.
- the selection of movies and television shows is exemplary, and those skilled in the art will recognize that the principles are readily applicable to the nomination and selection of other media types that may be shared among a group of users.
- Such media types may include any media file or application such as music files and video games. All such media types and applications are contemplated as within the scope of the present disclosure.
- a group of users may replicate the physical world experience of meeting with a group of friends to select a movie or television show to watch together.
- the experience may be replicated in a virtual world in which the users are in different physical locations and in communication via a network.
- Users of the group may nominate movie or television selections for group viewing.
- a display at a user's location may render avatars representing the users of the group.
- Users may also select an emotion that the user wishes to express to the other users of the group.
- a user in a group of users may be provided an opportunity to browse content made available by a system and/or service, such as for example Microsoft's XBOX 360 console and XBOX LIVE service, and to nominate specific content, such as movies or television shows, that they would like to watch.
- a system and/or service such as for example Microsoft's XBOX 360 console and XBOX LIVE service
- nominate specific content such as movies or television shows, that they would like to watch.
- the users may discuss with each other, via for example their respective headset devices, regarding which movie or show they would like to watch together.
- Each user may also have an avatar which is a virtual representation of his- or herself that may act out different “pick my movie” animations to attempt to convey the user's excitement with the particular movie or television show he or she has chosen.
- Many applications such as video games sometimes feature a user-created, system-wide avatar as a user-controlled character.
- Avatars can be graphical images that represent real persons in virtual or game space.
- a user may customize the avatar in variety of ways dealing with appearance, such as facial features and clothing. This allows the user a more personalized and involved video gaming experience.
- the Nintendo Corporation has a user-created, system-wide avatar, the MII®, which a user may then use as his or her user-controlled character in video games that support this feature, such as WII SPORTS®.
- the chosen content may be watched by the group of users and their respective avatars may appear in a virtual “destination” to watch the movie together.
- the virtual destination and a representation of the group watching the content may appear on the display of each user's respective console, thus simulating a physical gathering of those users.
- the group of users may talk during the movie on their headsets and have their avatars perform emotions and gestures that the user in the physical world is feeling based on the content being played.
- the system may provide themed destinations that may be chosen by the system or by one or more of the users of the group viewing the content.
- a content service provider such as an online provider of movies that can be rented for view, may provide a themed destination that resembles a home theater environment.
- Another service provider may provide a themed destination that resembles a full theater.
- Other themed destinations may include ones that resemble a beach, aquarium, outer space, mountains, drive-in theater or any other destination.
- the themed destination may be chosen by a leader of the group or by the collective agreement of the users.
- Hidden/unlockable themes may also be provided based on events occurring around a user. For example, if one of the users in the group is having a birthday on that day, the system may provide a special destination where a birthday cake and balloons may appear as the users are watching a movie together.
- FIG. 1 illustrates an example network environment.
- actual network and database environments may be arranged in a variety of configurations; however, the example environment shown here provides a framework for understanding the type of environment in which an embodiment may operate.
- the example network may include one or more client computers 200 a, a server computer 200 b, data source computers 200 c, and/or databases 270 , 272 a, and 272 b.
- the client computers 200 a and the data source computers 200 c may be in electronic communication with the server computer 200 b by way of the communications network 280 (e.g., an intranet, the Internet or the like).
- the client computers 200 a and data source computers 200 c may be connected to the communications network by way of communications interfaces 282 .
- the communications interfaces 282 can be any type of communications interfaces such as Ethernet connections, modem connections, wireless connections and so on.
- the server computer 200 b may provide management of the database 270 by way of database server system software such as MICROSOFT®'s SQL SERVER or the like. As such, server 200 b may act as a storehouse of data from a variety of data sources and provides that data to a variety of data consumers.
- database server system software such as MICROSOFT®'s SQL SERVER or the like.
- server 200 b may act as a storehouse of data from a variety of data sources and provides that data to a variety of data consumers.
- a data source may be provided by data source computer 200 c.
- Data source computer 200 c may communicate data to server computer 200 b via communications network 280 , which may be a LAN, WAN, Intranet, Internet, or the like.
- Data source computer 200 c may store data locally in database 272 a, which may be database server or the like.
- the data provided by data source 200 c can be combined and stored in a large database such as a data warehouse maintained by server 200 b.
- Client computers 200 a that desire to use the data stored by server computer 200 b can access the database 270 via communications network 280 .
- Client computers 200 a access the data by way of, for example, a query, a form, etc. It will be appreciated that any configuration of computers may be employed.
- the client computers 200 a depicted in FIG. 1 may be PCs or game consoles, for example. Two or more clients 200 a may form a “party.”
- a “social video application” 220 running on the server 200 b may designate one of the clients 200 a as the “remote holder.”
- the remote holder may be the first member of the party to request a network session. Such a request may be, for example, a request for streaming video.
- the remote holder may then invite other clients to establish a networked, social multimedia experience, i.e., to join the party.
- the remote holder may have control over a shared “remote control” 210 that controls content playback.
- the remote holder's “state” may be sent to all connected users in a group, who see it and synchronize to it, causing the same action to occur on their client.
- the other users may have the ability to play, pause, and request remote holder status by sending their own state to the remote holder. Such actions may need approval from the current remote holder to take effect. Users may also have the ability to leave the playback session.
- the video may be kept synchronized by keeping all users updated on the remote holder's state.
- the remote holder's state may be a structure 235 that contains information on playback status (e.g., playing, paused, initializing, etc.), an identifier associated with the content being viewed, and a current time code associated with the content.
- the remote holder may maintain its state (i.e., keep it up-to-date), and send it to all the other users when it changes. The other users may then see the new state, compare their own time code and playback state to the remote holder's, and then take action accordingly.
- Each client may have its own respective social video application 230 , and may maintain its own respective state structure 235 .
- a user's state is different from that of the remote holder, it may be updated (playing may become paused, for example). If a user's time code is too different from the remote holder's, then a “seek” operation may be performed to the remote holder's reported time code. The user may be responsible for predicting, based on “pre-buffering times,” how long it will take the seek call to complete, and compensate by adjusting the targeted time code.
- Users may also be enabled to make requests of the remote holder by sending the remote holder and all other users an updated state that differs from the remote holder's state. When the remote holder sees this state, it may be taken as a request. The remote holder may update its state to reflect the requested changes. Only then do the other users (including the user that made the request) change their state. The same process can be used to request remote holder status.
- any user can be the remote holder, but only one user can be the remote holder at any time. Any member may be promoted to remote holder, demoting the current remote holder to a normal user. The “current” remote holder is the only user who can “pass the remote” to another user. The server may keep track of the identify of the current remote holder.
- Multiparty voice chat may be integrated into the experience, allowing members to comment on the video.
- a group of people may be enabled to share the experience of watching a video together as if they were in the same room, without being physically present together. All users may have the same access to voice chat. That is, any user may speak whenever he chooses.
- Multiparty voice chat may require a certain level of synchronization among the clients that form the party. If any client were allowed to be even a few seconds out of synch with the rest of the party, comments made over the chat may not make sense. Additionally, feedback from the audio of one client sent over voice chat could be very disruptive if it's not closely in-sync with what other users are hearing from their own video.
- Fast-forward and reverse may be treated differently from play, pause, and seek commands.
- the other clients may simply pause playback.
- the other clients may receive the remote holder's updated state, and issue a “seek” command telling them to resume playback from the time index the remote holder has selected. This may eliminate potential synchronization issues that may be caused by fast-forward or reverse speeds being slightly different on different users' client computers.
- a fully social experience may be created where people are not only watching the same video, but also using graphical user avatars to create a virtual viewing environment such as a virtual entertainment room or movie theater.
- the users may be represented graphically in front of the video, and may be enabled to use animations, text chat, and voice chat to interact with each other.
- the introduction of graphical avatars into the shared video experience may add another dimension to the experience by giving users a sense of identity within the virtual viewing environment.
- Each user watching the video may be represented by their own customized avatar.
- the avatars of every person in the session may be rendered on everyone else's television or monitor, resulting in a group of avatars that appear to be watching the video in a virtual environment.
- Each user may be enabled to trigger animations and text messages (in the form of “speech balloons,” for example) for their avatar.
- Such animations and text messages may be rendered on every other users' television or monitor.
- FIG. 2 depicts an example user interface 400 that may be provided during a networked, social multimedia experience.
- the user interface 400 may be presented on respective video monitors provided at each client location. The same interface may be presented at each location.
- the user interface 400 may depict an area for displaying a movie.
- the area may be a virtual viewing environment such as a virtual living room or a virtual movie theater.
- the scene providing the area for rendering the media may be referred to as the “destination” or “themed destination.”
- the user interface 400 may include a video presentation portion 410 , via which the video 412 is presented to the users.
- the user interface 400 may also include a respective avatar 420 A-D corresponding to each of the users.
- the user interface 400 may also include a text chat area.
- text chat may be presented in the form of speech balloons 430 A-D.
- text chat may be presented as scrolling text in a chat box portion of the user interface 400 . Audio maybe presented via one or more speakers (not shown) provided at the client locations.
- Each client may render its own themed destination.
- software may be provided on each client to enable the client to render its own themed destination.
- the themed destinations rendered on the several clients may be identical, or not.
- the gesture When a user causes his or her avatar to gesticulate, the gesture may be presented at all the client locations in synchronicity.
- the gesture may be presented at all the client locations in synchronicity.
- a user speaks or otherwise produces an audio event, e.g., through voice chat, or textual event, e.g., through text chat, the audio or text may be presented at all the client locations in synchronicity.
- FIG. 3A is a flowchart of an example method 300 for synchronizing play, pause, stop, and seek commands from the remote holder.
- the remote holder may select a “play,” “pause,” “stop,” or “seek” operation, e.g., by pressing the play, pause, stop, or seek button on their game controller or remote control.
- the remote holder client may update its state structure to reflect the change in time code and playback status.
- the remote holder client communicates the remote holder's state structure to the other clients in the party. To maintain the highest level of synchronization among the several clients in the party, such updates should be communicated as frequently as possible.
- the other clients receive the remote holder's updated state.
- each client responds to the state change by updating its own state structure to conform to that of the remote holder.
- the state structure from each client may be sent to every other client, so that every client always knows the current state of every other client in the party. Because the state structure contains information on playback status, an identifier associated with the content being viewed, and a current time code associated with the content, each client will then be performing the same operation, at the same place in the same content, at the same time.
- FIG. 3B is a flowchart of an example method 310 for synchronizing play or pause commands from a user who is not the remote holder.
- a user who is not the remote holder is not enabled to exercise a stop, seek, fast-forward, or reverse command.
- a non-remote holder user may select a “play” or “pause” operation, e.g., by pressing the play or pause button on their game controller or remote control.
- the selecting user's client may update its state structure to reflect that a play or pause state has been requested.
- the selecting user's client may send the selecting user's state to the remote holder client, as well as to all other members of the party.
- the remote holder client may receive the selecting user's state, from which it can determine that another member of the party has made a playback state change request.
- the remote holder client may change its own state to reflect the new state.
- the remote holder client communicates the remote holder's state structure to the other clients in the party. To maintain the highest level of synchronization among the several clients in the party, such updates should be communicated as frequently as possible.
- the other clients receive the remote holder's updated state.
- the other clients including the user who made the original request, receive the remote holder's updated state, and respond to the state change by updating their own state structures to conform to that of the remote holder.
- the selected action occurs on the requesting user's client.
- FIG. 3C is a flowchart of an example method 320 for synchronizing fast-forward and reverse commands from the remote holder.
- the remote holder may select a “fast-forward” or “reverse” operation, e.g., by pressing the fast-forward or reverse button on their game controller or remote control.
- the remote holder client may update its state to reflect that it is currently fast-forwarding or reversing.
- the remote holder client communicates the remote holder's state structure to the other clients in the party.
- the other users receive the new state, and pause until the fast forward/reverse state changes again.
- the remote holder video starts to fast-forward or reverse.
- the remote holder may select a “play” operation, e.g., by pressing the play button on their game controller or remote control.
- the remote holder video begins playback at the time code associated with the point in the video at which the remote holder selected the play operation.
- the remote holder may update its state to reflect that it is currently playing and has a new time code, and communicate its state structure to the other clients in the party.
- the other users receive the new state structure and perform a seek and play operation to get back synchronized with the remote holder.
- the remote holder may be allowed full control over the virtual remote control, while the other users have only the ability to exit the video experience, play, pause, and make requests of the remote holder.
- no playback changes are made until the remote holder has changed its own state.
- Synchronization of avatars may be implemented in much the same way as described above in connection with synchronization of play and pause commands.
- Each user would construct his or her own avatar, or retrieve a saved avatar if the user already constructed one.
- Each client could then communicate information about its respective avatar to the other clients.
- each client may retrieve the avatars from a common server (e.g., based on gamer tags associated with the avatars). For example, avatars may be retrieved via the internet.
- Avatar placement and emotion information may be contained in the state structure that is passed around the several users. Placement information may indicate where each avatar is to be presented in the user interface, either in absolute or relative terms. Emotion information may convey an emotional state.
- Each client may animate a certain avatar based on emotion information received for that avatar.
- each client can determine from the state structure what the virtual destination is supposed to look like, avatar placement therein, which avatar is speaking, gesturing, leaving, etc.
- Synchronized text chat may also be implemented in much the same way as described above in connection with synchronization of play and pause commands. Text provided by one user may be included in the state structure that is passed around the several users.
- Voice chat can be implemented via the so-called “party” system, which connects up to eight or more users together.
- the party system employs a respective gamer tag associated with each of the several users.
- synchronized voice chat may be built into the system, eliminating any need to convey voice information in the state structure.
- FIG. 4 shows an exemplary computing environment in which example embodiments and aspects may be implemented.
- the computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100 .
- Examples of well known computing systems, environments, and/or configurations that may be suitable for use include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, embedded systems, distributed computing environments that include any of the above systems or devices, and the like.
- Computer-executable instructions such as program modules, being executed by a computer may be used.
- program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
- Distributed computing environments may be used where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium.
- program modules and other data may be located in both local and remote computer storage media including memory storage devices.
- an exemplary system includes a general purpose computing device in the form of a computer 110 .
- Components of computer 110 may include, but are not limited to, a processing unit 120 , a system memory 130 , and a system bus 121 that couples various system components including the system memory to the processing unit 120 .
- the processing unit 120 may represent multiple logical processing units such as those supported on a multi-threaded processor.
- the system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
- ISA Industry Standard Architecture
- MCA Micro Channel Architecture
- EISA Enhanced ISA
- VESA Video Electronics Standards Association
- PCI Peripheral Component Interconnect
- the system bus 121 may also be implemented as a point-to-point connection, switching fabric, or the like, among the communicating devices.
- Computer 110 typically includes a variety of computer readable media.
- Computer readable media can be any available media that can be accessed by computer 110 and includes both volatile and nonvolatile media, removable and non-removable media.
- Computer readable media may comprise computer storage media and communication media.
- Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
- Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 110 .
- Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
- the system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132 .
- ROM read only memory
- RAM random access memory
- BIOS basic input/output system
- RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120 .
- FIG. 4 illustrates operating system 134 , application programs 135 , other program modules 136 , and program data 137 .
- the computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media.
- FIG. 4 illustrates a hard disk drive 140 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152 , and an optical disk drive 155 that reads from or writes to a removable, nonvolatile optical disk 156 , such as a CD ROM or other optical media.
- removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
- the hard disk drive 141 is typically connected to the system bus 121 through a non-removable memory interface such as interface 140
- magnetic disk drive 151 and optical disk drive 155 are typically connected to the system bus 121 by a removable memory interface, such as interface 150 .
- hard disk drive 141 is illustrated as storing operating system 144 , application programs 145 , other program modules 146 , and program data 147 . Note that these components can either be the same as or different from operating system 134 , application programs 135 , other program modules 136 , and program data 137 . Operating system 144 , application programs 145 , other program modules 146 , and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies.
- a user may enter commands and information into the computer 20 through input devices such as a keyboard 162 and pointing device 161 , commonly referred to as a mouse, trackball or touch pad.
- Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
- These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
- a monitor 191 or other type of display device is also connected to the system bus 121 via an interface, such as a video interface 190 .
- computers may also include other peripheral output devices such as speakers 197 and printer 196 , which may be connected through an output peripheral interface 195 .
- the computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180 .
- the remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110 , although only a memory storage device 181 has been illustrated in FIG. 4 .
- the logical connections depicted in FIG. 4 include a local area network (LAN) 171 and a wide area network (WAN) 173 , but may also include other networks.
- LAN local area network
- WAN wide area network
- Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
- the computer 110 When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170 .
- the computer 110 When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173 , such as the Internet.
- the modem 172 which may be internal or external, may be connected to the system bus 121 via the user input interface 160 , or other appropriate mechanism.
- program modules depicted relative to the computer 110 may be stored in the remote memory storage device.
- FIG. 4 illustrates remote application programs 185 as residing on memory device 181 . It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
- a user may, at the user's location, use a multipurpose console that has access to an online network and network services.
- One service that may be available is an online media service that can provide streaming media services so that the user can experience near instant streaming of content.
- the user may desire to utilize a party mode on their console.
- a party may be a collection of users who may or may not all be interacting within the same experience at a given point in time.
- a party session may be established when party members interact with one another within the same application or experience.
- Any user may host a shared video experience.
- the user who initiates the party session may be deemed to be the party host.
- a host may invite other users to become members of a party and share a party experience.
- a host may create a shared party by either launching an application provided on the console that may provide or access an infrastructure for inviting party members to a shared media experience. Alternatively and optionally, the console may provide a wizard or guide for inviting other users to the party.
- Such applications may provide, for example, menus to select or enter the identities of other users who the host wishes to invite to the party.
- the application may transmit identifiers of the requested users to an online service that may in turn forward the requests to the identified users.
- the identified users may be notified via applications executing on their respective consoles that they have been invited to the party.
- the applications may then provide an interface for accepting or rejecting the invitation, upon which the application may return the acceptance or rejection to the online service.
- the online service may notify the host of the rejection or acceptance.
- a movie or television show may be selected by the party for shared viewing.
- One or more of the users may select one or more movies or shows to be listed in a watch queue.
- Such a queue may, for example, include a list of the movies and television shows that a user may want to watch via the online service.
- users may add titles to the queue using various means such as browsing on the online service website.
- the host of the party may be the only member of the party that is provided the capability to share out their watch queue to the other party members.
- the online service may make the information available so that the other party members may view the host's watch queue on their own consoles.
- any party member may be provided the capability to nominate content from within their own watch queues.
- the content nominations may be included and displayed in a party queue available to each party member.
- each nominated movie or show may be represented in a two dimensional grid, for example a 3 ⁇ 3 tiled wall.
- each party member may be provided the option to share their own personal queues with the entire party. Furthermore, even if the party member does not share their personal queues, they may still nominate content from their queues to the party queue. However, no other party members will see that party members queue.
- Each party member's individual watch queue may be represented as a different channel or data stream on each user's display.
- users may also browse the other users' watch queues and nominate the pieces of content that they wish to watch from the party member's queue.
- Users may also request and view details of each movie or show. For example, by selecting a particular movie in a user's queue, the details for the movie may be displayed, indicating for example the synopsis, run time, and other details of the movie.
- the host of the session may select a specific movie or show for the entire party to watch.
- the group or online system may determine the selected movie or show by determining the selection with the highest number of votes or other formulaic means.
- one or more members of the party may not have audio capability and thus a visual confirmation may be provided when the host selects the movie.
- the avatar of the party member who nominated the selected content may be displayed when the selected content item is highlighted in the party queue. This may provide a visual confirmation of the movie or show that was selected for viewing.
- the party members may be made aware that the content has been selected and is ready to be viewed without relying on the need for voice communications within the party.
- the avatars may be used to convey emotions and feelings during the nomination and selection process. For example, a user may react to a movie nomination and desire to express the response to the other users. A user may wish to, for example, let others know what the user is thinking, provoke a response from other users, make a statement about the movie, or respond to another user's conveyed emotion.
- the input of emotions may be implemented using a fly out wheel displayed on the user's screen and activated using an appropriate control on the user's console, console accessory, or other input means.
- the wheel can be activated and controlled using a gamepad or a remote controller.
- Any number of emotions or responses may be provided on the fly out wheel. By way of example and not limitation, eight categories of emotions may be used as shown in FIG. 6 .
- each category may be represented by one static avatar action or animation, or a series of animations selectable by the user.
- a random predefined animation may be rendered once the user selects the emotion they want to convey.
- the categories of emotions may further be based on typical responses that users may have watching major film genres such as action, adventure, comedy, crime/gangster, drama, historical, horror, musicals, science fiction, war, westerns, and the like.
- each category may further provide at least three random animations.
- Each avatar may also perform idle animations to make the experience more interesting for the users, even when an emotion has not been actively selected by the users.
- audio effects may also be provided.
- the audio effects may be combined with the on screen animations to further enhance the party experience.
- a clapping audio clip may be rendered along with one of the selected emotions.
- additional features may be added to enhance the party experience. For example, the consecutive selection of three emotions may perform/unlock a special emote. Additionally, on certain days such as holidays, some emotes may be configured to behave differently than on other days/times. Finally, avatars may perform different animations depending on the user's profile. For example, female avatars may use different animations than male avatars.
- users may be given the option to turn off the avatar experience and not use emotes. Users may continue to communicate to other party members using a voice chat mode. In an embodiment, the user may still be able to communicate via the user's headset, but the user will no longer see the other user's avatars. In one embodiment, however, pop-up notifications (e.g., “User X is Happy”) may still be presented during the nomination experience.
- pop-up notifications e.g., “User X is Happy”
- FIGS. 5A-5F depict exemplary screen shots from the perspective of one user of a group of users illustrating the avatar integrated shared media nomination experience disclosed herein.
- each user sits in his or her own physical location and views the display on his or her own computing device, such as an Xbox console.
- Each user's console may be connected via a network, such as the Internet.
- each user is logged into a respective Xbox Live account.
- FIG. 5A depicts a starting screen that may be presented to one user showing a “Start Party” option that may be selected by the user to start a shared media nomination and watching experience.
- FIG. 5B depicts one exemplary screen in which a user may suggest a movie to watch, wait for a suggestion, or invite users to join the party.
- FIG. 5C depicts a user screen for nominating a selection.
- the user may be provided the option to browse a list of available content.
- FIG. 5D depicts a user-interface screen presented to the user that allows the user to suggest to the other users participating in the “party” that the group view a selected piece of content (“The Neverending Story” in this example).
- FIG. 5E depicts the user's interface after making the nomination.
- the figure also illustrates animation sequences of the other user's avatar that represent how strongly the other user feels about watching the other user's suggestion.
- Animations may include the avatar pointing, waving hands, jumping, spinning or any other movement or expression.
- FIG. 5F shows one embodiment of a user-interface screen that is presented to each user.
- the screen resembles a home theater, and the avatars for each user are shown as if they are sitting together on a couch toward the bottom of the screen.
- the backs of the avatar's heads are visible.
- the avatars may be rendered as silhouettes as viewed from behind the avatars.
- FIG. 6 shows one example of a selection mechanism (a wheel or circle in this example) that is presented to a user to allow the user to select one or more “emotions” to be reflected through that user's avatar in order to display emotions to the other users participating in the group watching experience. Because the users are not physically at the same location, they cannot see each other; only each other's avatars. And although the users may be able to “chat” during the presentation of the content via their respective headsets, the visual element of emoting would otherwise be missing. With this mechanism, a user can cause the user's avatar to perform an animation that expressed to the other users how the user is feeling during the watching experience.
- a selection mechanism a wheel or circle in this example
- an avatar may make a sighing gesture and animated hearts may appear above the avatar's head to indicate to the other users that this user “loves” the particular content or scene being viewed. Another selection may make the avatar appear to be laughing to indicate that the user finds the content funny.
- an example emote wheel 500 is depicted that illustrates one implementation of an emote wheel using icons instead of text.
- the emote wheel 500 may be continuously present on the display, or may be rendered on the display when requested by the user, for example when a control button on the console is pressed.
- the leader can take his or her group of users to watch a movie or show to a variety of destinations around the world. As shown in FIG. 5F , for example, the leader may “take” the other users to a traditional move theater. In the example shown in previous figures, the theme is that of a home theater. In addition to user selectable themes, other themes may appear based on events that are occurring for the user. For example, when the groups of users are viewing a movie during the winter months, a “Winter Wonderland” theme may be available (as shown in FIG. 5G ). As another example, a special theme (not shown) may be available when it is your birthday.
- a button or other selection mechanism may be available on the user interface screen that signifies themes that may be selectable.
- only the leader may have the selection feature mechanism enabled.
- other users may also have the mechanism enabled.
- all members of the party may receive indication of the themes and render the themes unless the user has switched to full screen mode.
- the user's interface may render the theme chosen by the leader.
- some themes may be made available based on certain conditions, such as a time of the year or a date of importance to a user such as a user's birthday.
- a time of the year or a date of importance to a user such as a user's birthday.
- that theme may be chosen by default until the condition is no longer met and the theme may then no longer be available (until the condition occurs again).
- a special theme is unhidden, the leader may still have the ability to change to a different theme.
- the unhidden theme may just be a new option in the list.
- the conditional may be that if any member of a watching party has a birthday of today, within three days in the future or within three days in the past, the default theme may be the Birthday Theme and the theme may then be unhidden in the list of themes.
- the condition for a Winter Wonderland Theme may be that during the month of December every year, the Winter Wonderland Theme may become the default and be unhidden in the list of themes.
- Process 700 illustrates receiving, on one of the computing devices, a request to join the group.
- Process 710 illustrates sending the request to a shared media service provider and receiving an indication of acceptance into the group.
- another input may be received comprising an invitation for a specified user to join the group and sending said another input to the shared media service provider
- Process 720 illustrates receiving data describing media entities selected by other members of the group and rendering representations of the received media entities on a user interface device coupled to the respective computing device.
- Process 730 illustrates receiving a nomination identifying one of the media entities and sending the nomination to the other members of the group.
- the nomination may be selected from one or more queues comprising one or more media entities selected by each user. Each of the one or more queues may received on a separate channel.
- Process 740 illustrates displaying on said user interface device, along with the representations of the received media entities, avatars representing the users of the group.
- the process may further comprises receiving and displaying indications of votes for other ones of the media entities selected by other users of the group
- Process 750 illustrates receiving an indication of an emotion from one of the users and, in response, causing the avatar corresponding to said one of the users to perform an animation on said user interface device that conveys the indicated emotion.
- the emotion may be selected from an emote wheel comprising a plurality of representative emotions.
- the emote wheel may be divided into pie shaped segments radiating from a center of the emote wheel and each of the representative emotions may be represented as one of the pie shaped segments.
- a random animation may be displayed based on the selected emotion.
- the random animation may comprise, for example, one of: at ease, cheer, disappointed, dance, looking, nervous, neutral, surprised, taunt, thinking, and walk.
- the representative emotions bay be selected based on film genres.
- the emotions may comprise, for example, love, hate, funny, bored, happy, sad, mischievous, and scared.
- Process 760 illustrates receiving an indication of a selected media entity to be shared by the group and displaying a representation of the selected media entity on said user interface device, wherein the selected media object is selected by one of the users of the group deemed to be a host of the group.
- the user is deemed the host of the group when the user initiated the formation of the group.
- FIG. 8 depicts an exemplary system for establishing a shared media experience in a group comprising two or more users, each user operating a respective computing device and communicatively coupled via a network.
- system 800 comprises a process 810 and memory 820 .
- Memory 820 further comprises computer instructions for establishing a shared media experience in a group comprising two or more users, each user operating a respective computing device and communicatively coupled via a network.
- Block 822 illustrates instructions for communicating with the two or more users to form the group.
- Block 824 illustrates instructions for receiving a nomination identifying a selected media entity and sending data describing the nomination to each of the users in the group.
- Block 826 illustrates displaying on said user interface device, along with a representation of the selected media entity, avatars representing the users of the group and associated avatar emotions, the avatars representing the one or more users and the avatar emotions comprising indications of emotions selected by the one or more users, the avatar emotion information sufficient to allow the computing device to render an animation that conveys the avatar emotion.
- Block 828 illustrates instructions for sending an indication of a selected media entity to be shared by the group and rendered on the computing devices of the users.
- a computer readable medium can store thereon computer executable instructions for establishing a shared media experience in a group comprising two or more users, each user operating a respective computing device and communicatively coupled via a network.
- Such media can comprise a first subset of instructions for receiving, on one of the computing devices, an input indicating a request to join the group 910 ; a second subset of instructions for sending the input to a shared media service provider and receiving an indication of acceptance into the group 912 ; a third subset of instructions for receiving data describing media entities selected by other members of the group and rendering representations of the received media entities on a user interface device coupled to said one of the computing devices 914 ; a fourth subset of instructions for receiving a nomination identifying one of the media entities and sending the nomination to the shared media service provider 916 ; a fifth subset of instructions for displaying on said user interface device, along with the representations of the received media entities, avatars representing the users of the group 918 ; a sixth subset of instructions for receiving an indication of an emotion from one of the users and, in response, causing the avatar corresponding to said one of the users to perform an animation on said user interface device that conveys the selected emotion 920 ; and a seventh subset of instructions for receiving an indication
- the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both.
- the methods and apparatus of the disclosure may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the disclosure.
- the computing device In the case of program code execution on programmable computers, the computing device generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
- One or more programs that may implement or utilize the processes described in connection with the disclosure, e.g., through the use of an application programming interface (API), reusable controls, or the like.
- API application programming interface
- Such programs are preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system.
- the program(s) can be implemented in assembly or machine language, if desired.
- the language may be a compiled or interpreted language, and combined with hardware implementations.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- Human Computer Interaction (AREA)
- Entrepreneurship & Innovation (AREA)
- Computer Security & Cryptography (AREA)
- Databases & Information Systems (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Economics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Quality & Reliability (AREA)
- Data Mining & Analysis (AREA)
- Operations Research (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
- This application claims the benefit of U.S. Provisional Patent Application No. 61/217,268, filed May 29, 2009, which is hereby incorporated by reference in its entirety. This application is also related to co-pending U.S. application Ser. No. ______, filed on even date herewith, entitled “Avatar Integrated Shared Media Experience”, which is hereby incorporated by reference in its entirety.
- The wide availability of data networks has enabled computing device users to remain connected to their provider networks and thus all of the data and services available via the Internet and other networks. The growth of such networks have also fueled the growth of community and social applications using computing devices such as mobile phones and personal computers. For example, networked multiplayer gaming is generally available on both personal computers and game consoles. Such networked applications allow users to remain connected and to share online experiences without the need to be physically present in the same location.
- However, many social activities remain out of reach of online networks. For example, networked social multimedia experiences, such as streaming video, for example, are not generally available or convenient to use.
- In some cases, avatars may be used to enhance a group or online experience. An avatar can represent a user in a variety of contexts, including computer or video games, applications, chats, forums, communities, and instant messaging services. An avatar may be an object representing the embodiment of a user and may represent various actions and aspects of the user's personal, beliefs, interests, or social status. However, the use of avatars has not overcome the deficiencies of existing group and online systems and applications.
- One popular social activity is the viewing of movies and television shows. However, it is difficult for users who live in separate physical locations to conveniently decide on a movie or television show to watch together. For example, it may be difficult and cumbersome for a participant in an online group to suggest a movie and listen to suggestions from other members of the group. Furthermore, it may be difficult to communicate thoughts and feelings about the various suggested movie or television titles to the members of the group while at the same time maintaining a sense of fun and interaction that is typical of an in-person movie/television show selection experience.
- A method and system are disclosed herein in which a group of people may replicate the physical world experience of going with a group of friends to select a movie or show to watch together in a virtual world while the users are in different physical locations. Users of a group may nominate movie or television selections for group viewing. In one embodiment, a display at a user's location may render avatars representing the users of the group. Users may also select an emotion that the member wishes to express to the other users of the group.
-
FIG. 1 is a block diagram of an example network configuration. -
FIG. 2 depicts an example user interface that maybe provided during a networked, social multimedia experience. -
FIGS. 3A-3C are flowcharts of example methods for synchronizing control commands in a networked, social multimedia environment. -
FIG. 4 is a block diagram of an example computing environment. -
FIGS. 5A-5G are screen shots from the perspective of one user of a group of users illustrating a process flow of an avatar integrated shared media nomination and watching experience. -
FIG. 6 depicts an exemplary user interface incorporating some of the embodiments disclosed herein. -
FIG. 7 depicts an exemplary process incorporating some of the embodiments disclosed herein. -
FIG. 8 depicts example system for providing a shared media experience. -
FIG. 9 illustrates a computer readable medium bearing computer executable instructions discussed with respect toFIGS. 1-8 . - Certain specific details are set forth in the following description and figures to provide a thorough understanding of various embodiments of the disclosure. Certain well-known details often associated with computing and software technology are not set forth in the following disclosure to avoid unnecessarily obscuring the various embodiments of the disclosure. Further, those of ordinary skill in the relevant art will understand that they can practice other embodiments of the disclosure without one or more of the details described below. Finally, while various methods are described with reference to steps and sequences in the following disclosure, the description as such is for providing a clear implementation of embodiments of the disclosure, and the steps and sequences of steps should not be taken as required to practice this disclosure.
- Many social activities have been replicated or simulated by networked or online activities. For example, group discussions have been simulated using on-line messaging or chat services. However, some social activities have been more difficult to replicate. For example, one popular social activity is the group viewing of movies and television shows. However, it is difficult for users who are in separate physical locations to conveniently decide on a movie or television show to watch together. For example, it may be difficult and cumbersome for a participant in an online group to suggest a movie and listen to suggestions from other members of the group. Furthermore, it may be difficult to communicate thoughts and feelings about the various suggested movie or television titles to the members of the group while at the same time maintaining a sense of fun and interaction that is typical of an in-person movie/television show selection experience.
- The following example embodiments describe the media nomination and selection process in the context of viewing movies and television shows. The selection of movies and television shows is exemplary, and those skilled in the art will recognize that the principles are readily applicable to the nomination and selection of other media types that may be shared among a group of users. Such media types may include any media file or application such as music files and video games. All such media types and applications are contemplated as within the scope of the present disclosure.
- In various embodiments disclosed herein, a group of users may replicate the physical world experience of meeting with a group of friends to select a movie or television show to watch together. The experience may be replicated in a virtual world in which the users are in different physical locations and in communication via a network. Users of the group may nominate movie or television selections for group viewing. In one embodiment, a display at a user's location may render avatars representing the users of the group. Users may also select an emotion that the user wishes to express to the other users of the group.
- According to the methods and systems described herein, a user in a group of users may be provided an opportunity to browse content made available by a system and/or service, such as for example Microsoft's XBOX 360 console and XBOX LIVE service, and to nominate specific content, such as movies or television shows, that they would like to watch. Once the group of users has nominated movies or television shows that they would like to watch, the users may discuss with each other, via for example their respective headset devices, regarding which movie or show they would like to watch together.
- Each user may also have an avatar which is a virtual representation of his- or herself that may act out different “pick my movie” animations to attempt to convey the user's excitement with the particular movie or television show he or she has chosen. Many applications such as video games sometimes feature a user-created, system-wide avatar as a user-controlled character. Avatars can be graphical images that represent real persons in virtual or game space. Typically, a user may customize the avatar in variety of ways dealing with appearance, such as facial features and clothing. This allows the user a more personalized and involved video gaming experience. For instance, the Nintendo Corporation has a user-created, system-wide avatar, the MII®, which a user may then use as his or her user-controlled character in video games that support this feature, such as WII SPORTS®.
- Once a “leader” ultimately picks a particular piece of content, such as a particular movie, the chosen content may be watched by the group of users and their respective avatars may appear in a virtual “destination” to watch the movie together. The virtual destination and a representation of the group watching the content may appear on the display of each user's respective console, thus simulating a physical gathering of those users. The group of users may talk during the movie on their headsets and have their avatars perform emotions and gestures that the user in the physical world is feeling based on the content being played.
- Additionally, the system may provide themed destinations that may be chosen by the system or by one or more of the users of the group viewing the content. For example, a content service provider, such as an online provider of movies that can be rented for view, may provide a themed destination that resembles a home theater environment. Another service provider may provide a themed destination that resembles a full theater. Other themed destinations may include ones that resemble a beach, aquarium, outer space, mountains, drive-in theater or any other destination. The themed destination may be chosen by a leader of the group or by the collective agreement of the users. Hidden/unlockable themes may also be provided based on events occurring around a user. For example, if one of the users in the group is having a birthday on that day, the system may provide a special destination where a birthday cake and balloons may appear as the users are watching a movie together.
-
FIG. 1 illustrates an example network environment. Of course, actual network and database environments may be arranged in a variety of configurations; however, the example environment shown here provides a framework for understanding the type of environment in which an embodiment may operate. - The example network may include one or
more client computers 200 a, aserver computer 200 b, data sourcecomputers 200 c, and/ordatabases client computers 200 a and the data sourcecomputers 200 c may be in electronic communication with theserver computer 200 b by way of the communications network 280 (e.g., an intranet, the Internet or the like). Theclient computers 200 a anddata source computers 200 c may be connected to the communications network by way of communications interfaces 282. The communications interfaces 282 can be any type of communications interfaces such as Ethernet connections, modem connections, wireless connections and so on. - The
server computer 200 b may provide management of thedatabase 270 by way of database server system software such as MICROSOFT®'s SQL SERVER or the like. As such,server 200 b may act as a storehouse of data from a variety of data sources and provides that data to a variety of data consumers. - In the example network environment of
FIG. 1 , a data source may be provided bydata source computer 200 c.Data source computer 200 c may communicate data toserver computer 200 b viacommunications network 280, which may be a LAN, WAN, Intranet, Internet, or the like.Data source computer 200 c may store data locally indatabase 272 a, which may be database server or the like. The data provided bydata source 200 c can be combined and stored in a large database such as a data warehouse maintained byserver 200 b. -
Client computers 200 a that desire to use the data stored byserver computer 200 b can access thedatabase 270 viacommunications network 280.Client computers 200 a access the data by way of, for example, a query, a form, etc. It will be appreciated that any configuration of computers may be employed. - The
client computers 200 a depicted inFIG. 1 may be PCs or game consoles, for example. Two ormore clients 200 a may form a “party.” A “social video application” 220 running on theserver 200 b may designate one of theclients 200 a as the “remote holder.” The remote holder may be the first member of the party to request a network session. Such a request may be, for example, a request for streaming video. The remote holder may then invite other clients to establish a networked, social multimedia experience, i.e., to join the party. - The remote holder may have control over a shared “remote control” 210 that controls content playback. When the remote holder presses play, pause, reverse, or fast-forward, for example, the remote holder's “state” may be sent to all connected users in a group, who see it and synchronize to it, causing the same action to occur on their client. The other users may have the ability to play, pause, and request remote holder status by sending their own state to the remote holder. Such actions may need approval from the current remote holder to take effect. Users may also have the ability to leave the playback session.
- The video may be kept synchronized by keeping all users updated on the remote holder's state. The remote holder's state may be a
structure 235 that contains information on playback status (e.g., playing, paused, initializing, etc.), an identifier associated with the content being viewed, and a current time code associated with the content. The remote holder may maintain its state (i.e., keep it up-to-date), and send it to all the other users when it changes. The other users may then see the new state, compare their own time code and playback state to the remote holder's, and then take action accordingly. Each client may have its own respectivesocial video application 230, and may maintain its ownrespective state structure 235. - If a user's state is different from that of the remote holder, it may be updated (playing may become paused, for example). If a user's time code is too different from the remote holder's, then a “seek” operation may be performed to the remote holder's reported time code. The user may be responsible for predicting, based on “pre-buffering times,” how long it will take the seek call to complete, and compensate by adjusting the targeted time code.
- Users may also be enabled to make requests of the remote holder by sending the remote holder and all other users an updated state that differs from the remote holder's state. When the remote holder sees this state, it may be taken as a request. The remote holder may update its state to reflect the requested changes. Only then do the other users (including the user that made the request) change their state. The same process can be used to request remote holder status.
- In an example embodiment, any user can be the remote holder, but only one user can be the remote holder at any time. Any member may be promoted to remote holder, demoting the current remote holder to a normal user. The “current” remote holder is the only user who can “pass the remote” to another user. The server may keep track of the identify of the current remote holder.
- Multiparty voice chat may be integrated into the experience, allowing members to comment on the video. Thus, a group of people may be enabled to share the experience of watching a video together as if they were in the same room, without being physically present together. All users may have the same access to voice chat. That is, any user may speak whenever he chooses.
- Multiparty voice chat may require a certain level of synchronization among the clients that form the party. If any client were allowed to be even a few seconds out of synch with the rest of the party, comments made over the chat may not make sense. Additionally, feedback from the audio of one client sent over voice chat could be very disruptive if it's not closely in-sync with what other users are hearing from their own video.
- Fast-forward and reverse may be treated differently from play, pause, and seek commands. When the remote holder elects to fast-forward or reverse, the other clients may simply pause playback. When the remote holder finds the time in the video from which playback should resume, the other clients may receive the remote holder's updated state, and issue a “seek” command telling them to resume playback from the time index the remote holder has selected. This may eliminate potential synchronization issues that may be caused by fast-forward or reverse speeds being slightly different on different users' client computers.
- A fully social experience may be created where people are not only watching the same video, but also using graphical user avatars to create a virtual viewing environment such as a virtual entertainment room or movie theater. The users may be represented graphically in front of the video, and may be enabled to use animations, text chat, and voice chat to interact with each other.
- For example, the introduction of graphical avatars into the shared video experience may add another dimension to the experience by giving users a sense of identity within the virtual viewing environment. Each user watching the video may be represented by their own customized avatar. The avatars of every person in the session may be rendered on everyone else's television or monitor, resulting in a group of avatars that appear to be watching the video in a virtual environment. Each user may be enabled to trigger animations and text messages (in the form of “speech balloons,” for example) for their avatar. Such animations and text messages may be rendered on every other users' television or monitor.
-
FIG. 2 depicts anexample user interface 400 that may be provided during a networked, social multimedia experience. Theuser interface 400 may be presented on respective video monitors provided at each client location. The same interface may be presented at each location. - In general, the
user interface 400 may depict an area for displaying a movie. The area may be a virtual viewing environment such as a virtual living room or a virtual movie theater. As noted above, the scene providing the area for rendering the media may be referred to as the “destination” or “themed destination.” Specifically, as shown inFIG. 2 , theuser interface 400 may include avideo presentation portion 410, via which thevideo 412 is presented to the users. Theuser interface 400 may also include arespective avatar 420A-D corresponding to each of the users. Theuser interface 400 may also include a text chat area. As shown, text chat may be presented in the form ofspeech balloons 430A-D. Alternatively or additionally, text chat may be presented as scrolling text in a chat box portion of theuser interface 400. Audio maybe presented via one or more speakers (not shown) provided at the client locations. - Each client may render its own themed destination. Thus, software may be provided on each client to enable the client to render its own themed destination. The themed destinations rendered on the several clients may be identical, or not.
- When a user causes his or her avatar to gesticulate, the gesture may be presented at all the client locations in synchronicity. Similarly, when a user speaks or otherwise produces an audio event, e.g., through voice chat, or textual event, e.g., through text chat, the audio or text may be presented at all the client locations in synchronicity.
-
FIG. 3A is a flowchart of anexample method 300 for synchronizing play, pause, stop, and seek commands from the remote holder. At 301, the remote holder may select a “play,” “pause,” “stop,” or “seek” operation, e.g., by pressing the play, pause, stop, or seek button on their game controller or remote control. At 302, in response to the remote holder's selection of the play, pause, stop, or seek operation, the remote holder client may update its state structure to reflect the change in time code and playback status. - At 303, the remote holder client communicates the remote holder's state structure to the other clients in the party. To maintain the highest level of synchronization among the several clients in the party, such updates should be communicated as frequently as possible. At 304, the other clients receive the remote holder's updated state. At 305, each client responds to the state change by updating its own state structure to conform to that of the remote holder.
- The state structure from each client may be sent to every other client, so that every client always knows the current state of every other client in the party. Because the state structure contains information on playback status, an identifier associated with the content being viewed, and a current time code associated with the content, each client will then be performing the same operation, at the same place in the same content, at the same time.
-
FIG. 3B is a flowchart of anexample method 310 for synchronizing play or pause commands from a user who is not the remote holder. In an example embodiment, a user who is not the remote holder is not enabled to exercise a stop, seek, fast-forward, or reverse command. At 311, a non-remote holder user may select a “play” or “pause” operation, e.g., by pressing the play or pause button on their game controller or remote control. At 312, in response to the user's selection of the play or pause operation, the selecting user's client may update its state structure to reflect that a play or pause state has been requested. - At 313, the selecting user's client may send the selecting user's state to the remote holder client, as well as to all other members of the party. At 314, the remote holder client may receive the selecting user's state, from which it can determine that another member of the party has made a playback state change request. The remote holder client may change its own state to reflect the new state.
- At 315, the remote holder client communicates the remote holder's state structure to the other clients in the party. To maintain the highest level of synchronization among the several clients in the party, such updates should be communicated as frequently as possible. At 316, the other clients receive the remote holder's updated state.
- At 317, the other clients, including the user who made the original request, receive the remote holder's updated state, and respond to the state change by updating their own state structures to conform to that of the remote holder. At 318, the selected action occurs on the requesting user's client.
-
FIG. 3C is a flowchart of anexample method 320 for synchronizing fast-forward and reverse commands from the remote holder. At 321, the remote holder may select a “fast-forward” or “reverse” operation, e.g., by pressing the fast-forward or reverse button on their game controller or remote control. - At 322, in response to the remote holder's selection of the fast-forward or reverse operation, the remote holder client may update its state to reflect that it is currently fast-forwarding or reversing. At 323, the remote holder client communicates the remote holder's state structure to the other clients in the party. At 324, the other users receive the new state, and pause until the fast forward/reverse state changes again.
- At 325, the remote holder video starts to fast-forward or reverse. Eventually, the remote holder may select a “play” operation, e.g., by pressing the play button on their game controller or remote control. At 326, the remote holder video begins playback at the time code associated with the point in the video at which the remote holder selected the play operation.
- At 327, the remote holder may update its state to reflect that it is currently playing and has a new time code, and communicate its state structure to the other clients in the party. At 328, the other users receive the new state structure and perform a seek and play operation to get back synchronized with the remote holder.
- Thus, the remote holder may be allowed full control over the virtual remote control, while the other users have only the ability to exit the video experience, play, pause, and make requests of the remote holder. In an example embodiment, no playback changes are made until the remote holder has changed its own state.
- Synchronization of avatars may be implemented in much the same way as described above in connection with synchronization of play and pause commands. Each user would construct his or her own avatar, or retrieve a saved avatar if the user already constructed one. Each client could then communicate information about its respective avatar to the other clients.
- As each client renders its respective destination, it may retrieve the avatars from a common server (e.g., based on gamer tags associated with the avatars). For example, avatars may be retrieved via the internet. Avatar placement and emotion information may be contained in the state structure that is passed around the several users. Placement information may indicate where each avatar is to be presented in the user interface, either in absolute or relative terms. Emotion information may convey an emotional state. Each client may animate a certain avatar based on emotion information received for that avatar. Thus, when rendering its virtual destination, each client can determine from the state structure what the virtual destination is supposed to look like, avatar placement therein, which avatar is speaking, gesturing, leaving, etc.
- Synchronized text chat may also be implemented in much the same way as described above in connection with synchronization of play and pause commands. Text provided by one user may be included in the state structure that is passed around the several users.
- Voice chat can be implemented via the so-called “party” system, which connects up to eight or more users together. In essence, the party system employs a respective gamer tag associated with each of the several users. Thus, synchronized voice chat may be built into the system, eliminating any need to convey voice information in the state structure.
-
FIG. 4 shows an exemplary computing environment in which example embodiments and aspects may be implemented. Thecomputing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality. Neither should thecomputing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in theexemplary operating environment 100. - Numerous other general purpose or special purpose computing system environments or configurations may be used. Examples of well known computing systems, environments, and/or configurations that may be suitable for use include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, embedded systems, distributed computing environments that include any of the above systems or devices, and the like.
- Computer-executable instructions, such as program modules, being executed by a computer may be used. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Distributed computing environments may be used where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium. In a distributed computing environment, program modules and other data may be located in both local and remote computer storage media including memory storage devices.
- With reference to
FIG. 4 , an exemplary system includes a general purpose computing device in the form of acomputer 110. Components ofcomputer 110 may include, but are not limited to, aprocessing unit 120, asystem memory 130, and a system bus 121 that couples various system components including the system memory to theprocessing unit 120. Theprocessing unit 120 may represent multiple logical processing units such as those supported on a multi-threaded processor. The system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus (also known as Mezzanine bus). The system bus 121 may also be implemented as a point-to-point connection, switching fabric, or the like, among the communicating devices. -
Computer 110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed bycomputer 110 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed bycomputer 110. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media. - The
system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements withincomputer 110, such as during start-up, is typically stored inROM 131.RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processingunit 120. By way of example, and not limitation,FIG. 4 illustratesoperating system 134,application programs 135,other program modules 136, andprogram data 137. - The
computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,FIG. 4 illustrates ahard disk drive 140 that reads from or writes to non-removable, nonvolatile magnetic media, amagnetic disk drive 151 that reads from or writes to a removable, nonvolatilemagnetic disk 152, and anoptical disk drive 155 that reads from or writes to a removable, nonvolatileoptical disk 156, such as a CD ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. Thehard disk drive 141 is typically connected to the system bus 121 through a non-removable memory interface such asinterface 140, andmagnetic disk drive 151 andoptical disk drive 155 are typically connected to the system bus 121 by a removable memory interface, such asinterface 150. - The drives and their associated computer storage media discussed above and illustrated in
FIG. 4 , provide storage of computer readable instructions, data structures, program modules and other data for thecomputer 110. InFIG. 4 , for example,hard disk drive 141 is illustrated as storingoperating system 144,application programs 145, other program modules 146, andprogram data 147. Note that these components can either be the same as or different fromoperating system 134,application programs 135,other program modules 136, andprogram data 137.Operating system 144,application programs 145, other program modules 146, andprogram data 147 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into the computer 20 through input devices such as a keyboard 162 and pointing device 161, commonly referred to as a mouse, trackball or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to theprocessing unit 120 through auser input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). Amonitor 191 or other type of display device is also connected to the system bus 121 via an interface, such as avideo interface 190. In addition to the monitor, computers may also include other peripheral output devices such asspeakers 197 andprinter 196, which may be connected through an outputperipheral interface 195. - The
computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as aremote computer 180. Theremote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to thecomputer 110, although only amemory storage device 181 has been illustrated inFIG. 4 . The logical connections depicted inFIG. 4 include a local area network (LAN) 171 and a wide area network (WAN) 173, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. - When used in a LAN networking environment, the
computer 110 is connected to the LAN 171 through a network interface oradapter 170. When used in a WAN networking environment, thecomputer 110 typically includes amodem 172 or other means for establishing communications over theWAN 173, such as the Internet. Themodem 172, which may be internal or external, may be connected to the system bus 121 via theuser input interface 160, or other appropriate mechanism. In a networked environment, program modules depicted relative to thecomputer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,FIG. 4 illustratesremote application programs 185 as residing onmemory device 181. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used. - Described now is an exemplary embodiment illustrating some of the methods and systems disclosed herein for establishing a shared media experience in a group comprising two or more users, each user operating a respective computing device and communicatively coupled via a network. A user may, at the user's location, use a multipurpose console that has access to an online network and network services. One service that may be available is an online media service that can provide streaming media services so that the user can experience near instant streaming of content.
- In an embodiment, the user may desire to utilize a party mode on their console. A party may be a collection of users who may or may not all be interacting within the same experience at a given point in time. A party session may be established when party members interact with one another within the same application or experience.
- Any user may host a shared video experience. In one embodiment, the user who initiates the party session may be deemed to be the party host. A host may invite other users to become members of a party and share a party experience. A host may create a shared party by either launching an application provided on the console that may provide or access an infrastructure for inviting party members to a shared media experience. Alternatively and optionally, the console may provide a wizard or guide for inviting other users to the party. Such applications may provide, for example, menus to select or enter the identities of other users who the host wishes to invite to the party. The application may transmit identifiers of the requested users to an online service that may in turn forward the requests to the identified users. The identified users may be notified via applications executing on their respective consoles that they have been invited to the party. The applications may then provide an interface for accepting or rejecting the invitation, upon which the application may return the acceptance or rejection to the online service. Finally, the online service may notify the host of the rejection or acceptance.
- Once a party has been formed, a movie or television show may be selected by the party for shared viewing. One or more of the users may select one or more movies or shows to be listed in a watch queue. Such a queue may, for example, include a list of the movies and television shows that a user may want to watch via the online service. Typically, users may add titles to the queue using various means such as browsing on the online service website.
- In one embodiment, the host of the party may be the only member of the party that is provided the capability to share out their watch queue to the other party members. The online service may make the information available so that the other party members may view the host's watch queue on their own consoles.
- In another embodiment, any party member may be provided the capability to nominate content from within their own watch queues. The content nominations may be included and displayed in a party queue available to each party member. In one embodiment, each nominated movie or show may be represented in a two dimensional grid, for example a 3×3 tiled wall.
- In addition, each party member may be provided the option to share their own personal queues with the entire party. Furthermore, even if the party member does not share their personal queues, they may still nominate content from their queues to the party queue. However, no other party members will see that party members queue.
- Each party member's individual watch queue may be represented as a different channel or data stream on each user's display. When permitted, users may also browse the other users' watch queues and nominate the pieces of content that they wish to watch from the party member's queue. Users may also request and view details of each movie or show. For example, by selecting a particular movie in a user's queue, the details for the movie may be displayed, indicating for example the synopsis, run time, and other details of the movie.
- After the users have nominated their choices, in a preferred embodiment the host of the session may select a specific movie or show for the entire party to watch. In other embodiments, the group or online system may determine the selected movie or show by determining the selection with the highest number of votes or other formulaic means.
- In some cases one or more members of the party may not have audio capability and thus a visual confirmation may be provided when the host selects the movie. In an embodiment, the avatar of the party member who nominated the selected content may be displayed when the selected content item is highlighted in the party queue. This may provide a visual confirmation of the movie or show that was selected for viewing. Thus the party members may be made aware that the content has been selected and is ready to be viewed without relying on the need for voice communications within the party.
- In an embodiment, the avatars may be used to convey emotions and feelings during the nomination and selection process. For example, a user may react to a movie nomination and desire to express the response to the other users. A user may wish to, for example, let others know what the user is thinking, provoke a response from other users, make a statement about the movie, or respond to another user's conveyed emotion.
- Referring to
FIG. 6 , in one embodiment the input of emotions may be implemented using a fly out wheel displayed on the user's screen and activated using an appropriate control on the user's console, console accessory, or other input means. For example, the wheel can be activated and controlled using a gamepad or a remote controller. - Any number of emotions or responses may be provided on the fly out wheel. By way of example and not limitation, eight categories of emotions may be used as shown in
FIG. 6 . - In an embodiment, each category may be represented by one static avatar action or animation, or a series of animations selectable by the user. In other embodiments, rather than giving the user the ability to choose the details of the animation, a random predefined animation may be rendered once the user selects the emotion they want to convey. The categories of emotions may further be based on typical responses that users may have watching major film genres such as action, adventure, comedy, crime/gangster, drama, historical, horror, musicals, science fiction, war, westerns, and the like.
- Based on the above listed genres and typical response associated with the genres, in one embodiment depicted in
FIG. 6 the following categories may be used to populate an avatar emotion response function: - 1. Love
- 2. Hate
- 3. Funny
- 4. Bored
- 5. Happy
- 6. Sad
- 7. Mischievous
- 8. Scared
- These specific emotions may also be referred to as emotes. In an embodiment, each category may further provide at least three random animations. Each avatar may also perform idle animations to make the experience more interesting for the users, even when an emotion has not been actively selected by the users.
- Additionally and optionally, audio effects may also be provided. The audio effects may be combined with the on screen animations to further enhance the party experience. For example, a clapping audio clip may be rendered along with one of the selected emotions.
- In various embodiments, additional features may be added to enhance the party experience. For example, the consecutive selection of three emotions may perform/unlock a special emote. Additionally, on certain days such as holidays, some emotes may be configured to behave differently than on other days/times. Finally, avatars may perform different animations depending on the user's profile. For example, female avatars may use different animations than male avatars.
- In an embodiment, users may be given the option to turn off the avatar experience and not use emotes. Users may continue to communicate to other party members using a voice chat mode. In an embodiment, the user may still be able to communicate via the user's headset, but the user will no longer see the other user's avatars. In one embodiment, however, pop-up notifications (e.g., “User X is Happy”) may still be presented during the nomination experience.
-
FIGS. 5A-5F depict exemplary screen shots from the perspective of one user of a group of users illustrating the avatar integrated shared media nomination experience disclosed herein. In this example, each user sits in his or her own physical location and views the display on his or her own computing device, such as an Xbox console. Each user's console may be connected via a network, such as the Internet. In one embodiment, each user is logged into a respective Xbox Live account. - In one illustrative example of a nomination process,
FIG. 5A depicts a starting screen that may be presented to one user showing a “Start Party” option that may be selected by the user to start a shared media nomination and watching experience.FIG. 5B depicts one exemplary screen in which a user may suggest a movie to watch, wait for a suggestion, or invite users to join the party. -
FIG. 5C depicts a user screen for nominating a selection. The user may be provided the option to browse a list of available content. -
FIG. 5D depicts a user-interface screen presented to the user that allows the user to suggest to the other users participating in the “party” that the group view a selected piece of content (“The Neverending Story” in this example). -
FIG. 5E depicts the user's interface after making the nomination. The figure also illustrates animation sequences of the other user's avatar that represent how strongly the other user feels about watching the other user's suggestion. With this feature, a user can lobby for his or her selection and reflect to the other users through that user's avatar how strongly the user feels about watching his or her selection. Animations may include the avatar pointing, waving hands, jumping, spinning or any other movement or expression. -
FIG. 5F shows one embodiment of a user-interface screen that is presented to each user. As shown, in this example, the screen resembles a home theater, and the avatars for each user are shown as if they are sitting together on a couch toward the bottom of the screen. In this example, the backs of the avatar's heads are visible. In some embodiments the avatars may be rendered as silhouettes as viewed from behind the avatars. -
FIG. 6 shows one example of a selection mechanism (a wheel or circle in this example) that is presented to a user to allow the user to select one or more “emotions” to be reflected through that user's avatar in order to display emotions to the other users participating in the group watching experience. Because the users are not physically at the same location, they cannot see each other; only each other's avatars. And although the users may be able to “chat” during the presentation of the content via their respective headsets, the visual element of emoting would otherwise be missing. With this mechanism, a user can cause the user's avatar to perform an animation that expressed to the other users how the user is feeling during the watching experience. For example, if a user selects the “love” emote 610, an avatar may make a sighing gesture and animated hearts may appear above the avatar's head to indicate to the other users that this user “loves” the particular content or scene being viewed. Another selection may make the avatar appear to be laughing to indicate that the user finds the content funny. Referring back toFIG. 5F , anexample emote wheel 500 is depicted that illustrates one implementation of an emote wheel using icons instead of text. Theemote wheel 500 may be continuously present on the display, or may be rendered on the display when requested by the user, for example when a control button on the console is pressed. - As mentioned above, different themes, brands, or destinations can be applied to the watching experience. Conceptually, the leader can take his or her group of users to watch a movie or show to a variety of destinations around the world. As shown in
FIG. 5F , for example, the leader may “take” the other users to a traditional move theater. In the example shown in previous figures, the theme is that of a home theater. In addition to user selectable themes, other themes may appear based on events that are occurring for the user. For example, when the groups of users are viewing a movie during the winter months, a “Winter Wonderland” theme may be available (as shown inFIG. 5G ). As another example, a special theme (not shown) may be available when it is your birthday. - In various embodiments, a button or other selection mechanism may be available on the user interface screen that signifies themes that may be selectable. In one embodiment, only the leader may have the selection feature mechanism enabled. In other embodiments, other users may also have the mechanism enabled. When the leader cycles through different themes, then all members of the party may receive indication of the themes and render the themes unless the user has switched to full screen mode. When a user is in full screen mode and then switches back to avatar rendering mode, then the user's interface may render the theme chosen by the leader.
- As mentioned above, some themes may be made available based on certain conditions, such as a time of the year or a date of importance to a user such as a user's birthday. When the parameters of such a condition are satisfied, then that theme may be chosen by default until the condition is no longer met and the theme may then no longer be available (until the condition occurs again). When a special theme is unhidden, the leader may still have the ability to change to a different theme. The unhidden theme may just be a new option in the list. As an example, for a Birthday Theme, the conditional may be that if any member of a watching party has a birthday of today, within three days in the future or within three days in the past, the default theme may be the Birthday Theme and the theme may then be unhidden in the list of themes. As another example, the condition for a Winter Wonderland Theme may be that during the month of December every year, the Winter Wonderland Theme may become the default and be unhidden in the list of themes.
- Referring now to
FIG. 7 , illustrated is an exemplary process for establishing a shared media experience in a group comprising two or more users, each user operating a respective computing device and communicatively coupled via anetwork including operations Process 700 illustrates receiving, on one of the computing devices, a request to join the group. -
Process 710 illustrates sending the request to a shared media service provider and receiving an indication of acceptance into the group. In an embodiment, another input may be received comprising an invitation for a specified user to join the group and sending said another input to the shared media service provider -
Process 720 illustrates receiving data describing media entities selected by other members of the group and rendering representations of the received media entities on a user interface device coupled to the respective computing device. -
Process 730 illustrates receiving a nomination identifying one of the media entities and sending the nomination to the other members of the group. In an embodiment, the nomination may be selected from one or more queues comprising one or more media entities selected by each user. Each of the one or more queues may received on a separate channel. -
Process 740 illustrates displaying on said user interface device, along with the representations of the received media entities, avatars representing the users of the group. In an embodiment, the process may further comprises receiving and displaying indications of votes for other ones of the media entities selected by other users of the group -
Process 750 illustrates receiving an indication of an emotion from one of the users and, in response, causing the avatar corresponding to said one of the users to perform an animation on said user interface device that conveys the indicated emotion. In an embodiment, the emotion may be selected from an emote wheel comprising a plurality of representative emotions. The emote wheel may be divided into pie shaped segments radiating from a center of the emote wheel and each of the representative emotions may be represented as one of the pie shaped segments. - In one embodiment, a random animation may be displayed based on the selected emotion. The random animation may comprise, for example, one of: at ease, cheer, disappointed, dance, looking, nervous, neutral, surprised, taunt, thinking, and walk. As discussed above, the representative emotions bay be selected based on film genres. The emotions may comprise, for example, love, hate, funny, bored, happy, sad, mischievous, and scared.
-
Process 760 illustrates receiving an indication of a selected media entity to be shared by the group and displaying a representation of the selected media entity on said user interface device, wherein the selected media object is selected by one of the users of the group deemed to be a host of the group. In an embodiment, the user is deemed the host of the group when the user initiated the formation of the group. -
FIG. 8 depicts an exemplary system for establishing a shared media experience in a group comprising two or more users, each user operating a respective computing device and communicatively coupled via a network. Referring toFIG. 8 ,system 800 comprises aprocess 810 andmemory 820.Memory 820 further comprises computer instructions for establishing a shared media experience in a group comprising two or more users, each user operating a respective computing device and communicatively coupled via a network. Block 822 illustrates instructions for communicating with the two or more users to form the group. Block 824 illustrates instructions for receiving a nomination identifying a selected media entity and sending data describing the nomination to each of the users in the group. - Block 826 illustrates displaying on said user interface device, along with a representation of the selected media entity, avatars representing the users of the group and associated avatar emotions, the avatars representing the one or more users and the avatar emotions comprising indications of emotions selected by the one or more users, the avatar emotion information sufficient to allow the computing device to render an animation that conveys the avatar emotion. Block 828 illustrates instructions for sending an indication of a selected media entity to be shared by the group and rendered on the computing devices of the users.
- Any of the above mentioned aspects can be implemented in methods, systems, computer readable media, or any type of manufacture. For example, per
FIG. 9 , a computer readable medium can store thereon computer executable instructions for establishing a shared media experience in a group comprising two or more users, each user operating a respective computing device and communicatively coupled via a network. Such media can comprise a first subset of instructions for receiving, on one of the computing devices, an input indicating a request to join the group 910; a second subset of instructions for sending the input to a shared media service provider and receiving an indication of acceptance into the group 912; a third subset of instructions for receiving data describing media entities selected by other members of the group and rendering representations of the received media entities on a user interface device coupled to said one of the computing devices 914; a fourth subset of instructions for receiving a nomination identifying one of the media entities and sending the nomination to the shared media service provider 916; a fifth subset of instructions for displaying on said user interface device, along with the representations of the received media entities, avatars representing the users of the group 918; a sixth subset of instructions for receiving an indication of an emotion from one of the users and, in response, causing the avatar corresponding to said one of the users to perform an animation on said user interface device that conveys the selected emotion 920; and a seventh subset of instructions for receiving an indication of a selected media entity to be shared by the group and displaying a representation of the selected media entity on said user interface device, wherein the selected media object is selected by one of the users of the group deemed to be a host of the group 922. It will be appreciated by those skilled in the art that additional sets of instructions can be used to capture the various other aspects disclosed herein, and that the presently disclosed subsets of instructions can vary in detail per the present disclosure. - It should be understood that the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both. Thus, the methods and apparatus of the disclosure, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the disclosure. In the case of program code execution on programmable computers, the computing device generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs that may implement or utilize the processes described in connection with the disclosure, e.g., through the use of an application programming interface (API), reusable controls, or the like. Such programs are preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language, and combined with hardware implementations.
- While the invention has been particularly shown and described with reference to a preferred embodiment thereof, it will be understood by those skilled in the art that various changes in form and detail may be made without departing from the scope of the present invention as set forth in the following claims. Furthermore, although elements of the invention may be described or claimed in the singular, the plural is contemplated unless limitation to the singular is explicitly stated.
Claims (20)
Priority Applications (11)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/551,403 US20100306671A1 (en) | 2009-05-29 | 2009-08-31 | Avatar Integrated Shared Media Selection |
CA2760238A CA2760238A1 (en) | 2009-05-29 | 2010-05-28 | Avatar integrated shared media selection |
KR1020117028447A KR20120030396A (en) | 2009-05-29 | 2010-05-28 | Avatar integrated shared media selection |
PCT/US2010/036539 WO2010138798A2 (en) | 2009-05-29 | 2010-05-28 | Avatar integrated shared media selection |
CN2010800246656A CN102450032B (en) | 2009-05-29 | 2010-05-28 | Avatar integrated shared media selection |
RU2011148384/08A RU2527199C2 (en) | 2009-05-29 | 2010-05-28 | Avatar integrated shared media selection |
EP10781264A EP2435977A4 (en) | 2009-05-29 | 2010-05-28 | Avatar integrated shared media selection |
BRPI1012087A BRPI1012087A2 (en) | 2009-05-29 | 2010-05-28 | shared media selection with integrated avatar |
JP2012513285A JP5603417B2 (en) | 2009-05-29 | 2010-05-28 | Shared media selection method and system integrated with avatar |
US14/188,334 US9118737B2 (en) | 2009-05-29 | 2014-02-24 | Avatar integrated shared media experience |
US14/833,713 US9423945B2 (en) | 2009-05-29 | 2015-08-24 | Avatar integrated shared media experience |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US21726809P | 2009-05-29 | 2009-05-29 | |
US12/551,403 US20100306671A1 (en) | 2009-05-29 | 2009-08-31 | Avatar Integrated Shared Media Selection |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100306671A1 true US20100306671A1 (en) | 2010-12-02 |
Family
ID=43221691
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/551,339 Active 2030-08-17 US8661353B2 (en) | 2009-05-29 | 2009-08-31 | Avatar integrated shared media experience |
US12/551,403 Abandoned US20100306671A1 (en) | 2009-05-29 | 2009-08-31 | Avatar Integrated Shared Media Selection |
US14/188,334 Active US9118737B2 (en) | 2009-05-29 | 2014-02-24 | Avatar integrated shared media experience |
US14/833,713 Active US9423945B2 (en) | 2009-05-29 | 2015-08-24 | Avatar integrated shared media experience |
US15/234,812 Active US10368120B2 (en) | 2009-05-29 | 2016-08-11 | Avatar integrated shared media experience |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/551,339 Active 2030-08-17 US8661353B2 (en) | 2009-05-29 | 2009-08-31 | Avatar integrated shared media experience |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/188,334 Active US9118737B2 (en) | 2009-05-29 | 2014-02-24 | Avatar integrated shared media experience |
US14/833,713 Active US9423945B2 (en) | 2009-05-29 | 2015-08-24 | Avatar integrated shared media experience |
US15/234,812 Active US10368120B2 (en) | 2009-05-29 | 2016-08-11 | Avatar integrated shared media experience |
Country Status (9)
Country | Link |
---|---|
US (5) | US8661353B2 (en) |
EP (2) | EP2435976A4 (en) |
JP (2) | JP5701865B2 (en) |
KR (2) | KR101683936B1 (en) |
CN (2) | CN102450031B (en) |
BR (2) | BRPI1010562A2 (en) |
CA (2) | CA2760236A1 (en) |
RU (2) | RU2527746C2 (en) |
WO (2) | WO2010138734A2 (en) |
Cited By (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110265041A1 (en) * | 2010-04-23 | 2011-10-27 | Ganz | Radial user interface and system for a virtual world game |
CN102611925A (en) * | 2011-01-20 | 2012-07-25 | 华为终端有限公司 | Method and device for sharing information |
US20120226736A1 (en) * | 2011-03-04 | 2012-09-06 | Kabushiki Kaisha Toshiba | Method and system supporting mobile coalitions |
US20120233633A1 (en) * | 2011-03-09 | 2012-09-13 | Sony Corporation | Using image of video viewer to establish emotion rank of viewed video |
USD667416S1 (en) * | 2010-06-11 | 2012-09-18 | Microsoft Corporation | Display screen with graphical user interface |
US8306977B1 (en) * | 2011-10-31 | 2012-11-06 | Google Inc. | Method and system for tagging of content |
CN103517095A (en) * | 2012-06-18 | 2014-01-15 | 鸿富锦精密工业(深圳)有限公司 | A set top box which enables video synchronization sharing to be carried out and a video signal synchronization sharing method |
US8700643B1 (en) | 2010-11-03 | 2014-04-15 | Google Inc. | Managing electronic media collections |
US20140178043A1 (en) * | 2012-12-20 | 2014-06-26 | International Business Machines Corporation | Visual summarization of video for quick understanding |
WO2014120803A1 (en) * | 2013-01-31 | 2014-08-07 | Paramount Pictures Corporation | System and method for interactive remote movie watching, scheduling, and social connection |
US8909667B2 (en) | 2011-11-01 | 2014-12-09 | Lemi Technology, Llc | Systems, methods, and computer readable media for generating recommendations in a media recommendation system |
US9021370B1 (en) * | 2010-03-17 | 2015-04-28 | Amazon Technologies, Inc. | Collaborative chat room media player with recommendations |
WO2015073368A1 (en) | 2013-11-12 | 2015-05-21 | Highland Instruments, Inc. | Analysis suite |
US9047690B2 (en) | 2012-04-11 | 2015-06-02 | Myriata, Inc. | System and method for facilitating creation of a rich virtual environment |
US9292164B2 (en) | 2010-03-10 | 2016-03-22 | Onset Vi, L.P. | Virtual social supervenue for sharing multiple video streams |
US9310955B2 (en) | 2012-04-11 | 2016-04-12 | Myriata, Inc. | System and method for generating a virtual tour within a virtual environment |
US20160144278A1 (en) * | 2010-06-07 | 2016-05-26 | Affectiva, Inc. | Affect usage within a gaming context |
US9563902B2 (en) | 2012-04-11 | 2017-02-07 | Myriata, Inc. | System and method for transporting a virtual avatar within multiple virtual environments |
USD778922S1 (en) | 2012-08-07 | 2017-02-14 | Microsoft Corporation | Display screen with animated graphical user interface |
US9584851B2 (en) | 2012-08-29 | 2017-02-28 | Zte Corporation | Social television state synchronization method, system and terminal |
US20170289608A1 (en) * | 2015-06-16 | 2017-10-05 | Tencent Technology (Shenzhen) Company Limited | Message sharing method, client, and computer storage medium |
WO2018022977A1 (en) * | 2016-07-29 | 2018-02-01 | Everyscape, Inc. | Systems and methods for providing individual and/or synchronized virtual tours through a realm for a group of users |
US10091460B2 (en) * | 2008-03-31 | 2018-10-02 | Disney Enterprises, Inc. | Asynchronous online viewing party |
WO2018191720A1 (en) * | 2017-04-14 | 2018-10-18 | Penrose Studios, Inc. | System and method for spatial and immersive computing |
USD844648S1 (en) * | 2017-06-01 | 2019-04-02 | Sony Mobile Communications Inc. | Display screen with graphical user interface |
US10599304B2 (en) * | 2011-05-25 | 2020-03-24 | Sony Interactive Entertainment Inc. | Content player |
US10956113B2 (en) | 2012-06-25 | 2021-03-23 | Intel Corporation | Facilitation of concurrent consumption of media content by multiple users using superimposed animation |
US11153355B2 (en) | 2016-07-29 | 2021-10-19 | Smarter Systems, Inc. | Systems and methods for providing individual and/or synchronized virtual tours through a realm for a group of users |
US20220103873A1 (en) * | 2020-09-28 | 2022-03-31 | Gree, Inc. | Computer program, method, and server apparatus |
US20220132200A1 (en) * | 2020-10-13 | 2022-04-28 | Andrew Flessas | Method and system for displaying overlay graphics on television programs in response to viewer input and tracking and analyzing viewer inputs |
US20220150582A1 (en) * | 2020-11-11 | 2022-05-12 | Rovi Guides, Inc. | Systems and methods for providing media recommendations |
US11536796B2 (en) * | 2018-05-29 | 2022-12-27 | Tencent Technology (Shenzhen) Company Limited | Sound source determining method and apparatus, and storage medium |
US11552812B2 (en) * | 2020-06-19 | 2023-01-10 | Airbnb, Inc. | Outputting emotes based on audience member expressions in large-scale electronic presentation |
US20230029382A1 (en) * | 2018-04-13 | 2023-01-26 | Koji Yoden | Services over wireless communication with high flexibility and efficiency |
US11582269B2 (en) * | 2016-01-19 | 2023-02-14 | Nadejda Sarmova | Systems and methods for establishing a virtual shared experience for media playback |
USD984457S1 (en) | 2020-06-19 | 2023-04-25 | Airbnb, Inc. | Display screen of a programmed computer system with graphical user interface |
USD985005S1 (en) | 2020-06-19 | 2023-05-02 | Airbnb, Inc. | Display screen of a programmed computer system with graphical user interface |
US11641506B2 (en) | 2020-11-11 | 2023-05-02 | Rovi Guides, Inc. | Systems and methods for providing media recommendations |
US11758245B2 (en) | 2021-07-15 | 2023-09-12 | Dish Network L.L.C. | Interactive media events |
US11838450B2 (en) | 2020-02-26 | 2023-12-05 | Dish Network L.L.C. | Devices, systems and processes for facilitating watch parties |
US11849171B2 (en) | 2021-12-07 | 2023-12-19 | Dish Network L.L.C. | Deepfake content watch parties |
US20240064355A1 (en) * | 2022-08-19 | 2024-02-22 | Dish Network L.L.C. | User chosen watch parties |
US11974006B2 (en) | 2020-09-03 | 2024-04-30 | Dish Network Technologies India Private Limited | Live and recorded content watch parties |
US11974005B2 (en) | 2021-12-07 | 2024-04-30 | Dish Network L.L.C. | Cell phone content watch parties |
Families Citing this family (149)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006041885A (en) * | 2004-07-27 | 2006-02-09 | Sony Corp | Information processing apparatus and method therefor, recording medium and program |
US7459624B2 (en) | 2006-03-29 | 2008-12-02 | Harmonix Music Systems, Inc. | Game controller simulating a musical instrument |
EP2173444A2 (en) | 2007-06-14 | 2010-04-14 | Harmonix Music Systems, Inc. | Systems and methods for simulating a rock band experience |
US8678896B2 (en) | 2007-06-14 | 2014-03-25 | Harmonix Music Systems, Inc. | Systems and methods for asynchronous band interaction in a rhythm action game |
WO2010006054A1 (en) | 2008-07-08 | 2010-01-14 | Harmonix Music Systems, Inc. | Systems and methods for simulating a rock and band experience |
US20100070858A1 (en) * | 2008-09-12 | 2010-03-18 | At&T Intellectual Property I, L.P. | Interactive Media System and Method Using Context-Based Avatar Configuration |
US8661353B2 (en) | 2009-05-29 | 2014-02-25 | Microsoft Corporation | Avatar integrated shared media experience |
US8465366B2 (en) | 2009-05-29 | 2013-06-18 | Harmonix Music Systems, Inc. | Biasing a musical performance input to a part |
US10410222B2 (en) | 2009-07-23 | 2019-09-10 | DISH Technologies L.L.C. | Messaging service for providing updates for multimedia content of a live event delivered over the internet |
US20110035683A1 (en) * | 2009-08-07 | 2011-02-10 | Larry Stead | Method and apparatus for synchronous, collaborative media consumption |
US10357714B2 (en) | 2009-10-27 | 2019-07-23 | Harmonix Music Systems, Inc. | Gesture-based user interface for navigating a menu |
US9981193B2 (en) | 2009-10-27 | 2018-05-29 | Harmonix Music Systems, Inc. | Movement based recognition and evaluation |
US8760469B2 (en) | 2009-11-06 | 2014-06-24 | At&T Intellectual Property I, L.P. | Apparatus and method for managing marketing |
US8387088B2 (en) * | 2009-11-13 | 2013-02-26 | At&T Intellectual Property I, Lp | Method and apparatus for presenting media programs |
US20110154210A1 (en) * | 2009-12-18 | 2011-06-23 | Sung Jung-Sik | Multiple user terminal device which multiple users watch simultaneously, server for managing multiple users' usage of conents and method for managing multiple users and cotnents which multiple users watch simultaneously |
US20120327091A1 (en) * | 2010-03-08 | 2012-12-27 | Nokia Corporation | Gestural Messages in Social Phonebook |
US8667402B2 (en) * | 2010-03-10 | 2014-03-04 | Onset Vi, L.P. | Visualizing communications within a social setting |
US20110225515A1 (en) * | 2010-03-10 | 2011-09-15 | Oddmobb, Inc. | Sharing emotional reactions to social media |
US20110225039A1 (en) * | 2010-03-10 | 2011-09-15 | Oddmobb, Inc. | Virtual social venue feeding multiple video streams |
US20110225519A1 (en) * | 2010-03-10 | 2011-09-15 | Oddmobb, Inc. | Social media platform for simulating a live experience |
US20110225518A1 (en) * | 2010-03-10 | 2011-09-15 | Oddmobb, Inc. | Friends toolbar for a virtual social venue |
US8874243B2 (en) | 2010-03-16 | 2014-10-28 | Harmonix Music Systems, Inc. | Simulating musical instruments |
US8694553B2 (en) | 2010-06-07 | 2014-04-08 | Gary Stephen Shuster | Creation and use of virtual places |
US8562403B2 (en) | 2010-06-11 | 2013-10-22 | Harmonix Music Systems, Inc. | Prompting a player of a dance game |
US9358456B1 (en) | 2010-06-11 | 2016-06-07 | Harmonix Music Systems, Inc. | Dance competition game |
EP2579955B1 (en) | 2010-06-11 | 2020-07-08 | Harmonix Music Systems, Inc. | Dance game and tutorial |
US8583091B1 (en) | 2010-09-06 | 2013-11-12 | Sprint Communications Company L.P. | Dynamic loading, unloading, and caching of alternate complete interfaces |
US9024166B2 (en) | 2010-09-09 | 2015-05-05 | Harmonix Music Systems, Inc. | Preventing subtractive track separation |
TW201232280A (en) * | 2011-01-20 | 2012-08-01 | Hon Hai Prec Ind Co Ltd | System and method for sharing desktop information |
US8559933B1 (en) | 2011-02-08 | 2013-10-15 | Sprint Communications Company L.P. | System and method for ID platform |
US9123062B1 (en) | 2011-02-18 | 2015-09-01 | Sprint Communications Company L.P. | Ad sponsored interface pack |
US8848024B2 (en) | 2011-03-08 | 2014-09-30 | CSC Holdings, LLC | Virtual communal television viewing |
US9043446B1 (en) * | 2011-03-10 | 2015-05-26 | Sprint Communications Company L.P. | Mirroring device interface components for content sharing |
US8949333B2 (en) * | 2011-05-20 | 2015-02-03 | Alejandro Backer | Systems and methods for virtual interactions |
US8972592B1 (en) | 2011-05-27 | 2015-03-03 | Sprint Communications Company L.P. | Extending an interface pack to a computer system |
JP5783863B2 (en) * | 2011-09-20 | 2015-09-24 | 株式会社ジョイプロモーション | Server device, terminal and program |
US9619810B1 (en) | 2011-10-11 | 2017-04-11 | Sprint Communications Company L.P. | Zone architecture for dynamic targeted content creation |
US9473809B2 (en) * | 2011-11-29 | 2016-10-18 | At&T Intellectual Property I, L.P. | Method and apparatus for providing personalized content |
EP2792123B1 (en) | 2011-12-06 | 2017-09-27 | Echostar Technologies L.L.C. | Remote storage digital video recorder and related operating methods |
US9782680B2 (en) * | 2011-12-09 | 2017-10-10 | Futurewei Technologies, Inc. | Persistent customized social media environment |
US9245020B2 (en) * | 2011-12-14 | 2016-01-26 | Microsoft Technology Licensing, Llc | Collaborative media sharing |
US20130159126A1 (en) * | 2011-12-16 | 2013-06-20 | Amr Elkady | With-me social interactivity platform |
WO2013095512A1 (en) * | 2011-12-22 | 2013-06-27 | Intel Corporation | Collaborative entertainment platform |
CN103200430B (en) * | 2012-01-04 | 2017-05-31 | 华为终端有限公司 | personal content sharing method, system, server and terminal device |
KR101951761B1 (en) * | 2012-01-27 | 2019-02-25 | 라인 가부시키가이샤 | System and method for providing avatar in service provided in mobile environment |
US9348430B2 (en) | 2012-02-06 | 2016-05-24 | Steelseries Aps | Method and apparatus for transitioning in-process applications to remote devices |
US10913003B2 (en) | 2012-03-13 | 2021-02-09 | Sony Interactive Entertainment LLC | Mini-games accessed through a sharing interface |
US9345966B2 (en) | 2012-03-13 | 2016-05-24 | Sony Interactive Entertainment America Llc | Sharing recorded gameplay to a social graph |
US11406906B2 (en) | 2012-03-13 | 2022-08-09 | Sony Interactive Entertainment LLC | Network connected controller for direct to cloud gaming |
US8843122B1 (en) | 2012-06-29 | 2014-09-23 | Sprint Communications Company L.P. | Mobile phone controls preprocessor |
US9413839B2 (en) | 2012-07-31 | 2016-08-09 | Sprint Communications Company L.P. | Traffic management of third party applications |
US9183412B2 (en) | 2012-08-10 | 2015-11-10 | Sprint Communications Company L.P. | Systems and methods for provisioning and using multiple trusted security zones on an electronic device |
CN102831309A (en) * | 2012-08-17 | 2012-12-19 | 广州多益网络科技有限公司 | Virtual cinema interaction system and method |
US9442709B1 (en) | 2012-10-24 | 2016-09-13 | Sprint Communications Company L.P. | Transition experience during loading and updating an interface and applications pack |
KR20140061620A (en) * | 2012-11-13 | 2014-05-22 | 삼성전자주식회사 | System and method for providing social network service using augmented reality, and devices |
EP2745893B1 (en) * | 2012-12-21 | 2019-03-20 | Sony Computer Entertainment America LLC | Automatic generation of suggested mini-games for cloud-gaming based on recorded gameplay |
US9352226B2 (en) | 2012-12-21 | 2016-05-31 | Sony Interactive Entertainment America Llc | Automatic generation of suggested mini-games for cloud-gaming based on recorded gameplay |
US10104141B2 (en) | 2012-12-31 | 2018-10-16 | DISH Technologies L.L.C. | Methods and apparatus for proactive multi-path routing |
US10708319B2 (en) * | 2012-12-31 | 2020-07-07 | Dish Technologies Llc | Methods and apparatus for providing social viewing of media content |
US10051025B2 (en) | 2012-12-31 | 2018-08-14 | DISH Technologies L.L.C. | Method and apparatus for estimating packet loss |
US9510055B2 (en) | 2013-01-23 | 2016-11-29 | Sonos, Inc. | System and method for a media experience social interface |
US20140214504A1 (en) * | 2013-01-31 | 2014-07-31 | Sony Corporation | Virtual meeting lobby for waiting for online event |
US10220303B1 (en) | 2013-03-15 | 2019-03-05 | Harmonix Music Systems, Inc. | Gesture-based music game |
US9987552B2 (en) * | 2013-06-26 | 2018-06-05 | Smilegate, Inc. | Method and system for expressing emotion during game play |
US11130055B2 (en) | 2013-09-04 | 2021-09-28 | Nvidia Corporation | System and method for granting remote access to a video game executed on a video game console or network client |
KR20150055528A (en) * | 2013-11-13 | 2015-05-21 | 삼성전자주식회사 | display apparatus and user interface screen providing method thereof |
US9299194B2 (en) | 2014-02-14 | 2016-03-29 | Osterhout Group, Inc. | Secure sharing in head worn computing |
US9753288B2 (en) | 2014-01-21 | 2017-09-05 | Osterhout Group, Inc. | See-through computer display systems |
CA156714S (en) | 2014-01-28 | 2019-08-02 | Jvl Ventures Llc | Handheld electronic device |
US9513888B1 (en) | 2014-01-30 | 2016-12-06 | Sprint Communications Company L.P. | Virtual preloads |
US20150221112A1 (en) * | 2014-02-04 | 2015-08-06 | Microsoft Corporation | Emotion Indicators in Content |
US20150220498A1 (en) | 2014-02-05 | 2015-08-06 | Sonos, Inc. | Remote Creation of a Playback Queue for a Future Event |
US9679054B2 (en) | 2014-03-05 | 2017-06-13 | Sonos, Inc. | Webpage media playback |
USD748669S1 (en) * | 2014-03-17 | 2016-02-02 | Lg Electronics Inc. | Display panel with transitional graphical user interface |
USD748671S1 (en) * | 2014-03-17 | 2016-02-02 | Lg Electronics Inc. | Display panel with transitional graphical user interface |
USD748670S1 (en) * | 2014-03-17 | 2016-02-02 | Lg Electronics Inc. | Display panel with transitional graphical user interface |
USD757093S1 (en) * | 2014-03-17 | 2016-05-24 | Lg Electronics Inc. | Display panel with transitional graphical user interface |
USD748134S1 (en) * | 2014-03-17 | 2016-01-26 | Lg Electronics Inc. | Display panel with transitional graphical user interface |
US20150296033A1 (en) * | 2014-04-15 | 2015-10-15 | Edward K. Y. Jung | Life Experience Enhancement Via Temporally Appropriate Communique |
US10600245B1 (en) | 2014-05-28 | 2020-03-24 | Lucasfilm Entertainment Company Ltd. | Navigating a virtual environment of a media content item |
US20150356084A1 (en) | 2014-06-05 | 2015-12-10 | Sonos, Inc. | Social Queue |
US20150373171A1 (en) * | 2014-06-18 | 2015-12-24 | Verizon Patent And Licensing Inc. | Systems and Methods for Providing Access to Hosted Services by way of a User-Wearable Headset Computing Device |
USD759666S1 (en) | 2014-06-23 | 2016-06-21 | Google Inc. | Display screen or portion thereof with an animated graphical user interface |
USD807898S1 (en) | 2014-07-15 | 2018-01-16 | Google Llc | Display screen or portion thereof with an animated graphical user interface |
US9874997B2 (en) | 2014-08-08 | 2018-01-23 | Sonos, Inc. | Social playback queues |
CN104219785B (en) | 2014-08-20 | 2018-07-24 | 小米科技有限责任公司 | Real-time video providing method, device and server, terminal device |
EP3114625A1 (en) | 2014-09-24 | 2017-01-11 | Sonos, Inc. | Social media connection recommendations based on playback information |
US9667679B2 (en) | 2014-09-24 | 2017-05-30 | Sonos, Inc. | Indicating an association between a social-media account and a media playback system |
USD761300S1 (en) * | 2014-11-26 | 2016-07-12 | Amazon Technologies, Inc. | Display screen or portion thereof with an animated graphical user interface |
USD760274S1 (en) * | 2014-11-26 | 2016-06-28 | Amazon Technologies, Inc. | Display screen or portion thereof with an animated graphical user interface |
USD761845S1 (en) * | 2014-11-26 | 2016-07-19 | Amazon Technologies, Inc. | Display screen or portion thereof with an animated graphical user interface |
US20160212180A1 (en) * | 2015-01-21 | 2016-07-21 | Ryan S. Menezes | Shared Scene Object Synchronization |
WO2016158075A1 (en) * | 2015-04-02 | 2016-10-06 | 株式会社ディー・エヌ・エー | System, method, and program for distributing realtime motion video |
US10252171B2 (en) * | 2015-04-02 | 2019-04-09 | Nvidia Corporation | System and method for cooperative game control |
CN107430429B (en) * | 2015-04-07 | 2022-02-18 | 英特尔公司 | Avatar keyboard |
US9483253B1 (en) | 2015-04-30 | 2016-11-01 | Sprint Communications Company L.P. | Methods for customization of default applications on a mobile communication device |
CN104918124B (en) * | 2015-05-11 | 2017-12-08 | 腾讯科技(北京)有限公司 | Living broadcast interactive system, method for sending information, message receiving method and device |
US9631932B2 (en) | 2015-06-05 | 2017-04-25 | Nokia Technologies Oy | Crowd sourced interaction of browsing behavior in a 3D map |
JP6718169B2 (en) * | 2015-07-07 | 2020-07-08 | 学校法人幾徳学園 | Information presentation system, information presentation device and program |
USD789983S1 (en) * | 2015-08-12 | 2017-06-20 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with animated graphical user interface |
CN105228013B (en) * | 2015-09-28 | 2018-09-07 | 百度在线网络技术(北京)有限公司 | Barrage information processing method, device and barrage video player |
EP3398337A1 (en) | 2015-12-29 | 2018-11-07 | Dish Technologies L.L.C. | Remote storage digital video recorder streaming and related methods |
KR102532071B1 (en) | 2015-12-30 | 2023-05-15 | 삼성전자주식회사 | Display apparatus, user terminal, control method thereof, computer-readable medium, system thereof |
JP6220917B2 (en) * | 2016-04-19 | 2017-10-25 | 株式会社 ディー・エヌ・エー | System, method, and program for delivering real-time video |
CN105933790A (en) * | 2016-04-29 | 2016-09-07 | 乐视控股(北京)有限公司 | Video play method, device and system based on virtual movie theater |
CN105898522A (en) * | 2016-05-11 | 2016-08-24 | 乐视控股(北京)有限公司 | Method, device and system for processing barrage information |
US10762429B2 (en) * | 2016-05-18 | 2020-09-01 | Microsoft Technology Licensing, Llc | Emotional/cognitive state presentation |
US10154191B2 (en) | 2016-05-18 | 2018-12-11 | Microsoft Technology Licensing, Llc | Emotional/cognitive state-triggered recording |
CN109688909B (en) * | 2016-05-27 | 2022-07-01 | 詹森药业有限公司 | System and method for assessing cognitive and emotional states of real-world users based on virtual world activity |
US10848899B2 (en) * | 2016-10-13 | 2020-11-24 | Philip Scott Lyren | Binaural sound in visual entertainment media |
US10499178B2 (en) * | 2016-10-14 | 2019-12-03 | Disney Enterprises, Inc. | Systems and methods for achieving multi-dimensional audio fidelity |
US10432559B2 (en) | 2016-10-24 | 2019-10-01 | Snap Inc. | Generating and displaying customized avatars in electronic messages |
US20180157388A1 (en) * | 2016-12-02 | 2018-06-07 | Google Inc. | Emotion expression in virtual environment |
US11455549B2 (en) | 2016-12-08 | 2022-09-27 | Disney Enterprises, Inc. | Modeling characters that interact with users as part of a character-as-a-service implementation |
US10334283B2 (en) * | 2017-02-09 | 2019-06-25 | Nanning Fugui Precision Industrial Co., Ltd. | Interactive system for virtual cinema and method |
US10326809B2 (en) * | 2017-02-09 | 2019-06-18 | Nanning Fugui Precision Industrial Co., Ltd. | Interactive system for virtual cinema and method |
US10418813B1 (en) * | 2017-04-01 | 2019-09-17 | Smart Power Partners LLC | Modular power adapters and methods of implementing modular power adapters |
US11893647B2 (en) | 2017-04-27 | 2024-02-06 | Snap Inc. | Location-based virtual avatars |
US10542300B2 (en) * | 2017-05-31 | 2020-01-21 | Verizon Patent And Licensing Inc. | Methods and systems for customizing virtual reality data |
CN108173742B (en) * | 2017-12-08 | 2021-08-24 | 腾讯科技(深圳)有限公司 | Image data processing method and device |
US10506287B2 (en) * | 2018-01-04 | 2019-12-10 | Facebook, Inc. | Integration of live streaming content with television programming |
US11102255B2 (en) * | 2018-04-27 | 2021-08-24 | Filmio, Inc. | Project creation and distribution system |
US11113884B2 (en) * | 2018-07-30 | 2021-09-07 | Disney Enterprises, Inc. | Techniques for immersive virtual reality experiences |
US10628115B2 (en) | 2018-08-21 | 2020-04-21 | Facebook Technologies, Llc | Synchronization of digital content consumption |
US11245658B2 (en) * | 2018-09-28 | 2022-02-08 | Snap Inc. | System and method of generating private notifications between users in a communication session |
US10631052B1 (en) * | 2018-10-02 | 2020-04-21 | International Business Machines Corporation | Streaming content based on rules for watching as a group |
CN109525902A (en) * | 2018-11-15 | 2019-03-26 | 贵阳语玩科技有限公司 | A kind of method and device of more people's Real-Time Sharing videos |
US10902661B1 (en) * | 2018-11-28 | 2021-01-26 | Snap Inc. | Dynamic composite user identifier |
JP7277710B2 (en) * | 2019-02-04 | 2023-05-19 | 株式会社Mixi | Shared information processing device and control program |
EP3745832B1 (en) * | 2019-05-27 | 2023-05-03 | AT & S Austria Technologie & Systemtechnik Aktiengesellschaft | Anisotropic etching using photopolymerizable compound |
US11956231B1 (en) * | 2019-05-30 | 2024-04-09 | Apple Inc. | Authority transfer for virtual objects in shared computer-generated reality environments |
JP7001645B2 (en) * | 2019-07-26 | 2022-01-19 | 株式会社リンクコーポレイトコミュニケーションズ | Information processing equipment, terminal equipment, information processing methods, and programs |
CN110662083B (en) * | 2019-09-30 | 2022-04-22 | 北京达佳互联信息技术有限公司 | Data processing method and device, electronic equipment and storage medium |
US11991419B2 (en) * | 2020-01-30 | 2024-05-21 | Snap Inc. | Selecting avatars to be included in the video being generated on demand |
US11356720B2 (en) | 2020-01-30 | 2022-06-07 | Snap Inc. | Video generation system to render frames on demand |
US11036781B1 (en) | 2020-01-30 | 2021-06-15 | Snap Inc. | Video generation system to render frames on demand using a fleet of servers |
WO2021155249A1 (en) | 2020-01-30 | 2021-08-05 | Snap Inc. | System for generating media content items on demand |
US11284144B2 (en) | 2020-01-30 | 2022-03-22 | Snap Inc. | Video generation system to render frames on demand using a fleet of GPUs |
CN111459355B (en) * | 2020-03-30 | 2022-03-04 | 维沃移动通信有限公司 | Content sharing method and electronic equipment |
US11812090B2 (en) * | 2020-04-09 | 2023-11-07 | Caavo Inc | System and method for social multi-platform media playback synchronization |
CN113630371A (en) * | 2020-05-08 | 2021-11-09 | 珠海金山办公软件有限公司 | Method, device, computer storage medium and terminal for realizing information processing |
US11533531B1 (en) * | 2020-06-18 | 2022-12-20 | Nema Link | Streaming and synchronization of media |
WO2022046664A1 (en) * | 2020-08-23 | 2022-03-03 | Evasyst, Inc. | Electronic file presentation in a network environment |
GB2598577A (en) | 2020-09-02 | 2022-03-09 | Sony Interactive Entertainment Inc | User input method and apparatus |
US20220116227A1 (en) * | 2020-10-09 | 2022-04-14 | Unho Choi | Chain of authentication using public key infrastructure |
US11223800B1 (en) | 2020-11-03 | 2022-01-11 | International Business Machines Corporation | Selective reaction obfuscation |
WO2022246376A1 (en) * | 2021-05-15 | 2022-11-24 | Apple Inc. | User interfaces for media sharing and communication sessions |
US11671657B2 (en) * | 2021-06-30 | 2023-06-06 | Rovi Guides, Inc. | Method and apparatus for shared viewing of media content |
WO2023182667A1 (en) * | 2022-03-21 | 2023-09-28 | 삼성전자주식회사 | Display device and control method thereof |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6064383A (en) * | 1996-10-04 | 2000-05-16 | Microsoft Corporation | Method and system for selecting an emotional appearance and prosody for a graphical character |
US20050132288A1 (en) * | 2003-12-12 | 2005-06-16 | Kirn Kevin N. | System and method for realtime messaging having image sharing feature |
US20060288074A1 (en) * | 2005-09-09 | 2006-12-21 | Outland Research, Llc | System, Method and Computer Program Product for Collaborative Broadcast Media |
US20070260984A1 (en) * | 2006-05-07 | 2007-11-08 | Sony Computer Entertainment Inc. | Methods for interactive communications with real time effects and avatar environment interaction |
US20070271338A1 (en) * | 2006-05-18 | 2007-11-22 | Thomas Anschutz | Methods, systems, and products for synchronizing media experiences |
US20080059570A1 (en) * | 2006-09-05 | 2008-03-06 | Aol Llc | Enabling an im user to navigate a virtual world |
US20080215972A1 (en) * | 2007-03-01 | 2008-09-04 | Sony Computer Entertainment America Inc. | Mapping user emotional state to avatar in a virtual world |
US20080215975A1 (en) * | 2007-03-01 | 2008-09-04 | Phil Harrison | Virtual world user opinion & response monitoring |
US20080275769A1 (en) * | 2007-05-04 | 2008-11-06 | Shao Billy Jye-En | Network-based interactive entertainment center |
US20090063995A1 (en) * | 2007-08-27 | 2009-03-05 | Samuel Pierce Baron | Real Time Online Interaction Platform |
US20090094656A1 (en) * | 2007-10-03 | 2009-04-09 | Carlucci John B | System, method, and apparatus for connecting non-co-located video content viewers in virtual TV rooms for a shared participatory viewing experience |
US20090109213A1 (en) * | 2007-10-24 | 2009-04-30 | Hamilton Ii Rick A | Arrangements for enhancing multimedia features in a virtual universe |
US20090328122A1 (en) * | 2008-06-25 | 2009-12-31 | At&T Corp. | Method and apparatus for presenting media programs |
US20100205628A1 (en) * | 2009-02-12 | 2010-08-12 | Davis Bruce L | Media processing methods and arrangements |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5808662A (en) | 1995-11-08 | 1998-09-15 | Silicon Graphics, Inc. | Synchronized, interactive playback of digital movies across a network |
JPH10208240A (en) | 1997-01-23 | 1998-08-07 | Mitsubishi Paper Mills Ltd | Magnetic recording sheet and its production |
JP2000040088A (en) * | 1998-07-23 | 2000-02-08 | Nippon Telegr & Teleph Corp <Ntt> | Method and system for providing information in three- dimensionally shared virtual space and storage medium storing information provision program in three- dimensionally shared virtual space |
US6409599B1 (en) | 1999-07-19 | 2002-06-25 | Ham On Rye Technologies, Inc. | Interactive virtual reality performance theater entertainment system |
JP2003514307A (en) * | 1999-11-11 | 2003-04-15 | ユナイテッド バーチャリティーズ インク. | Computer advertising method and system |
US20020042834A1 (en) | 2000-10-10 | 2002-04-11 | Reelscore, Llc | Network music and video distribution and synchronization system |
US6657339B2 (en) | 2000-11-09 | 2003-12-02 | Seagate Technology Llc | Method for setting gaps in hydrodynamic bearings |
JP2003337776A (en) * | 2002-05-17 | 2003-11-28 | Nippon Telegraph & Telephone West Corp | Content delivery device, content sharing method in the device, and content delivery program |
JP4281306B2 (en) * | 2002-07-31 | 2009-06-17 | ソニー株式会社 | Information providing system, information providing method, information processing apparatus, information processing method, and computer program |
CN100524330C (en) | 2003-02-06 | 2009-08-05 | 诺基亚有限公司 | System and method for locally sharing subscription of multimedia content |
EP1629359A4 (en) | 2003-04-07 | 2008-01-09 | Sevenecho Llc | Method, system and software for digital media narrative personalization |
GB0318290D0 (en) | 2003-08-05 | 2003-09-10 | Koninkl Philips Electronics Nv | Shared experience of media content |
US7512882B2 (en) * | 2004-01-05 | 2009-03-31 | Microsoft Corporation | Systems and methods for providing alternate views when rendering audio/video content in a computing system |
JP4236606B2 (en) * | 2004-03-22 | 2009-03-11 | シャープ株式会社 | COMMUNICATION TERMINAL, TELEVISION RECEIVER, INFORMATION OUTPUT METHOD, INFORMATION OUTPUT PROGRAM, AND RECORDING MEDIUM CONTAINING INFORMATION OUTPUT PROGRAM |
US7370189B2 (en) | 2004-09-30 | 2008-05-06 | Intel Corporation | Method and apparatus for establishing safe processor operating points in connection with a secure boot |
US7669219B2 (en) * | 2005-04-15 | 2010-02-23 | Microsoft Corporation | Synchronized media experience |
JP2007115117A (en) * | 2005-10-21 | 2007-05-10 | Pc Net Kk | Content providing server and content viewing system |
US20090132284A1 (en) | 2005-12-16 | 2009-05-21 | Fey Christopher T | Customizable Prevention Plan Platform, Expert System and Method |
KR20070083269A (en) | 2006-02-06 | 2007-08-24 | 주식회사 케이티 | System and method for providing movie contents service by using virtual reality |
US20070196802A1 (en) | 2006-02-21 | 2007-08-23 | Nokia Corporation | Visually Enhanced Personal Music Broadcast |
NZ571345A (en) * | 2006-03-17 | 2011-10-28 | Sony Corp | Organising group content presentations by downloading content to participants' systems |
KR20080001073A (en) | 2006-06-29 | 2008-01-03 | 주식회사 케이티 | System for sharing multilateral media, method for sharing and method for sharing control thereof |
WO2008109299A2 (en) * | 2007-03-01 | 2008-09-12 | Sony Computer Entertainment America Inc. | System and method for communicating with a virtual world |
GB2450473A (en) * | 2007-06-04 | 2008-12-31 | Sony Comp Entertainment Europe | A Server in a Peer to Peer system selecting and notifying a device that it is to become a member of a peer group |
US20090044216A1 (en) * | 2007-08-08 | 2009-02-12 | Mcnicoll Marcel | Internet-Based System for Interactive Synchronized Shared Viewing of Video Content |
US8141115B2 (en) * | 2008-12-17 | 2012-03-20 | At&T Labs, Inc. | Systems and methods for multiple media coordination |
US8661353B2 (en) | 2009-05-29 | 2014-02-25 | Microsoft Corporation | Avatar integrated shared media experience |
-
2009
- 2009-08-31 US US12/551,339 patent/US8661353B2/en active Active
- 2009-08-31 US US12/551,403 patent/US20100306671A1/en not_active Abandoned
-
2010
- 2010-05-27 RU RU2011148387/08A patent/RU2527746C2/en active
- 2010-05-27 JP JP2012513259A patent/JP5701865B2/en active Active
- 2010-05-27 KR KR1020117028427A patent/KR101683936B1/en active IP Right Grant
- 2010-05-27 WO PCT/US2010/036428 patent/WO2010138734A2/en active Application Filing
- 2010-05-27 EP EP10781225.7A patent/EP2435976A4/en not_active Withdrawn
- 2010-05-27 CN CN201080024658.6A patent/CN102450031B/en active Active
- 2010-05-27 CA CA2760236A patent/CA2760236A1/en not_active Abandoned
- 2010-05-27 BR BRPI1010562A patent/BRPI1010562A2/en not_active Application Discontinuation
- 2010-05-28 RU RU2011148384/08A patent/RU2527199C2/en not_active IP Right Cessation
- 2010-05-28 WO PCT/US2010/036539 patent/WO2010138798A2/en active Application Filing
- 2010-05-28 CN CN2010800246656A patent/CN102450032B/en active Active
- 2010-05-28 EP EP10781264A patent/EP2435977A4/en not_active Withdrawn
- 2010-05-28 BR BRPI1012087A patent/BRPI1012087A2/en not_active IP Right Cessation
- 2010-05-28 CA CA2760238A patent/CA2760238A1/en not_active Abandoned
- 2010-05-28 KR KR1020117028447A patent/KR20120030396A/en not_active Application Discontinuation
- 2010-05-28 JP JP2012513285A patent/JP5603417B2/en active Active
-
2014
- 2014-02-24 US US14/188,334 patent/US9118737B2/en active Active
-
2015
- 2015-08-24 US US14/833,713 patent/US9423945B2/en active Active
-
2016
- 2016-08-11 US US15/234,812 patent/US10368120B2/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6064383A (en) * | 1996-10-04 | 2000-05-16 | Microsoft Corporation | Method and system for selecting an emotional appearance and prosody for a graphical character |
US20050132288A1 (en) * | 2003-12-12 | 2005-06-16 | Kirn Kevin N. | System and method for realtime messaging having image sharing feature |
US7562117B2 (en) * | 2005-09-09 | 2009-07-14 | Outland Research, Llc | System, method and computer program product for collaborative broadcast media |
US20060288074A1 (en) * | 2005-09-09 | 2006-12-21 | Outland Research, Llc | System, Method and Computer Program Product for Collaborative Broadcast Media |
US20070260984A1 (en) * | 2006-05-07 | 2007-11-08 | Sony Computer Entertainment Inc. | Methods for interactive communications with real time effects and avatar environment interaction |
US20070271338A1 (en) * | 2006-05-18 | 2007-11-22 | Thomas Anschutz | Methods, systems, and products for synchronizing media experiences |
US20080059570A1 (en) * | 2006-09-05 | 2008-03-06 | Aol Llc | Enabling an im user to navigate a virtual world |
US20080215972A1 (en) * | 2007-03-01 | 2008-09-04 | Sony Computer Entertainment America Inc. | Mapping user emotional state to avatar in a virtual world |
US20080215975A1 (en) * | 2007-03-01 | 2008-09-04 | Phil Harrison | Virtual world user opinion & response monitoring |
US20080275769A1 (en) * | 2007-05-04 | 2008-11-06 | Shao Billy Jye-En | Network-based interactive entertainment center |
US20090063995A1 (en) * | 2007-08-27 | 2009-03-05 | Samuel Pierce Baron | Real Time Online Interaction Platform |
US20090094656A1 (en) * | 2007-10-03 | 2009-04-09 | Carlucci John B | System, method, and apparatus for connecting non-co-located video content viewers in virtual TV rooms for a shared participatory viewing experience |
US20090109213A1 (en) * | 2007-10-24 | 2009-04-30 | Hamilton Ii Rick A | Arrangements for enhancing multimedia features in a virtual universe |
US20090328122A1 (en) * | 2008-06-25 | 2009-12-31 | At&T Corp. | Method and apparatus for presenting media programs |
US20100205628A1 (en) * | 2009-02-12 | 2010-08-12 | Davis Bruce L | Media processing methods and arrangements |
Cited By (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180367759A1 (en) * | 2008-03-31 | 2018-12-20 | Disney Enterprises, Inc. | Asynchronous Online Viewing Party |
US11233972B2 (en) * | 2008-03-31 | 2022-01-25 | Disney Enterprises, Inc. | Asynchronous online viewing party |
US10091460B2 (en) * | 2008-03-31 | 2018-10-02 | Disney Enterprises, Inc. | Asynchronous online viewing party |
US9292163B2 (en) | 2010-03-10 | 2016-03-22 | Onset Vi, L.P. | Personalized 3D avatars in a virtual social venue |
US9292164B2 (en) | 2010-03-10 | 2016-03-22 | Onset Vi, L.P. | Virtual social supervenue for sharing multiple video streams |
US9021370B1 (en) * | 2010-03-17 | 2015-04-28 | Amazon Technologies, Inc. | Collaborative chat room media player with recommendations |
US20110265041A1 (en) * | 2010-04-23 | 2011-10-27 | Ganz | Radial user interface and system for a virtual world game |
US9050534B2 (en) | 2010-04-23 | 2015-06-09 | Ganz | Achievements for a virtual world game |
US8719730B2 (en) * | 2010-04-23 | 2014-05-06 | Ganz | Radial user interface and system for a virtual world game |
US20160144278A1 (en) * | 2010-06-07 | 2016-05-26 | Affectiva, Inc. | Affect usage within a gaming context |
US10843078B2 (en) * | 2010-06-07 | 2020-11-24 | Affectiva, Inc. | Affect usage within a gaming context |
USD667416S1 (en) * | 2010-06-11 | 2012-09-18 | Microsoft Corporation | Display screen with graphical user interface |
US8700643B1 (en) | 2010-11-03 | 2014-04-15 | Google Inc. | Managing electronic media collections |
CN102611925B (en) * | 2011-01-20 | 2014-08-13 | 华为终端有限公司 | Method and device for sharing information |
CN102611925A (en) * | 2011-01-20 | 2012-07-25 | 华为终端有限公司 | Method and device for sharing information |
US8849900B2 (en) * | 2011-03-04 | 2014-09-30 | Telcordia Technologies, Inc. | Method and system supporting mobile coalitions |
US20120226736A1 (en) * | 2011-03-04 | 2012-09-06 | Kabushiki Kaisha Toshiba | Method and system supporting mobile coalitions |
US20120233633A1 (en) * | 2011-03-09 | 2012-09-13 | Sony Corporation | Using image of video viewer to establish emotion rank of viewed video |
US10599304B2 (en) * | 2011-05-25 | 2020-03-24 | Sony Interactive Entertainment Inc. | Content player |
US8306977B1 (en) * | 2011-10-31 | 2012-11-06 | Google Inc. | Method and system for tagging of content |
US10163090B1 (en) | 2011-10-31 | 2018-12-25 | Google Llc | Method and system for tagging of content |
US9015109B2 (en) | 2011-11-01 | 2015-04-21 | Lemi Technology, Llc | Systems, methods, and computer readable media for maintaining recommendations in a media recommendation system |
US8909667B2 (en) | 2011-11-01 | 2014-12-09 | Lemi Technology, Llc | Systems, methods, and computer readable media for generating recommendations in a media recommendation system |
US9047690B2 (en) | 2012-04-11 | 2015-06-02 | Myriata, Inc. | System and method for facilitating creation of a rich virtual environment |
US9563902B2 (en) | 2012-04-11 | 2017-02-07 | Myriata, Inc. | System and method for transporting a virtual avatar within multiple virtual environments |
US9310955B2 (en) | 2012-04-11 | 2016-04-12 | Myriata, Inc. | System and method for generating a virtual tour within a virtual environment |
CN103517095A (en) * | 2012-06-18 | 2014-01-15 | 鸿富锦精密工业(深圳)有限公司 | A set top box which enables video synchronization sharing to be carried out and a video signal synchronization sharing method |
US11789686B2 (en) | 2012-06-25 | 2023-10-17 | Intel Corporation | Facilitation of concurrent consumption of media content by multiple users using superimposed animation |
US11526323B2 (en) | 2012-06-25 | 2022-12-13 | Intel Corporation | Facilitation of concurrent consumption of media content by multiple users using superimposed animation |
US10956113B2 (en) | 2012-06-25 | 2021-03-23 | Intel Corporation | Facilitation of concurrent consumption of media content by multiple users using superimposed animation |
USD778922S1 (en) | 2012-08-07 | 2017-02-14 | Microsoft Corporation | Display screen with animated graphical user interface |
US9584851B2 (en) | 2012-08-29 | 2017-02-28 | Zte Corporation | Social television state synchronization method, system and terminal |
US20140178043A1 (en) * | 2012-12-20 | 2014-06-26 | International Business Machines Corporation | Visual summarization of video for quick understanding |
US9961403B2 (en) * | 2012-12-20 | 2018-05-01 | Lenovo Enterprise Solutions (Singapore) PTE., LTD. | Visual summarization of video for quick understanding by determining emotion objects for semantic segments of video |
US11418845B2 (en) | 2013-01-31 | 2022-08-16 | Paramount Pictures Corporation | System and method for interactive remote movie watching, scheduling, and social connection |
US8990303B2 (en) | 2013-01-31 | 2015-03-24 | Paramount Pictures Corporation | System and method for interactive remote movie watching, scheduling, and social connection |
US11818417B1 (en) | 2013-01-31 | 2023-11-14 | Paramount Pictures Corporation | Computing network for synchronized streaming of audiovisual content |
WO2014120803A1 (en) * | 2013-01-31 | 2014-08-07 | Paramount Pictures Corporation | System and method for interactive remote movie watching, scheduling, and social connection |
US9674239B2 (en) | 2013-01-31 | 2017-06-06 | Paramount Pictures Corporation | System and method for interactive remote movie watching, scheduling, and social connection |
CN105051778A (en) * | 2013-01-31 | 2015-11-11 | 派拉蒙电影公司 | System and method for interactive remote movie watching, scheduling, and social connection |
WO2015073368A1 (en) | 2013-11-12 | 2015-05-21 | Highland Instruments, Inc. | Analysis suite |
US10856035B2 (en) * | 2015-06-16 | 2020-12-01 | Tencent Technology (Shenzhen) Company Limited | Message sharing method, client, and computer storage medium |
US20170289608A1 (en) * | 2015-06-16 | 2017-10-05 | Tencent Technology (Shenzhen) Company Limited | Message sharing method, client, and computer storage medium |
US11582269B2 (en) * | 2016-01-19 | 2023-02-14 | Nadejda Sarmova | Systems and methods for establishing a virtual shared experience for media playback |
US11153355B2 (en) | 2016-07-29 | 2021-10-19 | Smarter Systems, Inc. | Systems and methods for providing individual and/or synchronized virtual tours through a realm for a group of users |
WO2018022977A1 (en) * | 2016-07-29 | 2018-02-01 | Everyscape, Inc. | Systems and methods for providing individual and/or synchronized virtual tours through a realm for a group of users |
US11575722B2 (en) | 2016-07-29 | 2023-02-07 | Smarter Systems, Inc. | Systems and methods for providing individual and/or synchronized virtual tours through a realm for a group of users |
WO2018191720A1 (en) * | 2017-04-14 | 2018-10-18 | Penrose Studios, Inc. | System and method for spatial and immersive computing |
USD844648S1 (en) * | 2017-06-01 | 2019-04-02 | Sony Mobile Communications Inc. | Display screen with graphical user interface |
US20230029382A1 (en) * | 2018-04-13 | 2023-01-26 | Koji Yoden | Services over wireless communication with high flexibility and efficiency |
US11962840B2 (en) * | 2018-04-13 | 2024-04-16 | Koji Yoden | Services over wireless communication with high flexibility and efficiency |
US11971494B2 (en) * | 2018-05-29 | 2024-04-30 | Tencent Technology (Shenzhen) Company Limited | Sound source determining method and apparatus, and storage medium |
US11536796B2 (en) * | 2018-05-29 | 2022-12-27 | Tencent Technology (Shenzhen) Company Limited | Sound source determining method and apparatus, and storage medium |
US11838450B2 (en) | 2020-02-26 | 2023-12-05 | Dish Network L.L.C. | Devices, systems and processes for facilitating watch parties |
US11552812B2 (en) * | 2020-06-19 | 2023-01-10 | Airbnb, Inc. | Outputting emotes based on audience member expressions in large-scale electronic presentation |
US11979245B2 (en) | 2020-06-19 | 2024-05-07 | Airbnb, Inc. | Augmenting audience member emotes in large-scale electronic presentation |
US11646905B2 (en) | 2020-06-19 | 2023-05-09 | Airbnb, Inc. | Aggregating audience member emotes in large-scale electronic presentation |
USD985005S1 (en) | 2020-06-19 | 2023-05-02 | Airbnb, Inc. | Display screen of a programmed computer system with graphical user interface |
USD984457S1 (en) | 2020-06-19 | 2023-04-25 | Airbnb, Inc. | Display screen of a programmed computer system with graphical user interface |
US11991013B2 (en) | 2020-06-19 | 2024-05-21 | Airbnb, Inc. | Incorporating individual audience member participation and feedback in large-scale electronic presentation |
US11974006B2 (en) | 2020-09-03 | 2024-04-30 | Dish Network Technologies India Private Limited | Live and recorded content watch parties |
US20220103873A1 (en) * | 2020-09-28 | 2022-03-31 | Gree, Inc. | Computer program, method, and server apparatus |
US20220132200A1 (en) * | 2020-10-13 | 2022-04-28 | Andrew Flessas | Method and system for displaying overlay graphics on television programs in response to viewer input and tracking and analyzing viewer inputs |
US11849177B2 (en) * | 2020-11-11 | 2023-12-19 | Rovi Guides, Inc. | Systems and methods for providing media recommendations |
US11641506B2 (en) | 2020-11-11 | 2023-05-02 | Rovi Guides, Inc. | Systems and methods for providing media recommendations |
US20220150582A1 (en) * | 2020-11-11 | 2022-05-12 | Rovi Guides, Inc. | Systems and methods for providing media recommendations |
US11758245B2 (en) | 2021-07-15 | 2023-09-12 | Dish Network L.L.C. | Interactive media events |
US11849171B2 (en) | 2021-12-07 | 2023-12-19 | Dish Network L.L.C. | Deepfake content watch parties |
US11974005B2 (en) | 2021-12-07 | 2024-04-30 | Dish Network L.L.C. | Cell phone content watch parties |
US20240064355A1 (en) * | 2022-08-19 | 2024-02-22 | Dish Network L.L.C. | User chosen watch parties |
US11973999B2 (en) * | 2022-08-19 | 2024-04-30 | Dish Network L.L.C. | User chosen watch parties |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10368120B2 (en) | Avatar integrated shared media experience | |
CN105430455B (en) | information presentation method and system | |
US20100083324A1 (en) | Synchronized Video Playback Among Multiple Users Across A Network | |
US8893022B2 (en) | Interactive and shared viewing experience | |
US20110244954A1 (en) | Online social media game | |
US20090063995A1 (en) | Real Time Online Interaction Platform | |
US20110225519A1 (en) | Social media platform for simulating a live experience | |
US20110225515A1 (en) | Sharing emotional reactions to social media | |
US20140040783A1 (en) | Virtual social supervenue for sharing multiple video streams | |
US20110239136A1 (en) | Instantiating widgets into a virtual social venue | |
US20110225039A1 (en) | Virtual social venue feeding multiple video streams | |
WO2008157671A1 (en) | Method and apparatus for selecting events to be displayed at virtual venues and social networking | |
US20110225514A1 (en) | Visualizing communications within a social setting | |
US20110225516A1 (en) | Instantiating browser media into a virtual social venue | |
US20110225518A1 (en) | Friends toolbar for a virtual social venue | |
US20110225517A1 (en) | Pointer tools for a virtual social venue | |
WO2011112296A1 (en) | Incorporating media content into a 3d platform | |
KR102669170B1 (en) | Methods, systems, and media for coordinating multiplayer game sessions | |
O’Brien | Digital love: Love through the screen/of the screen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MATTINGLY, ANDREW LAWRENCE;KRAMP, BRIAN CHARLES;SOEMO, THOMAS M.;AND OTHERS;REEL/FRAME:023198/0925 Effective date: 20090831 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034564/0001 Effective date: 20141014 |