CN117378206A - User interface and related connection method for shared playback of content items - Google Patents

User interface and related connection method for shared playback of content items Download PDF

Info

Publication number
CN117378206A
CN117378206A CN202280035200.3A CN202280035200A CN117378206A CN 117378206 A CN117378206 A CN 117378206A CN 202280035200 A CN202280035200 A CN 202280035200A CN 117378206 A CN117378206 A CN 117378206A
Authority
CN
China
Prior art keywords
electronic device
content item
playback
content
communication session
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280035200.3A
Other languages
Chinese (zh)
Inventor
A·元
T·S·里斯
D·R·多姆
G·V·本维尼斯特
C·L·托梅兹科
张宰祐
金秀姸
V·K·哈灵顿
L·E·塔帕纳
L·莫雷诺鲁福
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority claimed from PCT/US2022/072331 external-priority patent/WO2022246377A1/en
Publication of CN117378206A publication Critical patent/CN117378206A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

In some implementations, an electronic device plays a content item in synchronization with playback of the content item at one or more other electronic devices of other users. For example, the electronic device and one or more other electronic devices are engaged in a communication session that includes synchronized playback of a content item, including synchronized modification of the playback of the content item based on modifying an input received at one of the electronic devices for the playback of the content item.

Description

User interface and related connection method for shared playback of content items
Cross Reference to Related Applications
This patent application claims the benefit of U.S. provisional application No. 63/189,106, filed 5/15 of 2021, and U.S. provisional application No. 63/197,493, filed 6/6 of 2021, the contents of both of which are incorporated herein by reference in their entirety for all purposes.
Technical Field
The present description relates generally to electronic devices that perform shared playback of content items, and user interactions with such devices.
Background
In recent years, user interaction with electronic devices has been significantly enhanced. These devices may be devices such as computers, tablets, televisions, multimedia devices, mobile devices, smartwatches, etc. In some cases, a user may wish to use an electronic device to play a content item in a synchronized manner with other users' electronic devices.
Disclosure of Invention
Some embodiments described in this disclosure relate to a manner of playing a content item on an electronic device in synchronization with playback of the content item at other electronic devices of other users. Enhancing user interaction with the electronic device while performing the above operations improves the user's experience of using the device and reduces user interaction time, which is particularly important where the input device is battery powered.
It is well known that the use of personally identifiable information should follow privacy policies and practices that are recognized as meeting or exceeding industry or government requirements for maintaining user privacy. In particular, personally identifiable information data should be managed and processed to minimize the risk of inadvertent or unauthorized access or use, and the nature of authorized use should be specified to the user.
Drawings
For a better understanding of the various described embodiments, reference should be made to the following detailed description taken in conjunction with the accompanying drawings in which like reference numerals designate corresponding parts throughout the figures thereof.
Fig. 1A is a block diagram illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments.
FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments.
Fig. 2 illustrates a portable multifunction device with a touch screen in accordance with some embodiments.
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments.
FIG. 4A illustrates an exemplary user interface for an application menu on a portable multifunction device in accordance with some embodiments.
Fig. 4B illustrates an exemplary user interface of a multifunction device with a touch-sensitive surface separate from a display in accordance with some embodiments.
Fig. 5A illustrates a personal electronic device according to some embodiments.
Fig. 5B is a block diagram illustrating a personal electronic device, according to some embodiments.
Fig. 5C-5D illustrate exemplary components of a personal electronic device having a touch sensitive display and an intensity sensor, according to some embodiments.
Fig. 5E-5H illustrate exemplary components and user interfaces of a personal electronic device according to some embodiments.
Fig. 6A-6 CC illustrate an exemplary manner in which an electronic device plays a content item in synchronization with playback of the content item at other electronic devices of other users, according to some embodiments.
Fig. 7 is a flow chart illustrating a method of playing a content item at an electronic device in synchronization with playback of the content item at other electronic devices of other users, according to some embodiments.
Detailed Description
Description of the embodiments
The following description sets forth exemplary methods, parameters, and the like. However, it should be recognized that such description is not intended as a limitation on the scope of the present disclosure, but is instead provided as a description of exemplary embodiments.
In some implementations, an exemplary electronic device in a communication session with one or more second electronic devices plays a content item in synchronization with playback of the content item at the one or more second electronic devices. For example, in a shared playback mode of a communication session, the electronic device presents the same content item in a synchronized manner, including synchronizing modifications to playback of the content item in response to one or more user inputs. Such techniques may reduce the cognitive burden on a user using an exemplary electronic device and simplify the process for playing content items at multiple electronic devices in a communication session.
Although the following description uses the terms "first," "second," etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another element. For example, a first touch may be named a second touch and similarly a second touch may be named a first touch without departing from the scope of the various described embodiments. Both the first touch and the second touch are touches, but they are not the same touch.
The terminology used in the description of the various illustrated embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and in the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Depending on the context, the term "if" is optionally interpreted to mean "when..once..once.," in response to determining "or" in response to detecting ". Similarly, the phrase "if determined … …" or "if detected [ stated condition or event ]" is optionally interpreted to mean "upon determining … …" or "in response to determining … …" or "upon detecting [ stated condition or event ]" or "in response to detecting [ stated condition or event ]" depending on the context.
Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described herein. In some embodiments, the device is a portable communication device, such as a mobile phone, that also includes other functions, such as PDA and/or music player functions. Exemplary embodiments of the portable multifunction device include, but are not limited to, those from Apple inc (Cupertino, california)Device, iPod->Device, and->An apparatus. Other portable electronic devices, such as a laptop or tablet computer having a touch-sensitive surface (e.g., a touch screen display and/or a touchpad), are optionally used. It should also be appreciated that in some embodiments, the device is not a portable communication device, but rather a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). In some embodiments, the electronic device is a computer system in communication (e.g., via wireless communication, via wired communication) with the display generation component. The display generation component is configured to provide visual output, such as display via a CRT display, display via an LED display, or display via image projection. In some embodiments, the display generating component is integrated with the computer system. In some embodiments, the display generating component is separate from the computer system. As used herein, "displaying" content includes displaying content (e.g., video data rendered or decoded by display controller 156) by transmitting data (e.g., image data or video data) to an integrated or external display generation component via a wired or wireless connection to visually produce the content.
In the following discussion, an electronic device including a display and a touch-sensitive surface is described. However, it should be understood that the electronic device optionally includes one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.
The device typically supports various applications such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk editing applications, spreadsheet applications, gaming applications, telephony applications, video conferencing applications, email applications, instant messaging applications, fitness support applications, photo management applications, digital camera applications, digital video camera applications, web browsing applications, digital music player applications, and/or digital video player applications.
The various applications executing on the device optionally use at least one generic physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the device are optionally adjusted and/or changed for different applications and/or within the respective applications. In this way, the common physical architecture of the devices (such as the touch-sensitive surface) optionally supports various applications with a user interface that is intuitive and transparent to the user.
Attention is now directed to embodiments of a portable device having a touch sensitive display. Fig. 1A is a block diagram illustrating a portable multifunction device 100 with a touch-sensitive display system 112 in accordance with some embodiments. Touch-sensitive display 112 is sometimes referred to as a "touch screen" for convenience and is sometimes referred to or referred to as a "touch-sensitive display system". Device 100 includes memory 102 (which optionally includes one or more computer-readable storage media), memory controller 122, one or more processing units (CPUs) 120, peripheral interface 118, RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, input/output (I/O) subsystem 106, other input control devices 116, and external ports 124. The apparatus 100 optionally includes one or more optical sensors 164. The device 100 optionally includes one or more contact intensity sensors 165 for detecting the intensity of a contact on the device 100 (e.g., a touch-sensitive surface, such as the touch-sensitive display system 112 of the device 100). Device 100 optionally includes one or more tactile output generators 167 (e.g., generating tactile output on a touch-sensitive surface, such as touch-sensitive display system 112 of device 100 or touch pad 355 of device 300) for generating tactile output on device 100. These components optionally communicate via one or more communication buses or signal lines 103.
As used in this specification and the claims, the term "intensity" of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of the contact on the touch-sensitive surface (e.g., finger contact), or to an alternative to the force or pressure of the contact on the touch-sensitive surface (surrogate). The intensity of the contact has a range of values that includes at least four different values and more typically includes hundreds of different values (e.g., at least 256). The intensity of the contact is optionally determined (or measured) using various methods and various sensors or combinations of sensors. For example, one or more force sensors below or adjacent to the touch-sensitive surface are optionally used to measure forces at different points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., weighted average) to determine an estimated contact force. Similarly, the pressure sensitive tip of the stylus is optionally used to determine the pressure of the stylus on the touch sensitive surface. Alternatively, the size of the contact area and/or its variation detected on the touch-sensitive surface, the capacitance of the touch-sensitive surface and/or its variation in the vicinity of the contact and/or the resistance of the touch-sensitive surface and/or its variation in the vicinity of the contact are optionally used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, surrogate measurements of contact force or pressure are directly used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to surrogate measurements). In some implementations, surrogate measurements of contact force or pressure are converted to an estimated force or pressure, and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). The intensity of the contact is used as an attribute of the user input, allowing the user to access additional device functions that are not otherwise accessible to the user on a smaller sized device of limited real estate for displaying affordances and/or receiving user input (e.g., via a touch-sensitive display, touch-sensitive surface, or physical/mechanical control, such as a knob or button).
As used in this specification and in the claims, the term "haptic output" refers to a physical displacement of a device relative to a previous position of the device, a physical displacement of a component of the device (e.g., a touch sensitive surface) relative to another component of the device (e.g., a housing), or a displacement of a component relative to a centroid of the device, to be detected by a user with a user's feel. For example, in the case where the device or component of the device is in contact with a touch-sensitive surface of the user (e.g., a finger, palm, or other portion of the user's hand), the haptic output generated by the physical displacement will be interpreted by the user as a haptic sensation corresponding to a perceived change in a physical characteristic of the device or component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or touch pad) is optionally interpreted by a user as a "press click" or "click-down" of a physically actuated button. In some cases, the user will feel a tactile sensation, such as "press click" or "click down", even when the physical actuation button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movement is not moved. As another example, movement of the touch-sensitive surface may optionally be interpreted or sensed by a user as "roughness" of the touch-sensitive surface, even when the smoothness of the touch-sensitive surface is unchanged. While such interpretation of touches by a user will be limited by the user's individualized sensory perception, many sensory perceptions of touches are common to most users. Thus, when a haptic output is described as corresponding to a particular sensory perception of a user (e.g., "click down," "click up," "roughness"), unless stated otherwise, the haptic output generated corresponds to a physical displacement of the device or component thereof that would generate that sensory perception of a typical (or ordinary) user.
It should be understood that the device 100 is merely one example of a portable multifunction device, and that the device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown in fig. 1A are implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
Memory 102 optionally includes high-speed random access memory, and also optionally includes non-volatile memory, such as one or more disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Memory controller 122 optionally controls access to memory 102 by other components of device 100.
Peripheral interface 118 may be used to couple input and output peripherals of the device to CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions of the device 100 and process data. In some embodiments, peripheral interface 118, CPU 120, and memory controller 122 are optionally implemented on a single chip, such as chip 104. In some other embodiments, they are optionally implemented on separate chips.
The RF (radio frequency) circuit 108 receives and transmits RF signals, also referred to as electromagnetic signals. RF circuitry 108 converts/converts electrical signals to/from electromagnetic signals and communicates with communication networks and other communication devices via electromagnetic signals. RF circuitry 108 optionally includes well known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a codec chipset, a Subscriber Identity Module (SIM) card, memory, and the like. RF circuitry 108 optionally communicates via wireless communication with networks such as the internet (also known as the World Wide Web (WWW)), intranets, and/or wireless networks such as cellular telephone networks, wireless Local Area Networks (LANs), and/or Metropolitan Area Networks (MANs), and other devices. The RF circuitry 108 optionally includes well-known circuitry for detecting a Near Field Communication (NFC) field, such as by a short-range communication radio. Wireless communications optionally use any of a variety of communication standards, protocols, and technologies including, but not limited to, global system for mobile communications (GSM), enhanced Data GSM Environment (EDGE), high Speed Downlink Packet Access (HSDPA), high Speed Uplink Packet Access (HSUPA), evolution, pure data (EV-DO), HSPA, hspa+, dual cell HSPA (DC-HSPDA), long Term Evolution (LTE), near Field Communications (NFC), wideband code division multiple access (W-CDMA), code Division Multiple Access (CDMA), time Division Multiple Access (TDMA), bluetooth low energy (BTLE), wireless fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, and/or IEEE 802.11 ac), voice over internet protocol (VoIP), wi-MAX, email protocols (e.g., internet Message Access Protocol (IMAP) and/or Post Office Protocol (POP)), messages (e.g., extensible message handling and presence protocol (XMPP), protocols for instant messaging and presence using extended session initiation protocol (sime), messages and presence (IMPS), instant messaging and/or SMS (SMS) protocols, or any other suitable communications protocol not yet developed herein.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between the user and device 100. Audio circuitry 110 receives audio data from peripheral interface 118, converts the audio data to electrical signals, and transmits the electrical signals to speaker 111. The speaker 111 converts electrical signals into sound waves that are audible to humans. The audio circuit 110 also receives electrical signals converted from sound waves by the microphone 113. The audio circuitry 110 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 118 for processing. The audio data is optionally retrieved from and/or transmitted to the memory 102 and/or the RF circuitry 108 by the peripheral interface 118. In some embodiments, the audio circuit 110 also includes a headset jack (e.g., 212 in fig. 2). The headset jack provides an interface between the audio circuit 110 and removable audio input/output peripherals such as output-only headphones or a headset having both an output (e.g., a monaural or binaural) and an input (e.g., a microphone).
I/O subsystem 106 couples input/output peripheral devices on device 100, such as touch screen 112 and other input control devices 116, to peripheral interface 118. The I/O subsystem 106 optionally includes a display controller 156, an optical sensor controller 158, an intensity sensor controller 159, a haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive electrical signals from/transmit electrical signals to other input control devices 116. The other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click-type dials, and the like. In some implementations, the input controller 160 is optionally coupled to (or not coupled to) any of the following: a keyboard, an infrared port, a USB port, and a pointing device such as a mouse. One or more buttons (e.g., 208 in fig. 2) optionally include an up/down button for volume control of speaker 111 and/or microphone 113. The one or more buttons optionally include a push button (e.g., 206 in fig. 2). In some embodiments, the electronic device is a computer system that communicates (e.g., via wireless communication, via wired communication) with one or more input devices. In some implementations, the one or more input devices include a touch-sensitive surface (e.g., a touch pad as part of a touch-sensitive display). In some embodiments, the one or more input devices include one or more camera sensors (e.g., one or more optical sensors 164 and/or one or more depth camera sensors 175), such as for tracking gestures (e.g., hand gestures) of a user as input. In some embodiments, one or more input devices are integrated with the computer system. In some embodiments, one or more input devices are separate from the computer system.
The quick press of the push button optionally disengages the lock of the touch screen 112 or optionally begins the process of unlocking the device using gestures on the touch screen, as described in U.S. patent application 11/322,549 (i.e., U.S. patent No. 7,657,849), entitled "Unlocking a Device by Performing Gestures on an Unlock Image," filed on even 23, 12/2005, which is hereby incorporated by reference in its entirety. Long presses of a button (e.g., 206) optionally cause the device 100 to power on or off. The function of the one or more buttons is optionally customizable by the user. Touch screen 112 is used to implement virtual buttons or soft buttons and one or more soft keyboards.
The touch sensitive display 112 provides an input interface and an output interface between the device and the user. The display controller 156 receives electrical signals from and/or transmits electrical signals to the touch screen 112. Touch screen 112 displays visual output to a user. Visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively, "graphics"). In some embodiments, some or all of the visual output optionally corresponds to a user interface object.
Touch screen 112 has a touch-sensitive surface, sensor, or set of sensors that receives input from a user based on haptic and/or tactile contact. Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or interruption of the contact) on touch screen 112 and translate the detected contact into interactions with user interface objects (e.g., one or more soft keys, icons, web pages, or images) displayed on touch screen 112. In an exemplary embodiment, the point of contact between touch screen 112 and the user corresponds to a user's finger.
Touch screen 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, but in other embodiments other display technologies are used. Touch screen 112 and display controller 156 optionally detect contact and any movement or interruption thereof using any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112. In an exemplary embodiment, a projected mutual capacitance sensing technique is used, such as that described in the text from Apple inc (Cupertino, california) And iPod->Techniques used in the above.
The touch sensitive display in some implementations of touch screen 112 is optionally similar to the multi-touch sensitive touch pad described in the following U.S. patents: 6,323,846 (Westerman et al), 6,570,557 (Westerman et al) and/or 6,677,932 (Westerman et al) and/or U.S. patent publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety. However, touch screen 112 displays visual output from device 100, while touch sensitive touchpads do not provide visual output.
Touch sensitive displays in some implementations of touch screen 112 are described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, "Multipoint Touch Surface Controller", filed on 5/2/2006; (2) U.S. patent application Ser. No. 10/840,862, "Multipoint Touchscreen", filed 5/6/2004; (3) U.S. patent application Ser. No. 10/903,964, "Gestures For Touch Sensitive Input Devices", filed on 7 months and 30 days 2004; (4) U.S. patent application Ser. No. 11/048,264, "Gestures For Touch Sensitive Input Devices", filed 1/31/2005; (5) U.S. patent application Ser. No. 11/038,590, "Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices", filed 1/18/2005; (6) U.S. patent application Ser. No. 11/228,758, "Virtual Input Device Placement On A Touch Screen User Interface", filed 9/16/2005; (7) U.S. patent application Ser. No. 11/228,700, "Operation Of A Computer With A Touch Screen Interface", filed 9/16/2005; (8) U.S. patent application Ser. No. 11/228,737, "Activating Virtual Keys Of A Touch-Screen Virtual Keyboard", filed on 9/16/2005; and (9) U.S. patent application Ser. No. 11/367,749, "Multi-Functional Hand-Held Device," filed 3/2006. All of these applications are incorporated by reference herein in their entirety.
Touch screen 112 optionally has a video resolution in excess of 100 dpi. In some implementations, the touch screen has a video resolution of about 160 dpi. The user optionally uses any suitable object or appendage, such as a stylus, finger, or the like, to make contact with touch screen 112. In some embodiments, the user interface is designed to work primarily through finger-based contact and gestures, which may not be as accurate as stylus-based input due to the large contact area of the finger on the touch screen. In some embodiments, the device translates the finger-based coarse input into a precise pointer/cursor position or command for performing the action desired by the user.
In some embodiments, the device 100 optionally includes a touch pad (not shown) for activating or deactivating particular functions in addition to the touch screen. In some embodiments, the touch pad is a touch sensitive area of the device that, unlike the touch screen, does not display visual output. The touch pad is optionally a touch sensitive surface separate from the touch screen 112 or an extension of the touch sensitive surface formed by the touch screen.
The apparatus 100 also includes a power system 162 for powering the various components. The power system 162 optionally includes a power management system, one or more power sources (e.g., battery, alternating Current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., light Emitting Diode (LED)), and any other components associated with the generation, management, and distribution of power in the portable device.
The apparatus 100 optionally further comprises one or more optical sensors 164. FIG. 1A shows an optical sensor coupled to an optical sensor controller 158 in the I/O subsystem 106. The optical sensor 164 optionally includes a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The optical sensor 164 receives light projected through one or more lenses from the environment and converts the light into data representing an image. In conjunction with imaging module 143 (also called a camera module), optical sensor 164 optionally captures still images or video. In some embodiments, the optical sensor is located on the rear of the device 100, opposite the touch screen display 112 on the front of the device, so that the touch screen display can be used as a viewfinder for still image and/or video image acquisition. In some embodiments, the optical sensor is located on the front of the device such that the user's image is optionally acquired for video conferencing while viewing other video conference participants on the touch screen display. In some implementations, the position of the optical sensor 164 may be changed by the user (e.g., by rotating a lens and sensor in the device housing) such that a single optical sensor 164 is used with the touch screen display for both video conferencing and still image and/or video image acquisition.
The apparatus 100 optionally further comprises one or more contact intensity sensors 165. FIG. 1A shows a contact intensity sensor coupled to an intensity sensor controller 159 in the I/O subsystem 106. The contact strength sensor 165 optionally includes one or more piezoresistive strain gauges, capacitive force sensors, electrical force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other strength sensors (e.g., sensors for measuring force (or pressure) of a contact on a touch-sensitive surface). The contact strength sensor 165 receives contact strength information (e.g., pressure information or a surrogate for pressure information) from the environment. In some implementations, at least one contact intensity sensor is juxtaposed or adjacent to a touch-sensitive surface (e.g., touch-sensitive display system 112). In some embodiments, at least one contact intensity sensor is located on the rear of the device 100, opposite the touch screen display 112 located on the front of the device 100.
The device 100 optionally further includes one or more proximity sensors 166. Fig. 1A shows a proximity sensor 166 coupled to the peripheral interface 118. Alternatively, the proximity sensor 166 is optionally coupled to the input controller 160 in the I/O subsystem 106. The proximity sensor 166 optionally performs as described in the following U.S. patent application nos.: 11/241,839, entitled "Proximity Detector In Handheld Device";11/240,788, entitled "Proximity Detector In Handheld Device";11/620,702, entitled "Using Ambient Light Sensor To Augment Proximity Sensor Output";11/586,862, entitled "Automated Response To And Sensing Of User Activity In Portable Devices"; and 11/638,251, entitled "Methods And Systems For Automatic Configuration Of Peripherals," which are hereby incorporated by reference in their entirety. In some embodiments, the proximity sensor is turned off and the touch screen 112 is disabled when the multifunction device is placed near the user's ear (e.g., when the user is making a telephone call).
The device 100 optionally further comprises one or more tactile output generators 167. FIG. 1A shows a haptic output generator coupled to a haptic feedback controller 161 in the I/O subsystem 106. The tactile output generator 167 optionally includes one or more electroacoustic devices such as speakers or other audio components; and/or electromechanical devices for converting energy into linear motion such as motors, solenoids, electroactive polymers, piezoelectric actuators, electrostatic actuators, or other tactile output generating means (e.g., means for converting an electrical signal into a tactile output on a device). The contact intensity sensor 165 receives haptic feedback generation instructions from the haptic feedback module 133 and generates a haptic output on the device 100 that can be perceived by a user of the device 100. In some embodiments, at least one tactile output generator is juxtaposed or adjacent to a touch-sensitive surface (e.g., touch-sensitive display system 112), and optionally generates tactile output by moving the touch-sensitive surface vertically (e.g., inward/outward of the surface of device 100) or laterally (e.g., backward and forward in the same plane as the surface of device 100). In some embodiments, at least one tactile output generator sensor is located on the rear of the device 100, opposite the touch screen display 112 located on the front of the device 100.
The device 100 optionally further includes one or more accelerometers 168. Fig. 1A shows accelerometer 168 coupled to peripheral interface 118. Alternatively, accelerometer 168 is optionally coupled to input controller 160 in I/O subsystem 106. Accelerometer 168 optionally performs as described in the following U.S. patent publication nos.: 20050190059 under the names "acceletation-based Theft Detection System for Portable Electronic Devices" and 20060017692 under the name "Methods And Apparatuses For Operating A Portable Device Based On An Accelerometer", both of which disclosures are incorporated herein by reference in their entirety. In some implementations, information is displayed in a portrait view or a landscape view on a touch screen display based on analysis of data received from one or more accelerometers. The device 100 optionally includes a magnetometer (not shown) and a GPS (or GLONASS or other global navigation system) receiver (not shown) in addition to the accelerometer 168 for obtaining information about the position and orientation (e.g., longitudinal or lateral) of the device 100.
In some embodiments, the software components stored in memory 102 include an operating system 126, a communication module (or instruction set) 128, a contact/motion module (or instruction set) 130, a graphics module (or instruction set) 132, a text input module (or instruction set) 134, a Global Positioning System (GPS) module (or instruction set) 135, and an application program (or instruction set) 136. Furthermore, in some embodiments, memory 102 (fig. 1A) or 370 (fig. 3) stores device/global internal state 157, as shown in fig. 1A and 3. The device/global internal state 157 includes one or more of the following: an active application state indicating which applications (if any) are currently active; display status, indicating what applications, views, or other information occupy various areas of the touch screen display 112; sensor status, including information obtained from the various sensors of the device and the input control device 116; and location information relating to the device location and/or pose.
Operating system 126 (e.g., darwin, RTXC, LINUX, UNIX, OS X, iOS, WINDOWS, or embedded operating systems such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.), and facilitates communication between the various hardware components and software components.
The communication module 128 facilitates communication with other devices through one or more external ports 124 and also includes various software components for processing data received by the RF circuitry 108 and/or the external ports 124. External port 124 (e.g., universal Serial Bus (USB), firewire, etc.) is adapted to be coupled directly to other devices or indirectly via a network (e.g., the internet, wireless LAN, etc.). In some embodiments, the external port is in communication with30-pin connector used on (Apple Inc. trademark) device identical or similar and/or compatible withA multi-pin (e.g., 30-pin) connector.
The contact/motion module 130 optionally detects contact with the touch screen 112 (in conjunction with the display controller 156) and other touch sensitive devices (e.g., a touchpad or physical click wheel). The contact/motion module 130 includes various software components for performing various operations related to contact detection, such as determining whether a contact has occurred (e.g., detecting a finger press event), determining the strength of the contact (e.g., the force or pressure of the contact, or a substitute for the force or pressure of the contact), determining whether there is movement of the contact and tracking movement across the touch-sensitive surface (e.g., detecting one or more finger drag events), and determining whether the contact has ceased (e.g., detecting a finger lift event or a contact break). The contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact optionally includes determining a velocity (magnitude), a speed (magnitude and direction), and/or an acceleration (change in magnitude and/or direction) of the point of contact, the movement of the point of contact being represented by a series of contact data. These operations are optionally applied to single point contacts (e.g., single finger contacts) or simultaneous multi-point contacts (e.g., "multi-touch"/multiple finger contacts). In some embodiments, the contact/motion module 130 and the display controller 156 detect contact on the touch pad.
In some implementations, the contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether the user has "clicked" on an icon). In some implementations, at least a subset of the intensity thresholds are determined according to software parameters (e.g., the intensity thresholds are not determined by activation thresholds of particular physical actuators and may be adjusted without changing the physical hardware of the device 100). For example, without changing the touchpad or touch screen display hardware, the mouse "click" threshold of the touchpad or touch screen may be set to any of a wide range of predefined thresholds. Additionally, in some implementations, a user of the device is provided with software settings for adjusting one or more intensity thresholds in a set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting multiple intensity thresholds at once with a system-level click on an "intensity" parameter).
The contact/motion module 130 optionally detects gesture input by the user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different movements, timings, and/or intensities of the detected contacts). Thus, gestures are optionally detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger press event, and then detecting a finger lift (lift off) event at the same location (or substantially the same location) as the finger press event (e.g., at the location of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event, then detecting one or more finger-dragging events, and then detecting a finger-up (lift-off) event.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch screen 112 or other displays, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast, or other visual attribute) of the displayed graphics. As used herein, the term "graphic" includes any object that may be displayed to a user, including but not limited to text, web pages, icons (such as user interface objects including soft keys), digital images, video, animation, and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is optionally assigned a corresponding code. The graphic module 132 receives one or more codes for designating graphics to be displayed from an application program or the like, and also receives coordinate data and other graphic attribute data together if necessary, and then generates screen image data to output to the display controller 156.
Haptic feedback module 133 includes various software components for generating instructions used by haptic output generator 167 to generate haptic output at one or more locations on device 100 in response to user interaction with device 100.
Text input module 134, which is optionally a component of graphics module 132, provides a soft keyboard for entering text in various applications (e.g., contacts 137, email 140, IM 141, browser 147, and any other application requiring text input).
The GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to the phone 138 for use in location-based dialing, to the camera 143 as picture/video metadata, and to applications that provide location-based services, such as weather gadgets, local page gadgets, and map/navigation gadgets).
The application 136 optionally includes the following modules (or sets of instructions) or a subset or superset thereof:
contact module 137 (sometimes referred to as an address book or contact list);
a telephone module 138;
video conferencing module 139;
email client module 140;
an Instant Messaging (IM) module 141;
a fitness support module 142;
a camera module 143 for still and/or video images;
an image management module 144;
a video player module;
a music player module;
browser module 147;
Calendar module 148;
a gadget module 149, optionally comprising one or more of: weather gadgets 149-1, stock gadgets 149-2, calculator gadget 149-3, alarm gadget 149-4, dictionary gadget 149-5, and other gadgets obtained by the user, and user-created gadgets 149-6;
a gadget creator module 150 for forming a user-created gadget 149-6;
search module 151;
a video and music player module 152 that incorporates the video player module and the music player module;
a note module 153;
map module 154; and/or
An online video module 155.
Examples of other applications 136 optionally stored in memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, contacts module 137 is optionally used to manage an address book or contact list (e.g., in application internal state 192 of contacts module 137 stored in memory 102 or memory 370), including: adding one or more names to the address book; deleting the name from the address book; associating a telephone number, email address, physical address, or other information with the name; associating the image with the name; classifying and classifying names; providing a telephone number or email address to initiate and/or facilitate communications through telephone 138, video conferencing module 139, email 140, or IM 141; etc.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, telephone module 138 is optionally used to input a sequence of characters corresponding to a telephone number, access one or more telephone numbers in contact module 137, modify the entered telephone number, dial the corresponding telephone number, conduct a conversation, and disconnect or hang up when the conversation is completed. As described above, wireless communication optionally uses any of a variety of communication standards, protocols, and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact/motion module 130, graphics module 132, text input module 134, contacts module 137, and telephony module 138, videoconferencing module 139 includes executable instructions to initiate, conduct, and terminate a videoconference between a user and one or more other participants according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, email client module 140 includes executable instructions for creating, sending, receiving, and managing emails in response to user instructions. In conjunction with the image management module 144, the email client module 140 makes it very easy to create and send emails with still or video images captured by the camera module 143.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, instant message module 141 includes executable instructions for: inputting a character sequence corresponding to an instant message, modifying previously inputted characters, transmitting a corresponding instant message (e.g., using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for phone-based instant messages or using XMPP, SIMPLE, or IMPS for internet-based instant messages), receiving an instant message, and viewing the received instant message. In some embodiments, the transmitted and/or received instant message optionally includes graphics, photographs, audio files, video files, and/or other attachments supported in an MMS and/or Enhanced Messaging Service (EMS). As used herein, "instant message" refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module, workout support module 142 includes executable instructions for creating a workout (e.g., with time, distance, and/or calorie burn targets); communicate with a fitness sensor (exercise device); receiving fitness sensor data; calibrating a sensor for monitoring fitness; selecting and playing music for exercise; and displaying, storing and transmitting the fitness data.
In conjunction with touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact/motion module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions for: capturing still images or videos (including video streams) and storing them in the memory 102, modifying features of still images or videos, or deleting still images or videos from the memory 102.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions for arranging, modifying (e.g., editing), or otherwise manipulating, tagging, deleting, presenting (e.g., in a digital slide or album), and storing still and/or video images.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions for browsing the internet according to user instructions, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, email client module 140, and browser module 147, calendar module 148 includes executable instructions for creating, displaying, modifying, and storing calendars and data associated with calendars (e.g., calendar entries, to-do items, etc.) according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, gadget module 149 is a mini-application (e.g., weather gadget 149-1, stock gadget 149-2, calculator gadget 149-3, alarm gadget 149-4, and dictionary gadget 149-5) or a mini-application created by a user (e.g., user created gadget 149-6) that is optionally downloaded and used by a user. In some embodiments, gadgets include HTML (hypertext markup language) files, CSS (cascading style sheet) files, and JavaScript files. In some embodiments, gadgets include XML (extensible markup language) files and JavaScript files (e.g., yahoo | gadgets).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, gadget creator module 150 is optionally used by a user to create gadgets (e.g., to transform user-specified portions of a web page into gadgets).
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions for searching memory 102 for text, music, sound, images, video, and/or other files that match one or more search criteria (e.g., one or more user-specified search terms) according to user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuit 110, speaker 111, RF circuit 108, and browser module 147, video and music player module 152 includes executable instructions that allow a user to download and playback recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, as well as executable instructions for displaying, rendering, or otherwise playing back video (e.g., on touch screen 112 or on an external display connected via external port 124). In some embodiments, the device 100 optionally includes the functionality of an MP3 player such as an iPod (trademark of Apple inc.).
In conjunction with the touch screen 112, the display controller 156, the contact/movement module 130, the graphics module 132, and the text input module 134, the notes module 153 includes executable instructions for creating and managing notes, backlog, and the like according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 is optionally configured to receive, display, modify, and store maps and data associated with maps (e.g., driving directions, data related to shops and other points of interest at or near a particular location, and other location-based data) according to user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuit 110, speaker 111, RF circuit 108, text input module 134, email client module 140, and browser module 147, online video module 155 includes instructions for: allowing a user to access, browse, receive (e.g., by streaming and/or downloading), play back (e.g., on a touch screen or on an external display connected via external port 124), send an email with a link to a particular online video, and otherwise manage online video in one or more file formats such as h.264. In some embodiments, the instant messaging module 141 is used to send links to particular online videos instead of the email client module 140. Additional descriptions of online video applications can be found in U.S. provisional patent application Ser. No. 60/936,562, and U.S. patent application Ser. No. 11/968,067, entitled "Portable Multifunction Device, method, and Graphical User Interface for Playing Online Videos," filed on even date 20, 6, 2007, and entitled "Portable Multifunction Device, method, and Graphical User Interface for Playing Online Videos," filed on even date 31, 12, 2007, the contents of both of which are hereby incorporated by reference in their entirety.
Each of the modules and applications described above corresponds to a set of executable instructions for performing one or more of the functions described above, as well as the methods described in this patent application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (e.g., sets of instructions) need not be implemented in separate software programs, procedures or modules, and thus various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. For example, the video player module is optionally combined with the music player module into a single module (e.g., video and music player module 152 in fig. 1A). In some embodiments, memory 102 optionally stores a subset of the modules and data structures described above. Further, memory 102 optionally stores additional modules and data structures not described above.
In some embodiments, device 100 is a device in which the operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or touch pad. By using a touch screen and/or a touch pad as the primary input control device for operating the device 100, the number of physical input control devices (e.g., push buttons, dials, etc.) on the device 100 is optionally reduced.
A predefined set of functions performed solely by the touch screen and/or touch pad optionally includes navigation between user interfaces. In some embodiments, the touchpad, when touched by a user, navigates the device 100 from any user interface displayed on the device 100 to a main menu, a main desktop menu, or a root menu. In such implementations, a touch pad is used to implement a "menu button". In some other embodiments, the menu buttons are physical push buttons or other physical input control devices, rather than touch pads.
FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments. In some embodiments, memory 102 (FIG. 1A) or memory 370 (FIG. 3) includes event sorter 170 (e.g., in operating system 126) and corresponding applications 136-1 (e.g., any of the aforementioned applications 137-151, 155, 380-390).
The event classifier 170 receives the event information and determines the application view 191 of the application 136-1 and the application 136-1 to which the event information is to be delivered. The event sorter 170 includes an event monitor 171 and an event dispatcher module 174. In some embodiments, the application 136-1 includes an application internal state 192 that indicates one or more current application views that are displayed on the touch-sensitive display 112 when the application is active or executing. In some embodiments, the device/global internal state 157 is used by the event classifier 170 to determine which application(s) are currently active, and the application internal state 192 is used by the event classifier 170 to determine the application view 191 to which to deliver event information.
In some implementations, the application internal state 192 includes additional information, such as one or more of the following: restoration information to be used when the application 136-1 resumes execution, user interface state information indicating that the information is being displayed or ready for display by the application 136-1, a state queue for enabling the user to return to a previous state or view of the application 136-1, and a repeat/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripheral interface 118. The event information includes information about sub-events (e.g., user touches on the touch sensitive display 112 as part of a multi-touch gesture). The peripheral interface 118 transmits information it receives from the I/O subsystem 106 or sensors, such as a proximity sensor 166, one or more accelerometers 168, and/or microphone 113 (via audio circuitry 110). The information received by the peripheral interface 118 from the I/O subsystem 106 includes information from the touch-sensitive display 112 or touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to peripheral interface 118 at predetermined intervals. In response, the peripheral interface 118 transmits event information. In other embodiments, the peripheral interface 118 transmits event information only if there is a significant event (e.g., receiving an input above a predetermined noise threshold and/or receiving an input exceeding a predetermined duration).
In some implementations, the event classifier 170 also includes a hit view determination module 172 and/or an active event identifier determination module 173.
When the touch sensitive display 112 displays more than one view, the hit view determination module 172 provides a software process for determining where within one or more views a sub-event has occurred. The view is made up of controls and other elements that the user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes referred to herein as application views or user interface windows, in which information is displayed and touch-based gestures occur. The application view (of the respective application) in which the touch is detected optionally corresponds to a level of programming within the application's programming or view hierarchy. For example, the lowest horizontal view in which a touch is detected is optionally referred to as a hit view, and the set of events that are recognized as correct inputs is optionally determined based at least in part on the hit view of the initial touch that begins a touch-based gesture.
Hit view determination module 172 receives information related to sub-events of the touch-based gesture. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies the hit view as the lowest view in the hierarchy that should process sub-events. In most cases, the hit view is the lowest level view in which the initiating sub-event (e.g., the first sub-event in a sequence of sub-events that form an event or potential event) occurs. Once the hit view is identified by the hit view determination module 172, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as a hit view.
The activity event recognizer determination module 173 determines which view or views within the view hierarchy should receive a particular sequence of sub-events. In some implementations, the active event identifier determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, the activity event recognizer determination module 173 determines that all views that include the physical location of a sub-event are actively engaged views, and thus determines that all actively engaged views should receive a particular sequence of sub-events. In other embodiments, even if the touch sub-event is completely localized to an area associated with one particular view, the higher view in the hierarchy will remain the actively engaged view.
The event dispatcher module 174 dispatches event information to an event recognizer (e.g., event recognizer 180). In embodiments that include an active event recognizer determination module 173, the event dispatcher module 174 delivers event information to the event recognizers determined by the active event recognizer determination module 173. In some embodiments, the event dispatcher module 174 stores event information in an event queue that is retrieved by the corresponding event receiver 182.
In some embodiments, the operating system 126 includes an event classifier 170. Alternatively, the application 136-1 includes an event classifier 170. In yet another embodiment, the event classifier 170 is a stand-alone module or part of another module stored in the memory 102, such as the contact/motion module 130.
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for processing touch events that occur within a respective view of the user interface of the application. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, the respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of the event recognizers 180 are part of a separate module that is a higher level object, such as a user interface toolkit (not shown) or application 136-1, from which to inherit methods and other properties. In some implementations, the respective event handlers 190 include one or more of the following: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or invokes data updater 176, object updater 177, or GUI updater 178 to update the application internal state 192. Alternatively, one or more of application views 191 include one or more corresponding event handlers 190. Additionally, in some implementations, one or more of the data updater 176, the object updater 177, and the GUI updater 178 are included in a respective application view 191.
The corresponding event identifier 180 receives event information (e.g., event data 179) from the event classifier 170 and identifies events based on the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 further includes at least a subset of metadata 183 and event transfer instructions 188 (which optionally include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about sub-events such as touches or touch movements. The event information also includes additional information, such as the location of the sub-event, according to the sub-event. When a sub-event relates to movement of a touch, the event information optionally also includes the rate and direction of the sub-event. In some embodiments, the event includes rotation of the device from one orientation to another orientation (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information includes corresponding information about a current orientation of the device (also referred to as a device pose).
The event comparator 184 compares the event information with predefined event or sub-event definitions and determines an event or sub-event or determines or updates the state of the event or sub-event based on the comparison. In some embodiments, event comparator 184 includes event definition 186. Event definition 186 includes definitions of events (e.g., a predefined sequence of sub-events), such as event 1 (187-1), event 2 (187-2), and others. In some implementations, sub-events in the event (187) include, for example, touch start, touch end, touch move, touch cancel, and multi-touch. In one example, the definition of event 1 (187-1) is a double click on the displayed object. For example, a double click includes a first touch on the displayed object for a predetermined length of time (touch start), a first lift-off on the displayed object for a predetermined length of time (touch end), a second touch on the displayed object for a predetermined length of time (touch start), and a second lift-off on the displayed object for a predetermined length of time (touch end). In another example, the definition of event 2 (187-2) is a drag on the displayed object. For example, dragging includes touching (or contacting) on the displayed object for a predetermined period of time, movement of the touch on the touch-sensitive display 112, and lift-off of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some implementations, the event definitions 187 include definitions of events for respective user interface objects. In some implementations, the event comparator 184 performs a hit test to determine which user interface object is associated with a sub-event. For example, in an application view that displays three user interface objects on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit test to determine which of the three user interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the results of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects an event handler associated with the sub-event and the object that triggered the hit test.
In some embodiments, the definition of the respective event (187) further includes a delay action that delays delivery of the event information until it has been determined that the sequence of sub-events does or does not correspond to an event type of the event recognizer.
When the respective event recognizer 180 determines that the sequence of sub-events does not match any of the events in the event definition 186, the respective event recognizer 180 enters an event impossible, event failed, or event end state after which subsequent sub-events of the touch-based gesture are ignored. In this case, the other event recognizers (if any) that remain active for the hit view continue to track and process sub-events of the ongoing touch-based gesture.
In some embodiments, the respective event recognizer 180 includes metadata 183 with configurable properties, flags, and/or lists that indicate how the event delivery system should perform sub-event delivery to the actively engaged event recognizer. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate how event recognizers interact or are able to interact with each other. In some embodiments, metadata 183 includes configurable properties, flags, and/or lists that indicate whether sub-events are delivered to different levels in a view or programmatic hierarchy.
In some embodiments, when one or more particular sub-events of an event are identified, the corresponding event recognizer 180 activates an event handler 190 associated with the event. In some implementations, the respective event identifier 180 delivers event information associated with the event to the event handler 190. The activate event handler 190 is different from sending (and deferring) sub-events to the corresponding hit view. In some embodiments, event recognizer 180 throws a marker associated with the recognized event, and event handler 190 associated with the marker retrieves the marker and performs a predefined process.
In some implementations, the event delivery instructions 188 include sub-event delivery instructions that deliver event information about the sub-event without activating the event handler. Instead, the sub-event delivery instructions deliver the event information to an event handler associated with the sub-event sequence or to an actively engaged view. Event handlers associated with the sequence of sub-events or with the actively engaged views receive the event information and perform a predetermined process.
In some embodiments, the data updater 176 creates and updates data used in the application 136-1. For example, the data updater 176 updates a telephone number used in the contact module 137 or stores a video file used in the video player module. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, the object updater 177 creates a new user interface object or updates the location of the user interface object. GUI updater 178 updates the GUI. For example, the GUI updater 178 prepares the display information and sends the display information to the graphics module 132 for display on a touch-sensitive display.
In some embodiments, event handler 190 includes or has access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, the data updater 176, the object updater 177, and the GUI updater 178 are included in a single module of the respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
It should be appreciated that the above discussion regarding event handling of user touches on a touch sensitive display also applies to other forms of user inputs that utilize an input device to operate the multifunction device 100, not all of which are initiated on a touch screen. For example, mouse movements and mouse button presses optionally in conjunction with single or multiple keyboard presses or holds; contact movement on the touchpad, such as tap, drag, scroll, etc.; inputting by a touch pen; movement of the device; verbal instructions; detected eye movement; inputting biological characteristics; and/or any combination thereof is optionally used as input corresponding to sub-events defining the event to be distinguished.
Fig. 2 illustrates a portable multifunction device 100 with a touch screen 112 in accordance with some embodiments. The touch screen optionally displays one or more graphics within a User Interface (UI) 200. In this and other embodiments described below, a user can select one or more of these graphics by making a gesture on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figures) or one or more styluses 203 (not drawn to scale in the figures). In some embodiments, selection of one or more graphics will occur when a user breaks contact with the one or more graphics. In some embodiments, the gesture optionally includes one or more taps, one or more swipes (left to right, right to left, up and/or down), and/or scrolling of a finger that has been in contact with the device 100 (right to left, left to right, up and/or down). In some implementations or in some cases, inadvertent contact with the graphic does not select the graphic. For example, when the gesture corresponding to the selection is a tap, a swipe gesture that swipes over an application icon optionally does not select the corresponding application.
The device 100 optionally also includes one or more physical buttons, such as a "home desktop" or menu button 204. As previously described, menu button 204 is optionally used to navigate to any application 136 in a set of applications that are optionally executed on device 100. Alternatively, in some embodiments, the menu buttons are implemented as soft keys in a GUI displayed on touch screen 112.
In some embodiments, the device 100 includes a touch screen 112, menu buttons 204, a press button 206 for powering the device on/off and for locking the device, one or more volume adjustment buttons 208, a Subscriber Identity Module (SIM) card slot 210, a headset jack 212, and a docking/charging external port 124. Pressing button 206 is optionally used to turn on/off the device by pressing the button and holding the button in the pressed state for a predefined time interval; locking the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or unlock the device or initiate an unlocking process. In an alternative embodiment, the device 100 also accepts voice input through the microphone 113 for activating or deactivating certain functions. The device 100 also optionally includes one or more contact intensity sensors 165 for detecting the intensity of contacts on the touch screen 112, and/or one or more haptic output generators 167 for generating haptic outputs for a user of the device 100.
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments. The device 300 need not be portable. In some embodiments, the device 300 is a laptop computer, a desktop computer, a tablet computer, a multimedia player device, a navigation device, an educational device (such as a child learning toy), a gaming system, or a control device (e.g., a home controller or an industrial controller). The device 300 generally includes one or more processing units (CPUs) 310, one or more network or other communication interfaces 360, memory 370, and one or more communication buses 320 for interconnecting these components. Communication bus 320 optionally includes circuitry (sometimes referred to as a chipset) that interconnects and controls communications between system components. The device 300 includes an input/output (I/O) interface 330 with a display 340, typically a touch screen display. The I/O interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 and a touchpad 355, a tactile output generator 357 (e.g., similar to the tactile output generator 167 described above with reference to fig. 1A), a sensor 359 (e.g., an optical sensor, an acceleration sensor, a proximity sensor, a touch sensitive sensor, and/or a contact intensity sensor (similar to the contact intensity sensor 165 described above with reference to fig. 1A)) for generating tactile output on the device 300. Memory 370 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and optionally includes non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 370 optionally includes one or more storage devices located remotely from CPU 310. In some embodiments, memory 370 stores programs, modules, and data structures, or a subset thereof, similar to those stored in memory 102 of portable multifunction device 100 (fig. 1A). Furthermore, memory 370 optionally stores additional programs, modules, and data structures not present in memory 102 of portable multifunction device 100. For example, memory 370 of device 300 optionally stores drawing module 380, presentation module 382, word processing module 384, website creation module 386, disk editing module 388, and/or spreadsheet module 390, while memory 102 of portable multifunction device 100 (fig. 1A) optionally does not store these modules.
Each of the above elements in fig. 3 is optionally stored in one or more of the previously mentioned memory devices. Each of the above-described modules corresponds to a set of instructions for performing the above-described functions. The above-described modules or programs (e.g., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. In some embodiments, memory 370 optionally stores a subset of the modules and data structures described above. Further, memory 370 optionally stores additional modules and data structures not described above.
Attention is now directed to embodiments of user interfaces optionally implemented on, for example, portable multifunction device 100.
Fig. 4A illustrates an exemplary user interface of an application menu on the portable multifunction device 100 in accordance with some embodiments. A similar user interface is optionally implemented on device 300. In some embodiments, the user interface 400 includes the following elements, or a subset or superset thereof:
● A signal strength indicator 402 for wireless communications, such as cellular signals and Wi-Fi signals;
● Time 404;
● A bluetooth indicator 405;
● A battery status indicator 406;
● A tray 408 with icons for commonly used applications such as:
an icon 416 labeled "phone" of the o phone module 138, the icon 416 optionally including an indicator 414 of the number of missed calls or voice mails;
an icon 418 labeled "mail" of the o email client module 140, the icon 418 optionally including an indicator 410 of the number of unread emails;
icon 420 labeled "browser" of the omicron browser module 147; and
an icon 422 labeled "iPod" of the omicron video and music player module 152 (also known as iPod (trademark of Apple inc.) module 152); and
icons of other applications, such as:
icon 424 labeled "message" of omicron IM module 141;
icon 426 labeled "calendar" of calendar module 148;
icon 428 labeled "photo" of image management module 144;
an icon 430 labeled "camera" of the omicron camera module 143;
icon 432 labeled "online video" of online video module 155;
icon 434 labeled "stock market" for the o stock market gadget 149-2;
Icon 436 labeled "map" of the omicron map module 154;
icon 438 labeled "weather" for the o weather gadget 149-1;
icon 440 labeled "clock" for the o alarm clock gadget 149-4;
icon 442 labeled "fitness support" of omicron fitness support module 142;
icon 444 labeled "note" of the omicron note module 153; and
an icon 446 labeled "set" for a set application or module that provides access to the settings of device 100 and its various applications 136.
It should be noted that the iconic labels shown in fig. 4A are merely exemplary. For example, the icon 422 of the video and music player module 152 is labeled "music" or "music player". Other labels are optionally used for various application icons. In some embodiments, the label of the respective application icon includes a name of the application corresponding to the respective application icon. In some embodiments, the label of a particular application icon is different from the name of the application corresponding to the particular application icon.
Fig. 4B illustrates an exemplary user interface on a device (e.g., device 300 of fig. 3) having a touch-sensitive surface 451 (e.g., tablet or touchpad 355 of fig. 3) separate from a display 450 (e.g., touch screen display 112). The device 300 also optionally includes one or more contact intensity sensors (e.g., one or more of the sensors 359) for detecting the intensity of the contact on the touch-sensitive surface 451 and/or one or more tactile output generators 357 for generating tactile outputs for a user of the device 300.
While some of the examples below will be given with reference to inputs on touch screen display 112 (where the touch sensitive surface and the display are combined), in some embodiments the device detects inputs on a touch sensitive surface separate from the display, as shown in fig. 4B. In some implementations, the touch-sensitive surface (e.g., 451 in fig. 4B) has a primary axis (e.g., 452 in fig. 4B) that corresponds to the primary axis (e.g., 453 in fig. 4B) on the display (e.g., 450). According to these embodiments, the device detects contact (e.g., 460 and 462 in fig. 4B) with the touch-sensitive surface 451 at a location corresponding to a respective location on the display (e.g., 460 corresponds to 468 and 462 corresponds to 470 in fig. 4B). In this way, when the touch-sensitive surface (e.g., 451 in FIG. 4B) is separated from the display (e.g., 450 in FIG. 4B) of the multifunction device, user inputs (e.g., contacts 460 and 462 and movement thereof) detected by the device on the touch-sensitive surface are used by the device to manipulate the user interface on the display. It should be appreciated that similar approaches are optionally used for other user interfaces described herein.
Additionally, while the following examples are primarily given with reference to finger inputs (e.g., finger contacts, single-finger flick gestures, finger swipe gestures), it should be understood that in some embodiments one or more of these finger inputs are replaced by input from another input device (e.g., mouse-based input or stylus input). For example, a swipe gesture is optionally replaced with a mouse click (e.g., rather than a contact), followed by movement of the cursor along the path of the swipe (e.g., rather than movement of the contact). As another example, a flick gesture is optionally replaced by a mouse click (e.g., instead of detection of contact, followed by ceasing to detect contact) when the cursor is over the position of the flick gesture. Similarly, when multiple user inputs are detected simultaneously, it should be appreciated that multiple computer mice are optionally used simultaneously, or that the mice and finger contacts are optionally used simultaneously.
Fig. 5A illustrates an exemplary personal electronic device 500. The device 500 includes a body 502. In some embodiments, device 500 may include some or all of the features described with respect to devices 100 and 300 (e.g., fig. 1A-4B). In some implementations, the device 500 has a touch sensitive display 504, hereinafter referred to as a touch screen 504. In addition to or in lieu of touch screen 504, device 500 has a display and a touch-sensitive surface. As with devices 100 and 300, in some implementations, touch screen 504 (or touch-sensitive surface) optionally includes one or more intensity sensors for detecting the intensity of an applied contact (e.g., touch). One or more intensity sensors of the touch screen 504 (or touch sensitive surface) may provide output data representative of the intensity of the touch. The user interface of the device 500 may respond to touches based on the intensity of the touches, meaning that touches of different intensities may invoke different user interface operations on the device 500.
Exemplary techniques for detecting and processing touch intensity are found, for example, in the following related patent applications: international patent application serial number PCT/US2013/040061, filed 5/8 a 2013, entitled "Device, method, and Graphical User Interface for Displaying User Interface Objects Corresponding to an Application", issued as WIPO patent publication No. WO/2013/169849; and international patent application serial number PCT/US2013/069483, filed 11/2013, entitled "Device, method, and Graphical User Interface for Transitioning Between Touch Input to Display Output Relationships", published as WIPO patent publication No. WO/2014/105276, each of which is hereby incorporated by reference in its entirety.
In some embodiments, the device 500 has one or more input mechanisms 506 and 508. The input mechanisms 506 and 508 (if included) may be in physical form. Examples of physical input mechanisms include push buttons and rotatable mechanisms. In some embodiments, the device 500 has one or more attachment mechanisms. Such attachment mechanisms, if included, may allow for attachment of the device 500 with, for example, a hat, glasses, earrings, necklace, shirt, jacket, bracelet, watchband, bracelet, pants, leash, shoe, purse, backpack, or the like. These attachment mechanisms allow the user to wear the device 500.
Fig. 5B depicts an exemplary personal electronic device 500. In some embodiments, the apparatus 500 may include some or all of the components described with reference to fig. 1A, 1B, and 3. The device 500 has a bus 512 that operatively couples an I/O section 514 with one or more computer processors 516 and memory 518. The I/O portion 514 may be connected to a display 504, which may have a touch sensitive component 522 and optionally an intensity sensor 524 (e.g., a contact intensity sensor). In addition, the I/O portion 514 may be connected to a communication unit 530 for receiving application and operating system data using Wi-Fi, bluetooth, near Field Communication (NFC), cellular, and/or other wireless communication technologies. The device 500 may include input mechanisms 506 and/or 508. For example, the input mechanism 506 is optionally a rotatable input device or a depressible input device and a rotatable input device. In some examples, the input mechanism 508 is optionally a button.
In some examples, the input mechanism 508 is optionally a microphone. Personal electronic device 500 optionally includes various sensors, such as a GPS sensor 532, an accelerometer 534, an orientation sensor 540 (e.g., compass), a gyroscope 536, a motion sensor 538, and/or combinations thereof, all of which are operatively connected to I/O section 514.
The memory 518 of the personal electronic device 500 may include one or more non-transitory computer-readable storage media for storing computer-executable instructions that, when executed by the one or more computer processors 516, for example, may cause the computer processors to perform techniques described below, including process 700 (fig. 7). A computer-readable storage medium may be any medium that can tangibly contain or store computer-executable instructions for use by or in connection with an instruction execution system, apparatus, and device. In some examples, the storage medium is a transitory computer-readable storage medium. In some examples, the storage medium is a non-transitory computer-readable storage medium. The non-transitory computer readable storage medium may include, but is not limited to, magnetic storage devices, optical storage devices, and/or semiconductor storage devices. Examples of such storage devices include magnetic disks, optical disks based on CD, DVD, or blu-ray technology, and persistent solid state memories such as flash memory, solid state drives, etc. The personal electronic device 500 is not limited to the components and configuration of fig. 5B, but may include other components or additional components in a variety of configurations.
Furthermore, in a method described herein in which one or more steps are dependent on one or more conditions having been met, it should be understood that the method may be repeated in multiple iterations such that during the iteration, all conditions that determine steps in the method have been met in different iterations of the method. For example, if a method requires performing a first step (if a condition is met) and performing a second step (if a condition is not met), one of ordinary skill will know that the stated steps are repeated until both the condition and the condition are not met (not sequentially). Thus, a method described as having one or more steps depending on one or more conditions having been met may be rewritten as a method that repeats until each of the conditions described in the method have been met. However, this does not require the system or computer-readable medium to claim that the system or computer-readable medium contains instructions for performing the contingent operation based on the satisfaction of the corresponding condition or conditions, and thus is able to determine whether the contingent situation has been met without explicitly repeating the steps of the method until all conditions to decide on steps in the method have been met. It will also be appreciated by those of ordinary skill in the art that, similar to a method with optional steps, a system or computer readable storage medium may repeat the steps of the method as many times as necessary to ensure that all optional steps have been performed.
As used herein, the term "affordance" refers to a user-interactive graphical user interface object that is optionally displayed on a display screen of device 100, 300, and/or 500 (fig. 1A, 3, and 5A-5B). For example, an image (e.g., an icon), a button, and text (e.g., a hyperlink) optionally each constitute an affordance.
As used herein, the term "focus selector" refers to an input element for indicating the current portion of a user interface with which a user is interacting. In some implementations that include a cursor or other position marker, the cursor acts as a "focus selector" such that when the cursor detects an input (e.g., presses an input) on a touch-sensitive surface (e.g., touch pad 355 in fig. 3 or touch-sensitive surface 451 in fig. 4B) above a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted according to the detected input. In some implementations including a touch screen display (e.g., touch sensitive display system 112 in fig. 1A or touch screen 112 in fig. 4A) that enables direct interaction with user interface elements on the touch screen display, the contact detected on the touch screen acts as a "focus selector" such that when an input (e.g., a press input by a contact) is detected on the touch screen display at the location of a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted in accordance with the detected input. In some implementations, the focus is moved from one area of the user interface to another area of the user interface without a corresponding movement of the cursor or movement of contact on the touch screen display (e.g., by moving the focus from one button to another using a tab key or arrow key); in these implementations, the focus selector moves according to movement of the focus between different areas of the user interface. Regardless of the particular form that the focus selector takes, the focus selector is typically controlled by the user in order to deliver a user interface element (or contact on the touch screen display) that is interactive with the user of the user interface (e.g., by indicating to the device the element with which the user of the user interface desires to interact). For example, upon detection of a press input on a touch-sensitive surface (e.g., a touchpad or touch screen), the position of a focus selector (e.g., a cursor, contact, or selection box) over a respective button will indicate that the user desires to activate the respective button (rather than other user interface elements shown on the device display).
As used in the specification and claims, the term "characteristic intensity" of a contact refers to the characteristic of a contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on a plurality of intensity samples. The characteristic intensity is optionally based on a predefined number of intensity samples or a set of intensity samples acquired during a predetermined period of time (e.g., 0.05 seconds, 0.1 seconds, 0.2 seconds, 0.5 seconds, 1 second, 2 seconds, 5 seconds, 10 seconds) relative to a predefined event (e.g., after detection of contact, before or after detection of lift-off of contact, before or after detection of start of movement of contact, before or after detection of end of contact, and/or before or after detection of decrease in intensity of contact). The characteristic intensity of the contact is optionally based on one or more of: maximum value of intensity of contact, average value of intensity of contact, value at first 10% of intensity of contact, half maximum value of intensity of contact, 90% maximum value of intensity of contact, etc. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., when the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether the user has performed an operation. For example, the set of one or more intensity thresholds optionally includes a first intensity threshold and a second intensity threshold. In this example, contact of the feature strength that does not exceed the first threshold results in a first operation, contact of the feature strength that exceeds the first strength threshold but does not exceed the second strength threshold results in a second operation, and contact of the feature strength that exceeds the second threshold results in a third operation. In some implementations, a comparison between the feature strength and one or more thresholds is used to determine whether to perform one or more operations (e.g., whether to perform or forgo performing the respective operations) rather than for determining whether to perform the first or second operations.
FIG. 5C illustrates detecting a plurality of contacts 552A-552E on the touch-sensitive display screen 504 using a plurality of intensity sensors 524A-524D. FIG. 5C also includes an intensity graph showing the current intensity measurements of the intensity sensors 524A-524D relative to intensity units. In this example, the intensity measurements of intensity sensors 524A and 524D are each 9 intensity units, and the intensity measurements of intensity sensors 524B and 524C are each 7 intensity units. In some implementations, the cumulative intensity is the sum of the intensity measurements of the plurality of intensity sensors 524A-524D, which in this example is 32 intensity units. In some embodiments, each contact is assigned a corresponding intensity, i.e., a portion of the cumulative intensity. FIG. 5D illustrates the assignment of cumulative intensities to contacts 552A-552E based on their distance from the center of force 554. In this example, each of the contacts 552A, 552B, and 552E is assigned an intensity of the contact of 8 intensity units of cumulative intensity, and each of the contacts 552C and 552D is assigned an intensity of the contact of 4 intensity units of cumulative intensity. More generally, in some implementations, each contact j is assigned a respective intensity Ij according to a predefined mathematical function ij=a· (Dj/Σdi), which is a fraction of the cumulative intensity a, where Dj is the distance of the respective contact j from the force center, and Σdi is the sum of the distances of all the respective contacts (e.g., i=1 to last) from the force center. The operations described with reference to fig. 5C-5D may be performed using an electronic device similar or identical to device 100, 300, or 500. In some embodiments, the characteristic intensity of the contact is based on one or more intensities of the contact. In some embodiments, an intensity sensor is used to determine a single characteristic intensity (e.g., a single characteristic intensity of a single contact). It should be noted that the intensity map is not part of the displayed user interface, but is included in fig. 5C-5D to assist the reader.
In some implementations, a portion of the gesture is identified for determining a feature strength. For example, the touch-sensitive surface optionally receives a continuous swipe contact that transitions from a starting position and to an ending position where the contact intensity increases. In this example, the characteristic intensity of the contact at the end position is optionally based on only a portion of the continuous swipe contact, rather than the entire swipe contact (e.g., only the portion of the swipe contact at the end position). In some embodiments, a smoothing algorithm is optionally applied to the intensity of the swipe contact before determining the characteristic intensity of the contact. For example, the smoothing algorithm optionally includes one or more of the following: an unweighted moving average smoothing algorithm, a triangular smoothing algorithm, a median filter smoothing algorithm, and/or an exponential smoothing algorithm. In some cases, these smoothing algorithms eliminate narrow spikes or depressions in the intensity of the swipe contact for the purpose of determining the characteristic intensity.
The intensity of the contact on the touch-sensitive surface is optionally characterized relative to one or more intensity thresholds, such as a contact detection intensity threshold, a light press intensity threshold, a deep press intensity threshold, and/or one or more other intensity thresholds. In some embodiments, the tap strength threshold corresponds to a strength of: at this intensity the device will perform the operations normally associated with clicking a button of a physical mouse or touch pad. In some embodiments, the deep compression intensity threshold corresponds to an intensity of: at this intensity the device will perform an operation that is different from the operation normally associated with clicking a physical mouse or a button of a touch pad. In some implementations, when a contact is detected with a characteristic intensity below a light press intensity threshold (e.g., and above a nominal contact detection intensity threshold, a contact below the nominal contact detection intensity threshold is no longer detected), the device will move the focus selector according to movement of the contact over the touch-sensitive surface without performing an operation associated with the light press intensity threshold or the deep press intensity threshold. Generally, unless otherwise stated, these intensity thresholds are consistent across different sets of user interface drawings.
The increase in contact characteristic intensity from an intensity below the light press intensity threshold to an intensity between the light press intensity threshold and the deep press intensity threshold is sometimes referred to as a "light press" input. The increase in contact characteristic intensity from an intensity below the deep-press intensity threshold to an intensity above the deep-press intensity threshold is sometimes referred to as a "deep-press" input. The increase in the contact characteristic intensity from an intensity below the contact detection intensity threshold to an intensity between the contact detection intensity threshold and the light press intensity threshold is sometimes referred to as detecting a contact on the touch surface. The decrease in the contact characteristic intensity from an intensity above the contact detection intensity threshold to an intensity below the contact detection intensity threshold is sometimes referred to as detecting a lift-off of contact from the touch surface. In some embodiments, the contact detection intensity threshold is zero. In some embodiments, the contact detection intensity threshold is greater than zero.
In some implementations described herein, one or more operations are performed in response to detecting a gesture that includes a respective press input or in response to detecting a respective press input performed with a respective contact (or contacts), wherein a respective press input is detected based at least in part on detecting an increase in intensity of the contact (or contacts) above a press input intensity threshold. In some implementations, the respective operation is performed in response to detecting that the intensity of the respective contact increases above a press input intensity threshold (e.g., a "downstroke" of the respective press input). In some embodiments, the press input includes an increase in intensity of the respective contact above a press input intensity threshold and a subsequent decrease in intensity of the contact below the press input intensity threshold, and the respective operation is performed in response to detecting the subsequent decrease in intensity of the respective contact below the press input threshold (e.g., an "upstroke" of the respective press input).
FIGS. 5E-5H illustrate detection of a gesture that includes a change in intensity from contact 562 below the tap intensity threshold in FIG. 5E (e.g., "IT L ") increases in intensity above the deep compression intensity threshold in fig. 5H (e.g.," IT) D ") intensity corresponds to the press input. On the displayed user interface 570 including application icons 572A-572D displayed in predefined area 574, a gesture performed with contact 562 is detected on touch-sensitive surface 560 while cursor 576 is displayed over application icon 572B corresponding to application 2. In some implementations, a gesture is detected on the touch-sensitive display 504. The intensity sensor detects the intensity of the contact on the touch-sensitive surface 560. The device determines that the intensity of contact 562 is at a deep compression intensity threshold (e.g., "IT D ") reaches a peak above. Contact 562 is maintained on touch-sensitive surface 560. In response to detecting the gesture, and in accordance with the intensity rising to a deep press intensity threshold (e.g., "IT" during the gesture D ") above contact 562, displays scaled representations 578A-578C (e.g., thumbnails) of the recently opened document for application 2, as shown in fig. 5F-5H. In some embodiments, the intensity is a characteristic intensity of the contact compared to one or more intensity thresholds. It should be noted that the intensity map for contact 562 is not part of the displayed user interface, but includes In fig. 5E-5H to assist the reader.
In some embodiments, the display of representations 578A-578C includes animation. For example, representation 578A is initially displayed adjacent to application icon 572B, as shown in FIG. 5F. As the animation proceeds, representation 578A moves upward and representation 578B is displayed adjacent to application icon 572B, as shown in fig. 5G. Representation 578A then moves upward, 578B moves upward toward representation 578A, and representation 578C is displayed adjacent to application icon 572B, as shown in fig. 5H. Representations 578A-578C form an array over icon 572B. In some embodiments, the animation progresses according to the intensity of the contact 562, as shown in fig. 5F-5G, with representations 578A-578C appearing and pressing an intensity threshold (e.g., "IT" deeply with the intensity of the contact 562 D ") increases and moves upward. In some embodiments, the intensity upon which the animation progresses is based is the characteristic intensity of the contact. The operations described with reference to fig. 5E through 5H may be performed using an electronic device similar or identical to device 100, 300, or 500.
In some implementations, the device employs intensity hysteresis to avoid accidental inputs, sometimes referred to as "jitter," in which the device defines or selects a hysteresis intensity threshold that has a predefined relationship to the compression input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units lower than the compression input intensity threshold, or the hysteresis intensity threshold is 75%, 90%, or some reasonable proportion of the compression input intensity threshold). Thus, in some embodiments, the press input includes an increase in the intensity of the respective contact above a press input intensity threshold and a subsequent decrease in the intensity of the contact below a hysteresis intensity threshold corresponding to the press input intensity threshold, and the respective operation is performed in response to detecting that the intensity of the respective contact subsequently decreases below the hysteresis intensity threshold (e.g., an "upstroke" of the respective press input). Similarly, in some embodiments, a press input is detected only when the device detects an increase in contact intensity from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press input intensity threshold and optionally a subsequent decrease in contact intensity to an intensity at or below the hysteresis intensity, and a corresponding operation is performed in response to detecting a press input (e.g., an increase in contact intensity or a decrease in contact intensity depending on the circumstances).
For ease of explanation, optionally, a description of operations performed in response to a press input associated with a press input intensity threshold or in response to a gesture comprising a press input is triggered in response to detecting any of the following: the contact strength increases above the compression input strength threshold, the contact strength increases from an intensity below the hysteresis strength threshold to an intensity above the compression input strength threshold, the contact strength decreases below the compression input strength threshold, and/or the contact strength decreases below the hysteresis strength threshold corresponding to the compression input strength threshold. In addition, in examples where the operation is described as being performed in response to the intensity of the detected contact decreasing below a press input intensity threshold, the operation is optionally performed in response to the intensity of the detected contact decreasing below a hysteresis intensity threshold that corresponds to and is less than the press input intensity threshold.
As used herein, an "installed application" refers to a software application that has been downloaded onto an electronic device (e.g., device 100, 300, and/or 500) and is ready to be started (e.g., turned on) on the device. In some embodiments, the downloaded application becomes an installed application using an installer that extracts program portions from the downloaded software package and integrates the extracted portions with the operating system of the computer system.
As used herein, the term "open application" or "executing application" refers to a software application having retention state information (e.g., as part of device/global internal state 157 and/or application internal state 192). The open or executing application is optionally any of the following types of applications:
an active application currently displayed on the display screen of the device that is using the application;
a background application (or background process) that is not currently shown but for which one or more processes are being processed by one or more processors; and
a suspended or dormant application that is not running but has state information stored in memory (volatile and nonvolatile, respectively) and available to resume execution of the application.
As used herein, the term "closed application" refers to a software application that does not have maintained state information (e.g., the state information of the closed application is not stored in the memory of the device). Thus, closing an application includes stopping and/or removing application processes of the application and removing state information of the application from memory of the device. Generally, when in a first application, opening a second application does not close the first application. The first application becomes a background application when the second application is displayed and the first application stops being displayed.
Attention is now directed to embodiments of a user interface ("UI") and associated processes implemented on an electronic device, such as portable multifunction device 100, device 300, or device 500.
User interface and associated process
User interface for synchronized playback of content items
Users interact with electronic devices in many different ways, including using the electronic device to communicate with other electronic devices and to play content items. The embodiments described below provide a way for an electronic device in a communication session with one or more second electronic devices to play a content item in synchronization with playback of the content item at the one or more second electronic devices. Providing an efficient way to play the content item in a synchronized manner with other electronic devices enhances interaction with the device, thereby reducing the amount of time required for the user to ensure synchronized play of the content item and reducing the power usage of the device, which extends the battery life of the battery-powered device. It will be appreciated that people use the device. When a person uses a device, the person is optionally referred to as a user of the device.
Fig. 6A-6 CC illustrate an exemplary manner in which an electronic device 500a plays a content item in synchronization with playback of the content item at other electronic devices 500b and 500c of other users, according to some embodiments. The embodiments in these figures are used to illustrate the processes described below, including the processes described with reference to fig. 7. While fig. 6A-6 CC illustrate various examples of the manner in which an electronic device may be able to perform the processes described below with reference to fig. 7, it should be understood that these examples are not meant to be limiting and that an electronic device may be able to perform one or more of the processes described below with reference to fig. 7 in a manner not explicitly described with reference to fig. 6A-6 CC.
Fig. 6A-6 CC illustrate various examples of playing a content item in synchronization with playback of the content item at other electronic devices participating in a communication session. For example, the electronic device is in a communication session, such as a group video call, a group telephone call, or a group message chat, that includes synchronized playback of content items, such as audio content (e.g., music, audio books, podcasts, etc.) and/or video content (e.g., movies, episodes content series).
Fig. 6A-6C illustrate examples of the first electronic device 500a playing content in a private content playback mode of a communication session. For example, the first electronic device 500a, the second electronic device 500b, and one or more additional electronic devices not shown in fig. 6A-6C are participating in a communication session.
In fig. 6A, a first electronic device 500a presents a user interface of a content application while engaged in a communication session with a second electronic device 500b and optionally with one or more other electronic devices not shown in fig. 6A. The second electronic device 500b presents a home screen user interface that includes a plurality of icons 432-450 associated with applications accessible to (e.g., installed on) the second electronic device 500b and an expanded indication 616a of the communication session.
In some embodiments, when the second electronic device 500b is participating in the communication session, the second electronic device 500b displays an extension indication 616a of the communication session. In some embodiments, the expanded indication 616a of the communication session includes an indication 618f of the group name of the user group included in the communication session, an option 618g for leaving the communication session, an option 618a for viewing a chat associated with the communication session, an option 618b for modifying audio settings of an audio output of the communication session, an option 618c for muting or unmuting the second electronic device 500b from the communication session, an option 618d for viewing a video chat of the communication session, and an option 618e for participating or not participating in a shared media playback associated with the communication session. In some embodiments, in response to detecting an input to stop the display of the expanded indication 616A of the communication session (e.g., a swipe input from the expanded indication 616A toward the upper edge of the display generation component 504b of the second electronic device 500 b), the second electronic device 500b stops the display of the expanded indication 616A of the communication session and displays an indication of the communication session similar to the indication 602a displayed by the first electronic device 500a in fig. 6A (e.g., at a corresponding location in a user interface presented by the second electronic device 500 b).
In some implementations, the content application user interface displayed by the first electronic device 500a is an album page that includes representations 610a-610c of content items included in respective albums available through a content service associated with the content application. It should be appreciated that one or more of the examples shown herein with respect to the album page presented by the first electronic device 500a are also applicable to other types of user interfaces presented in the content application, such as playlist pages, library pages, and other user interfaces that include one or more representations of content items and/or options for initiating playback of one or more content items.
Since the first electronic device 500a is in a communication session when rendering album pages, the first electronic device 500a displays an indication 602a of the communication session. In some embodiments, the first electronic device 500a displays the indication 602a (e.g., at the location shown in fig. 6A) while displaying user interfaces of other applications and/or other user interfaces of the content application. In some embodiments, when the first electronic device 500a displays an indication of an extension of the communication session (e.g., similar to the indication of extension 616a of the communication session presented by the second electronic device 500 b), the electronic device 500a does not display the indication 602a. In some implementations, in response to detecting the selection of the indication 602a, the first electronic device 500a displays an extension indication of the communication session (e.g., an extension indication 616a of the communication session similar to that presented by the second electronic device 500 b). In some implementations, the first electronic device 500a displays the indication 602a shown in fig. 6A when the first electronic device 500a is participating in a communication session that does not include shared playback of the content item. In some implementations, as will be described in greater detail below with reference to fig. 6E, when the electronic device 500a is participating in a communication session that includes shared playback of one or more content items, the electronic device 500a displays a different indication 602c of the communication session.
The album page optionally includes an option 604a for navigating to a previously presented user interface of the content application (e.g., an artist page associated with an artist of an album of the album page), an option 604b for adding the album to a library of users of the first electronic device 500a, an option 604c for viewing a menu of selectable options that cause the first electronic device 500a to perform one or more actions with respect to the album, an image 606a (e.g., "album cover") associated with the album, an indication 606b of an album name, an indication 606c of an artist of the album, an option 608a that, when selected, causes the electronic device 500a to initiate playback of content items in the album (e.g., all content items) in an order of appearance of the content items on the album, and an option 608b for initiating playback of content items in the album (e.g., all content items) in a random play order that is different from the order of appearance of the content items on the album. In some implementations, the album page also includes a playback control element 612a that includes an image 614a (e.g., album art) associated with the content item currently being played on the electronic device 500a, an indication 614c of the name of the content item currently being played on the electronic device 500a, an indication 614b of a speaker in communication with the first electronic device 500a being used to playback the content item, an option 614d for pausing playback of the content, and an option 614e for skipping the rest of the content item and playing the next content item to be played.
In some implementations, the album page also includes representations 610a-610c of the content items included in the album. In response to detecting a selection of one of the representations 610a-610c of content items included in the album, the electronic device 500a optionally initiates playback of the content item corresponding to the selected representation. In some implementations, the electronic device 500a includes settings that, when activated (e.g., opened), cause the electronic device 500a to automatically play a content item (e.g., in a music application) in a shared content playback mode of a communication session in response to receiving an input (such as the input shown in fig. 6A) corresponding to a request to play the content item (e.g., in the music application) while the electronic device 500a is in the communication session that includes content sharing. As shown in fig. 6A, the first electronic device 500a detects selection of a representation 610b of one of the content items included in the album (e.g., via contact 603 a). In some implementations, if the settings for playing the content item in the shared content playback mode (e.g., without further input) are active, the electronic device 500a initiates playback of the content item in the shared content playback mode in response to the input shown in fig. 6A, as shown in fig. 6E, without presenting the menu 620 to select between playing the content item in the shared content playback mode or playing the content item in the private content playback mode. In some implementations, in response to the input shown in fig. 6A (e.g., in accordance with a determination that the settings for automatically playing the content item in the shared content playback mode in response to a request to play the content item in a communication session that includes content sharing are in an inactive state (e.g., closed)), the first electronic device 500a displays the menu 620 shown in fig. 6B (e.g., as part of a process for initiating playback of the content item).
Fig. 6B illustrates a menu 620 displayed, for example, in response to the user input illustrated in fig. 6A (e.g., in accordance with a determination that a setting for automatically playing a content item in a shared content playback mode in response to a request to play the content item in a communication session that includes content sharing is in an inactive state). Menu 620 optionally includes an option 622a for initiating playback of a content item corresponding to representation 610b selected by the user in fig. 6A in the private content playback mode of the communication session at first electronic device 500a, an option 622b for initiating playback of a content item corresponding to representation 610b in the shared content playback mode of the communication session, and an option 622c for stopping display of menu 620 (e.g., not initiating playback of a content item). In some implementations, the private content playback mode of the communication session is a mode in which the first electronic device 500a plays a content item (such as the content item corresponding to representation 610 b) at the first electronic device 500a without causing other electronic devices in the shared content playback mode of the communication session to initiate playback of the content item. In some implementations, the shared content playback mode of the communication session is a mode in which the first electronic device 500a plays a content item (such as the content item corresponding to representation 610 b) at the first electronic device and causes one or more electronic devices in the shared content playback mode of the communication session to also play the content item. As shown in fig. 6B, the first electronic device 500a detects selection of option 622a (e.g., via contact 603B) to play the content item corresponding to representation 610B in the private content playback mode of the communication session.
In some implementations, in response to the input shown in fig. 6B, the first electronic device 500a initiates playback of the content item corresponding to the representation 610B selected in fig. 6A in the private content playback mode, as shown in fig. 6C. For example, playing the content item in the private content playback mode causes the first electronic device 500a to play the content item without causing other electronic devices in the communication session to initiate playback of the content item.
For example, fig. 6C shows the first electronic device 500a playing a content item in response to the input sequence shown in fig. 6A-6B. In some implementations, in response to initiating playback of the content item, the first electronic device 500a updates the playback control element 612a to include an indication 614g of the title of the content item currently being played and an image 614f of the content item currently being played. In some implementations, because the first electronic device 500a is playing the content item in the private content playback mode, the first electronic device 500a continues to display an indication 602a of the communication session indicating that the first electronic device 500a is not engaged in shared media playback in the communication session. In some implementations, when the first electronic device 500a initiates playback of the content item, the second electronic device 500b does not initiate playback of the content item because the first electronic device 500a is playing the content item in the private content playback mode. Conversely, if the first electronic device 500a initiates playback of the content item in the shared content playback mode of the communication session, the second electronic device 500b will also initiate playback of the content item if it is in the shared content playback mode of the communication session, as will now be described with reference to fig. 6D-6G.
In some embodiments, in fig. 6D, the first electronic device 500a detects selection of option 622b in menu 620 (e.g., with contact 603D). Menu 620 is described in more detail above with reference to fig. 6B. In some implementations, the menu 620 in fig. 6D is displayed in response to user input described above with reference to fig. 6A (e.g., in accordance with a determination that settings for automatically playing content items in a shared content playback mode in response to a request to play content items in a communication session that includes content sharing are inactive). In some implementations, the option 622b selected in fig. 6D corresponds to a request to initiate playback of the content item selected in fig. 6A in the shared content playback mode of the communication session. In response to detecting selection of option 622b, first electronic device 500a initiates playback of the content item selected in fig. 6A at first electronic device 500a and transmits an indication to other electronic devices in the communication session that playback of the content item selected in fig. 6A is also initiated, as shown in fig. 6E. In some implementations, in accordance with a determination that a setting for automatically playing a content item in a shared content playback mode in response to a request to play the content item in a communication session that includes content sharing is active, the electronic device 500a initiates playback of the content item in the shared content item playback mode as shown in fig. 6E in response to an input shown in fig. 6A without displaying the menu 620 shown in fig. 6B and 6D and without receiving the input shown in fig. 6D.
Fig. 6E illustrates how the electronic devices 500a and 500b may initiate playback of a content item selected in fig. 6A in a shared content playback mode of a communication session in response to an input illustrated in fig. 6D (e.g., in accordance with a determination that a setting for automatically playing the content item in the shared content playback mode in response to a request to play the content item in the communication session including content sharing is in an inactive state) or in response to an input illustrated in fig. 6A (e.g., in accordance with a determination that a setting for automatically playing the content item in the shared content playback mode in response to a request to play the content item in the communication session including content sharing is in an active state). In some implementations, the first electronic device 500a can display an indication 602c of the communication session that content is being shared in the communication session, and can update the playback control element 612a to include an indication 614g of a title of the content item being played in the shared content playback mode of the communication session, an image 614f (e.g., album art) associated with the content item being played in the shared content playback mode of the communication session, and an indication 614h of a number of users listening to the content item in the communication session. For example, the playback control element 612a in fig. 6E includes an indication 612a that two users are listening to a content item in a shared content playback mode, such as the first electronic device 500a and another electronic device in a communication session that is not shown in fig. 6E.
In some embodiments, the first electronic device 500a transmits an indication of the input shown in fig. 6D to other electronic devices in the communication session, including the electronic device 500b (e.g., via one or more servers). In response to receiving the indication of the input, the second electronic device 500b presents an indication 624 of the input received at the first electronic device 500a, e.g., as shown in fig. 6E. The indication 624 may include an image 626a associated with the user of the first electronic device 500a (e.g., because the first electronic device 500a detected the input), an indication 626b that the content application was used to play the content item in the shared content playback mode of the communication session, an indication 626c of the user providing the input and the title of the content item being played in the shared content playback mode of the communication session, and a selectable option 626d that when selected causes the second electronic device 500b to initiate playback of the content item in the shared content playback mode of the communication session. The second electronic device 500b may also present an indication 602d of the communication session indicating that content is shared in the communication session even though the second electronic device 500b is not currently playing the content item in the shared content playback mode. In some embodiments, if the second electronic device 500b has played content in the shared content playback mode of the communication session upon receiving the indication of the input in fig. 6D, the second electronic device 500b foregoes presenting the indication 624 shown in fig. 6E, but presents an indication similar to the indication 630 shown in fig. 6H, the indication 660a shown in fig. 6Q, the indications 660b and 660c in fig. 6R, the indication 660D in fig. 6V, as will be described in more detail below.
In some implementations, as shown in fig. 6E, the second electronic device 500b detects selection of the selectable option 626d (e.g., via contact 603E) to play the content item in the shared content playback mode of the communication session. In some implementations, in response to the input shown in fig. 6E, if the second electronic device 500b is authorized to access content via a content (e.g., streaming, playback, browsing, library, etc.) service associated with the communication session, the second electronic device 500b can initiate playback of the content item in the shared content playback mode from a playback position where the content item is played in the shared content playback mode where other electronic devices in the communication session are currently in the communication session, which may be the same or different than the beginning of the content item. As shown in fig. 6F, if the second electronic device 500b is not authorized to access content through a content service associated with the communication session, the second electronic device 500b presents a user interface from which a process of obtaining authorization to access content via the content service may be initiated (e.g., without playing the content item in a shared content playback mode), as shown in fig. 6F. For example, if the user account associated with the second electronic device 500b has an active subscription to the content service, the second electronic device 500b is authorized to access the content through the content service.
Fig. 6F illustrates an exemplary user interface for the second electronic device 500b to obtain authorization to access content through a content service associated with a communication session. In some embodiments, the second electronic device 500b may display the user interface shown in fig. 6F in response to detecting the input shown in fig. 6E while not being authorized to access content through the content service associated with the communication session. For example, the user interface includes selectable options 638a for stopping the display of the user interface without initiating a process of obtaining authorization for the content service, an indication 638b of an authorized user for the content service playing the content item using the content service in a shared content playback mode of the communication session, information 638c about authorization terms for the content service (e.g., subscription information for a subscription-based content service), selectable options 638d that when selected cause the second electronic device 500b to initiate a process of obtaining authorization for the content service, and selectable options 638d that when selected cause the second electronic device 500b to present information about other subscription plans for authorizing the user of the second electronic device 500b to access the content through the content service associated with the communication session. In some embodiments, the content service is a subscription-based content service and the second electronic device 500b will be authorized to access content in the communication session, while the user account associated with the second electronic device 500b has an active subscription to the content service. For example, selecting the selectable option 638d may cause the second electronic device 500b to initiate a process of starting a free trial of subscription to the content service. In some implementations, the second electronic device 500b displays the indication 602d of the communication session with shared content while presenting a user interface related to obtaining authorization for the content service associated with the communication session.
As shown in fig. 6F, in some embodiments, the second electronic device 500b detects selection of the selectable option 638d (e.g., via contact 603F) to initiate a process of obtaining authorization for a content service associated with the communication session. In some embodiments, in response to the input, the second electronic device 500b initiates a process of obtaining authorization for a content service associated with the communication session. In some embodiments, the process of obtaining authorization for the content service includes initiating a free trial and collecting payment information to pay to maintain the subscription after the free trial period has ended.
Fig. 6G illustrates an example of the first electronic device 500a and the second electronic device 500b playing a content item in a shared playback mode of a communication session. In some implementations, the second electronic device 500b displays the user interface shown in fig. 6G and automatically initiates playback of the content item in the shared content playback mode of the communication session after completing the process of obtaining authorization for the content service associated with the communication session (e.g., after performing the process initiated in response to the input shown in fig. 6F). In some implementations, if the second electronic device 500b is authorized to access content through a content service associated with the communication session when the indication of the input shown in fig. 6D is received, the second electronic device 500b initiates playback of the content item in the shared content playback mode and displays the user interface shown in fig. 6G in response to detecting the input shown in fig. 6E without performing a process of gaining access to the content service associated with the communication session. In some embodiments, the second electronic device 500b does not present the user interface shown in fig. 6G (e.g., and performs the process of accessing the content service associated with the communication session) in response to the input shown in fig. 6E, but rather presents the user interface shown in fig. 6G in response to detecting one or more inputs corresponding to a request to display a user interface of a content application associated with the communication session (e.g., a content service associated with the communication session).
In some implementations, as shown in fig. 6G, in response to the first electronic device 500a initiating playback of the content item in the shared content playback mode of the communication session (e.g., receiving an indication that the first electronic device initiated playback of the content item in the shared content playback mode of the communication session), the first electronic device 500a updates the indication 614i of the number of users listening to the content in the shared content playback mode of the communication session. For example, if two devices are playing the content item in the shared content playback mode before the second electronic device 500b initiates playback of the content item in the shared content playback mode, and now the second electronic device 500b is also playing the content in the shared content playback mode, the first electronic device 500a may present an indication 614i that three devices are playing the content in the shared content playback mode of the communication session.
Fig. 6G illustrates an exemplary user interface of the second electronic device 500b presenting a content application associated with a content service associated with a communication session when playing a content item in a shared content playback mode of the communication session. In some embodiments, upon presenting the user interface, the second electronic device 500b presents an indication 602d of the communication session including shared content playback. In some embodiments, the user interface includes an image 628a associated with the content item currently being played, an indication 628b of the title of the content item, an indication 628d of the artist of the content item, a selectable option 628c that when selected causes the electronic device 500b to present an action menu associated with the content item, a selectable option 628f for jumping to the beginning of the content item (or to a previously played content item if selected again), a selectable option 628g for pausing the content item, a selectable option 628h for jumping to the next content item in the shared content item queue, a slider 628i for controlling the playback volume of the content item, a selectable option 628j for viewing the time-synchronized lyrics of the content item, a selectable option 628k for viewing the audio output option for playback of the content item, and a selectable option 628m for viewing the shared playback queue. In some embodiments, as will be described in greater detail with reference to fig. 6O-6X, when one or more electronic devices are playing content in a shared content playback mode of a communication session, the electronic devices may access a shared content item queue of content items to be played in the shared content playback mode.
As shown in fig. 6G, in some embodiments, the second electronic device 500b detects selection of the selectable option 628G (e.g., via contact 603G) to pause the content item currently being played. In response to the input shown in fig. 6G, in some embodiments, all electronic devices playing the content item in the shared content playback mode of the communication session pause playback of the content item, as shown in fig. 6H. In some implementations, the second electronic device 500b transmits an indication of the input shown in fig. 6G to other electronic devices in the communication session (e.g., via one or more servers), and the other electronic devices pause the content item in response to receiving the indication.
For example, fig. 6H illustrates the first electronic device 500a and the second electronic device 500b after suspending playback of the content item in the shared content playback mode of the communication session in response to the input (e.g., an indication of the input) illustrated in fig. 6G. In some implementations, in response to receiving the indication of the input, the first electronic device 500a pauses playback of the content item, updates the playback control element 612a to include an option 614j for resuming playback of the content item (e.g., in lieu of an option 614d for pausing the content item displayed while the content item is playing, such as shown in fig. 6G), and displays an indication 630 for pausing the input of the content. In some implementations, the indication 630 for pausing the input of the content includes an image 632a associated with the user of the second electronic device 500b and text 632b indicating the name of the user pausing the content and the user pausing the content. In some embodiments, the first electronic device 500a continues to display an indication 602c of the communication session including the shared content playback. In some embodiments, because the first electronic device 500a receives an indication of an input when displaying a user interface associated with a content service associated with a communication session (e.g., a content application associated with the content service), the indication 630 does not include an indication of the content application and is not selectable to display the user interface of the content application. In some embodiments, as will be described in greater detail below with reference to fig. 6I, if the first electronic device 500a has received an indication to pause input of content while the first electronic device 500a is presenting a user interface other than the user interface of the content application associated with the communication session, the indication will include an indication of the content application and will be selectable to display the user interface of the content application associated with the communication session.
As shown in fig. 6H, in some embodiments, in response to detecting an input to pause content, the second electronic device 500b pauses content and updates the user interface to include a selectable option 628n for resuming playback of the content item in the shared content playback mode. In some implementations, the second electronic device 500b displays the selectable option 628n for resuming playback of the content item shown in fig. 6H at the same location that the second electronic device 500b displayed the selectable option 628G for pausing the content item shown in fig. 6G. In some embodiments, the second electronic device 500b continues to display the indication 602d of the communication session including the shared content playback.
In some implementations, the second electronic device 500b detects an indication of input received at the first electronic device 500a to initiate playback of a content item in a shared content playback mode of the communication session shown in fig. 6D when the second electronic device 500b has no access to a content application associated with the communication session. For example, when the second electronic device 500b receives an indication to initiate input for playback of the content item in the shared content playback mode of the communication session, the content application is not downloaded and/or installed on the second electronic device 500 b. In some implementations, as shown in fig. 6I, in response to receiving an indication received at the first electronic device 500a to initiate playback of a content item in the shared content playback mode when the second electronic device 500b has no access to a content application associated with the communication session, the second electronic device 500b presents an indication 634 of the input to initiate playback of the content item in the shared content playback mode.
As shown in fig. 6I, in some embodiments, the second electronic device 500b presents the indication 634 in response to receiving an indication received at the first electronic device 500a to initiate playback of the content item in the shared content playback mode when the second electronic device 500b has no access to the content application associated with the communication session. In some embodiments, the indication 634 shown in fig. 6I may be the same as the indication 624 presented by the second electronic device 500b shown in fig. 6E in response to receiving an indication to initiate input for playback of a content item in the shared content playback mode when the second electronic device 500b may access the content application associated with the communication session, except that the indication 624 includes a selectable option 626d to initiate playback of the content item in the shared content playback mode (e.g., after access to the content service is obtained) and the indication 634 includes a selectable option 636d to initiate a process to obtain access to the content application associated with the communication session. In some embodiments, in response to detecting the selection of selectable option 636d, second electronic device 500b presents an application store user interface that includes selectable options that, when selected, cause second electronic device 500b to initiate a process of accessing a content application associated with the communication session, such as a process of downloading and/or installing the content application. In some implementations, once the second electronic device 500b has access to the content application associated with the communication session, the second electronic device 500b can play the content item in the shared content playback mode of the communication session.
In some embodiments, a content service associated with a communication session provides access to a variety of different types of content. For example, music content services provide access to songs, albums, playlists, and live radio stations. In some embodiments, it is not possible to play all types of content in the shared content playback mode of the communication session. For example, it is possible to play songs, albums, and playlists in the shared content playback mode, but it is not possible to play live radio stations in the shared content playback mode.
Fig. 6J illustrates a user interface in which a first electronic device 500a and a second electronic device 500b present a content application associated with a communication session for playing a live radio station, in accordance with some embodiments. In some embodiments, the user interface includes an indication 640j of the name of the radio station, an option 640k for viewing a live program schedule of the radio station, images 640m and 640a associated with a live radio program currently being streamed, indications 640n and 640d of the start time of the live radio program, indications 640p and 640e of the title of the radio program, selectable options 640q and 640f for playing the radio program, indications 640r and 640g of the second radio station, selectable options 640s and 640h for viewing a live broadcast schedule of the second radio station, and images 640t and 640i associated with the radio program currently being played on the second radio station.
In some embodiments, in the example of fig. 6J, the first electronic device 500a is in a private content playback mode of a communication session and the second electronic device 500b is in a shared content playback mode of a communication session. For example, since the first electronic device 500a is in the private content playback mode of the communication session, the extended indication 616b of the communication session includes an indication 618m that the first electronic device 500a is not engaged in the playback of the shared content in the communication session, and the content playback control element 612a does not include an indication of the number of users listening in the shared content playback mode (e.g., indication 614h in fig. 6I). As another example, since the second electronic device 500b is in the shared content playback mode, the content playback control element 612b displayed by the second electronic device 500b includes an indication 614t of the number of users listening to the content in the shared content playback mode of the communication session. In fig. 6J, for example, the second electronic device 500b displays an indication 602d of the communication session indicating that the communication session includes shared content playback. In some embodiments, the second electronic device 500b does not display an extended indication of the communication session because the second electronic device 500b displays the indication 602 d. If the second electronic device 500b is displaying an extended indication of the communication session instead of the indication 602d, the extended indication of the communication session would include an indication that the second electronic device 500b is participating in playback of the shared content instead of an indication 618m indicating not to participate in playback of the shared content in the communication session.
As shown in fig. 6J, the first electronic device 500a may detect (e.g., via contact 603 i) selection of option 640f to initiate playback of the live radio broadcast, and the second electronic device 500b may detect (e.g., via contact 603J) selection of option 640q to initiate playback of the live radio broadcast. In some embodiments, as shown in fig. 6K, the first electronic device 500a initiates playback of a live radio broadcast in the private content playback mode because the first electronic device 500a is in the private content playback mode when an input is received. In some embodiments, as shown in fig. 6K, the second electronic device 500b foregoes initiating playback of the live radio broadcast in response to the input because it is not possible to play the live radio broadcast in the shared content playback mode and the second electronic device 500b is in the shared content playback mode when the input is received.
Fig. 6K illustrates a first electronic device 500a and a second electronic device 500b responding to the input illustrated in fig. 6J according to some embodiments of the present disclosure. In some embodiments, the first electronic device 500a initiates playback of the live radio broadcast because the first electronic device 500a is in the private content playback mode of the communication session when the first electronic device 500a detects the input shown in fig. 6J. For example, the electronic device 500a updates the content playback control element 612a to include an indication 614m of the title of the live radio broadcast and an image 614k associated with the live radio broadcast. In some embodiments, the first electronic device 500a continues to display an expanded indication 616b of the communication session, including an indication 618m that the first electronic device 500a is not engaged in playback of the shared content in the communication session.
In some embodiments, the second electronic device 500b does not initiate playback of the live radio broadcast in fig. 6K because the second electronic device 500b is in the shared content playback mode of the communication session when the second electronic device 500b receives the input shown in fig. 6J. For example, in response to the input in fig. 6K, the second electronic device 500b presents an indication 642 that it is not possible to play the live radio broadcast in the private content playback mode of the communication session while continuing to play the content being played in the communication session in the shared content playback mode. In some implementations, the indication 642 includes a selectable option 644a that, when selected, causes the second electronic device 500b to stop playback of the content in the shared content playback mode and initiate playback of the live radio broadcast in the private content playback mode without modifying playback of the content item by other electronic devices in the communication session in the shared content playback mode. In some embodiments, the indication 642 further includes a selectable option 644b that, when selected, causes the second electronic device 500b to cease displaying the indication 642 without initiating playback of the live radio broadcast. In some implementations, in response to detecting selection of the option 644b, the second electronic device 500b continues to play the content item in the shared content playback mode of the communication session without modifying playback of the content item in the shared content playback mode of the communication session at the second electronic device 500b or any other electronic device that plays the content in the shared content playback mode of the communication session.
In some implementations, when the electronic device is playing content in the shared content playback mode of the communication session, the electronic device can temporarily transition to the private content playback mode to play the preview of the content item in response to detecting an input requesting to play the preview of the content item, as shown in fig. 6L-6N. In this way, a user of the electronic device can privately preview a song, for example, before playing the song in the shared content playback mode of the communication session.
For example, in fig. 6L, the first electronic device 500a displays the album page described above with reference to fig. 6A while playing content in the shared content playback mode of the communication session, and the second electronic device 500b presents the playback user interface described above with reference to fig. 6G while playing content in the shared content playback mode of the communication session. In some implementations, because both the first electronic device 500a and the second electronic device 500b play content in the shared content playback mode, the first electronic device 500a and the second electronic device 500b play the same content item with synchronized playback (e.g., the first electronic device 500a and the second electronic device 500b are playing the content item at the same playback position). For example, when playing a content item in the shared content playback mode, the first electronic device 500a detects a secondary selection of one of the representations 610c of the content item in the album (e.g., via contact 603L). In some embodiments, the secondary selection may be a long press, a hard tap, a right click, or similar input. In some embodiments, in response to detecting the secondary selection in fig. 6L, the first electronic device 500a may display the menu shown in fig. 6M.
In some implementations, as shown in FIG. 6M, in response to the input shown in FIG. 6L, the first electronic device 500a displays a menu of selectable options 646a-646i related to a content item corresponding to the representation 610 c. For example, the menu includes an option 646a for adding a content item to a library of user accounts associated with the first electronic device 500a, an option 646b for initiating a process of adding a content item to a playlist, an option 646c for adding a content item to the beginning of a content item playback queue (e.g., a shared content item playback queue), an option 646d for adding a content item to the end of a content item playback queue (e.g., a shared content item playback queue), an option 646e for playing a preview of a content item, an option 646f for initiating a process of sharing a content item with another user account or electronic device, an option 646g for viewing the complete lyrics of a content item, and an option 646h for creating a radio station that includes content similar to the content item (e.g., the same or similar genre, artist, etc.), and an option 646i for designating the content item as a user "like" of the first electronic device 500 a. As shown in fig. 6M, in some embodiments, the first electronic device 500a detects selection of the option 646e to play a preview of the content item. In some implementations, if a secondary selection of the representation 610c of the content item is detected while the first electronic device 500a is not in the shared content playback mode of the communication session, the first electronic device 500a relinquishes the display of the option 646e because the first electronic device 500a is in the private content playback mode of the communication session or because the first electronic device 500a is not engaged in the communication session. In some implementations, in response to detecting selection of option 646e, the first electronic device 500a plays a preview of the content item in the private content playback mode without causing other electronic devices playing the content in the shared content playback mode of the communication session to play the preview of the content item. In some implementations, the previews of the content items are predetermined portions (e.g., subsets) of the content items. For example, a preview of a content item is a thirty second sample of the content item. In some implementations, the user can request previewing a song (e.g., when the electronic device 500a is in the shared content playback mode of the communication session) by selecting an icon or image included in the representation 610c of the song that is associated with the preview of the song. In some implementations, the user need not cause the electronic device 500a to display the options 6246a-6246i to preview the song, but rather may select an icon or image associated with the preview of the song. In some implementations, the electronic device 500 initiates playback of a preview of a song in response to selection of an icon or image associated with the preview of the song included in the representation 610c of the song. In some implementations, in response to selection of a portion of the representation 610c of the song other than the icon or image associated with the preview of the song, the electronic device 500 presents the user interface in fig. 6D, 6M, 6P, or initiates playback of the song.
For example, fig. 6N shows the first electronic device 500a playing a preview of a content item in private content playback mode, while the second electronic device 500b continues to play another content item in shared content playback mode. In some implementations, as the preview of the content item is played, the first electronic device 500a updates the playback control element 612a to include an indication 614p of the title of the content item and an image 614n corresponding to the content item. In some implementations, the second electronic device 500b continues to display the playback user interface shown in fig. 6M as the second electronic device 500b continues to play the content item in the shared playback mode of the communication session. In some implementations, after the first electronic device 500a finishes playing the preview of the content item, the first electronic device 500a automatically resumes playback of the content in the shared content playback mode from a playback position that is later than the playback position at which the first electronic device stopped playing the content in the shared content playback mode (e.g., remains synchronized with playback of the content at other electronic devices in the communication session, which continue playing the content in the shared content playback mode while the first electronic device 500a is playing a preview of the other content item in the private content playback mode).
In some implementations, when playing a content item in a shared content playback mode, an electronic device in a communication session plays content from a shared content playback queue. In some embodiments, the shared content playback queue includes a plurality of content items in an ordered list, and the electronic device plays the content items in the order of the queue, automatically playing the next content item after each content item has completed playing. In some embodiments, all users in the communication session (e.g., users playing content in a shared content playback mode) are able to edit the queue, including adding content items to the queue, removing content items from the queue, and reordering content items in the queue. Fig. 6O-6X illustrate examples of an electronic device in a communication session interacting with a shared content item queue according to some embodiments.
In some implementations, as shown in fig. 6O, the second electronic device 500b displays a shared content item queue for the communication session. For example, the second electronic device 500b displays an indication 648e of a queue including group names in a communication session, an option 648f for playing content in random play order, an option 648g for repeating a sequence of content, an option 648h for automatically playing additional content after all content items in the queue are played, multiple representations 650a-650c of content items in the queue, an indication 654 of content items to be played after all content items in the queue have been played, and representations 650d-650e of content items to be played after all content items in the queue have been played. In some implementations, the representation 650a of the content items in the shared content item queue includes an image 652a associated with the content items, a representation 652b of a user adding the content items to the queue, an indication 652c of a title of the content items, and a user interface element 652d for dragging the content items to different locations in the queue. In some embodiments, representations 650b and 650c include elements similar to representation 650 a. In some implementations, the representation 650d of the content item to be played after all of the content items in the play queue include an image 652e associated with the content item, an indication 652f of the title of the content item, and a user interface element 652g that can be dragged to change the play position of the content item relative to other content items in the queue or in the list of items to be played after all of the items in the play queue. In some embodiments, content items to be played after playing all items in the shared content playback queue are selected based on content items in the shared content item queue and/or content consumption history of users in the communication session (e.g., by one or more of the electronic devices in the communication session and/or one or more servers in communication with the electronic devices in the communication session).
In some embodiments, when representations 650a-650c of content items in the queue and representations 650d-650e of content items to be played after all of the content items in the queue have been played, the second electronic device 500b also displays an image 648a associated with the content item currently being played, an indication 648b of the content item currently being played, an indication 648c of an artist of the content item currently being played, an option 648d that when selected causes the second electronic device 500b to present a plurality of options regarding actions to be performed by the content item currently being played on the second electronic device 500b, and an indication 602d of a communication session indicating that the communication session includes shared playback of the content.
In some embodiments, as shown in fig. 6O, the first electronic device 500a displays the album page user interface described in more detail above with reference to fig. 6A while playing content in the shared content playback mode of the communication session. As shown in fig. 6O, in some implementations, the first electronic device 500a detects selection of the representation 610c of the content item (e.g., via contact 603 n). In some implementations, in response to the input shown in fig. 6O, the first electronic device 500a displays a plurality of options for initiating playback of the content item in the shared content playback mode.
Fig. 6P illustrates an example of a user interface presented by the first electronic device 500a in response to the input shown in fig. 6O, in accordance with some embodiments. In some embodiments, in response to the input shown in FIG. 6O, the first electronic device 500a presents a menu 656a that includes options 658a-658 d. In some implementations, the menu 656 includes an option 658a for initiating playback of a content item now in the shared content playback mode (e.g., and stopping playback of a content item currently playing in the shared content playback mode), an option 658a for adding a content item to the beginning of the shared content item queue, an option 658c for adding a content item to the end of the shared content item queue, and an option 658d for stopping display of the menu 656 without adding a content item to the queue or initiating playback of a content item. In some implementations, initiating playback of the content item in the shared content playback mode causes all electronic devices playing the content in the shared content playback mode of the communication session to initiate playback of the content item. As shown in fig. 6P, in some embodiments, the first electronic device 500a detects selection of option 658b (e.g., via contact 603P) to add the content item to the beginning of the shared content item queue. In some implementations, in response to the input shown in fig. 6P, the electronic device in the communication session adds the content item to the shared content playback queue, as shown in fig. 6Q.
For example, fig. 6Q illustrates the second electronic device 500b presenting the shared content playback queue after adding the content item to the queue in response to the input detected at the first electronic device 500a in fig. 6P, in accordance with some embodiments. As shown in fig. 6Q, for example, the second electronic device 500b presents a representation 650f of the content item selected by the first electronic device 500a at the top of the shared content item queue. In some embodiments, in response to receiving an indication of an input received at the first electronic device 500a, the second electronic device 500b presents an indication 660a of the input. The indication 600a may include, for example, an indication 662a of the user of the first electronic device 500a and an indication 662b of a title of a content item added to the queue by the user of the first electronic device 500 a. In some implementations, the second electronic device 500a displays the indication 660a because the second electronic device 500a is displaying a user interface of a content application associated with the communication session (e.g., a content service associated with the communication session). In some embodiments, the third electronic device 500c also participating in the communication session does not display an indication that the first electronic device 500a added the content item to the queue because the third electronic device 500c does not display the user interface of the content application when the third electronic device 500c receives an indication of the input received at the first electronic device 500 a. In some implementations, the electronic device in the communication session displays an indication of the first subset of actions of the shared content playback mode (e.g., a change to the shared content item playback queue) only when the user interface of the content application is displayed. In some embodiments, when an indication of an input is received, the electronic device in the communication session displays an indication of a second subset of actions related to the shared content playback mode (e.g., changing playback of an item currently being played, such as playing, pausing, and/or skipping a content item or content item inner skip), regardless of the user interface that the electronic device is displaying when the indication of the input is received.
For example, as shown in fig. 6Q, the first electronic device 500a detects selection of option 614d (e.g., via contact 603Q) to pause the content item currently playing in the shared content playback mode. In some implementations, in response to an input (e.g., an indication of the input), an electronic device in a communication session pauses playback of a content item in a shared content playback mode of the communication session. In some embodiments, the electronic devices 500b and 500c present a visual indication of the input received by the first electronic device 500a in response to receiving an indication of the input received by the first electronic device 500a, as shown in fig. 6R.
In fig. 6R, in response to receiving an indication of an input detected by the first electronic device 500a in fig. 6Q, the second electronic device 500b presents an indication 660b of the input and the third electronic device 500c presents an indication 660c of the input. In some implementations, the second electronic device 500b presents the indication 660b while displaying a user interface of a content application associated with the communication session. For example, the indication 660b includes an indication 662a of the user of the first electronic device 500a and an indication 662f of the user of the first electronic device 500a to pause playback of the content in the shared content playback mode. In some implementations, the third electronic device 500c presents the indication 660c while displaying a home screen user interface that is not a user interface of the content application associated with the communication session. For example, similar to the indication 660b, the indication 660c includes an indication 662a of the user of the first electronic device 500a and an indication 662f of the user of the first electronic device 500a to pause playback of the content in the shared content playback mode. In some implementations, the indication 660c also includes an indication 662c of the content application associated with the communication session, as the indication 660c is displayed when the third electronic device 500c is displaying a user interface that is not a user interface of the content application associated with the communication session. In some embodiments, the indication 660c is selectable to display a user interface of a content application associated with the communication session because the indication 660c is displayed when the third electronic device 500c is not displaying a user interface of a content application associated with the communication session. In some implementations, the second electronic device 500b does not perform an action in response to detecting the selection of the indication 660b because the second electronic device 500b is already displaying a user interface of a content application associated with the communication session.
Fig. 6S-6U illustrate examples of the first electronic device 500a receiving an input sequence for removing a content item from a shared content playback queue, according to some embodiments. For example, in fig. 6S, the first electronic device 500a detects a swipe input (e.g., via contact 603S) on the representation 650i of the content item in the shared content item playback queue. As shown in fig. 6T, in response to the input shown in fig. 6S, the first electronic device 500a displays a selectable option 664 that, when selected, causes the first electronic device 500a to initiate a process of removing a content item from, for example, a shared content playback queue. In some implementations, as shown in fig. 6T, the first electronic device 500a detects selection of the option 664 (e.g., via contact 603T) to initiate a process of removing a content item from the content playback queue. In response to the input shown in fig. 6T, in some embodiments, the first electronic device 500a displays the menu 666 shown in fig. 6U. As shown in fig. 6U, in some embodiments, the menu includes text indicating that removing a content item from the queue will remove the content item from the queue for all users in the communication session, an option 668a that when selected causes the first electronic device 500a to remove the content item from the content playback queue, and an option 668b for stopping the display of the menu 666 without removing the content item from the content playback queue. For example, in fig. 6U, the first electronic device 500a detects selection of option 668a to remove a content item from the content item playback queue. In response to the input shown in fig. 6U, the first electronic device 500a removes the content item from the queue and the second electronic device 500a presents a visual indication of the input shown in fig. 6U, as shown in fig. 6V.
Fig. 6V illustrates the content item playback queue after removal of the content item in response to the input illustrated in fig. 6V, and the second electronic device 500b presents an indication 660d of the input received by the first electronic device 500a in fig. 6U, according to some embodiments. As shown in fig. 6V, the first electronic device 500a and the second electronic device 500b no longer display the indications 650i and 650b of the content item removed by the user of the first electronic device 500a shown in fig. 6U, as the content item has been removed from the content playback queue. Fig. 6V also illustrates that the second electronic device 500b displays an indication 660d of the input received by the first electronic device 500a, according to some embodiments. In some implementations, the indication 660d includes an indication 662a of the user of the first electronic device 500a and text 662d that indicates that the user of the first electronic device 500a removed the content item from the queue, including an indication of the title of the removed content item. In some embodiments, the second electronic device 500b displays the indication 660d because when the second electronic device 500b receives an indication of the input received at the first electronic device 500a in fig. 6U, the second electronic device 500b is displaying a user interface associated with a content application associated with the communication session.
In some embodiments, a plurality of electronic devices in a communication session receive input to update a shared content playback queue prior to updating the queue based on one or more inputs. For example, one of the electronic devices receives input to update the queue after the other electronic device receives input to update the queue but before updating the queue based on the input received first. In this case, in some embodiments, the queue is updated according to the input received first (e.g., by a server in communication with the electronic device in the communication session), rather than according to input received after the input received first (but before a change in the queue is made according to the input received first).
Fig. 6W illustrates an example in which the first electronic device 500a and the second electronic device 500b receive input to reorder content items in a shared content item playback queue, according to some embodiments. For example, the first electronic device 500a detects a swipe input (e.g., via contact 652 v) starting from user interface element 603h for changing the position of the content item associated with representation 640h within the shared playback queue, and the second electronic device 500b detects a swipe input (e.g., via contact 652 w) starting from user interface element 603i for changing the position of the content item associated with representation 650c within the queue. In some embodiments, the first electronic device 500a detects the input of the contact 603v after the second electronic device 500b detects the input of the contact 603w but before reordering the queue in response to the input detected at the second electronic device 500 b. In some embodiments, in response to the input shown in fig. 6W, the electronic devices 500a and 500b (and any other electronic devices playing content in the shared content playback mode of the communication session) reorder the content items in the queue according to the input detected at the second electronic device 500b, as shown in fig. 6X.
Fig. 6X illustrates an example of an updated shared content playback queue responsive to the input shown in fig. 6W, according to some embodiments. In some embodiments, the content playback queue reorders according to the input received by the second electronic device 500b and not according to the input received by the first electronic device 500a, as the first electronic device 500a receives the input reordering the content playback queue after the second electronic device 500b receives the input reordering the content playback queue but before reordering the content playback queue in response to the input received by the second electronic device 500 b. For example, as shown in FIG. 6X, the representation 650c of "Groovy Tune" is located over the representation 650a of "rock song" as requested by the user of the second electronic device 500b in FIG. 6X. In some embodiments, if the second electronic device 500b has received the input shown in fig. 6W after the first electronic device 500a has received the input shown in fig. 6W but before the queue has been updated according to the input received by the first electronic device 500a, the queue will have been updated according to the input received by the first electronic device 500a and not according to the input received by the second electronic device 500 b. For example, the representation 650h of "rock song" would be located above the representation 650g of "Uppeat Jam".
In some implementations, one or more electronic devices (e.g., with associated user accounts) engaged in the communication session are subject to content playback restrictions, such as parental control restrictions that limit playback of open-bone content or content having one or more specified parental control ratings. In some embodiments, if the shared content playback of the communication session has fewer restrictions than the restrictions implemented on the electronic device, the electronic device that is restricted by the content playback is not capable of playing the content in the shared content playback mode of the communication session. In some embodiments, the shared content playback of the communication session is limited by content playback on the electronic device that initiated the communication session. In some implementations, the shared content playback of the communication session is limited by content playback on the electronic device that initiated the shared content playback in the communication session. For example, if a communication session (e.g., shared content playback of a communication session) is initiated by an electronic device that is not content playback limited, the electronic device that is content playback limited cannot participate in the shared content playback in the communication session. As another example, if a communication session (e.g., shared content playback of the communication session) is initiated by an electronic device that is limited by content playback, content that is not limited by these content playback limitations may only be played in the shared content playback mode of the communication session, even if the electronic device is not limited by content playback. In some embodiments, the electronic device that is limited by content playback is not able to join a communication session initiated by an electronic device that is more loosely limited by content playback (or not limited by content playback at all).
Fig. 6Y-6Z illustrate examples of implementing content playback restrictions when playing content in a shared content playback mode of a communication session, according to some embodiments. In these examples, the first electronic device 500a is subject to fewer content playback limitations than are implemented on the second electronic device 500 b. For example, the content playback limit setting of the first electronic device 500a (if any) allows the first electronic device 500a to play open-bone music, and the content playback limit setting of the second electronic device 500b prevents the second electronic device 500b from playing open-bone music.
For example, in fig. 6Y, the first electronic device 500a and the second electronic device 500b are participating in a communication session with the same content playback limit setting as the content playback limit setting in the active state on the first electronic device 500a, which is more relaxed than the content playback limit setting in the active state on the second electronic device 500 b. In some embodiments, the content playback limit setting of the shared content playback mode of the communication session matches the content playback limit setting of the first electronic device 500a because the first electronic device 500a initiated the communication session (e.g., initiated the shared content playback in the communication session).
In some implementations, in fig. 6Y, the first electronic device 500a presents a playlist page of a content application associated with a communication session (e.g., a content service associated with the communication session). The playlist page may include an option 670a for navigating to a previously displayed user interface of the content application, an option 670c for displaying a menu including a plurality of selectable options for performing actions with respect to the playlist, an image 672a associated with the playlist, an indication 672b of a title of the playlist, an indication 672b of a user who created the playlist, an option 674a for initiating playback of the playlist in a predetermined order of the playlist, an option 674b for initiating playback of the playlist in a random play order different from the predetermined order of the playlist, and a plurality of representations 676a-676c of content items in the playlist selectable to initiate playback of content items corresponding to the selected representations. In some implementations, the first electronic device 500a presents the indication 602c of the communication session including shared content playback and the content playback control element 612a while displaying the playlist page. In some implementations, the content item corresponding to representation 676b includes an indication 682 that the content item is an open bone content item. In some implementations, the content playback limit settings active on the first electronic device 500a allow the first electronic device 500a to play open bone content, and the shared playback mode of the communication session has the same content playback limit settings as the first electronic device 500a, so the representation 676b is selectable to play the corresponding content item in the shared content playback mode of the communication session.
In some embodiments, the second electronic device 500b is not capable of playing content in the shared content playback mode of the communication session because the content playback limit setting of the second electronic device 500b does not allow the second electronic device 500b to play the open bone content and the content playback limit setting of the shared playback mode of the communication session allows playback of the open bone content. In some embodiments, the second electronic device 500b presents an indication 684 that the second electronic device 500b is not capable of playing content in the shared content playback mode of the communication session because the content playback limit setting of the second electronic device 500b is more limited than the content playback limit setting of the shared playback mode of the communication session. In some embodiments, the indication 684 includes a selectable option 686 that, when selected, causes the electronic device 500b to stop displaying the indication 684 without initiating playback of the content in the shared content playback mode of the communication session and without modifying the content playback limit settings of the second electronic device 500 b.
In some embodiments, the second electronic device 500b is capable of joining the communication session but is not capable of playing content in the shared content playback mode of the communication session because the content playback limit setting of the second electronic device 500b is more limited than the content playback limit setting of the shared playback mode of the communication session. In some embodiments, because the second electronic device 500b is participating in the communication session, the second electronic device 500b presents an indication 602d of the communication session. In some implementations, the indication 684 is presented in response to initiation of playback of the content in the shared content playback mode of the communication session (e.g., in response to detecting an indication of input received at the first electronic device 500a in fig. 6D). In some embodiments, indication 684 is presented in response to receiving an input at second electronic device 500b requesting to join the shared content playback in the communication session (such as the input received at the second electronic device in fig. 6E).
In some embodiments, the second electronic device 500b is not capable of joining the communication session because the content playback limit setting of the second electronic device 500b is more limited than the content playback limit setting of the shared playback mode of the communication session. In some embodiments, if the second electronic device 500b is not capable of joining the communication session, the second electronic device 500b will not display the indication 602d of the communication session while displaying the indication 684. In some implementations, the indication 684 is presented in response to detecting an input at the second electronic device 500b corresponding to a request to join the communication session.
In some embodiments, in fig. 6Z, the first electronic device 500a and the second electronic device 500b are participating in a communication session with the same content playback limit setting as the content playback limit setting in the active state on the second electronic device 500b, which content playback limit setting is more restrictive than the content playback limit setting in the active state on the first electronic device 500 a. In some embodiments, the content playback limit setting of the shared content playback mode of the communication session matches the content playback limit setting of the second electronic device 500b because the second electronic device 500b initiated the communication session (e.g., shared content playback in the communication session).
In some embodiments, in fig. 6Z, the first electronic device 500a displays the playlist page described above with reference to fig. 6Y. However, in fig. 6Z, the representation 676d of the open bone content item is not selectable because the shared content playback mode of the communication session does not allow playback of the open bone content even though the first electronic device 500a is not restricted to playing open bone content (e.g., when not in the communication session or when in the private content playback mode). In some implementations, since the content playback limit setting of the first electronic device 500a allows the first electronic device 500a to play open bone content, if the first electronic device 500a is in the private content playback mode, the representation 676d will be selectable to initiate playback of the content item in the private content playback mode.
In some embodiments, in fig. 6Z, the second electronic device 500b is capable of playing content in the shared content playback mode of the communication session because the content playback limit settings of the communication session are not more relaxed than the content playback limit settings of the second electronic device 500 b. In some embodiments, the second electronic device 500b does not display the indication shown in fig. 6Y because the second electronic device 500b is not prevented from joining the communication session (e.g., shared content playback of the communication session). Since the second electronic device 500b is playing content in the shared content playback mode (e.g., with the first electronic device 500b and another electronic device in a communication session), the first electronic device 500a presents an indication 614i in fig. 6Z that three users are playing content in the shared content playback mode. In some embodiments, in fig. 6Y, only the first electronic device 500a and the other electronic device not shown in fig. 6Y are playing content in the shared content playback mode, because the second electronic device 500b is not capable of playing content in the shared content playback mode, the first electronic device 500a presents an indication 614h that two users are playing content in the shared content playback mode.
In some implementations, when the electronic device is not engaged in the communication session, initiating playback of the content item on the electronic device does not result in initiating playback on other electronic devices engaged in the communication session. For example, in fig. 6AA, the first electronic device 500a is not engaged in a communication session, while the second electronic device 500b is engaged in a communication session. In some embodiments, as shown in fig. 6AA, the first electronic device 500a displays the album page described above with reference to fig. 6A. Because the first electronic device 500a is not in a communication session, the first electronic device 500a does not display an indication of the communication session in fig. 6AA, such as, for example, indication 602a shown in fig. 6A. In some implementations, in fig. 6AA, the second electronic device 500b is currently playing the content item in a shared content playback mode of the communication session in which the second electronic device 500b is participating. For example, because the second electronic device 500b is in a communication session that includes shared content playback, the second electronic device 500b displays an indication 602d of the communication session that includes shared content playback. In some implementations, the first electronic device 500a and the second electronic device 500b are playing different content items. For example, when the first electronic device 500a is not engaged in a communication session, the first electronic device 500a presents an indication 614c of a content item being played at the first electronic device 500a, and the second electronic device 500b displays an indication 628b of a content item of the shared content playback mode in which the second electronic device 500b is playing the communication session.
As shown in fig. 6AA, the first electronic device 500a detects selection of the representation 610a of the content item (e.g., via contact 603 AA). In some implementations, in response to the input shown in fig. 6AA, the first electronic device 500a initiates playback of the content item corresponding to the selected representation 610a without causing any other electronic device to initiate playback of the content item (e.g., because the first electronic device 500a is not in a communication session, the first electronic device 500a cannot be in a shared content playback mode of the communication session), as shown in fig. 6 BB. In some embodiments, the first electronic device 500a foregoes presenting a menu similar to the menu 620 shown in fig. 6D that is displayed in response to the user input shown in fig. 6A, because the first electronic device is not in a communication session when the selection of option 610a is received in fig. 6 AA. In some embodiments, the first electronic device 500b foregoes presenting a menu similar to the menu 656 shown in fig. 6P that is displayed in response to the user input shown in fig. 6O, because the first electronic device 500a is not in a communication session (and thus is not in a shared playback mode of the communication session) when a selection of the option 610a is received in fig. 6 AA.
In some implementations, as shown in fig. 6BB, in response to the input shown in fig. 6AA, the first electronic device 500a initiates playback of the content item corresponding to representation 610 a. As shown in fig. 6BB, when a content item is played in response to the input shown in fig. 6AA, the first electronic device 500a updates the playback control element 612a to include an image 614q (e.g., album art) corresponding to the content item and an indication 614r of the title of the content item. In some implementations, the second electronic device 500b continues to play content in the shared content playback mode of the communication session and does not play the content item selected at the first electronic device 500a in fig. 6AA because the first electronic device 500a is not in the communication session with the second electronic device 500b when the input in fig. 6AA is received.
In some embodiments, when the electronic device is not in a communication session or playing content in a private content playback mode of a communication session, the electronic device plays content from a private content queue that is different from a shared content queue from which the electronic device in a shared playback mode of a communication session plays content. For example, in fig. 6CC, the first electronic device 500a is not in a communication session and displays a private content playback queue, and the second electronic device 500b is in a shared content playback mode of the communication session and displays a shared content playback queue.
In some embodiments, as shown in fig. 6CC, the second electronic device 500b presents a shared content playback queue, which may be similar to the shared content playback queue described above with reference to fig. 6O. In fig. 6CC, in some embodiments, the private content playback queue displayed by the first electronic device 500a may include an indication 688e of the queue, an option 688f for playing the content in random play order, an option 688g for repeating playback of the content items, an option 688h for playing additional content items after playing all of the content items in the queue, and representations 690a-690f of the content items in the private content playback queue. In some embodiments, the indication 688e of the private content playback queue does not include an indication of the name of the communication session, such as the indication 648e displayed by the second electronic device 500b, because the first electronic device 500a is not in the communication session and/or the first electronic device 500a is not sharing the queue with other electronic devices in the communication session (e.g., if the first electronic device is in the private content playback mode of the communication session when the private content playback queue is displayed). In some embodiments, representations 690a-690f of content items in the private content playback queue do not include an indication of the user adding the content item to the queue, as are the cases of representations 650a, 650c, and 650f in the shared content playback queue displayed by the second electronic device. In some implementations, when a mode of playing a content item after all content items in the private content item queue have been played is active on the first electronic device 500a, the first electronic device 500a selects a content item to play after the content item in the queue based on the content items in the private content item queue and/or based on the content consumption history of the first electronic device 500a but not based on the content consumption histories of other electronic devices. It should be appreciated that in some embodiments, if the first electronic device 500a is displaying a private queue while in the private content playback mode of the communication session, the first electronic device 500a presents a visual indication of the communication session, such as the indication 602a in fig. 6A if no shared content playback occurs in the communication session, or the indication 602c in fig. 6E if shared content playback occurs in the communication session (e.g., between other electronic devices in the communication session).
Fig. 7 is a flow chart illustrating a method of playing a content item at an electronic device in synchronization with playback of the content item at other electronic devices of other users, according to some embodiments. Method 700 is optionally performed at an electronic device, such as device 100, device 300, device 500a, device 500B, and device 500c, as described above with reference to fig. 1A-1B, 2-3, 4A-4B, and 5A-5H. Some operations in method 700 are optionally combined, and/or the order of some operations is optionally changed.
As described below, the method 700 provides a way for an electronic device to play a content item at the electronic device in synchronization with playback of the content item at other electronic devices of other users. The method reduces the cognitive burden on the user when interacting with the device user interface of the present disclosure, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, improving the efficiency of user interaction with the user interface saves power and increases the time between battery charges.
In some embodiments, the method 700 is performed at an electronic device in communication with one or more input devices (e.g., one or more of a mobile device (e.g., a tablet, a smart phone, a media player, or a wearable device), a computer (e.g., a desktop computer, a laptop computer), or a wearable device (e.g., a watch, a headset), optionally with a mouse (e.g., external), a touch pad (optionally integrated or external), a remote control device (e.g., external), another mobile device (e.g., separate from the electronic device), a handheld device (e.g., external), and/or a controller (e.g., external, etc.), or a set top box in communication with one or more input devices (e.g., a remote control).
In some embodiments, such as shown in fig. 6G, when an electronic device (e.g., 500 b) is in a communication session (e.g., group phone call, group video conference session, group media consumption session) with one or more second electronic devices (e.g., 500 a) associated with one or more second users other than the user of the electronic device, the electronic device (e.g., 500 b) receives (704), via one or more input devices, an input (e.g., 603G) corresponding to a request to modify playback of the content item, such as shown in fig. 6G, when playback of the content item (e.g., audio content such as music, podcasts, audio books, or video content such as movies, video clips, or episodes of the content series) is occurring (e.g., the electronic device is playing the content item at the electronic device and/or causes the content item to be played by another device in the communication session, or the electronic device is playing the content item in response to initiation of playback of the content item by another device associated with another user in the communication session). In some implementations, the request to modify playback of the content item is the following request: play/pause a content item, initiate playback of a content item, skip forward or backward within a content item, add a content item or remove a content item from a queue of content items to be played next, skip forward or backward in a queue to a content item different from the content item currently being played.
In some implementations, the communication session is a synchronized media and communication session in which an electronic device in the communication session is capable of playing content items in a synchronized manner with other electronic devices in the shared media and communication session. In some embodiments, some or all of the electronic devices in the communication session communicate with some or all of the other electronic devices in the communication session. For example, multiple smartphones in a communication session together present audio and/or video captured by other smartphones in the communication session to facilitate a group video conference between two or more users. In some embodiments, the communication session facilitates communication between users. In some embodiments, the communication session is different from the communication between two electronic devices for rendering audio purposes, such as a bluetooth connection between an electronic device (e.g., smart phone, tablet, laptop, set top box) and a speaker. In some implementations, the communication session includes synchronized playback of the content item at all electronic devices in the communication session. For example, when electronic devices are in a communication session that does not include shared playback of content, one of the electronic devices initiates playback of the content item, which causes the content item to be played at some or all of the electronic devices in the communication session in a synchronized manner.
In some embodiments, such as shown in fig. 6H, when an electronic device (e.g., 500 b) is in a communication session (e.g., group phone call, group video conference session, group media consumption session) with one or more second electronic devices (e.g., 500 a), the one or more second electronic devices (e.g., 500 a) are associated with one or more second users other than users of the electronic devices (e.g., 500 b) (702), in response to receiving the input (706), in accordance with a determination that the input is received while the electronic device (e.g., 500 b) is in a shared content playback mode of the communication session, the electronic device (e.g., 500 b) modifies (708) playback of the content item in accordance with the input at the electronic device (e.g., 500 b) and at the one or more second electronic devices (e.g., 500 a), such as shown in fig. 6H. In some embodiments, the electronic device plays content shared in the communication session when the electronic device is in a shared content playback mode of the communication session. In some embodiments, some or all of the electronic devices in a communication session in a shared content playback mode play the same content in a time synchronized manner with each other. For example, if the input is an input for pausing the content item, the electronic device and the one or more second electronic devices pause the content item being played at their respective devices in response to the input. In some embodiments, when one or more electronic devices in a communication session are in a shared content playback mode, one or more respective electronic devices in the communication session are in a private content playback mode in which content playback at the respective electronic devices is not synchronized with other electronic devices (e.g., any electronic devices) in the communication session.
In some embodiments, such as shown in fig. 6B, when an electronic device (e.g., 500 a) is in a communication session (e.g., group phone call, group video conference session, group media consumption session) with one or more second electronic devices (e.g., 500B), the one or more second electronic devices (e.g., 500B) are associated with one or more second users other than the user of the electronic device (e.g., 500 a) (702), in response to receiving the input (706), in accordance with a determination that the input is received when the electronic device (e.g., 500 a) is in a private content playback mode of the communication session (other than a shared content playback mode of the communication session), such as shown in fig. 6B, the electronic device (e.g., 500 a) modifies (710) playback of the content item in accordance with the input at the electronic device (e.g., 500B) without modifying playback of the content item at the one or more second electronic devices (e.g., 500B), such as shown in fig. 6C. In some implementations, the electronic device can present content items that are not shared in the communication session while in the communication session and private content playback mode. In some embodiments, one or more second electronic devices in the communication session are in a shared content playback mode when the electronic devices are in a private content playback mode. For example, while in a communication session, one or more second electronic devices play the first content item in a synchronized manner, and the electronic devices play the second content item independent of content playback at the one or more second electronic devices. For example, in response to receiving an input to pause content, the electronic device pauses playback of the content at the first electronic device without pausing the content being played by other electronic devices (e.g., any other electronic devices) in the communication session. In some embodiments, one or more electronic devices in a communication session may be in neither a shared content playback mode nor a private content playback mode. For example, one or more electronic devices in a communication session discard playing content while in the communication session.
The above-described manner of providing a shared content playback mode in which input controls for modifying playback of content items control playback of content on a plurality of devices in a communication session and a private playback mode in which input controls for modifying playback of content items control playback of content on an electronic device provides an efficient way of maintaining content synchronization in the shared content playback mode and preventing content playback on one or more electronic devices from interfering with content playback on other electronic devices in the private content playback mode, which additionally reduces power usage and extends battery life of the electronic devices by enabling users to more quickly and efficiently use the electronic devices.
In some implementations, such as shown in fig. 6E, when an electronic device (e.g., 500 b) is in a communication session, the electronic device (e.g., 500 b) receives an indication of a selection of a respective content item at a respective electronic device (e.g., 500 a) of the one or more second electronic devices. In some implementations, selection of a representation of a respective content item is detected via one or more input devices in communication with an electronic device. In some implementations, a respective one of the one or more second electronic devices detects a selection of a representation of the respective content item, and the electronic device detects an indication of the selection (e.g., a selection received from the respective one of the one or more second electronic devices). In some implementations, representations of respective content items are displayed in a user interface of a content (e.g., browsing, streaming, playback, library) application. In some embodiments, if the electronic device in the communication session was playing content in the shared content playback mode of the communication session previously, but the content is currently paused or otherwise not playing, one of the electronic devices in the communication session detects a selection of an option (e.g., a play button or option) for resuming playback of the previously played content item, rather than a selection of the content item (e.g., a representation of the content item).
In some implementations, such as shown in fig. 6G, when an electronic device (e.g., 500 b) is in a communication session, the electronic device (e.g., 500 b) initiates playback of a respective content item at the electronic device (e.g., 500 b) (e.g., and one or more second electronic devices) in response to receiving an indication of selection of the respective content item (e.g., representation of the respective content item). In some implementations, if the electronic device is not already in the shared content playback mode of the communication session when an indication of selection of the respective content item is received, the electronic device initiates a process of entering the shared content playback mode in response to receiving the indication of selection of the respective content item. For example, the electronic device enters a shared content playback mode of the communication session and then initiates playback of the content item. In some implementations, playback of the content item at the electronic device is synchronized with playback of the content item at the one or more second electronic devices. For example, playback of the content at the electronic device and the one or more second electronic devices begins simultaneously. In some implementations, when the electronic device is in a communication session and the communication session does not include shared playback of the content, the electronic device receives a selection of a respective content item and, in response, presents an option to initiate shared playback (e.g., shared playback of the respective content item) in the communication session.
The above-described manner of initiating playback of a content item in response to an indication of an input for playing the content item received at one of the second electronic devices provides an efficient manner of presenting content in a synchronized manner with other electronic devices in a communication session, which additionally reduces power usage and extends battery life of the electronic devices by enabling users to more quickly and efficiently use the electronic devices.
In some implementations, such as shown in fig. 6A, when an electronic device (e.g., 500 a) is in a communication session (e.g., and when playback of a content item does not occur in the communication session), the electronic device (e.g., 500 a) receives an indication of a request (e.g., 603 a) to initiate playback of a corresponding content item in the communication session via one or more input devices (e.g., from one of the one or more second electronic devices, e.g., via one or more servers in communication with the electronic device, with one of the one or more second electronic devices, and/or with each other). In some implementations, initiating a request for playback of a respective content item in a communication session is a request to play the respective content item in a shared playback mode of the communication session. In some embodiments, the input is detected by one of the one or more second electronic devices, and an indication of the input is transmitted to the electronic device (e.g., from the one of the one or more second electronic devices, optionally via the one or more servers). In some embodiments, the electronic device is playing the corresponding content in a private content playback mode.
In some embodiments, such as shown in fig. 6B, when an electronic device (e.g., 500 a) is in a communication session, in response to receiving an indication to initiate playback of a respective content item in the communication session, in accordance with a determination that a user account associated with the electronic device is authorized to access the respective content item through a content (e.g., streaming, playback, library, browsing, etc.) service associated with the communication session (e.g., a content (e.g., streaming, playback, library, browsing, etc.) application associated with the communication session), the electronic device (e.g., 500 a) presents a selectable option (e.g., 622B) that, when selected, causes the electronic device (e.g., 500 a) to initiate playback of the respective content item in a shared content playback mode of the communication session. In some implementations, in response to detecting a selection of the selectable option, the electronic device initiates playback of the respective content item at a playback position of the respective content item in the communication session at which the respective content item is being played (e.g., not necessarily at the beginning of the content item, but at a current playback position of the content item in the communication session). Thus, in some implementations, playback of respective content items is synchronized between electronic devices in a shared playback mode of a communication session. In some implementations, an electronic device in a communication session initiates playback of a next content item in a shared content item queue associated with the communication session once the respective content item completes playing. In some embodiments, the electronic device is authorized to access content via the content service because a user account associated with the electronic device has an active subscription to the content service and/or the subscription includes the corresponding content item as the content item that includes the access. In some embodiments, all electronic devices playing content in a communication session have an active subscription to the content service. In some implementations, the selectable options are presented with an indication of a user initiating playback of the respective content item in a shared playback mode of the communication session and an indication of the respective content item (e.g., title, artist, etc.).
In some implementations, in response to receiving an indication that a request to initiate playback of a respective content item in a communication session is received while an electronic device (e.g., 500 a) is in the communication session, in accordance with a determination that a user account associated with the electronic device is not authorized to access the respective content item through a content (e.g., streaming, playback, library, browsing, etc.) service associated with the communication session (e.g., a content (e.g., streaming, playback, library, browsing, etc.) application associated with the communication session), the electronic device (e.g., 500 a) foregoes presenting the selectable option, such as forego presenting option 622B shown in fig. 6B. In some embodiments, in accordance with a determination that a user account associated with the electronic device is not authorized to access content through a content service associated with the communication session, the electronic device presents a selectable option that, when selected, causes the electronic device to initiate a process of subscribing to a content (e.g., streaming, playback, library, browsing, etc.) service associated with the communication session. For example, the electronic device is not authorized to access the respective content item because the user account of the electronic device does not have a valid subscription to the content service associated with the communication session or the user account of the electronic device has a subscription that does not include access to the respective content item. In some embodiments, one or more of the other electronic devices in the communication session that are not authorized to access the content through the content service associated with the communication session cannot play the content in the shared playback mode of the communication session unless and until the electronic device completes a process of obtaining authorization for the content through the content service (e.g., through a subscription content service). In some implementations, the electronic device presents a visual indication of a user initiating playback of the respective content item in a shared playback mode of the communication session and an indication of the respective content item (e.g., title, artist, etc.).
The above-described manner of presenting options for initiating playback of respective content items in accordance with determining that a user is authorized to access content through a content service associated with a communication session and discarding presenting selectable options if an electronic device is not authorized to access content through a content service associated with a communication session provides an efficient manner of joining a shared content playback mode of a communication session and indicating to a user when joining a shared content playback mode of a communication session is not possible, which additionally reduces power usage and extends battery life of the electronic device by enabling the user to more quickly and efficiently use the electronic device.
In some implementations, such as shown in fig. 6A, when an electronic device (e.g., 500 a) is in a communication session (e.g., and before playback of a content item occurs), the electronic device (e.g., 500 a) receives input corresponding to selection of a respective content item (e.g., corresponding to representation 610 b) via one or more input devices. In some implementations, the electronic device detects a selection of a representation of a respective content item. In some implementations, the electronic device detects selection of an option (e.g., a play button) for initiating playback of a corresponding content item. For example, the electronic device is previously playing the respective content item, pausing the respective content item, and then receiving a selection of a play option to resume playback of the respective content item. In some implementations, the input is a voice input requesting playback of the respective content item.
In some implementations, such as shown in fig. 6B, when an electronic device (e.g., 500 a) is in a communication session (e.g., and before playback of a content item occurs), in response to receiving an input corresponding to a selection of a respective content item (e.g., corresponding to representation 610B), the electronic device (e.g., 500 a) displays a first selectable option (e.g., 622B) via a display generation component in communication with the electronic device that, when selected, causes the electronic device to initiate playback of the respective content item in a shared content playback mode on the electronic device (e.g., 500 a) and one or more second electronic devices (e.g., 500B), such as shown in fig. 6B. In some embodiments, the electronic device enters the shared content playback mode of the communication session in response to detecting the selection of the first selectable option if the electronic device is not already in the shared content playback mode (e.g., the electronic device is in a private content playback mode or is not in a playback mode associated with the communication session (e.g., the electronic device is not playing content when receiving the input)). In some implementations, one or more second electronic devices in a shared content playback mode of a communication session initiate playback of a respective content item in response to receiving an indication of an input corresponding to a selection of the respective content item. In some implementations, one or more electronic devices in a private content playback mode of a communication session in the communication session forgo initiating playback of a respective content item. In some implementations, one or more electronic devices in a private content playback mode in a communication session present a visual indication of initiation of playback of a respective content item and/or selectable options for entering a shared content playback mode of the communication session and initiating playback of the respective content item.
In some implementations, such as described in fig. 6B, when the electronic device (e.g., 500 a) is in a communication session (e.g., and before playback of the content item occurs), in response to receiving an input corresponding to selection of the respective content item (e.g., corresponding to representation 610B in fig. 6A), the electronic device (e.g., 500 a) displays a second selectable option (e.g., 622 a) via a display generation component in communication with the electronic device (e.g., 500 a) that, when selected, causes the electronic device (e.g., 500 a) to initiate playback of the respective content item in the private content playback mode on the electronic device (e.g., 500 a) without initiating playback of the respective content item on one or more second electronic devices (e.g., 500B), such as shown in fig. 6B. In some embodiments, if the electronic device is not already in the private content playback mode (e.g., the electronic device is in the shared content playback mode or is not in a playback mode associated with the communication session (e.g., the electronic device is not playing content when receiving input)), the electronic device enters the private content playback mode of the communication session in response to detecting the selection of the second selectable option. In some implementations, one or more second electronic devices in the communication session do not receive an indication of an input corresponding to a selection of a respective content item and forego initiating playback of the respective content item because the electronic device is playing the respective content item in the private content playback mode. In some implementations, one or more second electronic devices in a shared content playback mode of a communication session play a second content item, while the electronic devices play a corresponding content item in a private content playback mode.
The above-described manner of presenting options for playing respective content items in a shared content playback mode or in a private content playback mode provides an efficient way of confirming a playback mode to be used when a selection of a content item is received, which additionally reduces power usage and extends battery life of the electronic device by enabling a user to more quickly and efficiently use the electronic device, such as by reducing user errors that initiate playback in a playback mode that is not desired by the user, which would then be reversed.
In some embodiments, the electronic device has settings that cause the electronic device to remain in the private content playback mode or in the shared content playback mode while in the communication session. For example, when one of the settings is activated, the electronic device, in response to detecting a selection of a content item, forgoes presenting an option for playing the corresponding content item in the private content playback mode or the shared content playback mode, and instead initiates playback in the content playback mode of the electronic device without additional input confirming the content playback mode. In some implementations, the electronic device automatically initiates playback of the selected content item selected at the electronic device and at the one or more second electronic devices in the communication session when the setting for maintaining the shared content playback mode is in an active state (e.g., without confirming that the input of the content item should be played for the one or more second electronic devices in the communication session). In some implementations, the electronic device automatically initiates playback of the content item selected at the electronic device when the setting for maintaining the private content playback mode is in an active state without initiating playback of the content item at the one or more second electronic devices in the communication session (e.g., without confirming that the input of the content item should not be played for the one or more second electronic devices in the communication session).
In some implementations, the input corresponding to the request to modify playback of the content item is the following: play/pause a content item, erase a content item, skip to a next content item in a content item queue, skip back to a beginning of a content item or a previous content item in a content queue, or change a playback position within a content item in response to detecting selection of a portion of lyrics of the content item. In some implementations, if the input is received while the electronic device is in a shared content playback mode of the communication session, the electronic device and the one or more second electronic devices modify playback of the content item according to the input. For example, while in the shared content item playback mode of the communication session, the electronic device and the one or more second electronic devices stop rendering the content item and initiate playback of the next content item in the content item queue in response to detecting an input to skip forward to the next content item in the content item queue.
In some embodiments, such as shown in fig. 6O, when the electronic device (e.g., 500 b) is in a communication session and when the electronic device (e.g., 500 b) is in a shared content playback mode of the communication session, the electronic device (e.g., 500 b) plays respective content items from a shared content item queue that includes a plurality of content items (e.g., corresponding to the communication sessions of representations 650a-650 c), wherein the electronic device (e.g., 500 b) and one or more second electronic devices (e.g., 500 a) are capable of editing the shared content item queue. In some implementations, after playback of the respective content item is completed, the electronic device and the one or more second electronic devices initiate playback of a next content item in the shared content item queue. In some implementations, playback of the content items from the content item queue continues until all of the content items in the content item queue have been played. In some implementations, after playing all of the content items in the queue (e.g., no more content items in the queue), the electronic device continues to play content selected based on the content items in the queue and the content consumption history of the user in the communication session, and/or stops playback of the content items. In some implementations, editing the shared content item queue includes adding content items or removing content items from the queue and/or reordering content items in the queue.
The above-described manner of playing content items from a shared content item queue in a shared content playback mode of a communication session provides an efficient manner of automatically playing multiple content items one after another in a shared content playback mode of a communication session, which additionally reduces power usage and extends battery life of the electronic device by enabling a user to more quickly and efficiently use the electronic device.
In some implementations, when the electronic device (e.g., 500 b) is in a communication session (and when the electronic device is in a shared content playback mode of the communication session), the electronic device (e.g., 500 b) receives input via one or more input devices corresponding to a request to display a shared content item queue via a display generation component in communication with the electronic device, such as detecting selection of option 628m in fig. 6N. In some implementations, the electronic device displays a user interface of a content (e.g., streaming, playback, browsing, etc.) application associated with the communication session, the user interface including selectable options that, when selected, cause the electronic device to display a content item queue of the content application. In some implementations, when the electronic device is in a communication session (e.g., in a shared playback mode), the electronic device displays a shared content item queue of the communication session in response to detecting the selection of the selectable option. In some implementations, the electronic device displays a private content item queue that is different from the shared content item queue in response to detecting selection of the selectable option when the electronic device is not engaged in the communication session and/or when the electronic device is in a private content playback mode in the communication session. In some implementations, the content item queue includes an ordered list of content items that the electronic device is configured to play in the order of the queue. For example, if the queue includes a first content item followed by a second content item, the electronic device will play the first content item followed by the second content item.
In some embodiments, such as shown in fig. 6O, when an electronic device (e.g., 500 b) is in a communication session (and when the electronic device is in a shared content playback mode of the communication session), in response to receiving an input corresponding to a request to display a shared content item queue, the electronic device (e.g., 500 b) displays a first representation (e.g., 650 a) of a first content item included in the shared content item queue via a display generation component, the first representation (e.g., 650 a) of the first content item being displayed in association with a representation (e.g., 652 b) of a user (e.g., associated with one of the electronic devices participating in the communication session) that added the first content item to the shared content item queue. In some implementations, the first representation of the first content item included in the shared content item queue includes an indication of a title of the first content item, an artist of the first content item, and/or an image associated with the first content item (e.g., album art of the first content item). In some implementations, the electronic device displays an image (e.g., an avatar) corresponding to a user adding the first content item to the shared content item queue within the first representation of the first content item. For example, an image associated with a user adding the first content item to the shared content item queue is displayed (e.g., at least partially) overlaid on the image associated with the first content item. In some implementations, the image associated with the user adding the first content item to the shared content item queue is an image associated with or included in a contact card of the user accessible by or stored on the electronic device.
In some implementations, as shown in fig. 6O, when the electronic device (e.g., 500 b) is in a communication session (and when the electronic device is in a shared content playback mode of the communication session), in response to receiving an input corresponding to a request to display the shared content item queue, the electronic device (e.g., 500 b) displays, via the display generation component, a second representation (e.g., 650 b) of a second content item included in the shared content item queue, the second representation (e.g., 650 b) of the second content item being displayed in association with a representation of a user adding the second content item to the shared content item queue. In some implementations, the second representation of the second content item included in the shared content item queue includes an indication of a title of the second content item, an artist of the second content item, and/or an image associated with the second content item (e.g., album art of the second content item). In some implementations, the electronic device displays an image (e.g., an avatar) corresponding to a user adding the second content item to the shared content item queue within the second representation of the second content item. For example, an image associated with a user adding the second content item to the shared content item queue is displayed (e.g., at least partially) overlaid on the image associated with the second content item. In some implementations, the image associated with the user adding the second content item to the shared content item queue is an image associated with or included in a contact card of the user accessible by or stored on the electronic device. In some implementations, the first representation and the second representation are displayed in an order that reflects an order of the first content item and the second content item in the shared content item queue. For example, if the first content item is located before the second content item in the shared content item queue, the electronic device displays the first representation over or to the right of the second representation.
The above-described manner of displaying a representation of a user adding a first content item and a second content item to a shared content item queue provides an efficient way of informing the user of each content item added to the queue while displaying the queue and indicating to the user whether the queue is a dedicated queue or a shared queue, which additionally reduces power usage and extends battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiently.
In some implementations, such as shown in fig. 6T, an electronic device (e.g., 500 a) receives, via one or more input devices, a corresponding input (e.g., via contact 603T) to a request to remove a respective content item included in the shared content item queue (e.g., corresponding to representation 650 i) from the shared content item queue. In some implementations, the input corresponding to the request to remove the respective content item included in the shared content item queue is a selection of a selectable option for removing the respective content item from the shared content item queue. In some implementations, selectable options are displayed within or near representations of respective content items in the shared content item queue. In some implementations, the selectable options are presented in response to a user performing a swipe gesture on a representation of a respective content item in the shared content item queue. For example, the electronic device detects a swipe from right to left of the representation of the respective content item and, in response, displays an option to delete the respective content item from the shared content item queue. In some implementations, the input corresponding to the request to remove the respective content item from the shared content item queue is a swipe input having a duration, distance, and/or speed exceeding a predetermined threshold. In some implementations, if the swipe input has a duration, distance, and/or speed that is less than a predetermined threshold, the electronic device displays selectable options for removing the respective content item from the shared content item queue.
In some implementations, such as shown in fig. 6U, in response to receiving an input corresponding to a request to remove a respective content item from a shared content item queue, an electronic device (e.g., 500 a) displays a visual indication (e.g., 666) via a display generation component in communication with the electronic device that indicates that the respective content item is to be removed from the shared content item queue of the electronic device (e.g., 500 a) and one or more second electronic devices (e.g., 500 b). In some embodiments, the visual indication is a dialog box comprising text indicating: removing the respective content item from the content item queues of all users in the communication session; an option selectable to confirm that the corresponding content item should be removed from the shared content item queue; and an option to dismiss the dialog box without removing the corresponding content item from the shared content item queue. In some implementations, in response to detecting a selection of an option to confirm that a respective content item should be removed from the shared content item queue, the electronic device removes the respective content item from the shared content item queue. In some implementations, after removing the respective content item from the shared content item queue, the electronic device and the one or more second electronic devices in the communication session will not play the respective content item during the communication session (e.g., unless it is added to the queue again). In some implementations, the visual indication is displayed in response to detecting a selection of a selectable option for deleting a respective content item from the shared content item queue. In some embodiments, the visual indication is displayed in response to detecting a swipe input with a speed, duration, or distance exceeding a predetermined threshold (e.g., without first detecting selection of a selectable option).
The above-described presentation of an indication that removing a respective content item from a shared content item queue removes content items for all users in a session provides an efficient way of allowing a user to confirm that a content item is removed from the shared content item queue before removing the respective content item for all users in a communication session, which additionally reduces power usage and extends battery life of the electronic device by enabling the user to more quickly and efficiently use the electronic device, by preventing the user from performing unexpected functions that will subsequently be reversed.
In some implementations, such as shown in fig. 6W, an electronic device (e.g., 500 a) receives, via one or more input devices, an input corresponding to a request to change a playback order of a plurality of respective content items included in a shared content item queue (e.g., via contact 603 v) when the playback order of the plurality of respective content items included in the shared content item queue is a first playback order. In some implementations, the request to change the playback order of the plurality of respective content items included in the shared content item queue is a request to move the respective content items up or down in the shared playback queue. For example, if the first playback order includes playing the first content item, then the second content item, then the third content item, then the request is a request to move the first content item to a position after the second content item and/or the third content item. As another example, the request is a request prior to moving the third content item to the first content item and/or the second content item in the content item queue. As another example, the request is a request to move the second content item to play before the first content item or after the third content item.
In some implementations, in response to receiving an input corresponding to a request to change a playback order of a plurality of respective content items included in the shared content item queue, such as shown in fig. 6W, in accordance with a determination that a respective one of the one or more second electronic devices (e.g., 500 b) detected an input corresponding to a second request to change the playback order of a plurality of respective content items included in the shared content item queue from the first playback order before the electronic device (e.g., 500 a) received the input corresponding to the request to change the playback order of a plurality of respective content items included in the shared content item queue, the electronic device (e.g., 500 a) updates the playback order of the respective content items included in the shared content item queue to a second playback order, such as shown in fig. 6X (e.g., without reordering the content items according to the first request) in accordance with the input detected at the respective one of the one or more second electronic devices (e.g., 500 b). In some implementations, the input corresponding to the second request is an input to move the respective content item in the content item queue up or down in a playback order of the content item queue, such as one of the examples described above with respect to the first request. In some embodiments, the server receives input corresponding to the first request and input corresponding to the second request, and if the second request is received before the first request (and the first request is received before the content item queue is updated to the second playback order according to the second request), the server reorders the content item queue according to the second request. In some implementations, if the first request is received after the content item queue is updated to the second playback order according to the second request, the electronic device updates the content item queue according to the first request.
In some embodiments, in response to receiving an input corresponding to a request to change the playback order of a plurality of respective content items included in the shared content item queue, such as shown in fig. 6W, in accordance with a determination that before the electronic device (e.g., 500 b) receives an input corresponding to a request to change the playback order of a plurality of respective content items included in the shared content item queue, the electronic device (e.g., 500 b) does not detect an input corresponding to a second request to change the playback order of a plurality of respective content items included in the shared content item queue from the first playback order (e.g., none of the one or more second electronic devices detects an input corresponding to a request to reorder the content item queue, or receives an input corresponding to a second request after the input corresponding to the first request), such as shown in fig. 6W, the electronic device (e.g., 500 b) updates the order of respective content items included in the shared content item queue to a second playback order, such as shown in fig. 6X, according to the input detected at the electronic device (e.g., 500 b). (e.g., without updating the content item queue based on the second request). In some embodiments, the server receives input corresponding to the first request and input corresponding to the second request, and if the first request is received before the second request (and the second request is received before the content item queue is updated to the third playback order according to the first request), the server reorders the content item queue according to the first request. In some embodiments, if the second request is received after the content item queue is updated to the third playback order according to the first request, the electronic device updates the content item queue according to the second request. In some implementations, if two or more requests ordering the content item queue are received before the content item queue is updated in response to any requests, the electronic device updates the content item queue according to the first received request.
The above-described manner of reordering the shared content item queue in accordance with a request to reorder the first received content item queue provides an efficient manner of reordering the content item queue in a predictable manner, which additionally reduces power usage and extends battery life of the electronic device by enabling a user to more quickly and efficiently use the electronic device, by avoiding reordering the queue in a manner that is undesirable to the user and will then be reversed.
In some implementations, after playing a first content item in a shared content item queue (e.g., a content item queue displayed by electronic device 500b in fig. 6O), the electronic device (e.g., 500 b) initiates playback of a second content item (e.g., 650 a) in accordance with determining that the content item queue includes a second content item (e.g., 650 a) after the first content item in the content item queue. In some implementations, after playback of the respective content item is completed, the electronic device (e.g., and one or more second electronic devices) plays the next content item in the content item queue.
In some implementations, after playing a first content item (e.g., 650 c) in a shared content item queue (e.g., a content item queue displayed by electronic device 500b in fig. 6O), the electronic device (e.g., 500 b) plays, during the communication session, a plurality of content items (e.g., 650a-650 c) previously included in the shared content item queue based on the communication session and content consumption history of a user account associated with the electronic device (e.g., 500 b) and one or more user accounts associated with one or more second electronic devices (e.g., 500 a) in accordance with determining that the content item queue does not include a second content item (e.g., 650 c) after the first content item (e.g., or any content item) in the content item queue. In some implementations, after playing the respective content item, the electronic device (e.g., and the one or more second electronic devices) plays the content item selected based on the content item included in the content item playback queue and a content consumption history of a user account of the electronic device in the communication session if no content item remains in the content item queue. In some implementations, one or more content items that play after all of the content items in the play queue have the same or similar artists and/or genres as the content items in the shared content item queue and/or the content consumption history of the user. In some embodiments, when the electronic device is in a private content playback mode or when the electronic device is not in a communication session, after playing all of the content items in the private content item queue, the electronic device plays the content items selected based on the content items in the private content item queue and/or the content consumption history of the user account associated with the electronic device, but does not play the content items selected based on the content consumption history of the shared content item queue and/or the user account associated with the one or more second electronic devices. In some embodiments, when the electronic device is playing the content items from the private queue while the electronic device is not in the shared content playback mode of the communication session (e.g., the electronic device is in the private content playback mode of the communication session or is not in the communication session), the electronic device stops playing the content once the electronic device plays all of the content items in the private queue.
The above-described manner of playing content items based on content consumption histories of the user accounts of the electronic device and the one or more second electronic devices and the content items in the shared content item play queue after playing the content items in the shared content item play queue provides an efficient manner of continuing to play content after all of the content items in the play queue, which additionally reduces power usage and extends battery life of the electronic device by enabling a user to more quickly and efficiently use the electronic device.
In some implementations, such as shown in fig. 6L, an electronic device (e.g., 500 a) detects an input (e.g., via contact 603L) corresponding to a request to initiate playback of a respective content item via one or more input devices. In some implementations, the input includes a selection of a representation of a respective content item displayed in a user interface of a content (e.g., browsing, streaming, playback, library) application associated with the communication session. In some implementations, the selection of the representation is a secondary selection (e.g., right click, long tap, hard tap, etc.), and in response to detecting the secondary selection of the representation, the electronic device displays an options menu to perform a plurality of respective actions with respect to the respective content item.
In some implementations, in response to detecting an input (e.g., via contact 603L) corresponding to a request to initiate playback of a respective content item, such as shown in fig. 6L, in accordance with a determination that an electronic device (e.g., 500 a) is in a communication session and the electronic device (e.g., 500 a) is in a shared content playback mode of the communication session and playing a second respective content item (e.g., the second respective content item is also playing at one or more second electronic devices in the communication session) when receiving the input corresponding to the request to initiate playback of the respective content item, such as shown in fig. 6L, the electronic device (e.g., 500 a) displays a selectable option (e.g., 646 e) via a display generating component in communication with the electronic device that, when selected, causes the electronic device (e.g., 500 a) to stop playing the second respective content item at the electronic device (e.g., 500 a) without causing the one or more second electronic devices to stop playing the second respective content item and initiate playback of a predetermined respective portion of the respective content item at the electronic device (e.g., 500 a) without initiating playback of the respective portion of the respective content item at the one or more respective content item, such as shown in fig. 500 b. In some implementations, the predetermined respective portion of the respective content item is a preview of the respective content item. In some implementations, the predetermined respective portion of the respective content item is a portion other than the beginning of the respective content item. In some implementations, the duration of the predetermined respective portion of the respective content item is less than the full duration of the respective content item, and begins after the beginning of the respective content item and ends before the end of the respective content item. In some implementations, the predetermined respective portions of the respective content items have a predetermined duration, and the predetermined respective portions of the second respective content items have the same predetermined duration. For example, a plurality of content items available through a content application associated with a communication session have an associated 30 second preview. In some implementations, the electronic device and the one or more second electronic devices initiate playback of the respective content items in response to selection of an option to play the respective content items for a user in the communication session. In some implementations, in response to selection of an option to play the respective content item at the electronic device without playing the respective content item at the one or more second electronic devices, the electronic device initiates playback of the respective content item without the one or more second electronic devices initiating playback of the respective content item. In some implementations, the electronic device resumes playback of the second corresponding content item once the electronic device has completed playing the predetermined corresponding portion of the corresponding content item. In some implementations, the electronic device resumes playback of the second corresponding content items at a playback position that is later than a playback position of the second corresponding content items at which the electronic device stopped playback of the second corresponding content items, because the second corresponding content items continue to be played on the one or more second electronic devices while the electronic device is playing the predetermined respective portions of the respective content items. In some embodiments, each time the electronic device stops playing content in the shared content playback mode to play other content in the private content playback mode, the electronic device resumes playback from the same playback position as the playback position of the content at the one or more second electronic devices, optionally a playback position different (e.g., later) than the playback position at which the electronic device stopped playing the content in the shared content playback mode, when resuming playback in the shared content playback mode.
In some implementations, such as shown in fig. 6AA, in response to detecting an input corresponding to a request to initiate playback of a respective content item (e.g., via contact 603 AA), in accordance with a determination that the electronic device (e.g., 500 a) is not in a communication session when receiving the input corresponding to the request to initiate playback of the respective content item (or in accordance with a determination that the electronic device is in a communication session and in a private content playback mode when receiving the input corresponding to the request to initiate playback of the respective content item), the electronic device (e.g., 500 a) displays a selectable option (e.g., selectable option 646e shown in fig. 6M), such as shown in fig. 6 BB. In some implementations, when the electronic device is not in the shared content playback mode, the electronic device does not display an option to present a preview of the content item in response to a (e.g., secondary) selection of a representation of the content item. In some embodiments, the electronic device presents an option for privately presenting a preview of the content item when the electronic device is in the shared content playback mode to allow the user to repeatedly check in the communication session whether the content item is one that they intend to play for the user group before playing the content item for the user group, and does not present the option when the electronic device is not in the shared content playback mode.
The above-described manner of presenting respective predefined portions of a content item for presentation at an electronic device without presenting respective predefined portions of the content item to one or more second electronic devices provides an efficient manner of allowing a user to preview the content item prior to playing the content item for the user in a communication session, which additionally reduces power usage and extends battery life of the electronic device by enabling the user to more quickly and efficiently use the electronic device.
In some embodiments, in order for an electronic device in a communication session to participate in synchronized playback of a content item, a user account of the electronic device in the communication session needs to subscribe to authorize the user account to access the subscription of the content item through a content service associated with the communication session. In some embodiments, subscriptions of users in a communication session need to be associated with the same area. For example, a first user having a first country-based content subscription and a second user having a second country-based content subscription cannot participate in synchronized content playback during a communication session because their subscriptions are based on different countries.
In some implementations, such as shown in fig. 6Y, when the electronic device (e.g., 500 a) is in a communication session and when the electronic device (e.g., 500 a) is in a shared content playback mode of the communication session, in accordance with a determination that the communication session is initiated at a first respective electronic device (e.g., 500 a) associated with a first content playback limit (e.g., associated with a user account associated with the first content playback limit), the electronic device (e.g., 500 a) limits playback of one or more content items in accordance with the first content playback limit. In some embodiments, the content playback limit is parental control that limits access to the open bone content by the first respective electronic device. For example, if the communication session is initiated by an electronic device that is active to prevent parental control of access to open bone content, then it is not possible for any electronic device in the communication session to initiate playback of open bone content in the shared content playback mode during participation in the communication session (e.g., at multiple or all electronic devices in the communication session).
In some implementations, such as shown in fig. 6Z, when the electronic device (e.g., 500 a) is in a communication session and when the electronic device (e.g., 500 a) is in a shared content playback mode of the communication session, in accordance with a determination that the communication session is initiated at a second respective electronic device (e.g., 500 b) associated with a second content playback limit (e.g., associated with a user account associated with the second content playback limit), the electronic device (e.g., 500 a) limits playback of one or more content items in accordance with the second content playback limit. In some embodiments, the content playback limit is parental control not enabled on the second corresponding electronic device, thus enabling the second corresponding electronic device to access the open bone content. For example, if the communication session is initiated by an electronic device that is parental controlled to be in an inactive state that prevents access to open bone content, any electronic device in the communication session may initiate playback of the open bone content in a shared content playback mode (e.g., at multiple or all electronic devices in the communication session). In some implementations, various content playback limit settings may be used for the first and second respective electronic devices. For example, the first setting allows access to the first subset of content items but restricts access to the second subset and the third subset of content items, and the second setting allows access to the first subset and the second subset of content items but restricts access to the third subset of content items. In this example, if the electronic device initiating the communication session is configured with the first setting, the electronic device in the communication session is capable of initiating playback of the first subset of content items in the shared content playback mode, but is not capable of initiating playback of the second subset and the third subset of content items in the shared content playback mode. As another example, if the electronic device initiating the communication session is configured with the second setting, the electronic device in the communication session is capable of initiating playback of the first subset and the second subset of content items in the shared content playback mode, but is not capable of initiating playback of the third subset of content items in the shared content playback mode.
The above-described manner of applying content restriction settings of an electronic device that initiates a communication session to content played in a shared content playback mode of the communication session provides an efficient way of maintaining content restriction settings and synchronized playback in the communication session, which additionally reduces power usage and extends battery life of the electronic device by enabling a user to more quickly and efficiently use the electronic device.
In some implementations, after playback of the first content item is completed, in accordance with a determination that the electronic device (e.g., 500 a) is in a communication session and the electronic device is in a shared content playback mode of the communication session, such as shown in fig. 6X, the electronic device (e.g., 500 a) initiates playback of a second content item (e.g., 650 g) included in a shared content item queue associated with the communication session. In some implementations, when the electronic device is in a shared playback mode of the communication session, the electronic device (e.g., and one or more second electronic devices in the communication session) automatically plays a next content item in a shared content item queue associated with the communication session after playback of the respective content item is completed.
In some implementations, after completing playback of the first content item, in accordance with a determination that the electronic device (e.g., 500 a) is not in a shared content playback mode of the communication session (e.g., the electronic device is not in the communication session or the electronic device is in a private content playback mode of the communication session), such as shown in fig. 6CC, the electronic device (e.g., 500 a) initiates playback of a third content item (e.g., 590 a) included in a private content item queue not associated with the communication session. In some implementations, when the electronic device is not in the shared playback mode of the communication session, the electronic device automatically plays a next content item in the private content item queue that is not associated with the communication session after playback of the respective content item is completed (e.g., and the one or more second electronic devices forgo playing the next content item in the private content item queue). In some implementations, the electronic device maintains a private content item queue from which the electronic device plays content items when the electronic device is not in a shared content playback mode of the communication session. For example, the electronic device plays content from the private content item queue before entering the shared content item playback mode and/or after exiting the shared content item playback mode.
The above-described manner of playing content items from the private content item queue when not in the shared content item playback mode and playing content items from the shared content item queue when in the shared content item playback mode provides an efficient manner of maintaining the private content item queue for use when the electronic device is not in the shared content playback mode, which additionally reduces power use and extends battery life of the electronic device by enabling a user to more quickly and efficiently use the electronic device (e.g., by not requiring the user to establish the private queue after exiting the shared content playback mode).
In some embodiments, such as shown in fig. 6A, when an electronic device (e.g., 500 a) is in a respective communication session with one or more electronic devices (e.g., 500 b) (e.g., the same or different from one or more second electronic devices), the one or more electronic devices (e.g., 500 b) are associated with one or more users other than the user of the electronic device, the electronic device (e.g., 500 a) receives an indication of a request (e.g., via contact 603 a) via the one or more electronic devices (e.g., from one of the one or more third electronic devices, e.g., via one or more servers in communication with the electronic device, with the one or more third electronic devices, and/or with each other) to initiate playback of the respective content item in the respective communication session. In some implementations, one of the one or more third electronic devices receives an input corresponding to a request to initiate playback of a respective content item in a shared content playback mode of a respective communication session and an indication of the request is transmitted to other electronic devices in the communication session.
In some implementations, in response to receiving (e.g., via contact 603 a) an indication to initiate a request for playback of a respective content item in a respective communication session, such as shown in fig. 6A, in accordance with a determination that the content playback limit setting of the electronic device (e.g., 500 a) is not more than the content playback limit setting limit associated with the respective communication session, the electronic device (e.g., 500 a) displays, via a display generation component in communication with the electronic device (e.g., 500 a), a selectable option (e.g., 622B) that, when selected, causes the electronic device to initiate playback of the respective content item in a shared content playback mode of the respective communication session, such as shown in fig. 6B. In some embodiments, the content playback limit setting is a parental control setting that limits the electronic device from accessing open bone content. For example, if the respective communication session limits playback of the open bone content and the electronic device does not have any content playback restrictions, the electronic device can play the content in a shared content playback mode of the respective communication session. In some embodiments, the content playback limit setting is set for a user account associated with the electronic device and is active on all electronic devices associated with the user account. In some embodiments, when the electronic device is in a communication session, the electronic device plays the content item in a shared content item playback mode of the communication session in accordance with determining that the content restriction setting of the communication session is not less restrictive than the content restriction setting of the electronic device (e.g., a user account of the electronic device). For example, the content restriction settings prevent the electronic device from playing the open bone content item. As another example, the content restriction settings prevent the electronic device from playing content items having one or more predetermined content ratings (e.g., PG-13, R, TV-17, TV-MA, etc.). In some implementations, if the content restriction settings of the communication session are more restrictive than the content restriction settings of the electronic device, the electronic device is not capable of playing content items restricted by the communication session in the shared content playback mode, even if the electronic device is capable of playing those content items in the private playback mode and/or when not in the communication session.
In some implementations, in response to receiving (e.g., via contact 603 a) an indication to initiate a request for playback of a respective content item in a respective communication session, such as shown in fig. 6A, in accordance with a determination that the content playback limit setting of the electronic device (e.g., 500 a) is more restrictive than the content playback limit setting associated with the respective communication session, the electronic device (e.g., 500 a) relinquishes display of the selectable option (e.g., 622B shown in fig. 6B), such as shown in fig. 6 BB. For example, if the respective communication session allows playback of the open bone content and the electronic device restricts playback of the open bone content, the electronic device is not capable of playing the content in the shared content playback mode of the respective communication session. In some implementations, the electronic device presents a visual indication that one of the one or more third electronic devices has initiated playback of the respective content item in the respective communication session, but the electronic device is unable to play content in the shared content playback mode of the respective communication session due to the content restriction settings on the electronic device. In some embodiments, the communication session has content playback limit settings that match content playback limit settings of an electronic device that initiated the communication session. The above-described manner of preventing playback of content in a shared content playback mode of a respective communication session provides an efficient way of maintaining content restriction settings and synchronized playback in the communication session when the content playback restriction settings of the electronic device are more restrictive than the content playback restriction settings of the respective communication session, which additionally reduces power usage and extends battery life of the electronic device by enabling a user to use the electronic device more quickly and efficiently.
In some implementations, when a user in a communication session initiates playback of a content item in a shared content playback mode, if the electronic device is not already in the shared content playback mode, the electronic device presents a selectable option that, when selected, causes the electronic device to enter the shared content playback mode. In some embodiments, if the electronic device is not subscribed to a content service associated with the communication session when the selection of the selectable option is detected, the electronic device presents an option that, when selected, causes the electronic device to initiate a process of subscribing to the content service.
In some embodiments, when the electronic device is in a communication session, the electronic device can access a first corresponding content (e.g., streaming, browsing, playback, library, etc.) application for playing content associated with the communication session. In some implementations, when the electronic device is in a shared content playback mode of the communication session, the electronic device plays (and one or more second electronic devices play) the content item via the first respective content application.
In some embodiments, such as shown in fig. 6I, when an electronic device (e.g., 500 b) is in a respective communication session with one or more third electronic devices (e.g., 500 a) (e.g., the one or more third electronic devices are the same as or different from the one or more second electronic devices), the one or more third electronic devices (e.g., 500 a) are associated with one or more third users (e.g., the same as or different from the second user) that are different from the user of the electronic device (e.g., 500 b), the electronic device (e.g., 500 a) receives an indication of shared playback of respective content items associated with the respective communication session from the respective one of the one or more third electronic devices (e.g., 500 a) (e.g., optionally via one or more servers in communication with the first electronic device, with the respective one or more third electronic devices, and/or in communication with each other), such as shown in fig. 6I. In some implementations, the indication is an indication of input received at one of the one or more third electronic devices corresponding to a request to initiate playback of the respective content item in the shared content playback mode of the communication session. In some implementations, in response to the indication, one or more electronic devices in a shared playback mode of the communication session initiate playback of the respective content item, and one or more electronic devices in the communication session that are not in the shared playback mode present a visual indication of the indication of shared playback of the respective content item.
In some embodiments, such as shown in fig. 6I, when an electronic device (e.g., 500 b) is in a respective communication session with one or more third electronic devices (e.g., 500 a) (e.g., the one or more third electronic devices are the same as or different from the one or more second electronic devices), the one or more third electronic devices (e.g., 500 a) are associated with one or more third users (e.g., the same as or different from the second user) other than the electronic device (e.g., 500 b), in response to receiving an indication of shared playback of the respective content item, in accordance with a determination that the electronic device (e.g., 500 b) may not access the second respective content application (e.g., the second respective content application is the same as or different from the first respective content application) for playing content associated with the respective communication session, the electronic device (e.g., 500 b) displays a selectable option (e.g., 636 d) via a display generating component in communication with the electronic device (e.g., 500 b), which, when the selectable option is caused to be initiated by the electronic device (e.g., 500 b) to obtain access to the respective content item. In some implementations, the selectable options are displayed at locations within or near a visual indication of shared playback of the respective content items in a shared playback mode of the communication session. In some embodiments, in response to detecting the selection of the selectable option, the electronic device initiates a process of obtaining access (e.g., download, installation) to the second corresponding content application. In some implementations, the second corresponding content application is an application by which the electronic device plays the content item in a shared content playback mode of the communication session. In some implementations, in response to receiving an indication of shared playback of the respective content item, in accordance with a determination that the electronic device can access the second respective content application, the electronic device displays a selectable option that, when selected, causes the electronic device to initiate playback of the respective content item in the shared content playback mode without initiating a process of gaining access to the second respective content application.
The above-described manner of presenting options for accessing a second corresponding content application in response to receiving shared playback of a content item in a shared playback mode of a corresponding communication session provides an efficient way of obtaining access to an application program required to play content in the shared content playback mode of the communication session, which additionally reduces power usage and extends battery life of the electronic device by enabling a user to more quickly and efficiently use the electronic device.
In some embodiments, such as shown in fig. 6G, when the electronic device (e.g., 500 a) is in a communication session and when the electronic device (e.g., 500 a) is in a shared content playback mode of the communication session, the electronic device (e.g., 500 a) receives an indication from a respective one of the one or more second electronic devices (e.g., 500 b) that corresponds to a request to modify playback of the content item at the electronic device (e.g., 500 a) and the one or more second electronic devices (e.g., 500 b) (e.g., via contact 603G), such as shown in fig. 6G. In some implementations, the request corresponds to a request to play the content item, pause the content item, skip forward to a next content item (e.g., in a content item queue), skip back to the beginning of the content item, skip back to a previous content item (e.g., in a content item queue), change a playback position within the content item, or modify a different manner of playback of the content item. In some embodiments, the first input is detected at a respective one of the one or more second electronic devices, and the electronic device receives an indication of the first input from the respective one of the one or more second electronic devices (e.g., via one or more servers).
In some implementations, such as shown in fig. 6H, when the electronic device (e.g., 500 a) is in a communication session and when the electronic device (e.g., 500 a) is in a shared content playback mode of the communication session, in response to receiving an indication of a first input (such as the input shown in fig. 6G), the electronic device (e.g., 500 a) modifies playback of the content item at the electronic device (e.g., 500 a) in accordance with the first input, such as shown in fig. 6H. In some implementations, in response to the input, the electronic device and the one or more second electronic devices in a shared content playback mode of the communication session modify playback of the content item in accordance with the request.
In some embodiments, such as shown in fig. 6H, when the electronic device (e.g., 500 a) is in a communication session and when the electronic device (e.g., 500 a) is in a shared content playback mode of the communication session, in response to receiving an indication of a first input (such as the input shown in fig. 6G), the electronic device (e.g., 500 a) displays a visual indication (e.g., 630) of the first input (e.g., a modification to playback of a content item associated with the first input) via a display generation component in communication with the electronic device (e.g., 500 a), such as shown in fig. 6H. In some embodiments, the visual indication includes an indication of a user associated with the electronic device that received the indication of the first input. In some embodiments, the visual indication includes an indication of a particular request made by the user. For example, if the first user provided an input corresponding to a request to pause the first content item, the visual indication would include an indication of the first user and an indication that the first input corresponds to a request to pause the content item. The above-described manner of displaying the visual indication of the first input provides an efficient way of indicating to the user the reason why playback of the content item was modified, which additionally reduces power usage and extends battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiently.
In some implementations, the visual indication does not include an indication of the content application if an indication of a first input is received when the electronic device presents a user interface of a content (e.g., browsing, streaming, playback, library, etc.) application associated with the communication session (e.g., an application by which the electronic device plays content in a shared content playback mode of the communication session). In some implementations, the visual indication includes an indication of the content application if an indication of a first input is received while the electronic device presents a user interface of an application other than the content (e.g., browse, stream, playback, library, etc.) application associated with the communication session. In some implementations, the indication of the first input presented when the electronic device presents a user interface of an application other than the content application associated with the communication session can be selectable to present a user interface of the content application associated with the communication session. In some embodiments, the indication of the first input presented when the electronic device presents the user interface of the content application associated with the communication session is not selectable to perform the action (e.g., because the electronic device has presented the user interface of the content application associated with the communication session).
In some implementations, such as shown in fig. 6Q, displaying the visual indication of the first input (e.g., 660 a) is in accordance with a determination that the request to modify playback of the content item is a request to modify playback of the content item in a first manner (e.g., the request corresponds to a request to perform one of a first subset of actions with respect to playback of the content), and receiving the indication of the first input while the electronic device (e.g., 500 b) is presenting a user interface of a content (e.g., streaming, browsing, playback, library) application associated with the communication session. In some implementations, when the electronic device displays a user interface of a content (e.g., streaming, browsing, playback, library) application associated with the communication session, the electronic device displays a visual indication of a first plurality of actions related to modifying playback of a corresponding content item of the shared content playback mode of the playback communication session.
In some implementations, such as shown in fig. 6Q, in response to receiving the indication of the first input, in accordance with a determination that the request to modify playback of the content item is a request to modify playback of the content item in a first manner and receiving the indication of the first input while the electronic device (e.g., 500 c) is presenting a user interface other than an application of the content application associated with the communication session, the visual indication of the first input (e.g., visual indication 660a displayed by electronic device 500 b) is forgoed. In some embodiments, when the electronic device is displaying a user interface other than the user interface of the content application associated with the communication session, the electronic device presents only a visual indication of a subset of actions for modifying playback of content played in the shared content playback mode of the communication session. For example, if a user adds or removes a content item from a shared content playback item queue associated with a communication session, the electronic device presents a visual indication of the addition or removal of the content item only if the electronic device is presenting a user interface of a content application associated with the communication session, and forego presenting the visual indication if the electronic device is presenting a user interface of a different application when input is received to perform an action (e.g., at a second one of the second electronic devices). In some implementations, if the electronic device detects an input corresponding to a request to modify playback of the content item, the electronic device foregoes presenting a visual indication of the request, regardless of the user interface being displayed by the electronic device.
The above-described manner of discarding the display of visual indications when the electronic device presents a user interface other than the user interface of the content application associated with the communication session provides an efficient manner of reducing interference when the user interacts with the application other than the content application associated with the communication session, which additionally reduces power usage and extends battery life of the electronic device by enabling the user to more quickly and efficiently use the electronic device.
In some embodiments, such as shown in fig. 6R, in response to receiving the indication of the first input, in accordance with a determination that the request to modify playback of the content item is a request to modify playback of the content item in a second manner different from the first manner (e.g., the request corresponds to a request to perform one of a second subset of actions with respect to playback of the content), the electronic device (e.g., 500 c) displays a visual indication of the first input (e.g., 660 c) when the indication of the first input is received, regardless of whether the electronic device (e.g., 500 c) is presenting a user interface of a content application associated with the communication session or presenting a user interface of an application different from the content application associated with the communication session. For example, when a user initiates playback of a content item, pauses a content item, skips a content item, changes the playback position of a content item currently being played, or skips to a previous content item in a shared content item queue associated with a communication session, the electronic device presents a visual indication regardless of whether the electronic device is presenting a user interface of a content application associated with the communication session or presenting a user interface of a different application.
The above-described manner of presenting the visual indication of the first respective input corresponding to the request to modify playback of the content item in the second manner, regardless of whether the electronic device is presenting the user interface of the content application or the user interface of a different application, provides an efficient way of informing the user of the reason why playback of the content item has changed, which additionally reduces power usage and extends battery life of the electronic device by enabling the user to use the electronic device more quickly and efficiently.
In some implementations, the electronic device displays a visual indication when a user adds a song to a shared content item queue of a communication session, initiates or pauses playback of a content item, jumps forward or jumps back to a content item in the shared content item queue, and/or transmits an indication of a reaction (e.g., favorites, likes, disfavorites, laughter, emphasis, etc.) to the content item.
In some implementations, the electronic device presents a user interface of a content (e.g., streaming, playback, browsing, library, etc.) application associated with the communication session. In some implementations, the electronic device uses a content application associated with the communication session to present the content item in a shared content playback mode of the communication session. In some implementations, the user interface of the content application associated with the communication session includes an indication of a current content playback state of the content application, including an indication of a content item played on the electronic device (e.g., title, artist, etc.), an indication of an audio device (e.g., speaker system, headset, etc.) used to present the content, an image (e.g., album art) associated with the content item played on the electronic device, a selectable option to play/pause the content, a selectable option to skip to a next content item in the content item queue, and a visual indication of a number of users consuming the content in a shared content playback mode of the communication session.
In some implementations, content applications associated with a communication session can access multiple types of content items, such as live radio stations, albums, playlists, songs, podcasts, and the like. In some implementations, a subset of the types of content items are available for playback in a shared content playback mode of the communication session, and a subset of the types of content items are unavailable for playback in a shared content playback mode of the communication session. For example, live content (such as a live radio) is not available for playback in the shared playback mode of the communication session, and other content (such as albums, playlists, songs, and podcasts) is available for playback in the shared playback mode of the communication session. In some embodiments, all types of content are available when the electronic device is not in a shared content playback mode (e.g., when the electronic device is not in a communication session, when the electronic device is in a private content playback mode of a communication session).
In some embodiments, such as shown in fig. 6E, when the electronic device (e.g., 500 b) is associated with a first content playback limit and when the electronic device (e.g., 500 b) is in a communication session, the communication session is associated with a second content playback limit, when the electronic device is not in a shared content playback mode of the communication session, the electronic device (e.g., 500 b) receives input (e.g., via contact 603E) corresponding to a request to enter the shared content playback mode of the communication session via one or more input devices, such as shown in fig. 6E. In some implementations, the input corresponding to the request to enter the shared content playback mode of the communication session is a selection of an option to initiate playback of the respective content item in the shared content playback mode of the communication session. In some implementations, the input corresponding to the request to enter the shared content playback mode of the communication session is a selection of a visual indication that one of the second electronic devices has initiated playback of the respective content item in the shared content playback mode of the communication session. In some embodiments, the content playback limit is parental control that limits access to the open bone content by the first respective electronic device. For example, if the communication session is associated with parental control that prevents access to the open bone content, it is not possible for any electronic device in the communication session to initiate playback of the open bone content in the shared content playback mode during participation in the communication session (e.g., at multiple or all electronic devices in the communication session). As another example, if the communication session is associated with a content playback limit that allows playback of all content available from the content service (including open bone content and non-open bone content), then an electronic device in the shared content playback mode of the communication session may play open bone content and non-open bone content in the shared content playback mode of the communication session. In some embodiments, various content playback limit settings may be available for electronic devices and communication sessions. For example, the first setting allows access to the first subset of content items but restricts access to the second subset and the third subset of content items, and the second setting allows access to the first subset and the second subset of content items but restricts access to the third subset of content items. In this example, if the communication session is associated with the first setting, the electronic device in the communication session is capable of initiating playback of the first subset of content items in the shared content playback mode, but is not capable of initiating playback of the second subset and the third subset of content items in the shared content playback mode. As another example, if the communication session is associated with the second setting, the electronic device in the communication session is capable of initiating playback of the first subset and the second subset of content items in the shared content playback mode, but is not capable of initiating playback of the third subset of content items in the shared content playback mode. In some embodiments, the communication session is associated with the same content restriction settings of the respective electronic device that originated the communication session. In some implementations, the communication session is associated with the same content restriction settings of the respective electronic device that initiated playback of the content in the shared content playback mode of the communication session. In some embodiments, the content playback limit of the electronic device is associated with a user account associated with the electronic device and with other electronic devices that are also associated with the same user account.
In some embodiments, when the electronic device (e.g., 500 b) is associated with a first content playback limit and when the electronic device (e.g., 500 b) is in a communication session, the communication session is associated with a second content playback limit, in response to input of a request corresponding to a shared content playback mode into the communication session (e.g., via contact 603E), such as shown in fig. 6E, in accordance with a determination that the first content playback limit is more than the second content playback limit, the electronic device (e.g., 500 b) relinquishes the shared content playback mode into the communication session, such as shown in fig. 6Y. In some embodiments, if the content playback limit of the communication session allows playback of content that is limited (e.g., not allowed) by the content playback limit of the electronic device (e.g., the content playback limit of the electronic device is more than the content playback limit of the communication session), the electronic device is able to join the communication session, but is unable to join the shared content playback mode of the communication session.
In some embodiments, when the electronic device (e.g., 500 b) is associated with a first content playback limit and when the electronic device (e.g., 500 b) is in a communication session, the communication session is associated with a second content playback limit, the electronic device (e.g., 500 b) enters a shared content playback mode of the communication session, such as shown in fig. 6G, in response to an input (e.g., via contact 603E) of a request corresponding to a shared content playback mode entered into the communication session, such as shown in fig. 6E, in accordance with a determination that the first content playback limit is not more than the second content playback limit. In some embodiments, if the content playback limit of the communication session limits playback of content limited (e.g., not allowed) by the content playback limit of the electronic device (e.g., the content playback limit of the communication session and the electronic device are the same), the electronic device is able to join the communication session and is able to join the shared content playback mode of the communication session. In some embodiments, if the content playback limit of the communication session limits playback of content allowed by the content playback limit of the electronic device (e.g., the content playback limit of the communication session is greater than the content playback limit of the electronic device), the electronic device is able to join the communication session and is able to join the shared content playback mode of the communication session.
The above-described manner of restricting access to a shared content playback mode of a communication session associated with fewer content playback restrictions than the content playback restrictions of the electronic device provides an efficient manner of enforcing content playback restrictions on the electronic device, which additionally reduces power usage and extends battery life of the electronic device by enabling a user to more quickly and efficiently use the electronic device.
In some embodiments, when the electronic device (e.g., 500 b) is associated with a first content playback limit, when the electronic device (e.g., 500 b) is not in a communication session, the communication session is associated with a second content playback limit, and the electronic device (e.g., 500 b) receives input corresponding to a request to join the communication session via one or more input devices. In some embodiments, the content playback limit is parental control that limits access to the open bone content by the first respective electronic device. In some embodiments, the content playback limit of the electronic device is associated with a user account associated with the electronic device and with other electronic devices that are also associated with the same user account. For example, if the communication session is associated with parental control that prevents access to the open bone content, it is not possible for any electronic device in the communication session to initiate playback of the open bone content in the shared content playback mode during participation in the communication session (e.g., at multiple or all electronic devices in the communication session). As another example, if the communication session is associated with a content playback limit that allows playback of all content available from the content service (including open bone content and non-open bone content), then an electronic device in the shared content playback mode of the communication session may play open bone content and non-open bone content in the shared content playback mode of the communication session. In some embodiments, various content playback limit settings may be available for electronic devices and communication sessions. For example, the first setting allows access to the first subset of content items but restricts access to the second subset and the third subset of content items, and the second setting allows access to the first subset and the second subset of content items but restricts access to the third subset of content items. In this example, if the communication session is associated with the first setting, the electronic device in the communication session is capable of initiating playback of the first subset of content items in the shared content playback mode, but is not capable of initiating playback of the second subset and the third subset of content items in the shared content playback mode. As another example, if the communication session is associated with the second setting, the electronic device in the communication session is capable of initiating playback of the first subset and the second subset of content items in the shared content playback mode, but is not capable of initiating playback of the third subset of content items in the shared content playback mode. In some embodiments, the communication session is associated with the same content restriction settings of the respective electronic device that originated the communication session. In some implementations, the communication session is associated with the same content restriction settings of the respective electronic device that initiated playback of the content in the shared content playback mode of the communication session.
In some embodiments, in response to an input corresponding to a request to join the communication session, the electronic device (e.g., 500 b) relinquishes the joining communication session in accordance with a determination that the first content playback limit is more restrictive than the second content playback limit. In some embodiments, the input corresponding to the request to join the communication session is a selection of an indication of initiation of the communication session (e.g., similar to an incoming call indication). In some embodiments, an electronic device is unable to join a communication session if the content playback limit of the communication session allows playback of content that is limited (e.g., not allowed) by the content playback limit of the electronic device (e.g., the content playback limit of the electronic device is significantly more than the content playback limit of the communication session).
In some embodiments, in response to an input corresponding to a request to join a communication session, in accordance with a determination that the first content playback limit is not more restrictive than the second content playback limit, the electronic device (e.g., 500 b) joins the communication session. In some embodiments, the electronic device is able to join the communication session if the content playback restriction of the communication session restricts playback of content restricted (e.g., not allowed) by the content playback restriction of the electronic device (e.g., the communication session and the content playback restriction of the electronic device are the same). In some embodiments, the electronic device is able to join the communication session if the content playback limit of the communication session limits playback of content allowed by the content playback limit of the electronic device (e.g., the content playback limit of the communication session is more than the content playback limit of the electronic device). The above-described manner of restricting access to communication sessions associated with fewer content playback restrictions than the content playback restrictions of the electronic device provides an efficient way of enforcing content playback restrictions on the electronic device, which additionally reduces power usage and extends battery life of the electronic device by enabling a user to more quickly and efficiently use the electronic device.
It should be understood that the particular order in which the operations in fig. 7 are described is merely exemplary and is not intended to suggest that the described order is the only order in which the operations may be performed. Those of ordinary skill in the art will recognize a variety of ways to reorder the operations described herein.
The operations in the above-described information processing method are optionally implemented by running one or more functional modules in an information processing apparatus such as a general purpose processor (e.g., as described in connection with fig. 1A-1B, 3, 5A-5H) or an application specific chip. Furthermore, the operations described above with reference to fig. 7 are optionally implemented by the components depicted in fig. 1A-1B. For example, receive operation 704 is optionally implemented by event sorter 170, event recognizer 180, and event handler 190. The event monitor 171 in the event sorter 170 detects a contact on the touch screen 504 and the event dispatcher module 174 communicates the event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch screen corresponds to a predefined event or sub-event, such as a selection of an object on the user interface. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or invokes data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update what is displayed by the application. Similarly, it will be apparent to one of ordinary skill in the art how other processes may be implemented based on the components depicted in fig. 1A-1B.
While the present disclosure and examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. It should be understood that such variations and modifications are considered to be included within the scope of the disclosure and examples as defined by the claims.
As described above, one aspect of the present technology is to collect and use data from various sources to share content items in a communication session. The present disclosure contemplates that in some examples, such collected data may include personal information data that uniquely identifies or may be used to contact or locate a particular person. Such personal information data may include demographic data, location-based data, telephone numbers, email addresses, tweet IDs, home addresses, data or records related to the user's health or fitness level (e.g., vital sign measurements, medication information, exercise information), date of birth, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information data in the present technology may be used to benefit users. For example, personal information data may be used to recommend content items based on content consumption histories of one or more users in a communication session. Thus, the use of such personal information data enables users to consume content related to their interests. In addition, the present disclosure contemplates other uses for personal information data that are beneficial to the user. For example, health and fitness data may be used to provide insight into the overall health of a user, or may be used as positive feedback to individuals using technology to pursue health goals.
The present disclosure contemplates that entities responsible for collecting, analyzing, disclosing, transmitting, storing, or otherwise using such personal information data will adhere to established privacy policies and/or privacy practices. In particular, such entities should exercise and adhere to privacy policies and practices that are recognized as meeting or exceeding industry or government requirements for maintaining the privacy and security of personal information data. Such policies should be readily accessible to the user and should be updated as the collection and/or use of the data changes. Personal information from users should be collected for legal and reasonable use by entities and not shared or sold outside of these legal uses. In addition, such collection/sharing should be performed after informed consent is received from the user. In addition, such entities should consider taking any necessary steps to defend and secure access to such personal information data and to ensure that others who have access to personal information data adhere to their privacy policies and procedures. In addition, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and practices. In addition, policies and practices should be adjusted to collect and/or access specific types of personal information data and to suit applicable laws and standards including specific considerations of jurisdiction. For example, in the united states, the collection or acquisition of certain health data may be governed by federal and/or state law, such as the health insurance flow and liability act (HIPAA); while health data in other countries may be subject to other regulations and policies and should be processed accordingly. Thus, different privacy practices should be maintained for different personal data types in each country.
In spite of the foregoing, the present disclosure also contemplates embodiments in which a user selectively prevents use or access to personal information data. That is, the present disclosure contemplates that hardware elements and/or software elements may be provided to prevent or block access to such personal information data. For example, with respect to a multi-participant real-time communication session, the techniques of the present invention may be configured to allow a user to choose to "join" or "exit" participation in the collection of personal information data during or at any time subsequent to the registration service. In another example, a user may choose not to share data associated with the user. In addition to providing the "opt-in" and "opt-out" options, the present disclosure also contemplates providing notifications related to accessing or using personal information. For example, the user may be notified that his personal information data will be accessed when the application is downloaded, and then be reminded again just before the personal information data is accessed by the application.
Further, it is an object of the present disclosure that personal information data should be managed and processed to minimize the risk of inadvertent or unauthorized access or use. Once the data is no longer needed, risk can be minimized by limiting the data collection and deleting the data. In addition, and when applicable, included in certain health-related applications, the data de-identification may be used to protect the privacy of the user. De-identification may be facilitated by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of stored data (e.g., collecting location data at a city level instead of at an address level), controlling how data is stored (e.g., aggregating data among users), and/or other methods, as appropriate.
Thus, while the present disclosure broadly covers the use of personal information data to implement one or more of the various disclosed embodiments, the present disclosure also contemplates that the various embodiments may be implemented without accessing such personal information data. That is, various embodiments of the present technology do not fail to function properly due to the lack of all or a portion of such personal information data. For example, a gaze representation of a particular user may be corrected at another user's system by inferring preferences and/or gaze directions of the particular user and/or other users based on non-personal information data or absolute minimal personal information, such as content requested by a device associated with the user, other non-personal information available to a multi-participant real-time communication service, or publicly available information.
The foregoing description, for purposes of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application to thereby enable others skilled in the art to best utilize the invention and various described embodiments with various modifications as are suited to the particular use contemplated.

Claims (27)

1. A method, comprising:
at an electronic device in communication with one or more input devices:
when the electronic device is in a communication session with one or more second electronic devices, the one or more second electronic devices are associated with one or more second users different from the user of the electronic device:
receiving, via the one or more input devices, input corresponding to a request to modify playback of a content item while the playback of the content item is occurring; and
in response to receiving the input:
in accordance with a determination that the input is received while the electronic device is in a shared content playback mode of the communication session, modifying the playback of the content item in accordance with the input at the electronic device and at the one or more second electronic devices; and
in accordance with a determination that the input is received while the electronic device is in a private content playback mode of the communication session, modifying the playback of the content item in accordance with the input at the electronic device without modifying the playback of the content item at the one or more second electronic devices.
2. The method of claim 1, further comprising
When the electronic device is in the communication session:
receiving, at a respective electronic device of the one or more second electronic devices, an indication of a selection of a respective content item; and
responsive to receiving the indication of the selection of the respective content item, playback of the respective content item is initiated at the electronic device.
3. The method of any of claims 1-2, further comprising:
when the electronic device is in the communication session:
receiving, via the one or more input devices, an indication of a request to initiate playback of a respective content item in the communication session; and
in response to receiving the indication of the request to initiate playback of the respective content item in the communication session:
in accordance with a determination that a user account associated with the electronic device is authorized to access the respective content item through a content service associated with the communication session, presenting a selectable option that, when selected, causes the electronic device to initiate playback of the respective content item in the shared content playback mode of the communication session; and
In accordance with a determination that the user account associated with the electronic device is not authorized to access the respective content item through the content service associated with the communication session, presentation of the selectable option is relinquished.
4. A method according to any one of claims 1 to 3, further comprising:
when the electronic device is in the communication session:
receiving, via the one or more input devices, input corresponding to selection of a respective content item; and
in response to receiving the input corresponding to the selection of the respective content item, displaying via a display generation component in communication with the electronic device:
a first selectable option that, when selected, causes the electronic device to initiate playback of the respective content item in the shared content playback mode on the electronic device and the one or more second electronic devices; and
a second selectable option, which when selected, causes the electronic device to initiate playback of the respective content item in the private content playback mode on the electronic device without initiating playback of the respective content item on the one or more second electronic devices.
5. The method of any one of claims 1 to 4, further comprising:
when the electronic device is in the communication session and when the electronic device is in the shared content playback mode of the communication session:
playing respective content items from a shared content item queue of the communication session comprising a plurality of content items, wherein the electronic device and the one or more second electronic devices are capable of editing the shared content item queue.
6. The method of claim 5, further comprising:
when the electronic device is in the communication session:
receiving, via the one or more input devices, an input corresponding to a request to display the shared content item queue via a display generation component in communication with the electronic device; and
in response to receiving the input corresponding to the request to display the shared content item queue, displaying via the display generating component:
a first representation of a first content item included in the shared content item queue, the first representation of the first content item being displayed in association with a representation of a user adding the first content item to the shared content item queue, and
A second representation of a second content item included in the shared content item queue, the second representation of the second content item being displayed in association with a representation of a user adding the second content item to the shared content item queue.
7. The method of any of claims 5 to 6, further comprising:
receiving, via the one or more input devices, input corresponding to a request to remove a respective content item included in the shared content item queue from the shared content item queue; and
in response to receiving the input corresponding to the request to remove the respective content item from the shared content item queue, a visual indication is displayed via a display generation component in communication with the electronic device, the visual indication indicating that the respective content item is to be removed from the shared content item queue of the electronic device and the one or more second electronic devices.
8. The method of any of claims 5 to 7, further comprising:
receiving, via the one or more input devices, an input corresponding to a request to change a playback order of the plurality of respective content items included in the shared content item queue when the playback order of the plurality of respective content items included in the shared content queue is a first playback order; and
In response to receiving the input corresponding to a request to change the playback order of the plurality of respective content items included in the shared content item queue:
in accordance with a determination that a respective one of the one or more second electronic devices detected an input corresponding to a second request to change the playback order of the plurality of respective content items included in the shared content item queue from the first playback order before the electronic device received the input corresponding to the request to change the playback order of the plurality of respective content items included in the shared content item queue, updating the playback order of the respective content items included in the shared content item queue to a second playback order in accordance with the input detected at the respective one of the one or more second electronic devices; and
in accordance with a determination that the respective one of the one or more second electronic devices did not detect an input corresponding to the second request to change the playback order of the plurality of respective content items included in the shared content item queue from the first playback order before the electronic device received the input corresponding to the request to change the playback order of the plurality of respective content items included in the shared content item queue, the playback order of the respective content items included in the shared content item queue is updated to a third playback order different from the second playback order in accordance with the input detected at the electronic device.
9. The method of any of claims 5 to 8, further comprising:
after playing the first content item in the shared content item queue:
in accordance with a determination that the content item queue includes a second content item subsequent to the first content item in the content item queue, initiating playback of the second content item; and
in accordance with a determination that the content item queue does not include the second content item subsequent to the first content item in the content item queue, one or more content items selected based on content consumption histories of the plurality of content items previously included in the shared content item queue of the communication session and a user account associated with the electronic device and one or more user accounts associated with the one or more second electronic devices are played during the communication session.
10. The method of any one of claims 1 to 9, further comprising:
detecting, via the one or more input devices, an input corresponding to a request to initiate playback of a respective content item; and
in response to detecting the input corresponding to a request to initiate playback of the respective content item:
In accordance with a determination that the electronic device is in the communication session and the electronic device is in the shared content playback mode of the communication session, and playing a second corresponding content item when the input corresponding to the request to initiate playback of the corresponding content item is received:
displaying, via a display generation component in communication with the electronic device, a selectable option that, when selected, causes the electronic device to stop playing the second corresponding content item at the electronic device and initiate playback of a predetermined corresponding portion of the corresponding content item at the electronic device without initiating playback of the predetermined corresponding portion of the corresponding content item at the one or more second electronic devices; and
in accordance with a determination that the electronic device is not in the communication session at the time the input corresponding to the request to initiate playback of the respective content item is received, display of the selectable option is relinquished.
11. The method of any one of claims 1 to 10, further comprising:
when the electronic device is in the communication session and when the electronic device is in the shared content playback mode of the communication session:
In accordance with a determination that the communication session is initiated at a first respective electronic device associated with a first content playback limit, restricting playback of one or more content items in accordance with the first content playback limit; and
in accordance with a determination that the communication session is initiated at a second corresponding electronic device associated with a second content playback limit, playback of the one or more content items is limited in accordance with the second content playback limit.
12. The method of any one of claims 1 to 11, further comprising:
after playback of the first content item is completed:
in accordance with a determination that the electronic device is in the communication session and the electronic device is in a shared content playback mode of the communication session, initiating playback of a second content item included in a shared content item queue associated with the communication session; and
in accordance with a determination that the electronic device is not in the shared content playback mode of the communication session, playback of a third content item in a private content item queue not associated with the communication session is initiated.
13. The method of any one of claims 1 to 12, further comprising:
when the electronic device is in a respective communication session with one or more electronic devices, the one or more electronic devices associated with one or more users other than the user of the electronic device receive, via the one or more electronic devices, an indication of a request to initiate playback of a respective content item in the respective communication session; and
In response to receiving the indication of the request to initiate playback of the respective content item in the respective communication session:
in accordance with a determination that the content playback limit setting of the electronic device is not more restrictive than the content playback limit setting associated with the respective communication session, displaying, via a display generation component in communication with the electronic device, a selectable option that, when selected, causes the electronic device to initiate playback of the respective content item in a shared content playback mode of the respective communication session; and
in accordance with a determination that the content playback limit setting of the electronic device is more restrictive than the content playback limit setting associated with the respective communication session, the display of the selectable option is abandoned.
14. The method of any of claims 1-13, wherein the electronic device has access to a first respective content application for playing content associated with the communication session when the electronic device is in the communication session, and the method further comprises:
the one or more third electronic devices are associated with one or more third users different from the user of the electronic device when the electronic device is in a respective communication session with the one or more third electronic devices:
Receiving, from a respective one of the one or more third electronic devices, an indication of shared playback of a respective content item associated with the respective communication session; and
in response to receiving the indication of the shared playback of the respective content item, in accordance with a determination that the electronic device is not capable of accessing a second respective content application for playing content associated with the respective communication session, displaying, via a display generation component in communication with the electronic device, a selectable option that, when selected, causes the electronic device to initiate a process of obtaining access to the second respective content application.
15. The method of any one of claims 1 to 14, further comprising:
when the electronic device is in the communication session and when the electronic device is in the shared content playback mode of the communication session:
receiving, from a respective one of the one or more second electronic devices, an indication of a first input corresponding to a request to modify playback of the content item at the electronic device and the one or more second electronic devices; and
In response to receiving the indication of the first input:
modifying playback of the content item at the electronic device according to the first input; and
a visual indication of the first input is displayed via a display generation component in communication with the electronic device.
16. The method of claim 15, wherein the visual indication of the first input is displayed is determined according to: the modifying the request for playback of the content item is modifying the request for playback of the content item in a first manner and the indication of the first input is received while the electronic device is presenting a user interface of a content application associated with the communication session, and the method further comprises:
in response to receiving the indication of the first input:
in accordance with a determination that the request to modify playback of the content item is to modify the request to playback of the content item in the first manner and to receive the indication of the first input while the electronic device is presenting a user interface of an application other than the content application associated with the communication session, the visual indication of the first input is forgone from being displayed.
17. The method of claim 16, further comprising:
in response to receiving the indication of the first input:
in accordance with a determination that the request to modify playback of the content item is to modify playback of the content item in a second manner different from the first manner, the visual indication of the first input is displayed when the indication of the first input is received, regardless of whether the electronic device is presenting a user interface of the content application associated with the communication session or presenting a user interface of the application different from the content application associated with the communication session.
18. The method of any one of claims 1 to 17, further comprising:
when the electronic device is associated with a first content playback limit, and when the electronic device is in the communication session, the communication session is associated with a second content playback limit:
receiving, via the one or more input devices, an input corresponding to a request to enter the shared content playback mode of the communication session when the electronic device is not in the shared content playback mode of the communication session; and
Responsive to the input corresponding to the request to enter the shared content playback mode of the communication session:
in accordance with a determination that the first content playback limit is more restrictive than the second content playback limit, relinquishing entry into the shared content playback mode of the communication session; and
in accordance with a determination that the first content playback limit does not limit more than the second content playback limit, the shared content playback mode of the communication session is entered.
19. The method of any one of claims 1 to 18, further comprising:
when the electronic device is associated with a first content playback limit:
when the electronic device is not in the communication session, the communication session being associated with a second content playback limit, receiving input corresponding to a request to join the communication session via the one or more input devices; and
responsive to the input corresponding to the request to join the communication session:
in accordance with a determination that the first content playback limit is more restrictive than the second content playback limit, relinquishing joining the communication session; and
in accordance with a determination that the first content playback limit does not limit more than the second content playback limit, the communication session is joined.
20. An electronic device, comprising:
one or more processors;
a memory; and
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for:
when the electronic device is in a communication session with one or more second electronic devices, the one or more second electronic devices are associated with one or more second users different from the user of the electronic device:
receiving, via one or more input devices, an input corresponding to a request to modify playback of a content item while the playback of the content item is occurring; and
in response to receiving the input:
in accordance with a determination that the input is received while the electronic device is in a shared content playback mode of the communication session, modifying the playback of the content item in accordance with the input at the electronic device and at the one or more second electronic devices; and
in accordance with a determination that the input is received while the electronic device is in a private content playback mode of the communication session, modifying the playback of the content item in accordance with the input at the electronic device without modifying the playback of the content item at the one or more second electronic devices.
21. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device to perform a method comprising:
when the electronic device is in a communication session with one or more second electronic devices, the one or more second electronic devices are associated with one or more second users different from the user of the electronic device:
receiving, via one or more input devices, an input corresponding to a request to modify playback of a content item while the playback of the content item is occurring; and
in response to receiving the input:
in accordance with a determination that the input is received while the electronic device is in a shared content playback mode of the communication session, modifying the playback of the content item in accordance with the input at the electronic device and at the one or more second electronic devices; and
in accordance with a determination that the input is received while the electronic device is in a private content playback mode of the communication session, modifying the playback of the content item in accordance with the input at the electronic device without modifying the playback of the content item at the one or more second electronic devices.
22. An electronic device, comprising:
one or more processors;
a memory;
means for, when the electronic device is in a communication session with one or more second electronic devices, the one or more second electronic devices being associated with one or more second users different from the user of the electronic device:
receiving, via one or more input devices, an input corresponding to a request to modify playback of a content item while the playback of the content item is occurring; and
in response to receiving the input:
in accordance with a determination that the input is received while the electronic device is in a shared content playback mode of the communication session, modifying the playback of the content item in accordance with the input at the electronic device and at the one or more second electronic devices; and
in accordance with a determination that the input is received while the electronic device is in a private content playback mode of the communication session, modifying the playback of the content item in accordance with the input at the electronic device without modifying the playback of the content item at the one or more second electronic devices.
23. An information processing apparatus for use in an electronic device, the information processing apparatus comprising:
means for, when the electronic device is in a communication session with one or more second electronic devices, the one or more second electronic devices being associated with one or more second users different from the user of the electronic device:
receiving, via one or more input devices, an input corresponding to a request to modify playback of a content item while the playback of the content item is occurring; and
in response to receiving the input:
in accordance with a determination that the input is received while the electronic device is in a shared content playback mode of the communication session, modifying the playback of the content item in accordance with the input at the electronic device and at the one or more second electronic devices; and
in accordance with a determination that the input is received while the electronic device is in a private content playback mode of the communication session, modifying the playback of the content item in accordance with the input at the electronic device without modifying the playback of the content item at the one or more second electronic devices.
24. An electronic device, comprising:
one or more processors;
a memory; and
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for performing any of the methods of claims 1-19.
25. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device to perform any of the methods of claims 1-19.
26. An electronic device, comprising:
one or more processors;
a memory; and
apparatus for performing any one of the methods of claims 1 to 19.
27. An information processing apparatus for use in an electronic device, the information processing apparatus comprising:
apparatus for performing any one of the methods of claims 1 to 19.
CN202280035200.3A 2021-05-15 2022-05-13 User interface and related connection method for shared playback of content items Pending CN117378206A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US63/189,106 2021-05-15
US202163197493P 2021-06-06 2021-06-06
US63/197,493 2021-06-06
PCT/US2022/072331 WO2022246377A1 (en) 2021-05-15 2022-05-13 User interfaces and associated systems and processes for shared playback of content items

Publications (1)

Publication Number Publication Date
CN117378206A true CN117378206A (en) 2024-01-09

Family

ID=89406357

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280035200.3A Pending CN117378206A (en) 2021-05-15 2022-05-13 User interface and related connection method for shared playback of content items

Country Status (1)

Country Link
CN (1) CN117378206A (en)

Similar Documents

Publication Publication Date Title
CN113162843B (en) User interface for multi-user communication session
US20200382845A1 (en) Notification of augmented reality content on an electronic device
US20220179526A1 (en) User interfaces for browsing and presenting content
US20230305799A1 (en) User interfaces for content applications
CN113950663A (en) Audio media user interface
DK201870353A1 (en) User interfaces for recommending and consuming content on an electronic device
CN113835593A (en) User interface for interacting with channels providing content played in a media browsing application
US20200236212A1 (en) User interfaces for presenting information about and facilitating application functions
CN116508021A (en) Method and user interface for processing user request
US11777881B2 (en) User interfaces and associated systems and processes for sharing portions of content items
CN116368805A (en) Media service configuration
US11481205B2 (en) User interfaces for managing subscriptions
EP4334871A1 (en) User interfaces for messaging conversations
CN111684403A (en) Media capture lock affordance for graphical user interface
CN110456948B (en) User interface for recommending and consuming content on electronic devices
CN117561494A (en) User interface for displaying content recommendations for a group of users
CN117546471A (en) User interface for indicating and/or controlling playback format of content items
US20220366077A1 (en) User interfaces and associated systems and processes for shared playback of content items
CN117378206A (en) User interface and related connection method for shared playback of content items
US20230082875A1 (en) User interfaces and associated systems and processes for accessing content items via content delivery services
US20240143553A1 (en) User interfaces for messages and shared documents
WO2022261606A1 (en) User interfaces for messaging conversations
CN117529906A (en) User interface for media sharing and communication sessions
CN116964989A (en) User interface for digital identification
CN117892264A (en) Method and user interface for voice-based user profile management

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination