US10528210B2 - Foreground/background assortment of hidden windows - Google Patents

Foreground/background assortment of hidden windows Download PDF

Info

Publication number
US10528210B2
US10528210B2 US14/834,305 US201514834305A US10528210B2 US 10528210 B2 US10528210 B2 US 10528210B2 US 201514834305 A US201514834305 A US 201514834305A US 10528210 B2 US10528210 B2 US 10528210B2
Authority
US
United States
Prior art keywords
application
computing device
operating system
mobile
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/834,305
Other versions
US20160054862A1 (en
Inventor
Brian Reeves
Paul E. Reeves
Wuke Liu
Borys Sushchev
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Z124 Co
Original Assignee
Z124 Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Z124 Co filed Critical Z124 Co
Priority to US14/834,305 priority Critical patent/US10528210B2/en
Assigned to Z124 reassignment Z124 ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SUSHCHEV, BORYS, LIU, WUKE, REEVES, BRIAN, REEVES, PAUL E.
Publication of US20160054862A1 publication Critical patent/US20160054862A1/en
Application granted granted Critical
Publication of US10528210B2 publication Critical patent/US10528210B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/363Graphics controllers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/02Networking aspects
    • G09G2370/022Centralised management of display operation, e.g. in a server instead of locally
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/06Consumer Electronics Control, i.e. control of another device by a display or vice versa
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/10Use of a protocol of communication by packets in interfaces along the display data pipeline
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/12Use of DVI or HDMI protocol in interfaces along the display data pipeline

Definitions

  • This Application relates generally to the field of mobile computing environments, and more particularly to supporting application navigation in a mobile computing environment with multiple active user environments.
  • Mobile communications devices are becoming ubiquitous in today's society. For example, as of the end of 2008, 90 percent of Americans had a mobile wireless device.
  • smartphones that is, mobile phones built on top of a mobile computing platform.
  • Mobile providers have launched hundreds of new smartphones in the last three years based upon several different computing platforms (e.g., Apple iPhone, Android, BlackBerry, Palm, Windows Mobile, and the like).
  • smartphone penetration reached almost 23% by the middle of 2010, and over 35% in some age-groups.
  • the smartphone market grew by 41% from 2009 to 2010, with over 60 million smartphone subscribers as of July 2010 in the five largest European countries alone.
  • Smartphone computing platforms typically include a mobile operating system (“OS”) running on a mobile processor. While mobile processors and mobile OSs have increased the capabilities of these devices, smartphones have not tended to replace personal computer (“PC”) environments (i.e., Windows, Mac OS X, Linux, and the like) such as desktop or notebook computers at least because of the limited user experience provided. In particular, smartphones typically have different processing resources, user interface device(s), peripheral devices, and applications. For example, mobile processors may have a different processor architecture than PC processors that emphasizes features like low-power operation and communications capabilities over raw processing and/or graphics performance.
  • OS mobile operating system
  • PC personal computer
  • smartphones tend to have smaller amounts of other hardware resources such as memory (e.g., SRAM, DRAM, etc.) and storage (e.g., hard disk, SSD, etc.) resources.
  • Other considerations typically include a smaller display size that limits the amount of information that can be presented through a mobile OS graphical user interface (“GUI”) and different user input devices.
  • GUI graphical user interface
  • Use interface input device(s) for smartphones typically include a small thumb-style QWERTY keyboard, touch-screen display, click-wheel, and/or scroll-wheel.
  • laptop, notebook, and desktop computers that use a desktop OS typically have a full-size keyboard, pointing device(s), and/or a larger screen area.
  • mobile OSs typically have a different architecture where some capabilities and features such as communications, lower power consumption, touch-screen capability, and the like, are emphasized over traditionally emphasized PC capabilities such as processing speed, graphics processing, and application multi-tasking.
  • Android runs only applications that are specifically developed to run within a Java-based virtual machine runtime environment.
  • Android is based on a modified Linux kernel, it uses different standard C libraries, system managers, and services than Linux. Accordingly, applications written for Linux do not run on Android without modification or porting.
  • Apple's iPhone uses the iOS mobile operating system. Again, while iOS is derived from Mac OS X, applications developed for OS X do not run on iOS. Therefore, while many applications are available for mobile OSs such as Android and iOS, many other common applications for desktop operating systems such as Linux and Mac OS X are either not available on the mobile platforms or have limited funcitonality. As such, these mobile OSs provide
  • smartphones are typically suited for a limited set of user experiences and provide applications designed primarily for the mobile environment.
  • smartphones do not provide a suitable desktop user experience, nor do they run most common desktop applications.
  • the user interface components typically found on a smartphones tend to be more difficult to use than a full-size keyboard and large display that may be typically found on a PC platform.
  • each device has its own CPU, memory, file storage, and operating system.
  • Connectivity and file sharing between smartphones and other computing devices involves linking one device (e.g., smartphone, running a mobile OS) to a second, wholly disparate device (e.g., notebook, desktop, or tablet running a desktop OS), through a wireless or wired connection.
  • Information is shared across devices by synchronizing data between applications running separately on each device. This process, typically called “synching,” is cumbersome and generally requires active management by the user.
  • Embodiments of the present invention are directed to providing the mobile computing experience of a smartphone and the appropriate user experience of a secondary terminal environment in a single mobile computing device.
  • a secondary terminal environment may be some combination of visual rendering devices (e.g., monitor or display), input devices (e.g., mouse, touch pad, touch-screen, keyboard, etc.), and other computing peripherals (e.g., HDD, optical disc drive, memory stick, camera, printer, etc.) connected to the computing device by a wired (e.g., USB, Firewire, Thunderbolt, etc.) or wireless (e.g., Bluetooth, WiFi, etc.) connection.
  • a mobile operating system associated with the user experience of the mobile environment and a desktop operating system associated with the user experience of the secondary terminal environment are run concurrently and independently on a shared kernel.
  • a mobile computing device includes a first operating system.
  • a first application is running on the mobile operating system.
  • a first application screen, associated with the first application, is displayed on an active display device.
  • the application screen may be displayed on a display of the mobile computing device.
  • a process for managing application graphics associated with the application includes receiving an application interaction state change event indicating that a current interaction state of the first application is to be changed from a foreground state to a background state, generating a bitmap image corresponding to a graphical representation of the first application screen, changing the current interaction state of the first application from the foreground state to the background state, associating the bitmap image with a position within an application activity stack corresponding to the application, receiving a user input command related to the application activity stack, and displaying a representation of the bitmap image within a graphical representation of the application activity stack.
  • the process may include receiving a user command indicative of a selection of the bitmap image within the graphical representation of the application activity stack, and changing the current interaction state of the first application from the background state to the foreground state.
  • the mobile computing device may define a first user environment, and the graphical representation of the application activity stack may be presented on a display of a second user environment.
  • the second user environment may be associated with a second operating system running concurrently with the first operating system on a shared kernel of the mobile computing device.
  • the application activity stack may be maintained by an application model manager.
  • the application activity stack may include applications that have been started by the user and not actively closed by the user.
  • a process associated with the first application screen may be suspended in response to the change in the current interaction state of the first application from the foreground state to the background state.
  • a mobile computing device includes a first application and a second application running concurrently on a first operating system.
  • a process for managing application graphics may include displaying the first application on an active display device, receiving an application interaction state change event indicating that a current interaction state of the first application is to be changed from a foreground state to a background state, generating a bitmap image corresponding to a graphical representation of an application screen associated with the first application, changing the current interaction state of the first application from the foreground state to the background state, associating the bitmap image with a position within an application activity stack corresponding to the application, and displaying a graphical representation of the application activity stack on a display device associated with a secondary terminal environment, the secondary terminal environment connected to the mobile computing device via a communications interface, the graphical representation of the application activity stack including the bitmap image.
  • the secondary terminal environment may be associated with a second operating system running concurrently with the first operating system on a shared kernel of the mobile computing device.
  • Displaying of the graphical representation of the application activity stack may be in response to a user initiated event within the secondary terminal environment and/or a dock event.
  • the dock event may include connecting the mobile computing device with the secondary terminal environment via the communications interface.
  • the bitmap image may be generated from graphical information maintained by a bitmap server within a surface manager of the first operating system.
  • the bitmap server may provide an application level interface to the bitmap image data.
  • a mobile computing device includes a first operating system and a display device.
  • a first application screen associated with a first application running on the first operating system, is displayed on the display device.
  • the first application may be considered to be in a foreground state.
  • the first operating system includes an activity manager that maintains a list of currently running applications, an application model manager of the first operating system that receives application state information from the activity manager service, and a bitmap server module that maintains references to active surfaces of the first application.
  • the bitmap server module may store a copy of the surface information of the first application screen responsive to an application interaction state change event indicating that a current interaction state of the first application is to be changed from the foreground state to a background state.
  • the first operating system may display a transition animation based on the copy of the surface information of the first application screen.
  • the transition animation may be displayed by an application space component of the first operating system.
  • the first operating system may be a mobile operating system.
  • the mobile operating system may include a surface manager module, and the bitmap server module may be implemented as a class in the surface manager module.
  • the bitmap server module may provide references to the copy of the surface information of the first application screen to the framework layer of the mobile operating system.
  • the bitmap server module may create a bitmap image from the copy of the surface information of the first application screen.
  • FIG. 1 illustrates a computing environment that provides multiple user computing experiences, according to various embodiments.
  • FIG. 2 illustrates an exemplary system architecture for a mobile computing device, according to various embodiments.
  • FIG. 3 illustrates an operating system architecture for a computing environment, according to various embodiments.
  • FIG. 4 illustrates an operating system architecture for a computing environment, according to various embodiments.
  • FIG. 5 illustrates aspects of a kernel for a computing environment, according to various embodiments.
  • FIG. 6 illustrates an operating system architecture for a computing environment, according to various embodiments.
  • FIG. 7 illustrates a computing environment with multiple active user environments, according to various embodiments.
  • FIG. 8 illustrates a computing environment including a mobile computing device, according to various embodiments.
  • FIG. 9 illustrates aspects of cross-environment assortment of application windows, according to various embodiments.
  • FIG. 10 illustrates aspects of graphical cross-environment application navigation, according to various embodiments.
  • FIG. 11 illustrates aspects of a computing architecture supporting cross-environment application navigation and assortment, according to various embodiments.
  • FIG. 12 illustrates a flow diagram for managing graphical navigation in a computing environment, according to various embodiments.
  • the present disclosure is generally directed to managing navigation of foreground and background applications in a computing environment with multiple active user environments. More particularly, applications or “Apps” may be running on a mobile operating system (“OS”) of a mobile computing device that generally defines a first active user environment.
  • the mobile OS typically presents a single active application (i.e., foreground application) at a time through a graphical user interface (“GUI”) of the mobile operating system.
  • GUI graphical user interface
  • Other applications may be running on the mobile operating system but not actively displayed (i.e., background applications).
  • processes of back ground applications related to displaying graphics information and accepting user input are suspended or paused by the mobile operating system.
  • Navigation among foreground and background applications running on the mobile OS on the mobile computing device typically consists of navigating away from the foreground application (e.g., back to a home screen, etc.) before selecting an icon associated with a background application.
  • a computing environment with multiple active user environments it may be desirable to present user interfaces for multiple concurrent applications within an active user environment of the computing environment. Additionally, it may be desirable to graphically browse and/or navigate through applications running on a first operating system through a second active user environment. For example, cross environment application browsing and/or navigation using preview representations of application screens may allow faster cross-environment application navigation as they provide a visual representation of an application state.
  • a mobile computing device running a mobile operating system defines a first active user environment.
  • a second active user environment may be connected to the mobile computing device.
  • Mobile OS applications may be accessed from and actively displayed through the second active user environment.
  • the second active user environment may be associated with a second operating system (e.g., a desktop operating system) running on the mobile computing device.
  • Disclosed embodiments present a seamless computing experience in a computing environment with multiple active user environments by automatically presenting application screens across user environments in certain conditions.
  • Other disclosed embodiments support graphical navigation of foreground and background applications of the mobile operating system across multiple active user environments. For example, preview screens of foreground and background applications running on the mobile operating system may be displayed within the second active user environment to allow a user to navigate quickly between applications running on the mobile operating system.
  • Disclosed techniques allow graphical cross-environment application navigation even where the mobile operating system does not maintain graphical information for background applications of the mobile OS. For example, graphical and/or user input processes of background applications may be paused, stopped, suspended, and/or killed.
  • a last graphical representation of a mobile OS application is captured before the application is moved from a foreground state to a background state and graphical information and/or user input processes of the application are paused or suspended.
  • the last graphical representation for mobile OS applications may be maintained as bitmap images or graphics surface information. While the foreground/background application navigation techniques presented in the disclosure are discussed with reference to a mobile computing device and various docked terminal environments, the disclosure may, in various embodiments, be applied to other computing devices (e.g., laptop computers, tablet computers, desktop computers, etc.) and is not intended to be limited to handheld mobile computing devices unless otherwise explicitly specified.
  • FIG. 1 illustrates a computing environment 100 that provides multiple user computing experiences through multiple active user environments, according to various embodiments.
  • a first active user environment 115 of computing environment 100 is defined by display(s) 116 , touch screen sensor(s) 117 , and/or I/O devices 118 of mobile computing device 110 .
  • the display(s) 116 may be operative to display a displayed image or “screen.”
  • the term display is intended to connote device hardware, whereas screen is intended to connote the displayed image produced on the display.
  • a display is physical hardware that is operable to present a screen to the user.
  • a screen may encompass a majority of one or more displays.
  • a screen may occupy substantially all of the display area of one or more displays except for areas dedicated to other functions (e.g. menu bars, status bars, etc.).
  • a screen may be associated with an application and/or an operating system executing on the mobile computing device 110 .
  • applications may have various kinds of screens that are capable of being manipulated as will be described further below.
  • mobile computing device 110 When mobile computing device 110 is operated as a stand-alone mobile device, active user environment 115 presents a typical mobile computing user experience.
  • mobile computing device 110 typically includes mobile telephony capabilities and user interaction features suited to a mobile computing use model.
  • mobile computing device 110 may present a graphical user interface (“GUI”) suited to active user environment 115 including display(s) 116 , touch-screen sensor(s) 117 , and/or I/O device(s) 118 .
  • GUI graphical user interface
  • the user may interact with application programs (i.e., “Apps”) running on mobile computing device 110 through an application screen including various interactive features (e.g., buttons, text fields, toggle fields, etc.) presented on display(s) 116 .
  • Apps application programs
  • the user interacts with these interactive features by way of I/O device(s) 118 .
  • the user interacts with these features by way of touch-screen sensor(s) 117 using gestures and symbols that are input to touch screen sensor(s) 117 using the user's fingers or a stylus.
  • the user interacts with these features using a combination of I/O device(s) 118 and touch-screen sensor(s) 117 .
  • FIG. 2 illustrates an exemplary hardware system architecture for mobile computing device 110 , according to various embodiments.
  • Mobile computing device 110 includes mobile processor 114 with one or more CPU cores 204 and external display interface 220 .
  • mobile computing device 110 also includes memory 206 , storage devices 208 , touch-screen display controller 212 connected to touch-screen display(s) 116 and/or touch-screen sensor(s) 117 , I/O devices 118 , power management IC 214 connected to battery 216 , cellular modem 218 , communication devices 222 , and/or other devices 224 that are connected to processor 114 through various communication signals and interfaces.
  • I/O devices 118 generally includes buttons and other user interface components that may be employed in mobile computing device 110 .
  • I/O devices 118 may include a set of buttons, (e.g., back, menu, home, search, etc.), off-screen gesture area, click-wheel, scroll-wheel, QWERTY keyboard, etc.
  • Other devices 224 may include, for example, GPS devices, LAN connectivity, microphones, speakers, cameras, accelerometers, gyroscopes, magnetometers, and/or MS/MMC/SD/SDIO card interfaces.
  • External display interface 220 may be any suitable display interface (e.g., VGA, DVI, HDMI, wireless, etc.).
  • One or more sensor devices of the mobile computing device 110 may be able to monitor the orientation of the mobile computing device with respect to gravity. For example, using an accelerometer, gyroscope, inclinometer, or magnetometer, or some combination of these sensors, mobile computing device 110 may be able to determine whether it is substantially in a portrait orientation (meaning that a long axis of the display(s) 116 are oriented vertically) or substantially in a landscape orientation with respect to gravity. These devices may further provide other control functionality by monitoring the orientation and/or movement of the mobile computing device 110 .
  • orientation sensor is intended to mean some combination of sensors (e.g., accelerometer, gyroscope, inclinometer, magnetometer, etc.) that may be used to determine orientation of a device with respect to gravity and is not intended to be limited to any particular sensor type or technology.
  • sensors e.g., accelerometer, gyroscope, inclinometer, magnetometer, etc.
  • Processor 114 may be an ARM-based mobile processor.
  • mobile processor 114 is a mobile ARM-based processor such as Texas Instruments OMAP3430, Marvell PXA320, Freescale iMX51, or Qualcomm QSD8650/8250.
  • mobile processor 114 may be another suitable ARM-based mobile processor or processor based on other processor architectures such as, for example, x86-based processor architectures or other RISC-based processor architectures.
  • FIG. 2 illustrates one exemplary hardware implementation 112 for mobile computing device 110
  • other architectures are contemplated as within the scope of the invention.
  • various components illustrated in FIG. 2 as external to mobile processor 114 may be integrated into mobile processor 114 .
  • external display interface 220 shown in FIG. 2 as integrated into mobile processor 114 , may be external to mobile processor 114 .
  • other computer architectures employing a system bus, discrete graphics processor, and/or other architectural variations are suitable for employing aspects of the present invention.
  • Secondary terminal environment 140 may be some combination of visual rendering devices (e.g., monitor or display) 140 , I/O devices (e.g., mouse, touch pad, touch-screen, keyboard, etc.) 146 , and other computing peripherals (e.g., HDD, optical disc drive, memory stick, camera, printer, GPS, accelerometer, etc.) 148 connected to mobile computing device 110 by connecting port 142 on secondary terminal environment 140 with port 120 on mobile computing device 110 through interface 122 .
  • visual rendering devices e.g., monitor or display
  • I/O devices e.g., mouse, touch pad, touch-screen, keyboard, etc.
  • other computing peripherals e.g., HDD, optical disc drive, memory stick, camera, printer, GPS, accelerometer, etc.
  • Interface 122 may be some combination of wired (e.g., USB, Firewire, Thunderbolt, HDMI, VGA, etc.) or wireless (e.g., Bluetooth, WiFi, Wireless HDMI, etc.) interfaces. While secondary terminal environments may have some processing or logic elements such as microcontrollers or other application specific integrated circuits (“ASICs”), they typically do not have a processor that runs a separate instance of an operating system.
  • wired e.g., USB, Firewire, Thunderbolt, HDMI, VGA, etc.
  • wireless e.g., Bluetooth, WiFi, Wireless HDMI, etc.
  • secondary terminal environments may have some processing or logic elements such as microcontrollers or other application specific integrated circuits (“ASICs”), they typically do not have a processor that runs a separate instance of an operating system.
  • ASICs application specific integrated circuits
  • Secondary terminal environments that define a second user environment may be suited for one or more of various use models, depending on the components that make up the secondary terminal environment.
  • Some secondary terminal environments may be associated with a user computing experience that is similar to the user computing experience of the mobile computing device 110 , while others may provide a user computing experience more traditionally associated with desktop computing.
  • secondary terminal environment 140 may be a device that includes a display 144 with a corresponding touch-screen sensor 146 that serves as the primary user input for the device.
  • This type of secondary terminal environment may be called a tablet-style secondary terminal environment. While a tablet-style secondary terminal environment may have a larger touch-screen display than mobile computing device 110 , the user experience of this type of secondary terminal environment may be similar in some ways to the user experience of mobile computing device 110 .
  • a tablet-style secondary terminal environment includes a 10.1-inch diagonal (1280 ⁇ 800 resolution) touch-enabled display, standard set of buttons (e.g., back, menu, home, search, etc.), one or more cameras, and an off-screen gesture area.
  • a tablet-style secondary terminal environment may include other peripheral devices 148 that may be used to influence the configuration of applications presented to the user on the tablet-style secondary terminal environment.
  • a tablet-style secondary terminal environment may include a GPS receiver, accelerometer, gyroscope, magnetometer, and/or other sensors for determining its location and/or orientation.
  • a notebook-style secondary terminal environment generally includes a display screen 144 , keyboard and pointing device(s) 146 , and/or other peripheral devices 148 in a clam-shell type enclosure.
  • a laptop or notebook-style secondary terminal environment may be known as a “Smart Display” or “LapDock.” Because this type of secondary terminal environment includes a larger display, keyboard, and pointing device(s), it typically has a user computing experience associated with a desktop computing experience. In this regard, this type of secondary terminal environment may not have a similar user experience profile to mobile computing device 110 .
  • a notebook-style secondary terminal environment may include other peripheral devices that may be used to influence the configuration of applications presented to the user on the secondary terminal environment.
  • a notebook-style secondary terminal environment may include a GPS receiver, accelerometer, gyroscope, magnetometer, and/or other sensors for determining its location and/or orientation.
  • the various secondary terminal environments may also include a variety of generic input/output device peripherals that make up a typical desktop computing environment.
  • the I/O devices may be connected through a docking hub (or “dock cradle”) that includes port 142 and one or more device I/O ports for connecting various commercially available display monitors 144 , I/O devices 146 , and/or other peripheral devices 148 .
  • a docking hub may include a display port (e.g., VGA, DVI, HDMI, Wireless HDMI, etc.), and generic device ports (e.g., USB, Firewire, etc.).
  • a user may connect a commercially available display, keyboard, and pointing device(s) to the docking hub.
  • this secondary terminal environment will be suited to a desktop computing experience.
  • this type of secondary terminal environment may be suited to a computing experience designed around the use of a pointing device(s) and physical keyboard to interact with a user interface on the display.
  • mobile computing device 110 includes multiple operating systems running concurrently and independently on a shared kernel. Concurrent execution of a mobile OS and a desktop OS on a shared kernel is described in more detail in U.S. patent application Ser. No. 13/217,108, filed Aug. 24, 2011, entitled “MULTI-OPERATING SYSTEM,” herein incorporated by reference. In this way, a single mobile computing device can concurrently provide a mobile computing experience through a first user environment associated with a mobile OS and a desktop computing experience through a second user environment associated with a full desktop OS.
  • FIG. 3 illustrates OS architecture 300 that may be employed to run mobile OS 130 and desktop OS 160 concurrently on mobile computing device 110 , according to various embodiments.
  • mobile OS 130 and desktop OS 160 are independent operating systems running concurrently on shared kernel 320 .
  • mobile OS 130 and desktop OS 160 are considered independent and concurrent because they are running on shared kernel 320 at the same time and may have separate and incompatible user libraries, graphics systems, and/or framework layers.
  • mobile OS 130 and desktop OS 160 may both interface to shared kernel 320 through the same kernel interface 322 (e.g., system calls, etc.).
  • shared kernel 320 manages task scheduling for processes of both mobile OS 130 and desktop OS 160 concurrently.
  • shared kernel 320 runs directly on mobile processor 114 of mobile computing device 110 , as illustrated in FIG. 3 .
  • shared kernel 320 directly manages the computing resources of processor 114 such as CPU scheduling, memory access, and I/O.
  • hardware resources are not virtualized, meaning that mobile OS 130 and desktop OS 160 make system calls through kernel interface 322 without virtualized memory or I/O access.
  • Functions and instructions for OS architecture 300 may be stored as computer program code on a tangible computer readable medium of mobile computing device 110 .
  • instructions for OS architecture 300 may be stored in storage device(s) 208 of mobile computing device 110 .
  • mobile OS 130 has libraries layer 330 , application framework layer 340 , and application layer 350 .
  • applications 352 and 354 run in application layer 350 supported by application framework layer 340 of mobile OS 130 .
  • Application framework layer 340 includes manager(s) 342 and service(s) 344 that are used by applications running on mobile OS 130 .
  • Libraries layer 330 includes user libraries 332 that implement common functions such as I/O and string manipulation, graphics functions, database capabilities, communication capabilities, and/or other functions and capabilities.
  • Application framework layer 340 may include a window manager, activity manager, package manager, resource manager, telephony manager, gesture controller, and/or other managers and services for the mobile environment.
  • Application framework layer 340 may include a mobile application runtime environment that executes applications developed for mobile OS 130 .
  • the mobile application runtime environment may be optimized for mobile computing resources such as lower processing power and/or limited memory space.
  • Applications running on mobile OS 130 may be composed of multiple application components that perform the functions associated with the application, where each component is a separate process.
  • a mobile OS application may be composed of processes for displaying graphical information, handling user input, managing data, communicating with other applications/processes, and/or other types of processes.
  • desktop OS 160 has libraries layer 360 , framework layer 370 , and application layer 380 .
  • applications 382 and 384 run in application layer 380 supported by application framework layer 370 of desktop OS 160 .
  • Application framework layer 370 includes manager(s) 372 and service(s) 374 that are used by applications running on desktop OS 160 .
  • application framework layer 370 may include a window manager, activity manager, package manager, resource manager, and/or other managers and services common to a desktop environment.
  • Libraries layer 360 may include user libraries 362 that implement common functions such as I/O and string manipulation, graphics functions, database capabilities, communication capabilities, and/or other functions and capabilities.
  • mobile operating systems typically do not use the same graphics environment as desktop operating systems.
  • graphics environments for mobile OSs are designed for efficiency and the specific user input devices of a mobile computing environment.
  • display devices of mobile computing devices are typically too small to present multiple active application screens at the same time.
  • most mobile OS GUIs present a single active application screen that consumes all or substantially all of the active area of the display of the mobile computing device.
  • presenting a single active application screen at a time allows the mobile OS to shut down or suspend graphical and/or user interaction processes of background applications. Shutting down or suspending background application processes conserves power which is critical to providing long battery life in a mobile computing device.
  • desktop OSs typically provide a multi-tasking user interface where more than one application screen may be presented through the desktop OS GUI at the same time.
  • Graphics information for multiple applications may be displayed within windows of the GUI of the desktop operating system that are cascaded, tiled, and/or otherwise displayed concurrently in overlapping or non-overlapping fashion.
  • This type of graphical environment provides for greater flexibility because multiple applications may be presented through multiple active application screens. While only a single application may have the input focus (i.e., the application to which input such as keyboard entry is directed), switching back and forth between applications does not require resuming or restarting the application and rebuilding the application screen.
  • switching input focus back and forth between active applications may involve selecting an application window or switching the focus to an application window by placing the mouse pointer over the application window.
  • desktop OSs typically maintain graphics information (i.e., graphics and user input processes continue to run) for all running applications, whether the application screens associated with the applications are active or not (e.g., in the background, etc.).
  • graphics information i.e., graphics and user input processes continue to run
  • the application screens associated with the applications are active or not (e.g., in the background, etc.).
  • maintaining multiple active application screens requires greater processing and system resources.
  • mobile OS 130 and desktop 160 may be independent operating systems with incompatible user libraries, graphics systems, and/or application frameworks. Therefore, applications developed for mobile OS 130 may not run directly on desktop OS 160 , and applications developed for desktop OS 160 may not run directly on mobile OS 130 .
  • application 352 running in application layer 350 of mobile OS 130 , may be incompatible with desktop OS 160 , meaning that application 352 could not run on desktop OS 160 .
  • application 352 may depend on manager(s) 342 , service(s) 344 , and/or libraries 332 of mobile OS 130 that are either not available or not compatible with manager(s) 372 , service(s) 374 , and/or libraries 362 of desktop OS 160 .
  • desktop OS 160 runs in a separate execution environment from mobile OS 130 .
  • mobile OS 130 may run in a root execution environment and desktop OS 160 may run in a secondary execution environment established under the root execution environment.
  • Processes and applications running on mobile OS 130 access user libraries 332 , manager(s) 342 and service(s) 344 in the root execution environment.
  • Processes and applications running on desktop OS 160 access user libraries 362 , manager(s) 372 and service(s) 374 in the secondary execution environment.
  • the most widely adopted mobile OS is Google's Android. While Android is based on Linux, it includes modifications to the kernel and other OS layers for the mobile environment and mobile processors. In particular, while the Linux kernel is designed for a PC (i.e., x86) CPU architecture, the Android kernel is modified for ARM-based mobile processors. Android device drivers are also particularly tailored for devices typically present in a mobile hardware architecture including touch-screens, mobile connectivity (GSM/EDGE, CDMA, Wi-Fi, etc.), battery management, GPS, accelerometers, and camera modules, among other devices. In addition, Android does not have a native X Window System nor does it support the full set of standard GNU libraries, and this makes it difficult to port existing GNU/Linux applications or libraries to Android.
  • Android does not have a native X Window System nor does it support the full set of standard GNU libraries, and this makes it difficult to port existing GNU/Linux applications or libraries to Android.
  • Apple's iOS operating system (run on the iPhone) and Microsoft's Windows Phone 7 are similarly modified for the mobile environment and mobile hardware architecture.
  • iOS is derived from the Mac OS X desktop OS
  • common Mac OS X applications do not run natively on iOS.
  • iOS applications are developed through a standard developer's kit (“SDK”) to run within the “Cocoa Touch” runtime environment of iOS, which provides basic application infrastructure and support for key iOS features such as touch-based input, push notifications, and system services. Therefore, applications written for Mac OS X do not run on iOS without porting.
  • SDK developer's kit
  • an Android mobile OS and a full Linux OS run independently and concurrently on a modified Android kernel.
  • the Android OS may be a modified Android distribution while the Linux OS (“Hydroid”) may be a modified Debian Linux desktop OS.
  • FIGS. 4-6 illustrate Android mobile OS 430 , Android kernel 520 , and Hydroid OS 660 that may be employed in OS architecture 300 in more detail, according to various embodiments.
  • Android OS 430 includes a set of C/C++ libraries in libraries layer 432 that are accessed through application framework layer 440 .
  • Libraries layer 432 includes the “bionic” system C library 439 that was developed specifically for Android to be smaller and faster than the “glibc” Linux C-library.
  • Libraries layer 432 also includes inter-process communication (“IPC”) library 436 , which includes the base classes for the “Binder” IPC mechanism of the Android OS. Binder was developed specifically for Android to allow communication between processes and services.
  • IPC inter-process communication
  • libraries 435 that support recording and playback of media formats
  • surface manager 434 that manages access to the display subsystem and composites graphic layers from multiple applications
  • 2D and 3D graphics engines 438 and lightweight relational database engine 437 .
  • Other libraries that may be included in libraries layer 432 but are not pictured in FIG. 4 include bitmap and vector font rendering libraries, utilities libraries, browser tools (i.e., WebKit, etc.), and/or secure communication libraries (i.e., SSL, etc.).
  • Application framework layer 440 of Android OS 430 provides a development platform that allows developers to use components of the device hardware, access location information, run background services, set alarms, add notifications to the status bar, etc. Framework layer 440 also allows applications to publish their capabilities and make use of the published capabilities of other applications.
  • Components of application framework layer 440 of Android mobile OS 430 include activity manager 441 , resource manager 442 , window manager 443 , dock manager 444 , hardware and system services 445 , desktop monitor service 446 , multi-display manager 447 , and remote communication service 448 .
  • Other components that may be included in framework layer 440 of Android mobile OS 430 include a view system, telephony manager, package manager, location manager, and/or notification manager, among other managers and services.
  • Applications running on Android OS 430 run within the Dalvik virtual machine 431 in the Android runtime environment 433 on top of the Android object-oriented application framework.
  • Dalvik virtual machine 431 is a register-based virtual machine, and runs a compact executable format that is designed to reduce memory usage and processing requirements.
  • Applications running on Android OS 430 include home screen 451 , email application 452 , phone application 453 , browser application 454 , and/or other application(s) (“App(s)”) 455 .
  • Each application may include one or more application components including activities which define application screens through which the user interfaces with the application. That is, activities are processes within the Android runtime environment that manage user interaction through application screens.
  • Other application components include services for performing long-running operations or non-user interface features, content providers for managing shared data, and broadcast receivers for responding to system broadcast messages.
  • the Android OS graphics system uses a client/server model.
  • a surface manager (“SurfaceFlinger”) is the graphics server and applications are the clients. SurfaceFlinger maintains a list of display ID's and keeps track of assigning applications to display ID's.
  • mobile computing device 110 has multiple touch screen displays 116 .
  • display ID 0 is associated with one of the touch screen displays 116 and display ID 1 is associated with the other touch screen display 116 .
  • Display ID 2 is associated with both touch screen displays 116 (i.e., the application is displayed on both displays at the same time).
  • Android For each display device associated with a display ID, Android maintains a graphics context and frame buffer associated with the display device.
  • display ID's greater than 2 are virtual displays, meaning that they are not associated with a display physically present on mobile computing device 110 .
  • Graphics information for Android applications and/or activities includes windows, views, and canvasses. Each window, view, and/or canvas is implemented with an underlying surface object.
  • Surface objects are double-buffered (front and back buffers) and synchronized across processes for drawing.
  • SurfaceFlinger maintains all surfaces in a shared memory pool which allows all processes within Android to access and draw into them without expensive copy operations and without using a server-side drawing protocol such as X-Windows.
  • Applications always draw into the back buffer while SurfaceFlinger reads from the front buffer.
  • SurfaceFlinger creates each surface object, maintains all surface objects, and also maintains a list of surface objects for each application. When the application finishes drawing in the back buffer, it posts an event to SurfaceFlinger, which swaps the back buffer to the front and queues the task of rendering the surface information to the frame buffer.
  • SurfaceFlinger monitors all window change events. When one or more window changeevents occur, SurfaceFlinger renders the surface information to the frame buffer for one or more displays.
  • Rendering includes compositing the surfaces, i.e., composing the final image frame based on dimensions, transparency, z-order, and visibility of the surfaces. Rendering may also include hardware acceleration (e.g., OpenGL 2D and/or 3D interface for graphics processing hardware). SurfaceFlinger loops over all surface objects and renders their front buffers to the frame buffer in their Z order.
  • FIG. 5 illustrates modified Android kernel 520 in more detail, according to various embodiments.
  • Modified Android kernel 520 includes touch-screen display driver 521 , camera driver(s) 522 , Bluetooth driver(s) 523 , shared memory allocator 524 , IPC driver(s) 525 , USB driver(s) 526 , WiFi driver(s) 527 , I/O device driver(s) 528 , and/or power management module 530 .
  • I/O device driver(s) 528 includes device drivers for external I/O devices, including devices that may be connected to mobile computing device 110 through port 120 .
  • Modified Android kernel 520 may include other drivers and functional blocks including a low memory killer, kernel debugger, logging capability, and/or other hardware device drivers.
  • FIG. 6 illustrates Hydroid OS 660 in more detail, according to various embodiments.
  • Hydroid is a full Linux OS that is capable of running almost any application developed for standard Linux distributions.
  • libraries layer 662 of Hydroid OS 660 includes Linux libraries that support networking, graphics processing, database management, and other common program functions.
  • user libraries 662 may include the “glibc” Linux C library 664 , Linux graphics libraries 662 (e.g., GTK, OpenGL, etc.), Linux utilities libraries 661 , Linux database libraries, and/or other Linux user libraries.
  • Applications run on Hydroid within an X-Windows Linux graphical environment using X-Server 674 , window manager 673 , and/or desktop environment 672 .
  • Illustrated applications include word processor 681 , email application 682 , spreadsheet application 683 , browser 684 , and other application(s) 685 .
  • the Linux OS graphics system is based on the X-windows (or “XII”) graphics system.
  • X-windows is a platform-independent, networked graphics framework.
  • X-windows uses a client/server model where the X-server is the graphics server and applications are the clients.
  • the X-server controls input/output hardware associated with the Linux OS such as displays, touch-screen displays, keyboards, pointing device(s), etc.
  • X-windows provides a server-side drawing graphics architecture, i.e., the X-server maintains the content for drawables including windows and pixmaps.
  • X-clients communicate with the X-server by exchanging data packets that describe drawing operations over a communication channel.
  • X-clients access the X communication protocol through a library of standard routines (the “Xlib”). For example, an X-client may send a request to the X-server to draw a rectangle in the client window.
  • the X-server sends input events to the X-clients, for example, keyboard or pointing device input, and/or window movement or resizing. Input events are relative to client windows. For example, if the user clicks when the pointer is within a window, the X-server sends a packet that includes the input event to the X-client associated with the window that includes the action and positioning of the event relative to the window.
  • Hydroid OS 660 includes components of a cross-environment communication framework that facilitates communication with Android OS 430 through shared kernel 520 . These components include IPC library 663 that includes the base classes for the Binder IPC mechanism of the Android OS and remote communications service 671 .
  • Hydroid OS 660 is run within a chrooted (created with the ‘chroot’ command) secondary execution environment created within the Android root environment. Processes and applications within Hydroid OS 660 are run within the secondary execution environment such that the apparent root directory seen by these processes and applications is the root directory of the secondary execution environment. In this way, Hydroid OS 660 can run programs written for standard Linux distributions without modification because Linux user libraries 662 are available to processes running on Hydroid OS 660 in the chrooted secondary execution environment.
  • mobile computing device 110 may associate a connected secondary terminal environment 140 with desktop OS 160 .
  • computing environment 100 presents a first computing experience through a first active user environment 115 associated with mobile OS 130 , and, concurrently, a second computing experience through the second active user environment 140 associated with desktop OS 160 .
  • FIG. 7 illustrates a computing environment 700 with multiple active user environments, according to various embodiments.
  • the mobile computing device 110 presents a first active user environment associated with the mobile OS 130 that includes touch-screen display(s) 116 and other I/O devices 118 of mobile computing device 110 .
  • the user interface 750 of the mobile OS 130 is displayed on touch-screen display 116 .
  • mobile computing device 110 is docked to a second active user environment 140 through dock interface 122 .
  • the second active user environment 140 includes display monitor 144 , keyboard 146 - 1 , and/or pointing device 146 - 2 .
  • the second active user environment 140 provides a desktop-like computing experience.
  • the mobile computing device 110 may associate desktop OS 160 with the second active user environment 140 such that the user interface 780 of the desktop OS 160 is displayed on the display 144 of the second active user environment 140 .
  • mobile computing device 110 is connected to components of the second user environment 140 through a dock cradle 141 and/or dock cable 143 .
  • dock interface 122 may include different connectors including other wired or wireless connections to components of the second active user environment 140 .
  • the first active user environment defined by mobile computing device 110 and the second active user environment 140 may provide different computing experiences.
  • mobile OS 130 and desktop OS 160 may have different sets of available applications, meaning that at least some applications available on mobile OS 130 are not available on desktop OS 160 and vice-versa.
  • the configuration of computing environment 700 provides the advantages of two separate active user environments suited to different computing experiences.
  • the user may wish to access various Apps and/or capabilities of one operating system through the active user environment associated with a different operating system.
  • the user may wish to access mobile telephony, location awareness capabilities, and/or other applications and/or services of mobile OS 130 through the second active user environment 140 associated with desktop OS 160 .
  • an application running on mobile OS 130 may not be displayed within the user environment associated with desktop OS 160 by re-directing the graphics information from the graphics server of the mobile OS 130 to the graphics server of the desktop OS 160 .
  • various techniques may be used to display application screens of applications running on mobile OS 130 within a console window of secondary user environment 140 associated with desktop OS 160 . These techniques are described in more detail in U.S. patent application Ser. No. 13/246,665, filed Sep. 27, 2011, entitled “INSTANT REMOTE RENDERING,” the entire contents of which are incorporated herein for all purposes. Accordingly, one or more application screens displayed in windows 782 , 784 , and/or 786 of computing environment 700 may correspond to applications running on mobile OS 130 .
  • FIG. 8 shows a computing environment 800 which illustrates mobile computing device 110 in an undocked state with multiple applications running on mobile OS 130 , according to various embodiments.
  • desktop OS 160 may be in a suspended state.
  • a first application 832 may be running on mobile OS 130 and displayed through application screen 852 on display 116 of the first active user environment 115 defined by mobile computing device 110 .
  • Other applications may be running on mobile OS 130 but not actively displayed.
  • application 834 may represent an application that has been started by the user and not explicitly closed.
  • the first application 832 is in a foreground interaction state and application 834 is in a background interaction state.
  • the user may have started application 834 in the foreground and subsequently started application 832 , which then replaced application 834 as the foreground application.
  • when application 834 is moved to the background of the first active user environment 115 it may be paused, stopped, and/or suspended.
  • mobile OS 130 does not continue to process instructions or pass user input to processes associated with application 834 beyond instructions that may be needed to put the application into the paused, stopped, and/or suspended state.
  • some processes associated with application 834 may be stopped or killed.
  • mobile OS 130 may destroy an application screen process associated with a suspended or stopped application if the corresponding memory is in demand for other processes. Accordingly, application screen processes for applications such as application 834 may not be depended on for graphics information once the application is in a background state.
  • pausing, stopping, and/or suspending background applications is preferred because it reduces processing requirements related to applications that are not being actively interacted with by the user.
  • the user may not be able to interact with more than one application through the first active user environment 115 defined by mobile computing device 115 .
  • the user can switch between applications (i.e., change the interaction states of applications) using several techniques including returning to a home screen and selecting the desired application.
  • Processes associated with background applications, including application screen processes may be restarted or resumed when the application is returned to the foreground state.
  • Embodiments are directed to providing a seamless workflow when mobile computing device 110 is docked with a secondary terminal environment by providing configurable cross-environment application screen behavior.
  • FIG. 9 shows computing environment 800 a which illustrates automatic cross-environment assortment of application screens, according to various embodiments.
  • mobile computing device 110 is docked to secondary terminal environment 140 through interface 122 .
  • mobile computing device 110 may recognize that secondary terminal environment 140 has a user experience profile associated with desktop OS 160 .
  • desktop OS 160 may be unsuspended and associated with secondary terminal environment 140 to provide a second active user environment through which the user can interact with desktop OS 160 .
  • one or more applications running on mobile OS 130 may be configured to be automatically displayed on display 144 of the second active user environment 140 using the cross-environment display techniques described above when mobile computing device 110 is docked. That is, one or more applications may have a user preference setting that determines if the application should be displayed within a second user environment when the mobile computing device 110 detects a docked condition.
  • the user preference setting may include multiple settings according to different user experience profiles of various secondary terminal environments.
  • user settings may determine that an application should be automatically displayed across user environments for a first type of secondary terminal environment (e.g., desktop-like secondary terminal environment), and display of the application maintained on the first user environment for a second type of secondary terminal environment (e.g., tablet-like secondary terminal environment).
  • a first type of secondary terminal environment e.g., desktop-like secondary terminal environment
  • a second type of secondary terminal environment e.g., tablet-like secondary terminal environment
  • applications 832 and 834 are displayed on display 144 of the second active user environment 140 automatically when the mobile computing device 110 is docked.
  • a first console application 962 displays application screen 952 associated with application 832 within console window 972 on display 144 .
  • a second console application 964 displays application screen 954 associated with application 834 within console window 974 on display 144 .
  • Assortment of console windows within the second active user environment 140 may occur in a variety of ways according to user preferences.
  • console windows 972 and 974 are tiled horizontally on display 144 .
  • other window assortment configurations may be chosen by the user by selecting configuration settings.
  • assortment of console windows for cross-environment application display may be done by cascading console windows or other display layering and/or tiling techniques.
  • window assortment configurations may be independently set according to user experience profiles of various secondary terminal environments.
  • the multiple active user environments may be used in a variety of ways to provide a seamless computing experience.
  • the user may interact with mobile OS 130 including applications available on mobile OS 130 through a first active user environment 115 .
  • the user may interact with desktop OS 160 including applications available on desktop OS 160 through the second active user environment 140 .
  • the user may interact with applications available on mobile OS 130 through the active user environment associated with desktop OS 160 using the cross-environment application display techniques described above.
  • the user may want to browse other applications available on mobile OS 130 through the second active user environment 140 .
  • menu icons or other elements of the GUI of desktop OS 160 may be used to browse and access applications available on mobile OS 130 .
  • the user may want to browse all running applications on mobile OS 130 (i.e., applications that are in a foreground interaction state and other applications that the user may have opened on mobile OS 130 and not explicitly shut down).
  • the lack of maintained or updated graphics information for background applications in mobile OS 130 presents issues for graphical cross-environment navigation and access of applications running in mobile OS 130 from the active user environment associated with desktop OS 160 .
  • Embodiments are directed to supporting graphical browsing and navigation in a computing environment with multiple active user environments.
  • FIG. 10 shows computing environment 800 b which illustrates cross environment application preview and navigation of foreground and background applications, according to various embodiments.
  • mobile OS 130 and desktop OS 160 run concurrently on shared kernel 320 of mobile computing device 110 .
  • Mobile OS 130 is associated with the first active user environment 115 through which the user may interact with application screens of applications running on mobile OS 130 .
  • Desktop OS 160 is associated with the second active user environment 140 through which the user may interact with application screens of applications running on desktop OS 160 .
  • a first application screen 852 associated with a first application 832 running on mobile OS 130 , is displayed on a display 116 of the first active user environment 115 .
  • a second application screen 1054 associated with a second application 834 running on mobile OS 130 , is displayed on the display 144 of the second user environment 140 .
  • application screen 1054 is displayed within a console window 1062 associated with a console application 1060 running on desktop OS 160 , as described above.
  • applications 1036 , 1038 , and/or 1040 may represent applications that have been started by the user and not explicitly closed as described above.
  • applications 1036 , 1038 , and/or 1040 may be paused, stopped, and/or suspended by mobile OS 130 .
  • this means that application screen processes associated with these applications may be stopped, killed, or otherwise not maintained by mobile OS 130 .
  • graphical information for application screen processes associated with these applications may not be maintained while the applications are in the paused, stopped, and/or suspended state.
  • mobile OS 130 includes functionality for capturing a last graphical representation of applications before the application is swapped from a foreground interaction state to a background interaction state.
  • mobile OS 130 may capture a bitmap image and/or a copy of surface information for a current state of an application screen of a foreground application just before the application is transitioned to the background (i.e., before the interaction state change from the foreground state to the background state).
  • the last graphical representation is captured in a bitmap server of the mobile OS as described in more detail below.
  • mobile OS 130 maintains an application activity stack 1042 that includes a list of applications that have been started on mobile OS 130 and not explicitly closed.
  • application activity stack 1042 may include place-holders for applications 832 , 834 , 1036 , 1038 , and 1040 .
  • Place-holders in application activity stack 1042 for applications 832 and 834 i.e., applications that are in the foreground
  • mobile OS 130 may use the current graphics and/or surface information from the application screen process associated with the application.
  • the place-holder for applications 1036 , 1038 , and/or 1040 i.e., applications currently in the background of mobile OS 130
  • the last graphical representation may be used to provide a graphical browsing feature within desktop OS 160 of preview screens associated with background applications of mobile OS 130 .
  • desktop OS 160 may present a preview window 1070 that shows preview screens associated with applications in application activity stack 1042 .
  • preview screen 1072 may show a preview representation of a currently active application screen 852 associated with application 832 .
  • Preview screen 1074 may also show a preview representation of a currently active application screen 1054 associated with a currently active application 834 that is displayed within a console window 1062 associated with a console application 1060 of desktop OS 160 as described above.
  • preview window 1070 may show preview screens for background applications of mobile OS 130 .
  • preview window 1070 may present preview screens 1076 , 1078 , and/or 1080 that represent last graphical representations of application screens associated with background applications 1036 , 1038 , and/or 1040 .
  • Mobile OS 130 may capture these last graphical representations and store them with a list of applications (e.g., via application activity stack 1042 , etc.) that are currently running (i.e., started by the user and not explicitly closed).
  • Preview window 1070 may be activated within desktop OS 160 in a variety of ways. As illustrated in FIG. 10 , preview window 1070 may be activated by selecting a menu icon 1066 with a mouse pointer 1064 within the GUI of mobile OS 160 . Upon receiving a user input indicating that a preview window is to be displayed, desktop OS 160 queries mobile OS 130 for a list of applications on the application activity stack and the associated graphical preview representations. In response to the query, mobile OS 130 returns a list corresponding to the application activity stack and provides desktop OS 160 with the graphical preview representations associated with application in the list. For example, mobile OS 130 may return a bitmap image for the preview screens or a pointer to a shared memory location that includes the preview screen graphics information.
  • the user may select preview screens within preview window 1070 to reconfigure the active application screens.
  • the user may select preview screen 1072 to switch the currently displayed user environment for application 832 from the first active user environment 115 to the second active user environment 140 .
  • the user may select any of preview screens 1076 , 1078 , or 1080 to resume the corresponding application 1036 , 1038 , or 1040 within the second active user environment 140 . That is, selecting one of preview screens 1076 , 1078 , or 1080 will open a new console window on display 144 (and corresponding console application running on desktop OS 160 ) and resume the corresponding application, with a restarted or resumed application screen associated with the application displayed within the new console window on display 144 . Accordingly, the user can graphically browse and navigate running applications of mobile OS 130 through preview window 1070 of desktop OS 160 , even where those applications are in a background state and graphical information for associated preview screens is unavailable.
  • embodiments include an Android mobile OS 430 and a full Linux desktop OS 660 (“Hydroid”) running on a modified Android kernel 520 .
  • These embodiments may include a window position control feature within the Android OS 430 that includes functionality for managing application preview screens and providing application preview screen data to components with Hydroid OS 660 .
  • FIG. 11 illustrates components of the window position control system 1100 of Android OS 430 that implement window position control features allowing the user to view and navigate through applications using multiple displays and/or multiple active user environments.
  • the components of window position control system 1100 include services, managers, and/or application level components of Android OS 430 .
  • the offscreen gesture controller 1104 (“OSGC”) and the navigation manager 1100 are user interface components that manipulate a linear model of the Android OS application stack by way of an intermediary known as the application model manager 1112 .
  • the application model manager 1112 provides methods to both examine and modify the linear model using the Android activity manager service 441 to do this. In effect the application model manager 1112 transforms the stack view of Android OS 430 to the linear view presented in the navigation manager 1110 .
  • Both the OSGC 1104 together with the transition controller 1106 and the navigation manager 1110 manipulate snapshot views of application screens.
  • the navigation manager 1110 may use bitmap images of application screens, while the OSGC 1104 works with live (or apparently live) application screens themselves.
  • the bitmap server 1114 provides mechanisms for rapid capture restore manipulation of the application image screens via JNI calls to C++ code designed to minimize image data transfers.
  • the bitmap server 1114 is a class that resides in the Android surface manager (i.e., SurfaceFlinger) 434 .
  • the bitmap server 1114 provides a highly efficient mechanism to manipulate windows such that the window position control system 1100 can provide intuitive visual effects for application browsing and navigation.
  • the bitmap server 1114 provides an interface to the Java layers (e.g., application layer 450 , runtime environment 431 ) such that window manipulation may be controlled from within Java layers.
  • the bitmap server 1114 maintains a handle and reference to all active surfaces so that window animations can be applied to any active surfaces and created based on the data within the active surface. When a surface is destroyed by SurfaceFlinger, the bitmap server persists a copy of the surface data so that it can be used to provide window animations for those applications that are no longer active.
  • the bitmap server 1114 also provides bitmap images of surface data to the application model manager and destroys resources (references to surfaces, and persisted surface data) when invoked to do so by the application model manager.
  • the bitmap server 1114 also creates window transition graphics from surface data including arranging window surfaces in a particular Z-order and/or creating new surfaces from rectangle coordinates of existing surfaces for transitions.
  • the offscreen gesture controller 1104 provides interpretations of user gestures, utilizing a factory class to create various types of transition objects to provide the on-screen visual effects corresponding types of gesture being performed. More particularly, the offscreen gesture controller may be a class in an existing system service that receives motion events from the window manager service 443 and processes them. Upon detecting the beginning of an offscreen gesture the offscreen gesture controller may invoke the transition controller factory to create an instance of the appropriate transition controller type. The offscreen gesture controller may filter out motion events with unintentional offscreen gestures are not processed, record statistics for all motion events within the scope of the gesture should so that gesture velocity may be calculated, and pass “move” motion events to the current transition controller so they can process the events and direct the required visual animation.
  • the offscreen gesture controller may invoke the transition controller factory 1102 to create a new instance of a transition controller so that gesture processing can continue.
  • the offscreen gesture controller may invoke the completion method of the current transition controller to automatically complete the transition.
  • the window manager service 443 may freeze the display so the configuration changes can be processed by applications without unwanted visual effects
  • the transition controller factory 1102 may receive invocations from the off-screen gesture controller 1104 to create instances of transition controllers. Upon invocation to create a transition controller instance, the transition controller factory may retrieve state data and handles to application surface data from the application model manager 1112 . For example, the transition controller factory may retrieve state data and handles to graphical information for applications that are visible, for applications that will become visible due to the transition, and/or on which display the relevant applications are visible and the type of information for each application (e.g., application display properties).
  • the transition controller factory Upon invocation from the offscreen gesture controller 1104 , the transition controller factory creates a transition controller of the appropriate type based on the direction of the gesture, which applications are visible (at the top of the stack), which application screen will become visible because of window movement, and/or on which display the application screen is intended to be visible.
  • the transition controller factory 1102 may be a class residing in an existing mobile OS system service. In one embodiment, the transition controller factory 1102 is a class of the Android OS 430 system service.
  • FIG. 12 illustrates a process flow 1200 for cross-environment application navigation using a captured last graphical representation of an application screen, according to various embodiments.
  • a first application is run on a first operating system of a mobile computing device.
  • a first application screen associated with the application is displayed on an active display device.
  • the application screen may be displayed on a display device of the mobile computing device that defines a first active user environment.
  • an application interaction state change event is received that indicates the current interaction state for the first application is to be changed from a foreground to a background state.
  • another application running on the first operating system may be started or swapped to the foreground on the display device.
  • a transition controller may be invoked by the transition controller factory 906 to handle the interaction state change.
  • the first operating system captures a graphical representation of the first application screen.
  • the first operating system may include a bit-map server in a surface manager that captures a bit-map representation of the active application screen surfaces.
  • the interaction state of the first application is changed from the foreground state to the backgrounds state.
  • processes associated with the first application screen may be paused, stopped, and/or suspended by the first operating system.
  • the first operating system may maintain the first application on an application activity stack, and, at block 1212 , the first operating system associates a position of the application activity stack with the captured bit-map Image.
  • an input command is received related to the application activity stack.
  • the input command may be received within a second active user environment.
  • the second active user environment may be associated with the first operating system or a second operating system running concurrently with the first operating system on a shared kernel of the mobile computing device.
  • a representation of the bitmap image is displayed within a graphical representation of the application activity stack.
  • the graphical representation of the application activity stack may be displayed across active user environments to allow cross-environment graphical navigation of foreground and background applications of the mobile operating system.
  • the various operations of methods described above may be performed by any suitable means capable of performing the corresponding functions.
  • the means may include various hardware and/or software component(s) and/or module(s), including, but not limited to a circuit, an application specific integrated circuit (ASIC), or processor.
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array signal
  • PLD programmable logic device
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any commercially available processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a software module may reside in any form of tangible storage medium.
  • storage media include random access memory (RAM), read only memory (ROM), flash memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM and so forth.
  • RAM random access memory
  • ROM read only memory
  • flash memory EPROM memory
  • EEPROM memory EEPROM memory
  • registers a hard disk, a removable disk, a CD-ROM and so forth.
  • a storage medium may be coupled to a processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor.
  • a software module may be a single instruction, or many instructions, and may be distributed over several different code segments, among different programs, and across multiple storage media.
  • the methods disclosed herein comprise one or more actions for achieving the described method.
  • the method and/or actions may be interchanged with one another without departing from the scope of the claims.
  • the order and/or use of specific actions may be modified without departing from the scope of the claims.
  • a storage medium may be any available tangible medium that can be accessed by a computer.
  • such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM, or other optical disk storage, magnetic disk storage, or other magnetic storage devices, or any other tangible medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
  • Disk and disc include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray® disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
  • a computer program product may perform operations presented herein.
  • a computer program product may be a computer readable tangible medium having instructions tangibly stored (and/or encoded) thereon, the instructions being executable by one or more processors to perform the operations described herein.
  • the computer program product may include packaging material.
  • Software or instructions may also be transmitted over a transmission medium.
  • software may be transmitted from a website, server, or other remote source using a transmission medium such as a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technology such as infrared, radio, or microwave.
  • a transmission medium such as a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technology such as infrared, radio, or microwave.
  • modules and/or other appropriate means for performing the methods and techniques described herein can be downloaded and/or otherwise obtained by a user terminal and/or base station as applicable.
  • a user terminal and/or base station can be coupled to a server to facilitate the transfer of means for performing the methods described herein.
  • various methods described herein can be provided via storage means (e.g., RAM, ROM, a physical storage medium such as a CD or floppy disk, etc.), such that a user terminal and/or base station can obtain the various methods upon coupling or providing the storage means to the device.
  • storage means e.g., RAM, ROM, a physical storage medium such as a CD or floppy disk, etc.
  • any other suitable technique for providing the methods and techniques described herein to a device can be utilized.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Stored Programmes (AREA)
  • Navigation (AREA)

Abstract

Graphical navigation of foreground and background applications running on a mobile computing device across multiple active user environments, even when graphics information for background applications is not maintained by a mobile operating system of the mobile computing device. A last graphical representation of an application screen may be captured as the application state is transitioned from the foreground state to the background state. The last graphical representation may be associated with a position in an application activity stack representing foreground and background mobile operating system applications. The navigation techniques may be used in a computing environment with multiple active user environments. A first active user environment may be associated with the mobile operating system. A second active user environment may be associated with the mobile operating system or a desktop operating system running concurrently with the mobile operating system on the mobile computing device.

Description

CROSS REFERENCE TO RELATED APPLICATION
The present application is a continuation of and claims priority to U.S. patent application Ser. No. 13/399,901, filed on Feb. 17, 2012, which claims the benefits of and priority, under 35 U.S.C. § 119(e), to U.S. Provisional Application Nos. 61/389,117, filed on Oct. 1, 2010; 61/507,199, filed on Jul. 13, 2011; 61/507,201, filed on Jul. 13, 2011; 61/507,203, filed on Jul. 13, 2011; 61/507,206 filed on Jul. 13, 2011; and 61/507,209, filed on Jul. 13, 2011, each of which is incorporated herein by reference in its entirety for all that it teaches and for all purposes.
BACKGROUND
1. Field
This Application relates generally to the field of mobile computing environments, and more particularly to supporting application navigation in a mobile computing environment with multiple active user environments.
2. Relevant Background
Mobile communications devices are becoming ubiquitous in today's society. For example, as of the end of 2008, 90 percent of Americans had a mobile wireless device. Among the fastest growing mobile communications devices are smartphones, that is, mobile phones built on top of a mobile computing platform. Mobile providers have launched hundreds of new smartphones in the last three years based upon several different computing platforms (e.g., Apple iPhone, Android, BlackBerry, Palm, Windows Mobile, and the like). In the U.S., smartphone penetration reached almost 23% by the middle of 2010, and over 35% in some age-groups. In Europe, the smartphone market grew by 41% from 2009 to 2010, with over 60 million smartphone subscribers as of July 2010 in the five largest European countries alone.
Smartphone computing platforms typically include a mobile operating system (“OS”) running on a mobile processor. While mobile processors and mobile OSs have increased the capabilities of these devices, smartphones have not tended to replace personal computer (“PC”) environments (i.e., Windows, Mac OS X, Linux, and the like) such as desktop or notebook computers at least because of the limited user experience provided. In particular, smartphones typically have different processing resources, user interface device(s), peripheral devices, and applications. For example, mobile processors may have a different processor architecture than PC processors that emphasizes features like low-power operation and communications capabilities over raw processing and/or graphics performance. In addition, smartphones tend to have smaller amounts of other hardware resources such as memory (e.g., SRAM, DRAM, etc.) and storage (e.g., hard disk, SSD, etc.) resources. Other considerations typically include a smaller display size that limits the amount of information that can be presented through a mobile OS graphical user interface (“GUI”) and different user input devices. Use interface input device(s) for smartphones typically include a small thumb-style QWERTY keyboard, touch-screen display, click-wheel, and/or scroll-wheel. In contrast, laptop, notebook, and desktop computers that use a desktop OS typically have a full-size keyboard, pointing device(s), and/or a larger screen area. As a result, mobile OSs typically have a different architecture where some capabilities and features such as communications, lower power consumption, touch-screen capability, and the like, are emphasized over traditionally emphasized PC capabilities such as processing speed, graphics processing, and application multi-tasking.
Because of the architecture differences, applications or “Apps” designed for mobile Oss tend to be designed for tasks and activities that are typical of a mobile computing experience (e.g., communications, gaming, navigation, and the like). For example, over a third of all Android App downloads have been targeted towards the gaming and entertainment categories while less than 20% of downloads fall under the tools and productivity categories. In addition, many applications that are common on PC platforms are either not available for mobile OSs or are available only with a limited features set.
For example, many smartphones run Google's Android operating system. Android runs only applications that are specifically developed to run within a Java-based virtual machine runtime environment. In addition, while Android is based on a modified Linux kernel, it uses different standard C libraries, system managers, and services than Linux. Accordingly, applications written for Linux do not run on Android without modification or porting. Similarly, Apple's iPhone uses the iOS mobile operating system. Again, while iOS is derived from Mac OS X, applications developed for OS X do not run on iOS. Therefore, while many applications are available for mobile OSs such as Android and iOS, many other common applications for desktop operating systems such as Linux and Mac OS X are either not available on the mobile platforms or have limited funcitonality. As such, these mobile OSs provide
Accordingly, smartphones are typically suited for a limited set of user experiences and provide applications designed primarily for the mobile environment. In particular, smartphones do not provide a suitable desktop user experience, nor do they run most common desktop applications. For some tasks such as typing or editing documents, the user interface components typically found on a smartphones tend to be more difficult to use than a full-size keyboard and large display that may be typically found on a PC platform.
As a result, many users carry and use multiple computing devices including a smartphone, laptop, and/or tablet computer. In this instance, each device has its own CPU, memory, file storage, and operating system. Connectivity and file sharing between smartphones and other computing devices involves linking one device (e.g., smartphone, running a mobile OS) to a second, wholly disparate device (e.g., notebook, desktop, or tablet running a desktop OS), through a wireless or wired connection. Information is shared across devices by synchronizing data between applications running separately on each device. This process, typically called “synching,” is cumbersome and generally requires active management by the user.
SUMMARY
Embodiments of the present invention are directed to providing the mobile computing experience of a smartphone and the appropriate user experience of a secondary terminal environment in a single mobile computing device. A secondary terminal environment may be some combination of visual rendering devices (e.g., monitor or display), input devices (e.g., mouse, touch pad, touch-screen, keyboard, etc.), and other computing peripherals (e.g., HDD, optical disc drive, memory stick, camera, printer, etc.) connected to the computing device by a wired (e.g., USB, Firewire, Thunderbolt, etc.) or wireless (e.g., Bluetooth, WiFi, etc.) connection. In embodiments, a mobile operating system associated with the user experience of the mobile environment and a desktop operating system associated with the user experience of the secondary terminal environment are run concurrently and independently on a shared kernel.
According to one aspect consistent with various embodiments, a mobile computing device includes a first operating system. A first application is running on the mobile operating system. A first application screen, associated with the first application, is displayed on an active display device. For example, the application screen may be displayed on a display of the mobile computing device. A process for managing application graphics associated with the application includes receiving an application interaction state change event indicating that a current interaction state of the first application is to be changed from a foreground state to a background state, generating a bitmap image corresponding to a graphical representation of the first application screen, changing the current interaction state of the first application from the foreground state to the background state, associating the bitmap image with a position within an application activity stack corresponding to the application, receiving a user input command related to the application activity stack, and displaying a representation of the bitmap image within a graphical representation of the application activity stack. The process may include receiving a user command indicative of a selection of the bitmap image within the graphical representation of the application activity stack, and changing the current interaction state of the first application from the background state to the foreground state.
According to other aspects consistent with various embodiments, the mobile computing device may define a first user environment, and the graphical representation of the application activity stack may be presented on a display of a second user environment. The second user environment may be associated with a second operating system running concurrently with the first operating system on a shared kernel of the mobile computing device. The application activity stack may be maintained by an application model manager. The application activity stack may include applications that have been started by the user and not actively closed by the user. A process associated with the first application screen may be suspended in response to the change in the current interaction state of the first application from the foreground state to the background state.
According to other aspects consistent with various embodiments, a mobile computing device includes a first application and a second application running concurrently on a first operating system. A process for managing application graphics may include displaying the first application on an active display device, receiving an application interaction state change event indicating that a current interaction state of the first application is to be changed from a foreground state to a background state, generating a bitmap image corresponding to a graphical representation of an application screen associated with the first application, changing the current interaction state of the first application from the foreground state to the background state, associating the bitmap image with a position within an application activity stack corresponding to the application, and displaying a graphical representation of the application activity stack on a display device associated with a secondary terminal environment, the secondary terminal environment connected to the mobile computing device via a communications interface, the graphical representation of the application activity stack including the bitmap image.
According to other aspects consistent with various embodiments, the secondary terminal environment may be associated with a second operating system running concurrently with the first operating system on a shared kernel of the mobile computing device. Displaying of the graphical representation of the application activity stack may be in response to a user initiated event within the secondary terminal environment and/or a dock event. The dock event may include connecting the mobile computing device with the secondary terminal environment via the communications interface. The bitmap image may be generated from graphical information maintained by a bitmap server within a surface manager of the first operating system. The bitmap server may provide an application level interface to the bitmap image data.
According to other aspects consistent with various embodiments, a mobile computing device includes a first operating system and a display device. A first application screen, associated with a first application running on the first operating system, is displayed on the display device. The first application may be considered to be in a foreground state. The first operating system includes an activity manager that maintains a list of currently running applications, an application model manager of the first operating system that receives application state information from the activity manager service, and a bitmap server module that maintains references to active surfaces of the first application. The bitmap server module may store a copy of the surface information of the first application screen responsive to an application interaction state change event indicating that a current interaction state of the first application is to be changed from the foreground state to a background state. The first operating system may display a transition animation based on the copy of the surface information of the first application screen. The transition animation may be displayed by an application space component of the first operating system. The first operating system may be a mobile operating system. The mobile operating system may include a surface manager module, and the bitmap server module may be implemented as a class in the surface manager module. The bitmap server module may provide references to the copy of the surface information of the first application screen to the framework layer of the mobile operating system. The bitmap server module may create a bitmap image from the copy of the surface information of the first application screen.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the present invention are illustrated in referenced figures of the drawings, in which like numbers refer to like elements throughout the description of the figures.
FIG. 1 illustrates a computing environment that provides multiple user computing experiences, according to various embodiments.
FIG. 2 illustrates an exemplary system architecture for a mobile computing device, according to various embodiments.
FIG. 3 illustrates an operating system architecture for a computing environment, according to various embodiments.
FIG. 4 illustrates an operating system architecture for a computing environment, according to various embodiments.
FIG. 5 illustrates aspects of a kernel for a computing environment, according to various embodiments.
FIG. 6 illustrates an operating system architecture for a computing environment, according to various embodiments.
FIG. 7 illustrates a computing environment with multiple active user environments, according to various embodiments.
FIG. 8 illustrates a computing environment including a mobile computing device, according to various embodiments.
FIG. 9 illustrates aspects of cross-environment assortment of application windows, according to various embodiments.
FIG. 10 illustrates aspects of graphical cross-environment application navigation, according to various embodiments.
FIG. 11 illustrates aspects of a computing architecture supporting cross-environment application navigation and assortment, according to various embodiments.
FIG. 12 illustrates a flow diagram for managing graphical navigation in a computing environment, according to various embodiments.
DETAILED DESCRIPTION
The present disclosure is generally directed to managing navigation of foreground and background applications in a computing environment with multiple active user environments. More particularly, applications or “Apps” may be running on a mobile operating system (“OS”) of a mobile computing device that generally defines a first active user environment. The mobile OS typically presents a single active application (i.e., foreground application) at a time through a graphical user interface (“GUI”) of the mobile operating system. Other applications may be running on the mobile operating system but not actively displayed (i.e., background applications). Commonly, processes of back ground applications related to displaying graphics information and accepting user input are suspended or paused by the mobile operating system. While some of these processes may save an instance state of application data before being paused or suspended, the mobile OS typically does not update or maintain graphical information for these processes while the application is in the background. Navigation among foreground and background applications running on the mobile OS on the mobile computing device typically consists of navigating away from the foreground application (e.g., back to a home screen, etc.) before selecting an icon associated with a background application.
In a computing environment with multiple active user environments, it may be desirable to present user interfaces for multiple concurrent applications within an active user environment of the computing environment. Additionally, it may be desirable to graphically browse and/or navigate through applications running on a first operating system through a second active user environment. For example, cross environment application browsing and/or navigation using preview representations of application screens may allow faster cross-environment application navigation as they provide a visual representation of an application state.
In disclosed embodiments with multiple active user environments, a mobile computing device running a mobile operating system defines a first active user environment. A second active user environment may be connected to the mobile computing device. Mobile OS applications may be accessed from and actively displayed through the second active user environment. In embodiments, the second active user environment may be associated with a second operating system (e.g., a desktop operating system) running on the mobile computing device.
Disclosed embodiments present a seamless computing experience in a computing environment with multiple active user environments by automatically presenting application screens across user environments in certain conditions. Other disclosed embodiments support graphical navigation of foreground and background applications of the mobile operating system across multiple active user environments. For example, preview screens of foreground and background applications running on the mobile operating system may be displayed within the second active user environment to allow a user to navigate quickly between applications running on the mobile operating system. Disclosed techniques allow graphical cross-environment application navigation even where the mobile operating system does not maintain graphical information for background applications of the mobile OS. For example, graphical and/or user input processes of background applications may be paused, stopped, suspended, and/or killed. In various embodiments, a last graphical representation of a mobile OS application is captured before the application is moved from a foreground state to a background state and graphical information and/or user input processes of the application are paused or suspended. The last graphical representation for mobile OS applications may be maintained as bitmap images or graphics surface information. While the foreground/background application navigation techniques presented in the disclosure are discussed with reference to a mobile computing device and various docked terminal environments, the disclosure may, in various embodiments, be applied to other computing devices (e.g., laptop computers, tablet computers, desktop computers, etc.) and is not intended to be limited to handheld mobile computing devices unless otherwise explicitly specified.
FIG. 1 illustrates a computing environment 100 that provides multiple user computing experiences through multiple active user environments, according to various embodiments. A first active user environment 115 of computing environment 100 is defined by display(s) 116, touch screen sensor(s) 117, and/or I/O devices 118 of mobile computing device 110. The display(s) 116 may be operative to display a displayed image or “screen.” As used herein, the term display is intended to connote device hardware, whereas screen is intended to connote the displayed image produced on the display. In this regard, a display is physical hardware that is operable to present a screen to the user. A screen may encompass a majority of one or more displays. For instance, a screen may occupy substantially all of the display area of one or more displays except for areas dedicated to other functions (e.g. menu bars, status bars, etc.). A screen may be associated with an application and/or an operating system executing on the mobile computing device 110. For instance, applications may have various kinds of screens that are capable of being manipulated as will be described further below.
When mobile computing device 110 is operated as a stand-alone mobile device, active user environment 115 presents a typical mobile computing user experience. In this regard, mobile computing device 110 typically includes mobile telephony capabilities and user interaction features suited to a mobile computing use model. For example, mobile computing device 110 may present a graphical user interface (“GUI”) suited to active user environment 115 including display(s) 116, touch-screen sensor(s) 117, and/or I/O device(s) 118. The user may interact with application programs (i.e., “Apps”) running on mobile computing device 110 through an application screen including various interactive features (e.g., buttons, text fields, toggle fields, etc.) presented on display(s) 116. In some instances, the user interacts with these interactive features by way of I/O device(s) 118. In other instances, the user interacts with these features by way of touch-screen sensor(s) 117 using gestures and symbols that are input to touch screen sensor(s) 117 using the user's fingers or a stylus. In yet other instances, the user interacts with these features using a combination of I/O device(s) 118 and touch-screen sensor(s) 117.
FIG. 2 illustrates an exemplary hardware system architecture for mobile computing device 110, according to various embodiments. Mobile computing device 110 includes mobile processor 114 with one or more CPU cores 204 and external display interface 220. Generally, mobile computing device 110 also includes memory 206, storage devices 208, touch-screen display controller 212 connected to touch-screen display(s) 116 and/or touch-screen sensor(s) 117, I/O devices 118, power management IC 214 connected to battery 216, cellular modem 218, communication devices 222, and/or other devices 224 that are connected to processor 114 through various communication signals and interfaces. I/O devices 118 generally includes buttons and other user interface components that may be employed in mobile computing device 110. For example, I/O devices 118 may include a set of buttons, (e.g., back, menu, home, search, etc.), off-screen gesture area, click-wheel, scroll-wheel, QWERTY keyboard, etc. Other devices 224 may include, for example, GPS devices, LAN connectivity, microphones, speakers, cameras, accelerometers, gyroscopes, magnetometers, and/or MS/MMC/SD/SDIO card interfaces. External display interface 220 may be any suitable display interface (e.g., VGA, DVI, HDMI, wireless, etc.).
One or more sensor devices of the mobile computing device 110 may be able to monitor the orientation of the mobile computing device with respect to gravity. For example, using an accelerometer, gyroscope, inclinometer, or magnetometer, or some combination of these sensors, mobile computing device 110 may be able to determine whether it is substantially in a portrait orientation (meaning that a long axis of the display(s) 116 are oriented vertically) or substantially in a landscape orientation with respect to gravity. These devices may further provide other control functionality by monitoring the orientation and/or movement of the mobile computing device 110. As used herein, the term orientation sensor is intended to mean some combination of sensors (e.g., accelerometer, gyroscope, inclinometer, magnetometer, etc.) that may be used to determine orientation of a device with respect to gravity and is not intended to be limited to any particular sensor type or technology.
Processor 114 may be an ARM-based mobile processor. In embodiments, mobile processor 114 is a mobile ARM-based processor such as Texas Instruments OMAP3430, Marvell PXA320, Freescale iMX51, or Qualcomm QSD8650/8250. However, mobile processor 114 may be another suitable ARM-based mobile processor or processor based on other processor architectures such as, for example, x86-based processor architectures or other RISC-based processor architectures.
While FIG. 2 illustrates one exemplary hardware implementation 112 for mobile computing device 110, other architectures are contemplated as within the scope of the invention. For example, various components illustrated in FIG. 2 as external to mobile processor 114 may be integrated into mobile processor 114. Optionally, external display interface 220, shown in FIG. 2 as integrated into mobile processor 114, may be external to mobile processor 114. Additionally, other computer architectures employing a system bus, discrete graphics processor, and/or other architectural variations are suitable for employing aspects of the present invention.
Returning to FIG. 1, mobile computing device 110 may be docked with a secondary terminal environment 140. Secondary terminal environment 140 may be some combination of visual rendering devices (e.g., monitor or display) 140, I/O devices (e.g., mouse, touch pad, touch-screen, keyboard, etc.) 146, and other computing peripherals (e.g., HDD, optical disc drive, memory stick, camera, printer, GPS, accelerometer, etc.) 148 connected to mobile computing device 110 by connecting port 142 on secondary terminal environment 140 with port 120 on mobile computing device 110 through interface 122. Interface 122 may be some combination of wired (e.g., USB, Firewire, Thunderbolt, HDMI, VGA, etc.) or wireless (e.g., Bluetooth, WiFi, Wireless HDMI, etc.) interfaces. While secondary terminal environments may have some processing or logic elements such as microcontrollers or other application specific integrated circuits (“ASICs”), they typically do not have a processor that runs a separate instance of an operating system.
Secondary terminal environments that define a second user environment may be suited for one or more of various use models, depending on the components that make up the secondary terminal environment. Some secondary terminal environments may be associated with a user computing experience that is similar to the user computing experience of the mobile computing device 110, while others may provide a user computing experience more traditionally associated with desktop computing. For example, secondary terminal environment 140 may be a device that includes a display 144 with a corresponding touch-screen sensor 146 that serves as the primary user input for the device. This type of secondary terminal environment may be called a tablet-style secondary terminal environment. While a tablet-style secondary terminal environment may have a larger touch-screen display than mobile computing device 110, the user experience of this type of secondary terminal environment may be similar in some ways to the user experience of mobile computing device 110. Specifically, it may be convenient for a user to interact with applications displayed on this type of secondary terminal environment through similar gesture-based techniques (i.e., touching, swiping, pinching, etc.) and/or virtual keyboards as they might use on mobile computing device 110. In one embodiment known as a “Smart Pad,” a tablet-style secondary terminal environment includes a 10.1-inch diagonal (1280×800 resolution) touch-enabled display, standard set of buttons (e.g., back, menu, home, search, etc.), one or more cameras, and an off-screen gesture area. A tablet-style secondary terminal environment may include other peripheral devices 148 that may be used to influence the configuration of applications presented to the user on the tablet-style secondary terminal environment. For example, a tablet-style secondary terminal environment may include a GPS receiver, accelerometer, gyroscope, magnetometer, and/or other sensors for determining its location and/or orientation.
Another type of secondary terminal environment is a laptop or notebook-style secondary terminal environment. A notebook-style secondary terminal environment generally includes a display screen 144, keyboard and pointing device(s) 146, and/or other peripheral devices 148 in a clam-shell type enclosure. In embodiments, a laptop or notebook-style secondary terminal environment may be known as a “Smart Display” or “LapDock.” Because this type of secondary terminal environment includes a larger display, keyboard, and pointing device(s), it typically has a user computing experience associated with a desktop computing experience. In this regard, this type of secondary terminal environment may not have a similar user experience profile to mobile computing device 110. A notebook-style secondary terminal environment may include other peripheral devices that may be used to influence the configuration of applications presented to the user on the secondary terminal environment. For example, a notebook-style secondary terminal environment may include a GPS receiver, accelerometer, gyroscope, magnetometer, and/or other sensors for determining its location and/or orientation.
The various secondary terminal environments may also include a variety of generic input/output device peripherals that make up a typical desktop computing environment. The I/O devices may be connected through a docking hub (or “dock cradle”) that includes port 142 and one or more device I/O ports for connecting various commercially available display monitors 144, I/O devices 146, and/or other peripheral devices 148. For example, a docking hub may include a display port (e.g., VGA, DVI, HDMI, Wireless HDMI, etc.), and generic device ports (e.g., USB, Firewire, etc.). As one example, a user may connect a commercially available display, keyboard, and pointing device(s) to the docking hub. In this way, the user may create a secondary terminal environment from a combination of input/output devices. Commonly, this secondary terminal environment will be suited to a desktop computing experience. In particular, this type of secondary terminal environment may be suited to a computing experience designed around the use of a pointing device(s) and physical keyboard to interact with a user interface on the display.
In embodiments, mobile computing device 110 includes multiple operating systems running concurrently and independently on a shared kernel. Concurrent execution of a mobile OS and a desktop OS on a shared kernel is described in more detail in U.S. patent application Ser. No. 13/217,108, filed Aug. 24, 2011, entitled “MULTI-OPERATING SYSTEM,” herein incorporated by reference. In this way, a single mobile computing device can concurrently provide a mobile computing experience through a first user environment associated with a mobile OS and a desktop computing experience through a second user environment associated with a full desktop OS.
FIG. 3 illustrates OS architecture 300 that may be employed to run mobile OS 130 and desktop OS 160 concurrently on mobile computing device 110, according to various embodiments. As illustrated in FIG. 3, mobile OS 130 and desktop OS 160 are independent operating systems running concurrently on shared kernel 320. Specifically, mobile OS 130 and desktop OS 160 are considered independent and concurrent because they are running on shared kernel 320 at the same time and may have separate and incompatible user libraries, graphics systems, and/or framework layers. For example, mobile OS 130 and desktop OS 160 may both interface to shared kernel 320 through the same kernel interface 322 (e.g., system calls, etc.). In this regard, shared kernel 320 manages task scheduling for processes of both mobile OS 130 and desktop OS 160 concurrently.
In addition, shared kernel 320 runs directly on mobile processor 114 of mobile computing device 110, as illustrated in FIG. 3. Specifically, shared kernel 320 directly manages the computing resources of processor 114 such as CPU scheduling, memory access, and I/O. In this regard, hardware resources are not virtualized, meaning that mobile OS 130 and desktop OS 160 make system calls through kernel interface 322 without virtualized memory or I/O access. Functions and instructions for OS architecture 300 may be stored as computer program code on a tangible computer readable medium of mobile computing device 110. For example, instructions for OS architecture 300 may be stored in storage device(s) 208 of mobile computing device 110.
As illustrated in FIG. 3, mobile OS 130 has libraries layer 330, application framework layer 340, and application layer 350. In mobile OS 130, applications 352 and 354 run in application layer 350 supported by application framework layer 340 of mobile OS 130. Application framework layer 340 includes manager(s) 342 and service(s) 344 that are used by applications running on mobile OS 130. Libraries layer 330 includes user libraries 332 that implement common functions such as I/O and string manipulation, graphics functions, database capabilities, communication capabilities, and/or other functions and capabilities.
Application framework layer 340 may include a window manager, activity manager, package manager, resource manager, telephony manager, gesture controller, and/or other managers and services for the mobile environment. Application framework layer 340 may include a mobile application runtime environment that executes applications developed for mobile OS 130. The mobile application runtime environment may be optimized for mobile computing resources such as lower processing power and/or limited memory space.
Applications running on mobile OS 130 may be composed of multiple application components that perform the functions associated with the application, where each component is a separate process. For example, a mobile OS application may be composed of processes for displaying graphical information, handling user input, managing data, communicating with other applications/processes, and/or other types of processes.
As illustrated in FIG. 3, desktop OS 160 has libraries layer 360, framework layer 370, and application layer 380. In desktop OS 160, applications 382 and 384 run in application layer 380 supported by application framework layer 370 of desktop OS 160. Application framework layer 370 includes manager(s) 372 and service(s) 374 that are used by applications running on desktop OS 160. For example, application framework layer 370 may include a window manager, activity manager, package manager, resource manager, and/or other managers and services common to a desktop environment. Libraries layer 360 may include user libraries 362 that implement common functions such as I/O and string manipulation, graphics functions, database capabilities, communication capabilities, and/or other functions and capabilities.
As described above, mobile operating systems typically do not use the same graphics environment as desktop operating systems. Specifically, graphics environments for mobile OSs are designed for efficiency and the specific user input devices of a mobile computing environment. For example, display devices of mobile computing devices are typically too small to present multiple active application screens at the same time. Accordingly, most mobile OS GUIs present a single active application screen that consumes all or substantially all of the active area of the display of the mobile computing device. In addition, presenting a single active application screen at a time allows the mobile OS to shut down or suspend graphical and/or user interaction processes of background applications. Shutting down or suspending background application processes conserves power which is critical to providing long battery life in a mobile computing device.
In contrast, graphics environments for desktop OSs are designed for flexibility and high performance. For example, desktop OSs typically provide a multi-tasking user interface where more than one application screen may be presented through the desktop OS GUI at the same time. Graphics information for multiple applications may be displayed within windows of the GUI of the desktop operating system that are cascaded, tiled, and/or otherwise displayed concurrently in overlapping or non-overlapping fashion. This type of graphical environment provides for greater flexibility because multiple applications may be presented through multiple active application screens. While only a single application may have the input focus (i.e., the application to which input such as keyboard entry is directed), switching back and forth between applications does not require resuming or restarting the application and rebuilding the application screen. Commonly, switching input focus back and forth between active applications may involve selecting an application window or switching the focus to an application window by placing the mouse pointer over the application window. In this regard, desktop OSs typically maintain graphics information (i.e., graphics and user input processes continue to run) for all running applications, whether the application screens associated with the applications are active or not (e.g., in the background, etc.). However, maintaining multiple active application screens requires greater processing and system resources.
Accordingly, mobile OS 130 and desktop 160 may be independent operating systems with incompatible user libraries, graphics systems, and/or application frameworks. Therefore, applications developed for mobile OS 130 may not run directly on desktop OS 160, and applications developed for desktop OS 160 may not run directly on mobile OS 130. For example, application 352, running in application layer 350 of mobile OS 130, may be incompatible with desktop OS 160, meaning that application 352 could not run on desktop OS 160. Specifically, application 352 may depend on manager(s) 342, service(s) 344, and/or libraries 332 of mobile OS 130 that are either not available or not compatible with manager(s) 372, service(s) 374, and/or libraries 362 of desktop OS 160.
In various embodiments of the present disclosure, desktop OS 160 runs in a separate execution environment from mobile OS 130. For example, mobile OS 130 may run in a root execution environment and desktop OS 160 may run in a secondary execution environment established under the root execution environment. Processes and applications running on mobile OS 130 access user libraries 332, manager(s) 342 and service(s) 344 in the root execution environment. Processes and applications running on desktop OS 160 access user libraries 362, manager(s) 372 and service(s) 374 in the secondary execution environment.
The most widely adopted mobile OS is Google's Android. While Android is based on Linux, it includes modifications to the kernel and other OS layers for the mobile environment and mobile processors. In particular, while the Linux kernel is designed for a PC (i.e., x86) CPU architecture, the Android kernel is modified for ARM-based mobile processors. Android device drivers are also particularly tailored for devices typically present in a mobile hardware architecture including touch-screens, mobile connectivity (GSM/EDGE, CDMA, Wi-Fi, etc.), battery management, GPS, accelerometers, and camera modules, among other devices. In addition, Android does not have a native X Window System nor does it support the full set of standard GNU libraries, and this makes it difficult to port existing GNU/Linux applications or libraries to Android.
Apple's iOS operating system (run on the iPhone) and Microsoft's Windows Phone 7 are similarly modified for the mobile environment and mobile hardware architecture. For example, while iOS is derived from the Mac OS X desktop OS, common Mac OS X applications do not run natively on iOS. Specifically, iOS applications are developed through a standard developer's kit (“SDK”) to run within the “Cocoa Touch” runtime environment of iOS, which provides basic application infrastructure and support for key iOS features such as touch-based input, push notifications, and system services. Therefore, applications written for Mac OS X do not run on iOS without porting. In addition, it may be difficult to port Mac OS X applications to iOS because of differences between user libraries and/or application framework layers of the two OSs, and/or differences in system resources of the mobile and desktop hardware.
In one embodiment consistent with OS architecture 300, an Android mobile OS and a full Linux OS run independently and concurrently on a modified Android kernel. In this embodiment, the Android OS may be a modified Android distribution while the Linux OS (“Hydroid”) may be a modified Debian Linux desktop OS. FIGS. 4-6 illustrate Android mobile OS 430, Android kernel 520, and Hydroid OS 660 that may be employed in OS architecture 300 in more detail, according to various embodiments.
As illustrated in FIG. 4, Android OS 430 includes a set of C/C++ libraries in libraries layer 432 that are accessed through application framework layer 440. Libraries layer 432 includes the “bionic” system C library 439 that was developed specifically for Android to be smaller and faster than the “glibc” Linux C-library. Libraries layer 432 also includes inter-process communication (“IPC”) library 436, which includes the base classes for the “Binder” IPC mechanism of the Android OS. Binder was developed specifically for Android to allow communication between processes and services. Other libraries shown in libraries layer 432 in FIG. 4 include media libraries 435 that support recording and playback of media formats, surface manager 434 that manages access to the display subsystem and composites graphic layers from multiple applications, 2D and 3D graphics engines 438, and lightweight relational database engine 437. Other libraries that may be included in libraries layer 432 but are not pictured in FIG. 4 include bitmap and vector font rendering libraries, utilities libraries, browser tools (i.e., WebKit, etc.), and/or secure communication libraries (i.e., SSL, etc.).
Application framework layer 440 of Android OS 430 provides a development platform that allows developers to use components of the device hardware, access location information, run background services, set alarms, add notifications to the status bar, etc. Framework layer 440 also allows applications to publish their capabilities and make use of the published capabilities of other applications. Components of application framework layer 440 of Android mobile OS 430 include activity manager 441, resource manager 442, window manager 443, dock manager 444, hardware and system services 445, desktop monitor service 446, multi-display manager 447, and remote communication service 448. Other components that may be included in framework layer 440 of Android mobile OS 430 include a view system, telephony manager, package manager, location manager, and/or notification manager, among other managers and services.
Applications running on Android OS 430 run within the Dalvik virtual machine 431 in the Android runtime environment 433 on top of the Android object-oriented application framework. Dalvik virtual machine 431 is a register-based virtual machine, and runs a compact executable format that is designed to reduce memory usage and processing requirements. Applications running on Android OS 430 include home screen 451, email application 452, phone application 453, browser application 454, and/or other application(s) (“App(s)”) 455. Each application may include one or more application components including activities which define application screens through which the user interfaces with the application. That is, activities are processes within the Android runtime environment that manage user interaction through application screens. Other application components include services for performing long-running operations or non-user interface features, content providers for managing shared data, and broadcast receivers for responding to system broadcast messages.
The Android OS graphics system uses a client/server model. A surface manager (“SurfaceFlinger”) is the graphics server and applications are the clients. SurfaceFlinger maintains a list of display ID's and keeps track of assigning applications to display ID's. In one embodiment, mobile computing device 110 has multiple touch screen displays 116. In this embodiment, display ID 0 is associated with one of the touch screen displays 116 and display ID 1 is associated with the other touch screen display 116. Display ID 2 is associated with both touch screen displays 116 (i.e., the application is displayed on both displays at the same time).
For each display device associated with a display ID, Android maintains a graphics context and frame buffer associated with the display device. In one embodiment, display ID's greater than 2 are virtual displays, meaning that they are not associated with a display physically present on mobile computing device 110.
Graphics information for Android applications and/or activities includes windows, views, and canvasses. Each window, view, and/or canvas is implemented with an underlying surface object. Surface objects are double-buffered (front and back buffers) and synchronized across processes for drawing. SurfaceFlinger maintains all surfaces in a shared memory pool which allows all processes within Android to access and draw into them without expensive copy operations and without using a server-side drawing protocol such as X-Windows. Applications always draw into the back buffer while SurfaceFlinger reads from the front buffer. SurfaceFlinger creates each surface object, maintains all surface objects, and also maintains a list of surface objects for each application. When the application finishes drawing in the back buffer, it posts an event to SurfaceFlinger, which swaps the back buffer to the front and queues the task of rendering the surface information to the frame buffer.
SurfaceFlinger monitors all window change events. When one or more window changeevents occur, SurfaceFlinger renders the surface information to the frame buffer for one or more displays. Rendering includes compositing the surfaces, i.e., composing the final image frame based on dimensions, transparency, z-order, and visibility of the surfaces. Rendering may also include hardware acceleration (e.g., OpenGL 2D and/or 3D interface for graphics processing hardware). SurfaceFlinger loops over all surface objects and renders their front buffers to the frame buffer in their Z order.
FIG. 5 illustrates modified Android kernel 520 in more detail, according to various embodiments. Modified Android kernel 520 includes touch-screen display driver 521, camera driver(s) 522, Bluetooth driver(s) 523, shared memory allocator 524, IPC driver(s) 525, USB driver(s) 526, WiFi driver(s) 527, I/O device driver(s) 528, and/or power management module 530. I/O device driver(s) 528 includes device drivers for external I/O devices, including devices that may be connected to mobile computing device 110 through port 120. Modified Android kernel 520 may include other drivers and functional blocks including a low memory killer, kernel debugger, logging capability, and/or other hardware device drivers.
FIG. 6 illustrates Hydroid OS 660 in more detail, according to various embodiments. Hydroid is a full Linux OS that is capable of running almost any application developed for standard Linux distributions. In particular, libraries layer 662 of Hydroid OS 660 includes Linux libraries that support networking, graphics processing, database management, and other common program functions. For example, user libraries 662 may include the “glibc” Linux C library 664, Linux graphics libraries 662 (e.g., GTK, OpenGL, etc.), Linux utilities libraries 661, Linux database libraries, and/or other Linux user libraries. Applications run on Hydroid within an X-Windows Linux graphical environment using X-Server 674, window manager 673, and/or desktop environment 672. Illustrated applications include word processor 681, email application 682, spreadsheet application 683, browser 684, and other application(s) 685.
The Linux OS graphics system is based on the X-windows (or “XII”) graphics system. X-windows is a platform-independent, networked graphics framework. X-windows uses a client/server model where the X-server is the graphics server and applications are the clients. The X-server controls input/output hardware associated with the Linux OS such as displays, touch-screen displays, keyboards, pointing device(s), etc. In this regard, X-windows provides a server-side drawing graphics architecture, i.e., the X-server maintains the content for drawables including windows and pixmaps. X-clients communicate with the X-server by exchanging data packets that describe drawing operations over a communication channel. X-clients access the X communication protocol through a library of standard routines (the “Xlib”). For example, an X-client may send a request to the X-server to draw a rectangle in the client window. The X-server sends input events to the X-clients, for example, keyboard or pointing device input, and/or window movement or resizing. Input events are relative to client windows. For example, if the user clicks when the pointer is within a window, the X-server sends a packet that includes the input event to the X-client associated with the window that includes the action and positioning of the event relative to the window.
Because of the differences in operating system frameworks, graphics systems, and/or libraries, applications written for Android do not generally run on Hydroid OS 660 and applications written for standard Linux distributions do not generally run on Android OS 430. In this regard, applications for Android OS 430 and Hydroid OS 660 are not bytecode compatible, meaning compiled and executable programs for one do not run on the other.
In one embodiment, Hydroid OS 660 includes components of a cross-environment communication framework that facilitates communication with Android OS 430 through shared kernel 520. These components include IPC library 663 that includes the base classes for the Binder IPC mechanism of the Android OS and remote communications service 671.
In one embodiment, Hydroid OS 660 is run within a chrooted (created with the ‘chroot’ command) secondary execution environment created within the Android root environment. Processes and applications within Hydroid OS 660 are run within the secondary execution environment such that the apparent root directory seen by these processes and applications is the root directory of the secondary execution environment. In this way, Hydroid OS 660 can run programs written for standard Linux distributions without modification because Linux user libraries 662 are available to processes running on Hydroid OS 660 in the chrooted secondary execution environment.
Referring back to FIG. 1, mobile computing device 110 may associate a connected secondary terminal environment 140 with desktop OS 160. In this configuration, computing environment 100 presents a first computing experience through a first active user environment 115 associated with mobile OS 130, and, concurrently, a second computing experience through the second active user environment 140 associated with desktop OS 160.
FIG. 7 illustrates a computing environment 700 with multiple active user environments, according to various embodiments. In computing environment 700, the mobile computing device 110 presents a first active user environment associated with the mobile OS 130 that includes touch-screen display(s) 116 and other I/O devices 118 of mobile computing device 110. As illustrated in FIG. 7, the user interface 750 of the mobile OS 130 is displayed on touch-screen display 116. In computing environment 700, mobile computing device 110 is docked to a second active user environment 140 through dock interface 122. The second active user environment 140 includes display monitor 144, keyboard 146-1, and/or pointing device 146-2. In this regard, the second active user environment 140 provides a desktop-like computing experience. When docked through dock interface 122, the mobile computing device 110 may associate desktop OS 160 with the second active user environment 140 such that the user interface 780 of the desktop OS 160 is displayed on the display 144 of the second active user environment 140. As illustrated in FIG. 7, mobile computing device 110 is connected to components of the second user environment 140 through a dock cradle 141 and/or dock cable 143. However, dock interface 122 may include different connectors including other wired or wireless connections to components of the second active user environment 140.
As described above, the first active user environment defined by mobile computing device 110 and the second active user environment 140 may provide different computing experiences. In addition, mobile OS 130 and desktop OS 160 may have different sets of available applications, meaning that at least some applications available on mobile OS 130 are not available on desktop OS 160 and vice-versa. As such, the configuration of computing environment 700 provides the advantages of two separate active user environments suited to different computing experiences. However, in some instances the user may wish to access various Apps and/or capabilities of one operating system through the active user environment associated with a different operating system. For example, the user may wish to access mobile telephony, location awareness capabilities, and/or other applications and/or services of mobile OS 130 through the second active user environment 140 associated with desktop OS 160.
Because the graphics environments of mobile and desktop OSs are often different, an application running on mobile OS 130 may not be displayed within the user environment associated with desktop OS 160 by re-directing the graphics information from the graphics server of the mobile OS 130 to the graphics server of the desktop OS 160. However, various techniques may be used to display application screens of applications running on mobile OS 130 within a console window of secondary user environment 140 associated with desktop OS 160. These techniques are described in more detail in U.S. patent application Ser. No. 13/246,665, filed Sep. 27, 2011, entitled “INSTANT REMOTE RENDERING,” the entire contents of which are incorporated herein for all purposes. Accordingly, one or more application screens displayed in windows 782, 784, and/or 786 of computing environment 700 may correspond to applications running on mobile OS 130.
Consider that initially a user is interacting with mobile computing device 110 in an undocked state. FIG. 8 shows a computing environment 800 which illustrates mobile computing device 110 in an undocked state with multiple applications running on mobile OS 130, according to various embodiments. In computing environment 800, desktop OS 160 may be in a suspended state.
A first application 832 may be running on mobile OS 130 and displayed through application screen 852 on display 116 of the first active user environment 115 defined by mobile computing device 110. Other applications may be running on mobile OS 130 but not actively displayed. For example, application 834 may represent an application that has been started by the user and not explicitly closed. In this instance, the first application 832 is in a foreground interaction state and application 834 is in a background interaction state. In one instance, the user may have started application 834 in the foreground and subsequently started application 832, which then replaced application 834 as the foreground application. In this instance, when application 834 is moved to the background of the first active user environment 115, it may be paused, stopped, and/or suspended. This means that mobile OS 130 does not continue to process instructions or pass user input to processes associated with application 834 beyond instructions that may be needed to put the application into the paused, stopped, and/or suspended state. In addition, some processes associated with application 834 may be stopped or killed. For example, mobile OS 130 may destroy an application screen process associated with a suspended or stopped application if the corresponding memory is in demand for other processes. Accordingly, application screen processes for applications such as application 834 may not be depended on for graphics information once the application is in a background state.
In a computing system with only one active user environment 115 defined by mobile computing device 110, pausing, stopping, and/or suspending background applications is preferred because it reduces processing requirements related to applications that are not being actively interacted with by the user. For example, the user may not be able to interact with more than one application through the first active user environment 115 defined by mobile computing device 115. The user can switch between applications (i.e., change the interaction states of applications) using several techniques including returning to a home screen and selecting the desired application. Processes associated with background applications, including application screen processes, may be restarted or resumed when the application is returned to the foreground state.
Embodiments are directed to providing a seamless workflow when mobile computing device 110 is docked with a secondary terminal environment by providing configurable cross-environment application screen behavior. FIG. 9 shows computing environment 800 a which illustrates automatic cross-environment assortment of application screens, according to various embodiments. For example, in FIG. 9, mobile computing device 110 is docked to secondary terminal environment 140 through interface 122. Upon docking with secondary terminal environment 140, mobile computing device 110 may recognize that secondary terminal environment 140 has a user experience profile associated with desktop OS 160. Accordingly, desktop OS 160 may be unsuspended and associated with secondary terminal environment 140 to provide a second active user environment through which the user can interact with desktop OS 160.
To provide a more seamless user experience, one or more applications running on mobile OS 130 may be configured to be automatically displayed on display 144 of the second active user environment 140 using the cross-environment display techniques described above when mobile computing device 110 is docked. That is, one or more applications may have a user preference setting that determines if the application should be displayed within a second user environment when the mobile computing device 110 detects a docked condition. The user preference setting may include multiple settings according to different user experience profiles of various secondary terminal environments. For example, user settings may determine that an application should be automatically displayed across user environments for a first type of secondary terminal environment (e.g., desktop-like secondary terminal environment), and display of the application maintained on the first user environment for a second type of secondary terminal environment (e.g., tablet-like secondary terminal environment).
In the embodiment according to FIG. 9, applications 832 and 834 are displayed on display 144 of the second active user environment 140 automatically when the mobile computing device 110 is docked. Specifically, a first console application 962 displays application screen 952 associated with application 832 within console window 972 on display 144. Likewise, a second console application 964 displays application screen 954 associated with application 834 within console window 974 on display 144. Assortment of console windows within the second active user environment 140 may occur in a variety of ways according to user preferences. In FIG. 9, console windows 972 and 974 are tiled horizontally on display 144. However, other window assortment configurations may be chosen by the user by selecting configuration settings. For example, assortment of console windows for cross-environment application display may be done by cascading console windows or other display layering and/or tiling techniques. In embodiments, window assortment configurations may be independently set according to user experience profiles of various secondary terminal environments.
Once mobile computing device 110 is docked, the multiple active user environments may be used in a variety of ways to provide a seamless computing experience. In one configuration, the user may interact with mobile OS 130 including applications available on mobile OS 130 through a first active user environment 115. At the same time, the user may interact with desktop OS 160 including applications available on desktop OS 160 through the second active user environment 140. The user may interact with applications available on mobile OS 130 through the active user environment associated with desktop OS 160 using the cross-environment application display techniques described above.
To access applications available on mobile OS 130 from the active user environment associated with desktop OS 160, the user may want to browse other applications available on mobile OS 130 through the second active user environment 140. In embodiments, menu icons or other elements of the GUI of desktop OS 160 may be used to browse and access applications available on mobile OS 130. In some embodiments, the user may want to browse all running applications on mobile OS 130 (i.e., applications that are in a foreground interaction state and other applications that the user may have opened on mobile OS 130 and not explicitly shut down). However, the lack of maintained or updated graphics information for background applications in mobile OS 130 presents issues for graphical cross-environment navigation and access of applications running in mobile OS 130 from the active user environment associated with desktop OS 160.
Embodiments are directed to supporting graphical browsing and navigation in a computing environment with multiple active user environments. FIG. 10 shows computing environment 800 b which illustrates cross environment application preview and navigation of foreground and background applications, according to various embodiments. In computing environment 800 b, mobile OS 130 and desktop OS 160 run concurrently on shared kernel 320 of mobile computing device 110. Mobile OS 130 is associated with the first active user environment 115 through which the user may interact with application screens of applications running on mobile OS 130. Desktop OS 160 is associated with the second active user environment 140 through which the user may interact with application screens of applications running on desktop OS 160. A first application screen 852, associated with a first application 832 running on mobile OS 130, is displayed on a display 116 of the first active user environment 115. That is, the first application 832 is in a foreground interaction state on the first active user environment 115. A second application screen 1054, associated with a second application 834 running on mobile OS 130, is displayed on the display 144 of the second user environment 140. Specifically, application screen 1054 is displayed within a console window 1062 associated with a console application 1060 running on desktop OS 160, as described above.
In computing environment 800 b, other applications may be running on mobile OS 130 but not actively displayed. For example, applications 1036, 1038, and/or 1040 may represent applications that have been started by the user and not explicitly closed as described above. In this instance, applications 1036, 1038, and/or 1040 may be paused, stopped, and/or suspended by mobile OS 130. As described above, this means that application screen processes associated with these applications may be stopped, killed, or otherwise not maintained by mobile OS 130. As such, graphical information for application screen processes associated with these applications may not be maintained while the applications are in the paused, stopped, and/or suspended state.
In embodiments, mobile OS 130 includes functionality for capturing a last graphical representation of applications before the application is swapped from a foreground interaction state to a background interaction state. For example, mobile OS 130 may capture a bitmap image and/or a copy of surface information for a current state of an application screen of a foreground application just before the application is transitioned to the background (i.e., before the interaction state change from the foreground state to the background state). In embodiments, the last graphical representation is captured in a bitmap server of the mobile OS as described in more detail below.
In embodiments, mobile OS 130 maintains an application activity stack 1042 that includes a list of applications that have been started on mobile OS 130 and not explicitly closed. For example, application activity stack 1042 may include place-holders for applications 832, 834, 1036, 1038, and 1040. Place-holders in application activity stack 1042 for applications 832 and 834 (i.e., applications that are in the foreground) may be references to the application processes. As such, to provide a graphical representation of the application, mobile OS 130 may use the current graphics and/or surface information from the application screen process associated with the application. The place-holder for applications 1036, 1038, and/or 1040 (i.e., applications currently in the background of mobile OS 130) may include a reference to the last graphical representation captured by mobile OS 130.
As illustrated in FIG. 10, the last graphical representation may be used to provide a graphical browsing feature within desktop OS 160 of preview screens associated with background applications of mobile OS 130. Specifically, desktop OS 160 may present a preview window 1070 that shows preview screens associated with applications in application activity stack 1042. For example, preview screen 1072 may show a preview representation of a currently active application screen 852 associated with application 832. Preview screen 1074 may also show a preview representation of a currently active application screen 1054 associated with a currently active application 834 that is displayed within a console window 1062 associated with a console application 1060 of desktop OS 160 as described above. Additionally, preview window 1070 may show preview screens for background applications of mobile OS 130. For example, preview window 1070 may present preview screens 1076, 1078, and/or 1080 that represent last graphical representations of application screens associated with background applications 1036, 1038, and/or 1040. Mobile OS 130 may capture these last graphical representations and store them with a list of applications (e.g., via application activity stack 1042, etc.) that are currently running (i.e., started by the user and not explicitly closed).
Preview window 1070 may be activated within desktop OS 160 in a variety of ways. As illustrated in FIG. 10, preview window 1070 may be activated by selecting a menu icon 1066 with a mouse pointer 1064 within the GUI of mobile OS 160. Upon receiving a user input indicating that a preview window is to be displayed, desktop OS 160 queries mobile OS 130 for a list of applications on the application activity stack and the associated graphical preview representations. In response to the query, mobile OS 130 returns a list corresponding to the application activity stack and provides desktop OS 160 with the graphical preview representations associated with application in the list. For example, mobile OS 130 may return a bitmap image for the preview screens or a pointer to a shared memory location that includes the preview screen graphics information.
The user may select preview screens within preview window 1070 to reconfigure the active application screens. For example, the user may select preview screen 1072 to switch the currently displayed user environment for application 832 from the first active user environment 115 to the second active user environment 140. The user may select any of preview screens 1076, 1078, or 1080 to resume the corresponding application 1036, 1038, or 1040 within the second active user environment 140. That is, selecting one of preview screens 1076, 1078, or 1080 will open a new console window on display 144 (and corresponding console application running on desktop OS 160) and resume the corresponding application, with a restarted or resumed application screen associated with the application displayed within the new console window on display 144. Accordingly, the user can graphically browse and navigate running applications of mobile OS 130 through preview window 1070 of desktop OS 160, even where those applications are in a background state and graphical information for associated preview screens is unavailable.
As described above, embodiments include an Android mobile OS 430 and a full Linux desktop OS 660 (“Hydroid”) running on a modified Android kernel 520. These embodiments may include a window position control feature within the Android OS 430 that includes functionality for managing application preview screens and providing application preview screen data to components with Hydroid OS 660. FIG. 11 illustrates components of the window position control system 1100 of Android OS 430 that implement window position control features allowing the user to view and navigate through applications using multiple displays and/or multiple active user environments. The components of window position control system 1100 include services, managers, and/or application level components of Android OS 430.
The offscreen gesture controller 1104 (“OSGC”) and the navigation manager 1100 are user interface components that manipulate a linear model of the Android OS application stack by way of an intermediary known as the application model manager 1112. The application model manager 1112 provides methods to both examine and modify the linear model using the Android activity manager service 441 to do this. In effect the application model manager 1112 transforms the stack view of Android OS 430 to the linear view presented in the navigation manager 1110. Both the OSGC 1104 together with the transition controller 1106 and the navigation manager 1110 manipulate snapshot views of application screens. The navigation manager 1110 may use bitmap images of application screens, while the OSGC 1104 works with live (or apparently live) application screens themselves. To support both of these components the bitmap server 1114 provides mechanisms for rapid capture restore manipulation of the application image screens via JNI calls to C++ code designed to minimize image data transfers.
The bitmap server 1114 is a class that resides in the Android surface manager (i.e., SurfaceFlinger) 434. The bitmap server 1114 provides a highly efficient mechanism to manipulate windows such that the window position control system 1100 can provide intuitive visual effects for application browsing and navigation. The bitmap server 1114 provides an interface to the Java layers (e.g., application layer 450, runtime environment 431) such that window manipulation may be controlled from within Java layers.
The bitmap server 1114 maintains a handle and reference to all active surfaces so that window animations can be applied to any active surfaces and created based on the data within the active surface. When a surface is destroyed by SurfaceFlinger, the bitmap server persists a copy of the surface data so that it can be used to provide window animations for those applications that are no longer active. The bitmap server 1114 also provides bitmap images of surface data to the application model manager and destroys resources (references to surfaces, and persisted surface data) when invoked to do so by the application model manager. The bitmap server 1114 also creates window transition graphics from surface data including arranging window surfaces in a particular Z-order and/or creating new surfaces from rectangle coordinates of existing surfaces for transitions.
The offscreen gesture controller 1104 provides interpretations of user gestures, utilizing a factory class to create various types of transition objects to provide the on-screen visual effects corresponding types of gesture being performed. More particularly, the offscreen gesture controller may be a class in an existing system service that receives motion events from the window manager service 443 and processes them. Upon detecting the beginning of an offscreen gesture the offscreen gesture controller may invoke the transition controller factory to create an instance of the appropriate transition controller type. The offscreen gesture controller may filter out motion events with unintentional offscreen gestures are not processed, record statistics for all motion events within the scope of the gesture should so that gesture velocity may be calculated, and pass “move” motion events to the current transition controller so they can process the events and direct the required visual animation. Upon receiving a response from the current transition controller indicating it cannot continue movement in a given direction, the offscreen gesture controller may invoke the transition controller factory 1102 to create a new instance of a transition controller so that gesture processing can continue. Upon receipt of an up motion event the offscreen gesture controller may invoke the completion method of the current transition controller to automatically complete the transition. Upon completion of the transition invoke, the window manager service 443 may freeze the display so the configuration changes can be processed by applications without unwanted visual effects
The transition controller factory 1102 may receive invocations from the off-screen gesture controller 1104 to create instances of transition controllers. Upon invocation to create a transition controller instance, the transition controller factory may retrieve state data and handles to application surface data from the application model manager 1112. For example, the transition controller factory may retrieve state data and handles to graphical information for applications that are visible, for applications that will become visible due to the transition, and/or on which display the relevant applications are visible and the type of information for each application (e.g., application display properties). Upon invocation from the offscreen gesture controller 1104, the transition controller factory creates a transition controller of the appropriate type based on the direction of the gesture, which applications are visible (at the top of the stack), which application screen will become visible because of window movement, and/or on which display the application screen is intended to be visible. The transition controller factory 1102 may be a class residing in an existing mobile OS system service. In one embodiment, the transition controller factory 1102 is a class of the Android OS 430 system service.
FIG. 12 illustrates a process flow 1200 for cross-environment application navigation using a captured last graphical representation of an application screen, according to various embodiments. In a first block 1202 of process flow 1200, a first application is run on a first operating system of a mobile computing device. At a second block 1204, a first application screen associated with the application is displayed on an active display device. For example, the application screen may be displayed on a display device of the mobile computing device that defines a first active user environment. At block 1206, an application interaction state change event is received that indicates the current interaction state for the first application is to be changed from a foreground to a background state. For example, another application running on the first operating system may be started or swapped to the foreground on the display device. As described above, a transition controller may be invoked by the transition controller factory 906 to handle the interaction state change.
At block 1208, the first operating system captures a graphical representation of the first application screen. For example, the first operating system may include a bit-map server in a surface manager that captures a bit-map representation of the active application screen surfaces. At block 1210, the interaction state of the first application is changed from the foreground state to the backgrounds state. At this stage, processes associated with the first application screen may be paused, stopped, and/or suspended by the first operating system. The first operating system may maintain the first application on an application activity stack, and, at block 1212, the first operating system associates a position of the application activity stack with the captured bit-map Image.
At block 1214, an input command is received related to the application activity stack. The input command may be received within a second active user environment. The second active user environment may be associated with the first operating system or a second operating system running concurrently with the first operating system on a shared kernel of the mobile computing device. At block 1216, a representation of the bitmap image is displayed within a graphical representation of the application activity stack. The graphical representation of the application activity stack may be displayed across active user environments to allow cross-environment graphical navigation of foreground and background applications of the mobile operating system.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the invention to the form disclosed herein. While a number of exemplary aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, permutations, additions, and sub-combinations thereof.
The various operations of methods described above may be performed by any suitable means capable of performing the corresponding functions. The means may include various hardware and/or software component(s) and/or module(s), including, but not limited to a circuit, an application specific integrated circuit (ASIC), or processor.
The various illustrative logical blocks, modules, and circuits described may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an ASIC, a field programmable gate array signal (FPGA), or other programmable logic device (PLD), discrete gate, or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any commercially available processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the present disclosure, may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in any form of tangible storage medium. Some examples of storage media that may be used include random access memory (RAM), read only memory (ROM), flash memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM and so forth. A storage medium may be coupled to a processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. A software module may be a single instruction, or many instructions, and may be distributed over several different code segments, among different programs, and across multiple storage media.
The methods disclosed herein comprise one or more actions for achieving the described method. The method and/or actions may be interchanged with one another without departing from the scope of the claims. In other words, unless a specific order of actions is specified, the order and/or use of specific actions may be modified without departing from the scope of the claims.
The functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions on a tangible computer-readable medium. A storage medium may be any available tangible medium that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM, or other optical disk storage, magnetic disk storage, or other magnetic storage devices, or any other tangible medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk and disc, as used herein, include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray® disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Thus, a computer program product may perform operations presented herein. For example, such a computer program product may be a computer readable tangible medium having instructions tangibly stored (and/or encoded) thereon, the instructions being executable by one or more processors to perform the operations described herein. The computer program product may include packaging material.
Software or instructions may also be transmitted over a transmission medium. For example, software may be transmitted from a website, server, or other remote source using a transmission medium such as a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technology such as infrared, radio, or microwave.
Further, modules and/or other appropriate means for performing the methods and techniques described herein can be downloaded and/or otherwise obtained by a user terminal and/or base station as applicable. For example, such a device can be coupled to a server to facilitate the transfer of means for performing the methods described herein. Alternatively, various methods described herein can be provided via storage means (e.g., RAM, ROM, a physical storage medium such as a CD or floppy disk, etc.), such that a user terminal and/or base station can obtain the various methods upon coupling or providing the storage means to the device. Moreover, any other suitable technique for providing the methods and techniques described herein to a device can be utilized.
Other examples and implementations are within the scope and spirit of the disclosure and appended claims. For example, due to the nature of software, functions described above can be implemented using software executed by a processor, hardware, firmware, hardwiring, or combinations of any of these. Features implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations. Also, as used herein, including in the claims, “or” as used in a list of items prefaced by “at least one of” indicates a disjunctive list such that, for example, a list of “at least one of A, B, or C” means A or B or C or AB or AC or BC or ABC (i.e., A and B and C). Further, the term “exemplary” does not mean that the described example is preferred or better than other examples.
Various changes, substitutions, and alterations to the techniques described herein can be made without departing from the technology of the teachings as defined by the appended claims. Moreover, the scope of the disclosure and claims is not limited to the particular aspects of the process, machine, manufacture, composition of matter, means, methods, and actions described above. Processes, machines, manufacture, compositions of matter, means, methods, or actions, presently existing or later to be developed, that perform substantially the same function or achieve substantially the same result as the corresponding aspects described herein may be utilized. Accordingly, the appended claims include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or actions.

Claims (12)

What is claimed is:
1. A method comprising:
executing, by a computing device, a first operating system of a mobile device and a second operating system of the computing device, wherein the first operating system is different from the second operating system and wherein the first operating system and the second operating system execute concurrently on a shared kernel of the computing device;
providing, by the mobile computing device, a computing environment with first and second active user environments, wherein the first active user environment is associated with the first operating system, and wherein the second active user environment is associated with the second operating system;
running, by the computing device, a first application on the first operating system executing on the shared kernel of the mobile computing device;
displaying, by the mobile computing device, a first application screen associated with the first application on an active display;
receiving, by the computing device, an application interaction state change event indicating that a current interaction state of the first application is to be changed from a foreground state to a background state;
generating, by the computing device, a bitmap image corresponding to a graphical representation of a current state of an application screen of the first application just before the first application is transitioned from the foreground state to the background state;
changing, by the mobile computing device, the current interaction state of the first application from the foreground state to the background state;
associating, by the computing device, the bitmap image of the first application with a position within an application activity stack containing the first application, wherein the application activity stack is a list of applications that have been started in the first active user environment and the second active user environment and are in background states which are not explicitly closed, wherein the application activity stack includes information about the first application and a second application, and wherein the first and second applications are different and when the mobile computing device is docked to the computing device, the application activity stack is available to be displayed on the computing device;
receiving, by the computing device, a first user input command to display the application activity stack on the computing device
which is a representation of the bitmap images of the list of applications in background states;
receiving, at the computing device, a second user command indicative of a selection of the bitmap image of the first application within the application activity stack; and
changing the current interaction state of the first application from the background state to the foreground state on the computing device.
2. The method of claim 1, further comprising:
receiving a user command indicative of a selection of the bitmap image within the graphical representation of the application activity stack; and
changing the current interaction state of the first application from the background state to the foreground state.
3. The method of claim 1, wherein the shared kernel of the mobile computing device manages task scheduling for processes of both the first operating system and the second operating system.
4. The method of claim 3, wherein an application model manager maintains the application activity stack, the application activity stack including applications that have been started by the user and not actively closed by the user.
5. The method of claim 1, wherein a process associated with the first application screen is suspended in response to the change in the current interaction state of the first application from the foreground state to the background state.
6. The method of claim 5, wherein the bitmap image is generated from graphical information maintained by a bitmap server within a surface manager of the first operating system.
7. The method of claim 6, wherein the bitmap server provides an application level interface to the bitmap image data.
8. A mobile computing device comprising:
a display;
a processor coupled with the display; and
a memory coupled with and readable by the processor and storing therein a set of instructions which, when executed by the processor, causes the processor to:
execute a first operating system of a mobile device and a second operating system of the computing device, wherein the first operating system is different from the second operating system and wherein the first operating system and the second operating system execute concurrently on a shared kernel of the computing device;
provide a computing environment with first and second active user environments, wherein the first active user environment is associated with the first operating system, and wherein the second active user environment is associated with the second operating system;
run a first application on the first operating system executing on the shared kernel of the mobile computing device;
display a first application screen associated with the first application on an active display;
receive an application interaction state change event indicating that a current interaction state of the first application is to be changed from a foreground state to a background state;
generate a bitmap image corresponding to a graphical representation of a current state of an application screen of the first application just before the first application is transitioned from the foreground state to the background state;
change the current interaction state of the first application from the foreground state to the background state;
associate the bitmap image of the first application with a position within an application activity stack containing the first application, wherein the application activity stack is a list of applications that have been started in the first active user environment and the second active user environment and are in background states which are not explicitly closed, wherein the application activity stack includes information about the first application and a second application, and wherein the first and second applications are different and when the mobile computing device is docked to the computing device, the application activity stack is available to be displayed on the computing device;
receive a first user input command to display to the application activity stack on the computing device
which is a representation of the bitmap images of the list of applications in background states;
receiving, at the computing device, a second user command indicative of a selection of the bitmap image of the first application within the application activity stack; and
changing the current interaction state of the first application from the background state to the foreground state on the computing device.
9. The system of claim 8, wherein the instructions further cause the processor to:
receive a user command indicative of a selection of the bitmap image within the graphical representation of the application activity stack; and
change the current interaction state of the first application from the background state to the foreground state.
10. The system of claim 8, wherein the shared kernel of the mobile computing device manages task scheduling for processes of both the first operating system and the second operating system.
11. The system of claim 10, wherein an application model manager maintains the application activity stack, the application activity stack including applications that have been started by the user and not actively closed by the user.
12. The system of claim 8, wherein a process associated with the first application screen is suspended in response to the change in the current interaction state of the first application from the foreground state to the background state.
US14/834,305 2010-10-01 2015-08-24 Foreground/background assortment of hidden windows Active 2033-12-30 US10528210B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/834,305 US10528210B2 (en) 2010-10-01 2015-08-24 Foreground/background assortment of hidden windows

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US38911710P 2010-10-01 2010-10-01
US201161507199P 2011-07-13 2011-07-13
US201161507206P 2011-07-13 2011-07-13
US201161507201P 2011-07-13 2011-07-13
US201161507209P 2011-07-13 2011-07-13
US201161507203P 2011-07-13 2011-07-13
US13/399,901 US20130024812A1 (en) 2011-07-13 2012-02-17 Foreground/background assortment of hidden windows
US14/834,305 US10528210B2 (en) 2010-10-01 2015-08-24 Foreground/background assortment of hidden windows

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/399,901 Continuation US20130024812A1 (en) 2001-08-24 2012-02-17 Foreground/background assortment of hidden windows

Publications (2)

Publication Number Publication Date
US20160054862A1 US20160054862A1 (en) 2016-02-25
US10528210B2 true US10528210B2 (en) 2020-01-07

Family

ID=47506966

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/399,901 Abandoned US20130024812A1 (en) 2001-08-24 2012-02-17 Foreground/background assortment of hidden windows
US14/834,305 Active 2033-12-30 US10528210B2 (en) 2010-10-01 2015-08-24 Foreground/background assortment of hidden windows

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/399,901 Abandoned US20130024812A1 (en) 2001-08-24 2012-02-17 Foreground/background assortment of hidden windows

Country Status (2)

Country Link
US (2) US20130024812A1 (en)
WO (1) WO2013010143A2 (en)

Families Citing this family (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8726294B2 (en) 2010-10-01 2014-05-13 Z124 Cross-environment communication using application space API
US8933949B2 (en) 2010-10-01 2015-01-13 Z124 User interaction across cross-environment applications through an extended graphics context
US9026709B2 (en) 2010-10-01 2015-05-05 Z124 Auto-waking of a suspended OS in a dockable system
US9529494B2 (en) 2011-09-27 2016-12-27 Z124 Unified desktop triad control user interface for a browser
US9678624B2 (en) 2011-09-27 2017-06-13 Z124 Unified desktop triad control user interface for a phone manager
US9268518B2 (en) 2011-09-27 2016-02-23 Z124 Unified desktop docking rules
US8990713B2 (en) 2011-09-27 2015-03-24 Z124 Unified desktop triad control user interface for an application manager
US8990712B2 (en) 2011-08-24 2015-03-24 Z124 Unified desktop triad control user interface for file manager
US20130024812A1 (en) 2011-07-13 2013-01-24 Z124 Foreground/background assortment of hidden windows
US20130024778A1 (en) 2011-07-13 2013-01-24 Z124 Dynamic cross-environment application configuration/orientation
US20130076592A1 (en) 2011-09-27 2013-03-28 Paul E. Reeves Unified desktop docking behavior for visible-to-visible extension
US9715252B2 (en) 2011-08-24 2017-07-25 Z124 Unified desktop docking behavior for window stickiness
US9405459B2 (en) 2011-08-24 2016-08-02 Z124 Unified desktop laptop dock software operation
US8819705B2 (en) 2010-10-01 2014-08-26 Z124 User interaction support across cross-environment applications
US20130104062A1 (en) 2011-09-27 2013-04-25 Z124 Unified desktop input segregation in an application manager
US8966379B2 (en) 2010-10-01 2015-02-24 Z124 Dynamic cross-environment application configuration/orientation in an active user environment
US9152176B2 (en) 2010-10-01 2015-10-06 Z124 Application display transitions between single and multiple displays
US8886773B2 (en) 2010-08-14 2014-11-11 The Nielsen Company (Us), Llc Systems, methods, and apparatus to monitor mobile internet activity
CN107122168A (en) 2010-10-01 2017-09-01 Z124 Multiple operating system
US20120084737A1 (en) 2010-10-01 2012-04-05 Flextronics Id, Llc Gesture controls for multi-screen hierarchical applications
US8761831B2 (en) 2010-10-15 2014-06-24 Z124 Mirrored remote peripheral interface
US8863232B1 (en) 2011-02-04 2014-10-14 hopTo Inc. System for and methods of controlling user access to applications and/or programs of a computer
US20130104051A1 (en) 2011-09-27 2013-04-25 Z124 Unified desktop big brother application pools
US20160124698A1 (en) 2011-08-24 2016-05-05 Z124 Unified desktop triad control user interface for an application launcher
US9063775B2 (en) 2011-09-01 2015-06-23 Microsoft Technology Licensing, Llc Event aggregation for background work execution
US9032413B2 (en) 2011-09-01 2015-05-12 Microsoft Technology Licensing, Llc Decoupling background work and foreground work
US20130080899A1 (en) 2011-09-27 2013-03-28 Paul E. Reeves Unified desktop big brother applications
US9182935B2 (en) 2011-09-27 2015-11-10 Z124 Secondary single screen mode activation through menu option
US20160054757A1 (en) 2012-02-29 2016-02-25 Z124 Unified desktop docking flow
US9703468B2 (en) 2011-09-27 2017-07-11 Z124 Unified desktop independent focus in an application manager
US11416131B2 (en) 2011-09-27 2022-08-16 Z124 Unified desktop input segregation in an application manager
US9164544B2 (en) 2011-12-09 2015-10-20 Z124 Unified desktop: laptop dock, hardware configuration
US9116663B2 (en) * 2011-12-22 2015-08-25 Blackberry Limited Method for changing device modes of an electronic device connected to a docking station and an electronic device configured for same
US9164803B2 (en) 2012-01-20 2015-10-20 Microsoft Technology Licensing, Llc Background task resource control
KR101325840B1 (en) * 2012-03-13 2013-11-05 주식회사 팬택 Method for managing sink device, source device and wlan system for the same
US9419848B1 (en) 2012-05-25 2016-08-16 hopTo Inc. System for and method of providing a document sharing service in combination with remote access to document applications
US8713658B1 (en) 2012-05-25 2014-04-29 Graphon Corporation System for and method of providing single sign-on (SSO) capability in an application publishing environment
US10296181B2 (en) * 2012-06-20 2019-05-21 Maquet Critical Care Ab Breathing apparatus having a display with user selectable background
US9743017B2 (en) * 2012-07-13 2017-08-22 Lattice Semiconductor Corporation Integrated mobile desktop
TW201407431A (en) * 2012-08-03 2014-02-16 Novatek Microelectronics Corp Portable apparatus
KR102009816B1 (en) * 2012-08-28 2019-08-12 삼성전자주식회사 Screen display method and apparatus
US9407961B2 (en) * 2012-09-14 2016-08-02 Intel Corporation Media stream selective decode based on window visibility state
US20140089092A1 (en) * 2012-09-27 2014-03-27 Livingsocial, Inc. Client-Based Deal Filtering and Display
US9489236B2 (en) 2012-10-31 2016-11-08 Microsoft Technology Licensing, Llc Application prioritization
EP2728893B1 (en) * 2012-11-05 2020-04-22 Accenture Global Services Limited Controlling a data stream
US10356579B2 (en) * 2013-03-15 2019-07-16 The Nielsen Company (Us), Llc Methods and apparatus to credit usage of mobile devices
US9378054B2 (en) * 2013-04-12 2016-06-28 Dropbox, Inc. Testing system with methodology for background application control
CN103273892B (en) * 2013-04-23 2015-03-25 上海纵目科技有限公司 Vehicle navigation panorama device and operation control method thereof
US10477454B2 (en) 2013-05-08 2019-11-12 Cellcontrol, Inc. Managing iOS-based mobile communication devices by creative use of CallKit API protocols
US10805861B2 (en) * 2013-05-08 2020-10-13 Cellcontrol, Inc. Context-aware mobile device management
US10268530B2 (en) 2013-05-08 2019-04-23 Cellcontrol, Inc. Managing functions on an iOS-based mobile device using ANCS notifications
US11751123B2 (en) 2013-05-08 2023-09-05 Cellcontrol, Inc. Context-aware mobile device management
US9524147B2 (en) 2013-05-10 2016-12-20 Sap Se Entity-based cross-application navigation
US9225799B1 (en) * 2013-05-21 2015-12-29 Trend Micro Incorporated Client-side rendering for virtual mobile infrastructure
US8825814B1 (en) * 2013-05-23 2014-09-02 Vonage Network Llc Method and apparatus for minimizing application delay by pushing application notifications
KR102601782B1 (en) 2013-07-02 2023-11-14 가부시키가이샤 한도오따이 에네루기 켄큐쇼 Data processing device
US9286097B2 (en) * 2013-07-23 2016-03-15 Intel Corporation Switching a first OS in a foreground to a standby state in response to a system event and resuming a second OS from a background
CN105339898B (en) * 2013-07-23 2019-08-06 英特尔公司 Operating system switching method and device
KR20150025635A (en) * 2013-08-29 2015-03-11 삼성전자주식회사 Electronic device and method for controlling screen
US9582904B2 (en) 2013-11-11 2017-02-28 Amazon Technologies, Inc. Image composition based on remote object data
US9578074B2 (en) 2013-11-11 2017-02-21 Amazon Technologies, Inc. Adaptive content transmission
US9413830B2 (en) 2013-11-11 2016-08-09 Amazon Technologies, Inc. Application streaming service
US9641592B2 (en) 2013-11-11 2017-05-02 Amazon Technologies, Inc. Location of actor resources
US9805479B2 (en) 2013-11-11 2017-10-31 Amazon Technologies, Inc. Session idle optimization for streaming server
US9604139B2 (en) 2013-11-11 2017-03-28 Amazon Technologies, Inc. Service for generating graphics object data
US9634942B2 (en) 2013-11-11 2017-04-25 Amazon Technologies, Inc. Adaptive scene complexity based on service quality
KR102148948B1 (en) * 2013-12-06 2020-08-27 삼성전자주식회사 Multi tasking method of electronic apparatus and electronic apparatus thereof
CN104699218B (en) * 2013-12-10 2019-04-19 华为终端(东莞)有限公司 A kind of task management method and equipment
US9645864B2 (en) * 2014-02-06 2017-05-09 Intel Corporation Technologies for operating system transitions in multiple-operating-system environments
US10866714B2 (en) * 2014-02-13 2020-12-15 Samsung Electronics Co., Ltd. User terminal device and method for displaying thereof
US10747416B2 (en) * 2014-02-13 2020-08-18 Samsung Electronics Co., Ltd. User terminal device and method for displaying thereof
US10712918B2 (en) 2014-02-13 2020-07-14 Samsung Electronics Co., Ltd. User terminal device and displaying method thereof
US10367814B2 (en) * 2014-06-22 2019-07-30 Citrix Systems, Inc. Enabling user entropy encryption in non-compliant mobile applications
US9830167B2 (en) * 2014-08-12 2017-11-28 Linkedin Corporation Enhancing a multitasking user interface of an operating system
US20160098238A1 (en) * 2014-10-06 2016-04-07 Brent Grandil System and method for printing from a mobile computing device to a narrow media printer
US9924017B2 (en) 2015-05-28 2018-03-20 Livio, Inc. Methods and systems for a vehicle computing system to launch an application
US10261985B2 (en) 2015-07-02 2019-04-16 Microsoft Technology Licensing, Llc Output rendering in dynamic redefining application
US10198252B2 (en) 2015-07-02 2019-02-05 Microsoft Technology Licensing, Llc Transformation chain application splitting
US20170010758A1 (en) * 2015-07-08 2017-01-12 Microsoft Technology Licensing, Llc Actuator module for building application
US10198405B2 (en) 2015-07-08 2019-02-05 Microsoft Technology Licensing, Llc Rule-based layout of changing information
US10031724B2 (en) 2015-07-08 2018-07-24 Microsoft Technology Licensing, Llc Application operation responsive to object spatial status
CN108027714B (en) * 2015-08-31 2020-12-25 三菱电机株式会社 Display control device and display control method
JP6069553B1 (en) * 2016-02-04 2017-02-01 京セラ株式会社 COMMUNICATION DEVICE, COMMUNICATION CONTROL METHOD, AND PROGRAM
CN107153528A (en) 2016-03-02 2017-09-12 阿里巴巴集团控股有限公司 The method and apparatus that mixed model list items are reused
CN107291549B (en) * 2016-03-31 2020-11-24 阿里巴巴集团控股有限公司 Method and device for managing application program
CN106020592A (en) * 2016-05-09 2016-10-12 北京小米移动软件有限公司 Split screen display method and device
US10747467B2 (en) 2016-06-10 2020-08-18 Apple Inc. Memory management for application loading
US10725761B2 (en) 2016-06-10 2020-07-28 Apple Inc. Providing updated application data for previewing applications on a display
US10520979B2 (en) * 2016-06-10 2019-12-31 Apple Inc. Enhanced application preview mode
TR201608894A3 (en) * 2016-06-27 2018-03-21 Tusas Tuerk Havacilik Ve Uzay Sanayii Anonim Sirketi A real-time working method.
TWI605378B (en) * 2016-07-14 2017-11-11 財團法人工業技術研究院 Method of recording operations and method of automatically re-executing operations
US10212113B2 (en) * 2016-09-19 2019-02-19 Google Llc Uniform resource identifier and image sharing for contextual information display
US11178272B2 (en) 2017-08-14 2021-11-16 Cellcontrol, Inc. Systems, methods, and devices for enforcing do not disturb functionality on mobile devices
KR102536097B1 (en) * 2018-01-26 2023-05-25 삼성전자주식회사 Electronic apparatus and method for controlling display
CN110294372B (en) * 2018-03-23 2023-02-28 奥的斯电梯公司 Wireless signal device, elevator service request system and method
CA3150048A1 (en) * 2019-08-05 2021-02-11 Hoppr Ltd A method and system for providing content to a media playing device
US11150791B1 (en) * 2020-01-15 2021-10-19 Navvis & Company, LLC Unified ecosystem experience for managing multiple healthcare applications from a common interface with trigger-based layout control
CN111901686B (en) * 2020-08-03 2022-04-08 海信视像科技股份有限公司 Method for keeping normal display of user interface stack and display equipment

Citations (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07219903A (en) 1994-02-04 1995-08-18 Canon Inc Computer, computer system and its controlling method
JPH08115144A (en) 1994-10-17 1996-05-07 Toshiba Corp Separation type work station device
US5673403A (en) 1992-11-13 1997-09-30 International Business Machines Corporation Method and system for displaying applications of different operating systems on a single system using the user interface of the different operating systems
US5764984A (en) 1993-02-26 1998-06-09 International Business Machines Corporation System for multiple co-existing operating system personalities on a microkernel
US6178503B1 (en) 1998-09-11 2001-01-23 Powerquest Corporation Managing multiple operating systems on a single computer
US6182158B1 (en) 1995-04-14 2001-01-30 Sun Microsystems, Inc. Method and system for providing interoperability among processes written to execute on different operating systems
US20020157001A1 (en) 2001-04-19 2002-10-24 Alec Huang Computer system capable of switching operating system
US20020158811A1 (en) 2000-06-02 2002-10-31 Davis Terry Glenn Dual-monitor duo-workpad TM device
US6477585B1 (en) 1995-08-18 2002-11-05 International Business Machines Corporation Filter mechanism for an event management service
KR20020092969A (en) 2000-06-20 2002-12-12 가부시끼가이샤 히다치 세이사꾸쇼 Vehicle control device
US6507336B1 (en) 1999-02-04 2003-01-14 Palm, Inc. Keyboard for a handheld computer
US20030020954A1 (en) 2001-07-26 2003-01-30 Charlie Udom Versatile printing from portable electronic device
US20030079010A1 (en) 2001-10-17 2003-04-24 Osborn Andrew R. Method of communicating across an operating system
US20030079205A1 (en) 2001-10-22 2003-04-24 Takeshi Miyao System and method for managing operating systems
US20030115443A1 (en) 2001-12-18 2003-06-19 Cepulis Darren J. Multi-O/S system and pre-O/S boot technique for partitioning resources and loading multiple operating systems thereon
US20030174172A1 (en) * 1993-06-11 2003-09-18 Conrad Thomas J. Computer system with graphical user interface including drawer-like windows
US20040137855A1 (en) 2002-07-31 2004-07-15 Wiley Anthony John Wireless mobile printing
US6917963B1 (en) 1999-10-05 2005-07-12 Veritas Operating Corporation Snapshot image for the application state of unshareable and shareable data
US20050193267A1 (en) 2004-01-30 2005-09-01 Eldon Liu Application program sharing framework and method for operating systems
US6961941B1 (en) 2001-06-08 2005-11-01 Vmware, Inc. Computer configuration for resource management in systems including a virtual machine
US20050246505A1 (en) 2004-04-29 2005-11-03 Mckenney Paul E Efficient sharing of memory between applications running under different operating systems on a shared hardware system
US20060005187A1 (en) 2004-06-30 2006-01-05 Microsoft Corporation Systems and methods for integrating application windows in a virtual machine environment
KR100578592B1 (en) 2004-06-23 2006-05-12 공정배 System and method for providing the different operating system based terminal service client
US20060107020A1 (en) 2004-11-17 2006-05-18 Stillwell Paul M Jr Sharing data in a user virtual address range with a kernel virtual address range
US7069519B1 (en) 1999-09-24 2006-06-27 Hitachi, Ltd. Method, apparatus and navigation apparatus for sharing display by plural operating systems
KR20060081997A (en) 2005-01-11 2006-07-14 와이더댄 주식회사 Method and syetem for interworking plurality of applications
US20060227806A1 (en) 2005-01-17 2006-10-12 Lite-On Technology Corporation Multi-mode computer systems and operating methods thereof
US20060248404A1 (en) * 2005-04-29 2006-11-02 Microsoft Corporation System and Method for Providing a Window Management Mode
US20070005661A1 (en) 2005-06-30 2007-01-04 Yang Chiang H Shared file system management between independent operating systems
US20070033260A1 (en) 2003-07-30 2007-02-08 Sa, Jaluna Multiple operating systems sharing a processor and a network interface
US20070067769A1 (en) 2005-08-30 2007-03-22 Geisinger Nile J Method and apparatus for providing cross-platform hardware support for computer platforms
US20070198760A1 (en) 2006-02-17 2007-08-23 Samsung Electronics Co., Ltd. Digital multimedia device
US7284203B1 (en) 1999-07-27 2007-10-16 Verizon Laboratories Inc. Method and apparatus for application sharing interface
US20070245263A1 (en) * 2006-03-29 2007-10-18 Alltel Communications, Inc. Graphical user interface for wireless device
US20070271522A1 (en) 2006-05-22 2007-11-22 Samsung Electronics Co., Ltd. Apparatus and method for setting user interface according to user preference
US20070288941A1 (en) 2006-06-07 2007-12-13 Andrew Dunshea Sharing kernel services among kernels
US20080082815A1 (en) 2001-12-07 2008-04-03 International Business Machines Corporation Apparatus, method and program product for initiating computer system operation
US20080090525A1 (en) 2006-10-12 2008-04-17 Samsung Electronics Co., Ltd. Method of controlling printer using bluetooth function of mobile terminal
US20080119731A1 (en) 2006-11-20 2008-05-22 North American Medical Corporation Portable ultrasound with touch screen interface
US20080134061A1 (en) 2006-12-01 2008-06-05 Banerjee Dwip N Multi-Display System and Method Supporting Differing Accesibility Feature Selection
JP2008225546A (en) 2007-03-08 2008-09-25 Nec Corp Virtual device configuration system and its method
US20080244599A1 (en) 2007-03-30 2008-10-02 Microsoft Corporation Master And Subordinate Operating System Kernels For Heterogeneous Multiprocessor Systems
WO2008132924A1 (en) 2007-04-13 2008-11-06 Nec Corporation Virtual computer system and its optimization method
US7453465B2 (en) 2004-10-14 2008-11-18 Microsoft Corporation Encoding for remoting graphics to decoder device
US7478341B2 (en) 2000-04-19 2009-01-13 Broadcom Corporation Apparatus and method for persistent display interface
KR100883208B1 (en) 2007-12-13 2009-02-13 성균관대학교산학협력단 Mobile communication terminal available to update software based on virtualization technology and updating method thereof
US20090055749A1 (en) * 2007-07-29 2009-02-26 Palm, Inc. Application management framework for web applications
US20090094523A1 (en) * 2007-09-12 2009-04-09 Terry Noel Treder Methods and Systems for Maintaining Desktop Environments providing integrated access to remote and local resourcses
US20090119580A1 (en) 2000-06-12 2009-05-07 Gary B. Rohrabaugh Scalable Display of Internet Content on Mobile Devices
US20090138818A1 (en) 2005-12-26 2009-05-28 Kazuo Nemoto Method, program, and data processing system for manipulating display of multiple display objects
US7565535B2 (en) 2005-05-06 2009-07-21 Microsoft Corporation Systems and methods for demonstrating authenticity of a virtual machine using a security image
US20090193364A1 (en) * 2008-01-29 2009-07-30 Microsoft Corporation Displaying thumbnail copies of running items
US20090217071A1 (en) 2008-02-27 2009-08-27 Lenovo (Beijing) Limited Data processing device and method for switching states thereof
US20090249331A1 (en) 2008-03-31 2009-10-01 Mark Charles Davis Apparatus, system, and method for file system sharing
US20090278806A1 (en) 2008-05-06 2009-11-12 Matias Gonzalo Duarte Extended touch-sensitive control area for electronic device
US20090313440A1 (en) 2008-06-11 2009-12-17 Young Lak Kim Shared memory burst communications
US20090327905A1 (en) * 2008-06-27 2009-12-31 Microsoft Corporation Integrated client for access to remote resources
US20090327560A1 (en) 2008-06-29 2009-12-31 Microsoft Corporation Automatic transfer of information through physical docking of devices
US20100005396A1 (en) 2000-02-18 2010-01-07 Nason D David Method and system for controlling a comlementary user interface on a display surface
US20100046026A1 (en) 2008-08-21 2010-02-25 Samsung Electronics Co., Ltd. Image forming apparatus and image forming method
US20100064228A1 (en) 2008-09-11 2010-03-11 Ely Tsern Expandable system architecture comprising a handheld computer device that dynamically generates different user environments with secondary devices with displays of various form factors
US20100063994A1 (en) 2008-09-08 2010-03-11 Celio Technology Corporation, Inc. dba Celio Corporation Client device for cellular telephone as server
US20100064244A1 (en) 2008-09-08 2010-03-11 Qualcomm Incorporated Multi-fold mobile device with configurable interface
US20100107163A1 (en) 2007-03-20 2010-04-29 Sanggyu Lee Movable virtual machine image
KR20100043434A (en) 2008-10-20 2010-04-29 삼성전자주식회사 Apparatus and method for application of multi operating systems in multi modem mobile communication terminal
US20100149121A1 (en) 2008-12-12 2010-06-17 Maxim Integrated Products, Inc. System and method for interfacing applications processor to touchscreen display for reduced data transfer
US20100211769A1 (en) 2009-02-19 2010-08-19 Subramonian Shankar Concurrent Execution of a Smartphone Operating System and a Desktop Operating System
US20100251233A1 (en) 2009-03-25 2010-09-30 Honeywell International Inc. Embedded computing system user interface emulated on a separate computing device
US20100250975A1 (en) 2009-03-27 2010-09-30 Qualcomm Incorporated System and method of providing scalable computing between a portable computing device and a portable computing device docking station
US20100246119A1 (en) 2009-03-27 2010-09-30 Qualcomm Incorporated Portable docking station for a portable computing device
US7880728B2 (en) 2006-06-29 2011-02-01 Microsoft Corporation Application switching via a touch screen interface
US20110093691A1 (en) 2009-07-20 2011-04-21 Galicia Joshua D Multi-environment operating system
US7950008B2 (en) 2005-07-06 2011-05-24 International Business Machines Corporation Software installation in multiple operating systems
US20110138314A1 (en) * 2009-12-09 2011-06-09 Abraham Mir Methods and systems for generating a combined display of taskbar button group entries generated on a local machine and on a remote machine
US7960945B1 (en) 2008-01-30 2011-06-14 Google Inc. Estimating remaining use time of a mobile device
US20110246904A1 (en) 2010-04-01 2011-10-06 Gus Pinto Interacting with Remote Applications Displayed Within a Virtual Desktop of a Tablet Computing Device
US20110273464A1 (en) 2006-08-04 2011-11-10 Apple Inc. Framework for Graphics Animation and Compositing Operations
WO2012044738A2 (en) 2010-10-01 2012-04-05 Imerj, Llc Instant remote rendering
US20120084697A1 (en) 2010-10-01 2012-04-05 Flextronics Id, Llc User interface with independent drawer control
US20120084675A1 (en) 2010-10-01 2012-04-05 Imerj, Llc Annunciator drawer
WO2012044872A2 (en) 2010-10-01 2012-04-05 Imerj, Llc Extended graphics context with divided compositing
US20120158829A1 (en) * 2010-12-20 2012-06-21 Kalle Ahmavaara Methods and apparatus for providing or receiving data connectivity
US20120172088A1 (en) 2010-12-31 2012-07-05 Motorola-Mobility, Inc. Method and System for Adapting Mobile Device to Accommodate External Display
US20120176413A1 (en) 2011-01-11 2012-07-12 Qualcomm Incorporated Methods and apparatuses for mobile device display mode selection based on motion direction
US20120188185A1 (en) 2010-10-01 2012-07-26 Ron Cassar Secondary single screen mode activation through off-screen gesture area activation
US20120278750A1 (en) 2011-04-28 2012-11-01 Motorola Mobility, Inc. Method and apparatus for presenting a window in a system having two operating system environments
US20120278747A1 (en) 2011-04-28 2012-11-01 Motorola Mobility, Inc. Method and apparatus for user interface in a system having two operating system environments
US20130024778A1 (en) 2011-07-13 2013-01-24 Z124 Dynamic cross-environment application configuration/orientation
US20130024812A1 (en) 2011-07-13 2013-01-24 Z124 Foreground/background assortment of hidden windows
US20130076683A1 (en) 2011-09-27 2013-03-28 Z124 Dual screen property detail display
US20130312106A1 (en) 2010-10-01 2013-11-21 Z124 Selective Remote Wipe
US8713474B2 (en) * 2010-10-05 2014-04-29 Citrix Systems, Inc. Providing user interfaces and window previews for hosted applications
US8761831B2 (en) 2010-10-15 2014-06-24 Z124 Mirrored remote peripheral interface
US8810533B2 (en) 2011-07-20 2014-08-19 Z124 Systems and methods for receiving gesture inputs spanning multiple input devices
US8819705B2 (en) 2010-10-01 2014-08-26 Z124 User interaction support across cross-environment applications
US8898443B2 (en) 2010-10-01 2014-11-25 Z124 Multi-operating system
US8933949B2 (en) 2010-10-01 2015-01-13 Z124 User interaction across cross-environment applications through an extended graphics context
US8966379B2 (en) 2010-10-01 2015-02-24 Z124 Dynamic cross-environment application configuration/orientation in an active user environment
US9063798B2 (en) 2010-10-01 2015-06-23 Z124 Cross-environment communication using application space API

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8868899B2 (en) * 2009-07-20 2014-10-21 Motorola Mobility Llc System and method for switching between environments in a multi-environment operating system

Patent Citations (118)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5673403A (en) 1992-11-13 1997-09-30 International Business Machines Corporation Method and system for displaying applications of different operating systems on a single system using the user interface of the different operating systems
US5764984A (en) 1993-02-26 1998-06-09 International Business Machines Corporation System for multiple co-existing operating system personalities on a microkernel
US20030174172A1 (en) * 1993-06-11 2003-09-18 Conrad Thomas J. Computer system with graphical user interface including drawer-like windows
JPH07219903A (en) 1994-02-04 1995-08-18 Canon Inc Computer, computer system and its controlling method
JPH08115144A (en) 1994-10-17 1996-05-07 Toshiba Corp Separation type work station device
US6182158B1 (en) 1995-04-14 2001-01-30 Sun Microsystems, Inc. Method and system for providing interoperability among processes written to execute on different operating systems
US6477585B1 (en) 1995-08-18 2002-11-05 International Business Machines Corporation Filter mechanism for an event management service
US6178503B1 (en) 1998-09-11 2001-01-23 Powerquest Corporation Managing multiple operating systems on a single computer
US6507336B1 (en) 1999-02-04 2003-01-14 Palm, Inc. Keyboard for a handheld computer
US7284203B1 (en) 1999-07-27 2007-10-16 Verizon Laboratories Inc. Method and apparatus for application sharing interface
US7069519B1 (en) 1999-09-24 2006-06-27 Hitachi, Ltd. Method, apparatus and navigation apparatus for sharing display by plural operating systems
US6917963B1 (en) 1999-10-05 2005-07-12 Veritas Operating Corporation Snapshot image for the application state of unshareable and shareable data
US20100005396A1 (en) 2000-02-18 2010-01-07 Nason D David Method and system for controlling a comlementary user interface on a display surface
US7478341B2 (en) 2000-04-19 2009-01-13 Broadcom Corporation Apparatus and method for persistent display interface
US20020158811A1 (en) 2000-06-02 2002-10-31 Davis Terry Glenn Dual-monitor duo-workpad TM device
US20090119580A1 (en) 2000-06-12 2009-05-07 Gary B. Rohrabaugh Scalable Display of Internet Content on Mobile Devices
KR20020092969A (en) 2000-06-20 2002-12-12 가부시끼가이샤 히다치 세이사꾸쇼 Vehicle control device
US20020157001A1 (en) 2001-04-19 2002-10-24 Alec Huang Computer system capable of switching operating system
US6961941B1 (en) 2001-06-08 2005-11-01 Vmware, Inc. Computer configuration for resource management in systems including a virtual machine
US20030020954A1 (en) 2001-07-26 2003-01-30 Charlie Udom Versatile printing from portable electronic device
US20030079010A1 (en) 2001-10-17 2003-04-24 Osborn Andrew R. Method of communicating across an operating system
US20030079205A1 (en) 2001-10-22 2003-04-24 Takeshi Miyao System and method for managing operating systems
US20080082815A1 (en) 2001-12-07 2008-04-03 International Business Machines Corporation Apparatus, method and program product for initiating computer system operation
US20030115443A1 (en) 2001-12-18 2003-06-19 Cepulis Darren J. Multi-O/S system and pre-O/S boot technique for partitioning resources and loading multiple operating systems thereon
US20040137855A1 (en) 2002-07-31 2004-07-15 Wiley Anthony John Wireless mobile printing
US20070033260A1 (en) 2003-07-30 2007-02-08 Sa, Jaluna Multiple operating systems sharing a processor and a network interface
US20050193267A1 (en) 2004-01-30 2005-09-01 Eldon Liu Application program sharing framework and method for operating systems
US20050246505A1 (en) 2004-04-29 2005-11-03 Mckenney Paul E Efficient sharing of memory between applications running under different operating systems on a shared hardware system
KR100578592B1 (en) 2004-06-23 2006-05-12 공정배 System and method for providing the different operating system based terminal service client
US20060005187A1 (en) 2004-06-30 2006-01-05 Microsoft Corporation Systems and methods for integrating application windows in a virtual machine environment
US7453465B2 (en) 2004-10-14 2008-11-18 Microsoft Corporation Encoding for remoting graphics to decoder device
US20060107020A1 (en) 2004-11-17 2006-05-18 Stillwell Paul M Jr Sharing data in a user virtual address range with a kernel virtual address range
KR100616157B1 (en) 2005-01-11 2006-08-28 와이더댄 주식회사 Method and syetem for interworking plurality of applications
WO2006075859A1 (en) 2005-01-11 2006-07-20 Widerthan Co., Ltd. Method and system for interworking plurality of applications
KR20060081997A (en) 2005-01-11 2006-07-14 와이더댄 주식회사 Method and syetem for interworking plurality of applications
US20060227806A1 (en) 2005-01-17 2006-10-12 Lite-On Technology Corporation Multi-mode computer systems and operating methods thereof
US20060248404A1 (en) * 2005-04-29 2006-11-02 Microsoft Corporation System and Method for Providing a Window Management Mode
US7565535B2 (en) 2005-05-06 2009-07-21 Microsoft Corporation Systems and methods for demonstrating authenticity of a virtual machine using a security image
US20070005661A1 (en) 2005-06-30 2007-01-04 Yang Chiang H Shared file system management between independent operating systems
US7950008B2 (en) 2005-07-06 2011-05-24 International Business Machines Corporation Software installation in multiple operating systems
US20070067769A1 (en) 2005-08-30 2007-03-22 Geisinger Nile J Method and apparatus for providing cross-platform hardware support for computer platforms
US20090138818A1 (en) 2005-12-26 2009-05-28 Kazuo Nemoto Method, program, and data processing system for manipulating display of multiple display objects
US20070198760A1 (en) 2006-02-17 2007-08-23 Samsung Electronics Co., Ltd. Digital multimedia device
US20070245263A1 (en) * 2006-03-29 2007-10-18 Alltel Communications, Inc. Graphical user interface for wireless device
US20070271522A1 (en) 2006-05-22 2007-11-22 Samsung Electronics Co., Ltd. Apparatus and method for setting user interface according to user preference
US20070288941A1 (en) 2006-06-07 2007-12-13 Andrew Dunshea Sharing kernel services among kernels
US7880728B2 (en) 2006-06-29 2011-02-01 Microsoft Corporation Application switching via a touch screen interface
US20110273464A1 (en) 2006-08-04 2011-11-10 Apple Inc. Framework for Graphics Animation and Compositing Operations
US20080090525A1 (en) 2006-10-12 2008-04-17 Samsung Electronics Co., Ltd. Method of controlling printer using bluetooth function of mobile terminal
US20080119731A1 (en) 2006-11-20 2008-05-22 North American Medical Corporation Portable ultrasound with touch screen interface
US20080134061A1 (en) 2006-12-01 2008-06-05 Banerjee Dwip N Multi-Display System and Method Supporting Differing Accesibility Feature Selection
JP2008225546A (en) 2007-03-08 2008-09-25 Nec Corp Virtual device configuration system and its method
US20100107163A1 (en) 2007-03-20 2010-04-29 Sanggyu Lee Movable virtual machine image
US20080244599A1 (en) 2007-03-30 2008-10-02 Microsoft Corporation Master And Subordinate Operating System Kernels For Heterogeneous Multiprocessor Systems
WO2008132924A1 (en) 2007-04-13 2008-11-06 Nec Corporation Virtual computer system and its optimization method
US20090055749A1 (en) * 2007-07-29 2009-02-26 Palm, Inc. Application management framework for web applications
US20090094523A1 (en) * 2007-09-12 2009-04-09 Terry Noel Treder Methods and Systems for Maintaining Desktop Environments providing integrated access to remote and local resourcses
KR100883208B1 (en) 2007-12-13 2009-02-13 성균관대학교산학협력단 Mobile communication terminal available to update software based on virtualization technology and updating method thereof
US20090193364A1 (en) * 2008-01-29 2009-07-30 Microsoft Corporation Displaying thumbnail copies of running items
US7960945B1 (en) 2008-01-30 2011-06-14 Google Inc. Estimating remaining use time of a mobile device
US20090217071A1 (en) 2008-02-27 2009-08-27 Lenovo (Beijing) Limited Data processing device and method for switching states thereof
US20090249331A1 (en) 2008-03-31 2009-10-01 Mark Charles Davis Apparatus, system, and method for file system sharing
US20090278806A1 (en) 2008-05-06 2009-11-12 Matias Gonzalo Duarte Extended touch-sensitive control area for electronic device
US20090313440A1 (en) 2008-06-11 2009-12-17 Young Lak Kim Shared memory burst communications
US20090327905A1 (en) * 2008-06-27 2009-12-31 Microsoft Corporation Integrated client for access to remote resources
US20090327560A1 (en) 2008-06-29 2009-12-31 Microsoft Corporation Automatic transfer of information through physical docking of devices
US20100046026A1 (en) 2008-08-21 2010-02-25 Samsung Electronics Co., Ltd. Image forming apparatus and image forming method
US20100063994A1 (en) 2008-09-08 2010-03-11 Celio Technology Corporation, Inc. dba Celio Corporation Client device for cellular telephone as server
US20100064244A1 (en) 2008-09-08 2010-03-11 Qualcomm Incorporated Multi-fold mobile device with configurable interface
US20100064228A1 (en) 2008-09-11 2010-03-11 Ely Tsern Expandable system architecture comprising a handheld computer device that dynamically generates different user environments with secondary devices with displays of various form factors
US20100060549A1 (en) 2008-09-11 2010-03-11 Ely Tsern Method and system for dynamically generating different user environments with secondary devices with displays of various form factors
KR20100043434A (en) 2008-10-20 2010-04-29 삼성전자주식회사 Apparatus and method for application of multi operating systems in multi modem mobile communication terminal
US20100149121A1 (en) 2008-12-12 2010-06-17 Maxim Integrated Products, Inc. System and method for interfacing applications processor to touchscreen display for reduced data transfer
US20100211769A1 (en) 2009-02-19 2010-08-19 Subramonian Shankar Concurrent Execution of a Smartphone Operating System and a Desktop Operating System
US20100251233A1 (en) 2009-03-25 2010-09-30 Honeywell International Inc. Embedded computing system user interface emulated on a separate computing device
US20100250975A1 (en) 2009-03-27 2010-09-30 Qualcomm Incorporated System and method of providing scalable computing between a portable computing device and a portable computing device docking station
US20100246119A1 (en) 2009-03-27 2010-09-30 Qualcomm Incorporated Portable docking station for a portable computing device
US20110093691A1 (en) 2009-07-20 2011-04-21 Galicia Joshua D Multi-environment operating system
US20110138314A1 (en) * 2009-12-09 2011-06-09 Abraham Mir Methods and systems for generating a combined display of taskbar button group entries generated on a local machine and on a remote machine
US20110246904A1 (en) 2010-04-01 2011-10-06 Gus Pinto Interacting with Remote Applications Displayed Within a Virtual Desktop of a Tablet Computing Device
US8842080B2 (en) 2010-10-01 2014-09-23 Z124 User interface with screen spanning icon morphing
US8683496B2 (en) 2010-10-01 2014-03-25 Z124 Cross-environment redirection
US20120084675A1 (en) 2010-10-01 2012-04-05 Imerj, Llc Annunciator drawer
WO2012044872A2 (en) 2010-10-01 2012-04-05 Imerj, Llc Extended graphics context with divided compositing
WO2012044645A2 (en) 2010-10-01 2012-04-05 Imerj, Llc Extended graphics context with common compositing
US9152582B2 (en) 2010-10-01 2015-10-06 Z124 Auto-configuration of a docked system in a multi-OS environment
US9098437B2 (en) 2010-10-01 2015-08-04 Z124 Cross-environment communication framework
US9071625B2 (en) 2010-10-01 2015-06-30 Z124 Cross-environment event notification
US20120188185A1 (en) 2010-10-01 2012-07-26 Ron Cassar Secondary single screen mode activation through off-screen gesture area activation
US9063798B2 (en) 2010-10-01 2015-06-23 Z124 Cross-environment communication using application space API
US9026709B2 (en) 2010-10-01 2015-05-05 Z124 Auto-waking of a suspended OS in a dockable system
US8966379B2 (en) 2010-10-01 2015-02-24 Z124 Dynamic cross-environment application configuration/orientation in an active user environment
US8933949B2 (en) 2010-10-01 2015-01-13 Z124 User interaction across cross-environment applications through an extended graphics context
US8898443B2 (en) 2010-10-01 2014-11-25 Z124 Multi-operating system
WO2012044738A2 (en) 2010-10-01 2012-04-05 Imerj, Llc Instant remote rendering
US8819705B2 (en) 2010-10-01 2014-08-26 Z124 User interaction support across cross-environment applications
US20120084697A1 (en) 2010-10-01 2012-04-05 Flextronics Id, Llc User interface with independent drawer control
US20130312106A1 (en) 2010-10-01 2013-11-21 Z124 Selective Remote Wipe
US8713474B2 (en) * 2010-10-05 2014-04-29 Citrix Systems, Inc. Providing user interfaces and window previews for hosted applications
US20140201679A1 (en) * 2010-10-05 2014-07-17 Citrix Systems, Inc. Providing user interfaces and window previews for hosted applications
US8761831B2 (en) 2010-10-15 2014-06-24 Z124 Mirrored remote peripheral interface
US20120158829A1 (en) * 2010-12-20 2012-06-21 Kalle Ahmavaara Methods and apparatus for providing or receiving data connectivity
US20120172088A1 (en) 2010-12-31 2012-07-05 Motorola-Mobility, Inc. Method and System for Adapting Mobile Device to Accommodate External Display
US20120176413A1 (en) 2011-01-11 2012-07-12 Qualcomm Incorporated Methods and apparatuses for mobile device display mode selection based on motion direction
US20120278747A1 (en) 2011-04-28 2012-11-01 Motorola Mobility, Inc. Method and apparatus for user interface in a system having two operating system environments
US20120278750A1 (en) 2011-04-28 2012-11-01 Motorola Mobility, Inc. Method and apparatus for presenting a window in a system having two operating system environments
US20130024812A1 (en) 2011-07-13 2013-01-24 Z124 Foreground/background assortment of hidden windows
US20130024778A1 (en) 2011-07-13 2013-01-24 Z124 Dynamic cross-environment application configuration/orientation
US8810533B2 (en) 2011-07-20 2014-08-19 Z124 Systems and methods for receiving gesture inputs spanning multiple input devices
US20130219162A1 (en) 2011-09-27 2013-08-22 Z124 Unified desktop wake and unlock
US20130076683A1 (en) 2011-09-27 2013-03-28 Z124 Dual screen property detail display
US8868135B2 (en) 2011-09-27 2014-10-21 Z124 Orientation arbitration
US20130080945A1 (en) 2011-09-27 2013-03-28 Paul Reeves Reconfigurable user interface elements
US20130088411A1 (en) 2011-09-27 2013-04-11 Z124 Device wakeup orientation
US9104366B2 (en) 2011-09-27 2015-08-11 Z124 Separation of screen usage for complex language input
US9128660B2 (en) 2011-09-27 2015-09-08 Z124 Dual display pinyin touch input
US9128659B2 (en) 2011-09-27 2015-09-08 Z124 Dual display cursive touch input
US9152179B2 (en) 2011-09-27 2015-10-06 Z124 Portrait dual display and landscape dual display

Non-Patent Citations (29)

* Cited by examiner, † Cited by third party
Title
"Lapdock™ for Motorola ATRIX," at http://www.motorola.com/Consumers/US-EN/Consumer-Product-and-Services/Mobile . . . , accessed Apr. 18, 2011, 1 page.
"Motorola ATRIX 4G Laptop Dock Review," at http://www.phonearena.com/reviews/Motorola-ATRIX-4G-Laptop-Dock-Review_id2667, Mar. 2, 2011, 6 pages.
Burns, C., "Motorola ATRIX 4G Laptop Dock Review," at http://androidcommunity.com/motorola-atrix-4g-laptop-dock-review-20110220/, Feb. 20, 2011, 5 pages.
Catacchio, Chad, "This smartphone has two huge screens . . . that rotate," The Next Web at http://thenextweb.com/asia/2010/10/07/this-smartphone-has-two-huge-screens-that-rotate/, Jul. 21, 2011, 2 pages.
Final Action for U.S. Appl. No. 13/399,901, dated Dec. 26, 2013 20 pages.
Google images, accessed Apr. 18, 2011, 6 pages.
Google Transliteration IME website, 2010, available at www.***.com/ime/transliteration/help.html#features, 8 pages.
Harman03, "Kyocera Echo Dual-screen Android Phone," posted 4 weeks from Apr. 18, 2011, 3 pages.
InputKing Online Input System, 2011, available at www.inputking.com, 2 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2012/046798, dated Jan. 23, 2014 6 pages.
International Search Report for International (PCT) Patent Application No. PCT/US2012/046798, dated Feb. 20, 2013 3 pages.
Official Action for U.S. Appl. No. 13/399,901, dated Aug. 2, 2013 17 pages.
Official Action for U.S. Appl. No. 13/399,901, dated May 1, 2014 17 pages.
Official Action for U.S. Appl. No. 13/399,901, dated May 22, 2015 23 pages.
Official Action for U.S. Appl. No. 13/399,901, dated Oct. 29, 2014 23 pages.
Sakhr Software-Arabic Optical Character Recognition, Jul. 15, 2011, available at www.sakhr.com/ocr.aspx, 1 page.
Sakhr Software—Arabic Optical Character Recognition, Jul. 15, 2011, available at www.sakhr.com/ocr.aspx, 1 page.
Stein, S., "How does the Motorola Atrix 4G Lapdock compare with a laptop?" Crave-CNET, at http://news.cnet.com/8301-17938_105-20031251-1.html, Feb. 9, 2011, 7 pages.
Stein, S., "How does the Motorola Atrix 4G Lapdock compare with a laptop?" Crave—CNET, at http://news.cnet.com/8301-17938_105-20031251-1.html, Feb. 9, 2011, 7 pages.
Sud, et al., "Dynamic Migration of Computation Through Virtualization of the Mobile Platform," Mobile Networks and Applications, 2012, (published online Feb. 22, 2011), vol. 17, Iss. 2, pp. 206-215.
Website entitled, "Kyocera Echo," at www.echobykyocera.com/, 2011, 6 pages.
Website entitled, "Sony Tablet," at www.store.sony.com/webapp/wcs/stores/servlet/CategoryDisplay?catalogId=10551&storeId=10151&langId=-1&categoryId=8198552921644795521, 2011, 3 pages.
Wikipedia, "Balloon help," Jul. 18, 2011, available at www.en.wikipedia.org/wiki/Balloon_help, 3 pages.
Wikipedia, "Google Pinyin," Aug. 27, 2011 available at www.en.wikipedia.org/wiki/Google_Pinyin, 3 pages.
Wikipedia, "Mouseover," Sep. 29, 2011, available at www.en.wikipedia.org/wiki/Mouseover, 2 pages.
Wikipedia, "Predictive text," Aug. 7, 2011, available at www.en.wikipedia.org/wiki/Predictive_test, 6 pages.
Wikipedia, "Sogou Pinyin," Jul. 23, 2011 available at www.en.wikipedia.org/wiki/Sogou_Pinyin, 3 pages.
Wikipedia, "Status bar," Sep. 8, 2011, available at www.en.wikipedia.org/wiki/Status_bar, 3 pages.
Wikipedia, "Tooltip," Sep. 17, 2011, available at www.en.wikipedia.org/wiki/Tooltip, 2 pages.

Also Published As

Publication number Publication date
WO2013010143A2 (en) 2013-01-17
US20160054862A1 (en) 2016-02-25
WO2013010143A3 (en) 2013-04-25
US20130024812A1 (en) 2013-01-24

Similar Documents

Publication Publication Date Title
US10528210B2 (en) Foreground/background assortment of hidden windows
US10503344B2 (en) Dynamic cross-environment application configuration/orientation
US8966379B2 (en) Dynamic cross-environment application configuration/orientation in an active user environment
US9047102B2 (en) Instant remote rendering
US8957905B2 (en) Cross-environment user interface mirroring
US8933949B2 (en) User interaction across cross-environment applications through an extended graphics context
US8819705B2 (en) User interaction support across cross-environment applications

Legal Events

Date Code Title Description
AS Assignment

Owner name: Z124, CAYMAN ISLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:REEVES, BRIAN;REEVES, PAUL E.;LIU, WUKE;AND OTHERS;SIGNING DATES FROM 20120920 TO 20130117;REEL/FRAME:037170/0527

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction
MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4