US20170060394A1 - System and method of modifying a user experience based on physical environment - Google Patents
System and method of modifying a user experience based on physical environment Download PDFInfo
- Publication number
- US20170060394A1 US20170060394A1 US15/352,780 US201615352780A US2017060394A1 US 20170060394 A1 US20170060394 A1 US 20170060394A1 US 201615352780 A US201615352780 A US 201615352780A US 2017060394 A1 US2017060394 A1 US 2017060394A1
- Authority
- US
- United States
- Prior art keywords
- electronic device
- physical environment
- notification
- mode
- vibration motor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
Definitions
- User experience is a broad term covering many aspects of users' experiences with computing products or services accessed through computing products (such as web sites).
- the user experience includes not only the user interface, but also the graphics and physical interaction.
- a user experience is somewhat static in nature.
- the layout of an application on a smartphone is generally the same for most or all users who access the application, regardless of the physical environment in which the application is being used.
- Cold temperatures can make it difficult to use a touchscreen application due to shaking or numb fingers or the use of gloves. While certain gloves have been designed that are “touch-sensitive,” in that a touchscreen device can detect location of the glove despite the absence of an actual finger touching the screen, these gloves still result in an ultimate footprint of the “touch” being larger than the user's finger.
- FIG. 1 is a network diagram depicting a client-server system within which one example embodiment may be deployed.
- FIG. 2 is a diagram illustrating a progression, in accordance with an example embodiment, of dynamic alteration of a user interface based on temperature.
- FIG. 3 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user experience.
- FIG. 4 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user experience.
- FIG. 5 is an interaction diagram illustrating a method, in accordance with an example embodiment, of dynamically altering a user interface.
- FIG. 6 is an interaction diagram illustrating a method, in accordance with an example embodiment, of dynamically altering a user interface.
- FIG. 7 is a flow diagram illustrating a method, in accordance with an example embodiment, of dynamically altering a user interface.
- FIG. 8 is a flow diagram illustrating a method, in accordance with another example embodiment, of dynamically altering a user interface.
- FIG. 9 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
- various aspects of a user experience are dynamically altered in order to provide a customized and efficient experience for the user, based on the physical environment.
- temperature of the physical environment of a user device is taken into account, and the user experience is modified based on this temperature.
- Elements within a user interface such as button size, advertising sizing, font, color, placement, the presences of certain interface objects, etc., can all be dynamically altered based on this physical environment information as well as other factors (e.g., demographic information, information from user profiles, etc.).
- a search bar displayed in an application may change in size and location on the screen of a touchscreen device based on the current temperature at the current location of the touchscreen device. In an extreme embodiment, all elements but the search bar may be removed when the temperature is extremely cold, reducing the user interface to its bare minimum elements.
- FIG. 1 is a network diagram depicting a client-server system 100 , within which one example embodiment may be deployed.
- a networked system 102 in the example forms of a network-based marketplace or publication system, provides server-side functionality, via a network 104 (e.g., the Internet or Wide Area Network (WAN)) to one or more clients.
- FIG. 1 illustrates, for example, a web client 106 (e.g., a browser), and a programmatic client 108 executing on respective client machines 110 and 112 .
- a web client 106 e.g., a browser
- programmatic client 108 executing on respective client machines 110 and 112 .
- An Application Program Interface (API) server 114 and a web server 116 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 118 .
- the application servers 118 host one or more marketplace applications 120 and payment applications 122 .
- the application servers 118 are, in turn, shown to be coupled to one or more database servers 124 that facilitate access to one or more databases 126 .
- the marketplace applications 120 may provide a number of marketplace functions and services to users that access the networked system 102 .
- the payment applications 122 may likewise provide a number of payment services and functions to users.
- the payment applications 122 may allow users to accumulate value (e.g., in a commercial currency, such as the U.S. dollar, or a proprietary currency, such as “points”) in accounts, and then later to redeem the accumulated value for products (e.g., goods or services) that are made available via the marketplace applications 120 . While the marketplace and payment applications 120 and 122 are shown in FIG. 1 to both form part of the networked system 102 , it will be appreciated that, in alternative embodiments, the payment applications 122 may form part of a payment service that is separate and distinct from the networked system 102 .
- system 100 shown in FIG. 1 employs a client-server architecture
- present disclosure is of course not limited to such an architecture, and may equally well find application in a distributed, or peer-to-peer, architecture system, for example.
- the various marketplace and payment applications 120 and 122 may also be implemented as standalone software programs, which do not necessarily have networking capabilities.
- the web client 106 accesses the various marketplace and payment applications 120 and 122 via the web interface supported by the web server 116 .
- the programmatic client 108 accesses the various services and functions provided by the marketplace and payment applications 120 and 122 via the programmatic interface provided by the API server 114 .
- the programmatic client 108 may, for example, be a seller application (e.g., the TurboLister application developed by eBay Inc., of San Jose, Calif.) to enable sellers to author and manage listings on the networked system 102 in an off-line manner, and to perform batch-mode communications between the programmatic client 108 and the networked system 102 .
- FIG. 1 also illustrates a third party application 128 , executing on a third party server machine 130 , as having programmatic access to the networked system 102 via the programmatic interface provided by the API server 114 .
- the third party application 128 may, utilizing information retrieved from the networked system 102 , support one or more features or functions on a website hosted by the third party.
- the third party website may, for example, provide one or more promotional, marketplace, or payment functions that are supported by the relevant applications of the networked system 102 .
- FIG. 2 is a diagram illustrating a progression, in accordance with an example embodiment, of dynamic alteration of a user interface based on temperature.
- Pictured here are a mobile device 200 a, 200 b, 200 c in three states. It should be noted that while a mobile device is depicted, a similar process could run on any electronic device. Beginning with the mobile device 200 a in the first state, it can be seen that the user interface 202 a has various sections, including a search bar 204 , an activity dashboard 206 , a merchandise area 208 , and an advertisement 210 .
- the user interface 202 a may depict an interface to an online auction web site, although one of ordinary skill in the art will recognize that this disclosure can apply to other types of user interfaces as well.
- the activity dashboard 206 Within the activity dashboard 206 are three activities: watching 212 (for items in the online auction the user has selected as being of interest), buying 214 (for items in the online auction the user has bid on), and selling 216 (for items in the online auction the user is selling).
- buttons 218 , 220 , 222 , 224 , 226 , 228 may be a number of buttons 218 , 220 , 222 , 224 , 226 , 228 .
- the temperature of the environment surrounding the mobile device 200 b has fallen below a predetermined threshold.
- the system may track this temperature change and adjust the user interface 202 b to better align with the temperature.
- the size of the search bar 204 may be increased, and the buttons 218 - 224 in the merchandise area 208 may he increased in size, making it easier for the user to select these items with a shivering, numb, or gloved finger.
- the activity dashboard 206 has been eliminated.
- the temperature of the environment surrounding the mobile device 200 c has fallen even more, past a further threshold. This temperature decrease may be so extreme that the system may decide to reduce the user interface 202 c to its barest minimum element, namely the search bar 204 , which here has been increased in size to fill the entire user interface 202 c.
- a location of a mobile device is obtained via, for example, global positioning system (GPS) information from a GPS module located in the mobile device.
- GPS global positioning system
- the location may be deduced using other information, such as Internet Protocol (IP) address, cell phone tower proximity or triangulation, or express user interaction (e.g., the user informs the application of the location).
- IP Internet Protocol
- a temperature corresponding to the location may then be retrieved from a weather server, which may provide a current temperature for the user's location. In some embodiments, it may be appropriate to distinguish between when the user device is indoors or outdoors.
- wintertime outside in Minneapolis may be ⁇ 10 degrees Fahrenheit, but inside in the same city the temperature may be 70 degrees Fahrenheit.
- GPS information may be precise enough to determine whether the user is inside or outside, there may be instances where it is borderline (such as if the user is near an exit, or due to GPS interference).
- other factors may be used to help determine whether the mobile device is inside or outside, such as ambient light levels or information received from a microphone on the mobile device (e.g., crickets chirping, traffic sounds, or wind noise may all be indicative of being outside).
- thermometer is embedded in a mobile device, and the thermometer may be accessed directly to obtain the local temperature.
- hot temperatures may also result in the system deciding to alter the user interface or other elements.
- the same types of user interface modifications described above with respect to cold temperatures may also apply to extremely hot temperatures as well.
- FIG. 3 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user experience.
- depicted are two states of the mobile device 300 a, 300 b.
- the user interface 302 a may be identical to the user interface 302 b, but in the presence of a lower temperature, the mobile device 300 b has automatically activated a voice recognition service, which allows the user to speak commands as opposed to, or in conjunction with, touch input.
- FIG. 4 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user experience.
- the user interface 402 a may be identical to the user interface 402 b, but in the presence of a lower temperature, the mobile device 400 b has automatically altered a mapping of a physical button 404 to operate one of the functions of the user interface.
- the physical button 404 which ordinarily may be used to raise a volume level of the mobile device, is automatically switched to activate a search function of a user interface.
- a mobile device upon detecting a low surrounding temperature, can activate a haptic mode of the mobile device, which provides tactile feedback for touch input through the use of, for example, vibrations. This can be helpful when fingers are numb and a user may not ordinarily be able to detect whether or not he or she is actually pressing a touchscreen.
- a mobile device upon detecting a low surrounding temperature, can take active steps to raise the temperature of the mobile device. This may include, for example, increasing the brightness of the screen, running a central processing unit (CPU) at maximum levels, and taking other steps to heat up the mobile device itself.
- CPU central processing unit
- Humidity, sun position, sunrise and sunset times, and other parameters that can be obtained from a weather server can also be used.
- other factors, such as ambient noise and location can be utilized as well. With respect to ambient noise, for example, a mobile device could dynamically place itself into a vibrate mode and/or a non-voice recognition mode when the ambient noise becomes too great, as the user may not be able to hear a ringer in such an environment or be able to speak voice commands that could be understood.
- FIG. 5 is an interaction diagram illustrating a method 500 , in accordance with an example embodiment, of dynamically altering a user interface.
- a user interface 502 which may be contained on a mobile device 504 , interacts with a physical environment tracking module 506 , also located on the mobile device 504 .
- the physical environment tracking module 506 detects a location of the mobile device 504 , and then at operation 510 requests a local temperature for this location from a weather server 512 .
- the weather server 512 returns the temperature at operation 514 .
- the physical environment tracking module 506 determines that the local temperature passes a predetermined threshold and dynamically modifies the user interface 502 based on this local temperature at operation 516 , and then at operation 518 passes the dynamically modified user interface to the user interface 502 for display.
- FIG. 6 is an interaction diagram illustrating a method 600 , in accordance with an example embodiment, of dynamically altering a user interface.
- a user interface 602 is located on a user device 604 , along with a physical environment tracking module 606 .
- the physical environment tracking module 606 detects information from sensors on the user device 604 .
- this information is passed to a web server 612 .
- the web server 612 requests a local temperature from the weather server 616 , using the sensor information.
- the weather server 616 returns the local temperature, which the web server 612 uses at operation 620 to dynamically modify user interface 602 .
- the web server 612 passes the dynamically modified user interface to the user interface 602 for display.
- FIG. 7 is a flow diagram illustrating a method 700 , in accordance with an example embodiment, of dynamically altering a user interface.
- a user interface is presented to a user.
- a first aspect of a physical environment of the electronic device is determined.
- An aspect of a physical environment may be any feature of the physical environment, such as temperature, rain patterns, snow conditions, allergy conditions, sun position, etc.
- one or more of the elements of the user interface are dynamically modified based on the first aspect.
- FIG. 8 is a flow diagram illustrating a method 800 , in accordance with another example embodiment, of dynamically altering a user interface.
- physical environment information is received from an electronic device.
- one or more of the elements of the user interface are dynamically modified based on the physical environment information.
- the dynamically modified user interface is passed to the electronic device for display.
- FIG. 9 shows a diagrammatic representation of a machine in the example form of a computer system 900 within which a set of instructions 924 for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
- the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
- the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
- PC personal computer
- PDA personal digital assistant
- STB set-top box
- a cellular telephone a web appliance
- network router switch or bridge
- the example computer system 900 includes a processor 902 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), a main memory 904 and a static memory 906 , which communicate with each other via a bus 908 .
- the computer system 900 may further include a video display unit 910 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)).
- the computer system 900 also includes an alphanumeric input device 912 (e.g., a keyboard), a cursor control device 914 (e.g., a mouse), a disk drive unit 916 , a signal generation device 918 (e.g., a speaker), and a network interface device 920 .
- the disk drive unit 916 includes a computer-readable medium 922 on which is stored one or more sets of instructions 924 (e.g., software) embodying any one or more of the methodologies or functions described herein.
- the instructions 924 may also reside, completely or at least partially, within the main memory 904 and/or within the processor 902 during execution thereof by the computer system 900 , with the main memory 904 and the processor 902 also constituting machine-readable media.
- the instructions 924 may further be transmitted or received over a network 926 via the network interface device 920 .
- machine-readable medium 922 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 924 .
- the term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies described herein.
- the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Business, Economics & Management (AREA)
- Software Systems (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
In an example embodiment, a first aspect of a physical environment, other than location or current time, of the electronic device is determined. Then a mode of a notification function within the electronic device is dynamically modified such that the mode changes from a first mode in which a notification does not activate a vibration motor in the electronic device to a second mode in which the notification does activate the vibration motor, based on the determined first aspect of the physical environment.
Description
- The application is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 13/769,499, filed on Feb. 18, 2013, which is hereby incorporated by reference herein in its entirety.
- User experience is a broad term covering many aspects of users' experiences with computing products or services accessed through computing products (such as web sites). The user experience includes not only the user interface, but also the graphics and physical interaction. For the most part, such a user experience is somewhat static in nature. The layout of an application on a smartphone, for example, is generally the same for most or all users who access the application, regardless of the physical environment in which the application is being used. There are factors, however, that can affect the user's ability to interact with such a user experience. Cold temperatures, for example, can make it difficult to use a touchscreen application due to shaking or numb fingers or the use of gloves. While certain gloves have been designed that are “touch-sensitive,” in that a touchscreen device can detect location of the glove despite the absence of an actual finger touching the screen, these gloves still result in an ultimate footprint of the “touch” being larger than the user's finger.
-
FIG. 1 is a network diagram depicting a client-server system within which one example embodiment may be deployed. -
FIG. 2 is a diagram illustrating a progression, in accordance with an example embodiment, of dynamic alteration of a user interface based on temperature. -
FIG. 3 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user experience. -
FIG. 4 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user experience. -
FIG. 5 is an interaction diagram illustrating a method, in accordance with an example embodiment, of dynamically altering a user interface. -
FIG. 6 is an interaction diagram illustrating a method, in accordance with an example embodiment, of dynamically altering a user interface. -
FIG. 7 is a flow diagram illustrating a method, in accordance with an example embodiment, of dynamically altering a user interface. -
FIG. 8 is a flow diagram illustrating a method, in accordance with another example embodiment, of dynamically altering a user interface. -
FIG. 9 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. - The description that follows includes illustrative systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques have not been shown in detail.
- Although the present embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader scope of the embodiments. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
- In an example embodiment, various aspects of a user experience are dynamically altered in order to provide a customized and efficient experience for the user, based on the physical environment. In an example embodiment, temperature of the physical environment of a user device is taken into account, and the user experience is modified based on this temperature. Elements within a user interface, such as button size, advertising sizing, font, color, placement, the presences of certain interface objects, etc., can all be dynamically altered based on this physical environment information as well as other factors (e.g., demographic information, information from user profiles, etc.). For example, a search bar displayed in an application may change in size and location on the screen of a touchscreen device based on the current temperature at the current location of the touchscreen device. In an extreme embodiment, all elements but the search bar may be removed when the temperature is extremely cold, reducing the user interface to its bare minimum elements.
-
FIG. 1 is a network diagram depicting a client-server system 100, within which one example embodiment may be deployed. Anetworked system 102, in the example forms of a network-based marketplace or publication system, provides server-side functionality, via a network 104 (e.g., the Internet or Wide Area Network (WAN)) to one or more clients.FIG. 1 illustrates, for example, a web client 106 (e.g., a browser), and aprogrammatic client 108 executing onrespective client machines - An Application Program Interface (API)
server 114 and aweb server 116 are coupled to, and provide programmatic and web interfaces respectively to, one ormore application servers 118. Theapplication servers 118 host one ormore marketplace applications 120 andpayment applications 122. Theapplication servers 118 are, in turn, shown to be coupled to one ormore database servers 124 that facilitate access to one ormore databases 126. - The
marketplace applications 120 may provide a number of marketplace functions and services to users that access thenetworked system 102. Thepayment applications 122 may likewise provide a number of payment services and functions to users. Thepayment applications 122 may allow users to accumulate value (e.g., in a commercial currency, such as the U.S. dollar, or a proprietary currency, such as “points”) in accounts, and then later to redeem the accumulated value for products (e.g., goods or services) that are made available via themarketplace applications 120. While the marketplace andpayment applications FIG. 1 to both form part of thenetworked system 102, it will be appreciated that, in alternative embodiments, thepayment applications 122 may form part of a payment service that is separate and distinct from thenetworked system 102. - Further, while the
system 100 shown inFIG. 1 employs a client-server architecture, the present disclosure is of course not limited to such an architecture, and may equally well find application in a distributed, or peer-to-peer, architecture system, for example. The various marketplace andpayment applications - The
web client 106 accesses the various marketplace andpayment applications web server 116. Similarly, theprogrammatic client 108 accesses the various services and functions provided by the marketplace andpayment applications API server 114. Theprogrammatic client 108 may, for example, be a seller application (e.g., the TurboLister application developed by eBay Inc., of San Jose, Calif.) to enable sellers to author and manage listings on thenetworked system 102 in an off-line manner, and to perform batch-mode communications between theprogrammatic client 108 and thenetworked system 102. -
FIG. 1 also illustrates athird party application 128, executing on a thirdparty server machine 130, as having programmatic access to thenetworked system 102 via the programmatic interface provided by theAPI server 114. For example, thethird party application 128 may, utilizing information retrieved from thenetworked system 102, support one or more features or functions on a website hosted by the third party. The third party website may, for example, provide one or more promotional, marketplace, or payment functions that are supported by the relevant applications of thenetworked system 102. -
FIG. 2 is a diagram illustrating a progression, in accordance with an example embodiment, of dynamic alteration of a user interface based on temperature. Pictured here are amobile device mobile device 200 a in the first state, it can be seen that theuser interface 202 a has various sections, including asearch bar 204, anactivity dashboard 206, amerchandise area 208, and anadvertisement 210. For discussion purposes, this may he referred to as a default or beginning layout, although since the methods described herein are dynamically applied, there need not be any state that is strictly known as a default or beginning layout because the layout may simply be continuously adjusted. Theuser interface 202 a here may depict an interface to an online auction web site, although one of ordinary skill in the art will recognize that this disclosure can apply to other types of user interfaces as well. - Within the
activity dashboard 206 are three activities: watching 212 (for items in the online auction the user has selected as being of interest), buying 214 (for items in the online auction the user has bid on), and selling 216 (for items in the online auction the user is selling). - Within the
merchandise area 208 may be a number ofbuttons - Turning to
mobile device 200 b, which is in the second state, the temperature of the environment surrounding themobile device 200 b has fallen below a predetermined threshold. The system may track this temperature change and adjust theuser interface 202 b to better align with the temperature. Specifically, the size of thesearch bar 204 may be increased, and the buttons 218-224 in themerchandise area 208 may he increased in size, making it easier for the user to select these items with a shivering, numb, or gloved finger. In order to compensate for the increase in size of these elements, theactivity dashboard 206 has been eliminated. - Turning to
mobile device 200 c, which is in the third state, the temperature of the environment surrounding themobile device 200 c has fallen even more, past a further threshold. This temperature decrease may be so extreme that the system may decide to reduce theuser interface 202 c to its barest minimum element, namely thesearch bar 204, which here has been increased in size to fill theentire user interface 202 c. - In the example embodiments where temperature is used as the physical environmental factor affecting the user experience, the temperature may be retrieved in a number of different ways. In one example embodiment, a location of a mobile device is obtained via, for example, global positioning system (GPS) information from a GPS module located in the mobile device. Alternatively, the location may be deduced using other information, such as Internet Protocol (IP) address, cell phone tower proximity or triangulation, or express user interaction (e.g., the user informs the application of the location). A temperature corresponding to the location may then be retrieved from a weather server, which may provide a current temperature for the user's location. In some embodiments, it may be appropriate to distinguish between when the user device is indoors or outdoors. For example, wintertime outside in Minneapolis may be −10 degrees Fahrenheit, but inside in the same city the temperature may be 70 degrees Fahrenheit. While the GPS information may be precise enough to determine whether the user is inside or outside, there may be instances where it is borderline (such as if the user is near an exit, or due to GPS interference). In such cases, other factors may be used to help determine whether the mobile device is inside or outside, such as ambient light levels or information received from a microphone on the mobile device (e.g., crickets chirping, traffic sounds, or wind noise may all be indicative of being outside).
- Other mechanisms to detect temperature may also be used. In some example embodiments, a thermometer is embedded in a mobile device, and the thermometer may be accessed directly to obtain the local temperature.
- Additionally, it is not just cold temperatures that may affect the user experience. In some embodiments, hot temperatures may also result in the system deciding to alter the user interface or other elements. When temperatures exceed 90 or 100 degrees Fahrenheit, for example, it is quite common for users' hands to get sweaty, and the sweat can interfere with a touchscreen's ability to accurately detect user position. As such, the same types of user interface modifications described above with respect to cold temperatures may also apply to extremely hot temperatures as well.
- Furthermore, while the above embodiments describe altering the visual user interface itself in response to the physical environment, other aspects of the user experience can be altered in lieu of or in conjunction with the visual user interface.
-
FIG. 3 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user experience. Here, depicted are two states of themobile device 300 a, 300 b. Theuser interface 302 a may be identical to theuser interface 302 b, but in the presence of a lower temperature, the mobile device 300 b has automatically activated a voice recognition service, which allows the user to speak commands as opposed to, or in conjunction with, touch input. -
FIG. 4 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user experience. Here, depicted are two states of themobile device user interface 402 a may be identical to theuser interface 402 b, but in the presence of a lower temperature, themobile device 400 b has automatically altered a mapping of aphysical button 404 to operate one of the functions of the user interface. Here, thephysical button 404, which ordinarily may be used to raise a volume level of the mobile device, is automatically switched to activate a search function of a user interface. - Other changes in response to the physical environment may also be implemented. In one example embodiment, a mobile device, upon detecting a low surrounding temperature, can activate a haptic mode of the mobile device, which provides tactile feedback for touch input through the use of, for example, vibrations. This can be helpful when fingers are numb and a user may not ordinarily be able to detect whether or not he or she is actually pressing a touchscreen.
- In another example embodiment, a mobile device, upon detecting a low surrounding temperature, can take active steps to raise the temperature of the mobile device. This may include, for example, increasing the brightness of the screen, running a central processing unit (CPU) at maximum levels, and taking other steps to heat up the mobile device itself.
- Other parameters of a physical environment can also be used to dynamically alter the user experience. Humidity, sun position, sunrise and sunset times, and other parameters that can be obtained from a weather server can also be used. Additionally, other factors, such as ambient noise and location can be utilized as well. With respect to ambient noise, for example, a mobile device could dynamically place itself into a vibrate mode and/or a non-voice recognition mode when the ambient noise becomes too great, as the user may not be able to hear a ringer in such an environment or be able to speak voice commands that could be understood.
-
FIG. 5 is an interaction diagram illustrating amethod 500, in accordance with an example embodiment, of dynamically altering a user interface. In thismethod 500, auser interface 502, which may be contained on amobile device 504, interacts with a physicalenvironment tracking module 506, also located on themobile device 504. Atoperation 508, the physicalenvironment tracking module 506 detects a location of themobile device 504, and then atoperation 510 requests a local temperature for this location from aweather server 512. Theweather server 512 returns the temperature atoperation 514. The physicalenvironment tracking module 506 then determines that the local temperature passes a predetermined threshold and dynamically modifies theuser interface 502 based on this local temperature atoperation 516, and then atoperation 518 passes the dynamically modified user interface to theuser interface 502 for display. -
FIG. 6 is an interaction diagram illustrating amethod 600, in accordance with an example embodiment, of dynamically altering a user interface. In thismethod 600, auser interface 602 is located on auser device 604, along with a physicalenvironment tracking module 606. Atoperation 608, the physicalenvironment tracking module 606 detects information from sensors on theuser device 604. Atoperation 610, this information is passed to aweb server 612. Atoperation 614, theweb server 612 requests a local temperature from theweather server 616, using the sensor information. Atoperation 618, theweather server 616 returns the local temperature, which theweb server 612 uses atoperation 620 to dynamically modifyuser interface 602. Atoperation 622, theweb server 612 passes the dynamically modified user interface to theuser interface 602 for display. -
FIG. 7 is a flow diagram illustrating amethod 700, in accordance with an example embodiment, of dynamically altering a user interface. Atoperation 702, a user interface is presented to a user. Atoperation 704, a first aspect of a physical environment of the electronic device is determined. An aspect of a physical environment may be any feature of the physical environment, such as temperature, rain patterns, snow conditions, allergy conditions, sun position, etc. Atoperation 706, one or more of the elements of the user interface are dynamically modified based on the first aspect. -
FIG. 8 is a flow diagram illustrating amethod 800, in accordance with another example embodiment, of dynamically altering a user interface. Atoperation 802, physical environment information is received from an electronic device. Atoperation 804, one or more of the elements of the user interface are dynamically modified based on the physical environment information. Atoperation 806, the dynamically modified user interface is passed to the electronic device for display. -
FIG. 9 shows a diagrammatic representation of a machine in the example form of acomputer system 900 within which a set ofinstructions 924 for causing the machine to perform any one or more of the methodologies discussed herein may be executed. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein. - The
example computer system 900 includes a processor 902 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), amain memory 904 and astatic memory 906, which communicate with each other via abus 908. Thecomputer system 900 may further include a video display unit 910 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). Thecomputer system 900 also includes an alphanumeric input device 912 (e.g., a keyboard), a cursor control device 914 (e.g., a mouse), adisk drive unit 916, a signal generation device 918 (e.g., a speaker), and anetwork interface device 920. - The
disk drive unit 916 includes a computer-readable medium 922 on which is stored one or more sets of instructions 924 (e.g., software) embodying any one or more of the methodologies or functions described herein. Theinstructions 924 may also reside, completely or at least partially, within themain memory 904 and/or within theprocessor 902 during execution thereof by thecomputer system 900, with themain memory 904 and theprocessor 902 also constituting machine-readable media. Theinstructions 924 may further be transmitted or received over anetwork 926 via thenetwork interface device 920. - While the machine-
readable medium 922 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets ofinstructions 924. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies described herein. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals. - Although the inventive concepts have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the inventive concepts. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
- The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.
Claims (20)
1. An electronic device, comprising:
a processor,
a vibration motor configured to cause a physical vibration in the electronic device upon activation of the vibration motor; and
a physical environment tracking module configured to determine a first aspect of a physical environment, other than location or current time, of the electronic device and to dynamically alter a mode of a notification function within the electronic device such that the mode changes from a first mode in which a notification does not activate the vibration motor to a second mode in which the notification does activate the vibration motor, based on the determined first aspect of the physical environment.
2. The electronic device of claim 1 , further comprising a microphone and wherein the first aspect of the physical environment is ambient noise levels recorded by the microphone.
3. The electronic device of claim 1 , wherein the notification is an incoming phone call.
4. The electronic device of claim 1 , wherein the notification is an incoming text message.
5. The electronic device of claim 1 , further comprising a thermometer and wherein the first aspect of the physical environment is temperature recorded by the thermometer.
6. The electronic device of claim 1 , wherein intensity of the physical vibration caused by the vibration motor is varied based on a second aspect of the physical environment.
7. The electronic device of claim 1 , wherein the first aspect of the physical environment is acceleration of the electronic device as detected by an accelerometer.
8. A method of dynamically altering a user interface on an electronic device, comprising:
determining a first aspect of a physical environment, other than location or current time, of the electronic device; and
dynamically altering a mode of a notification function within the electronic device such that the mode changes from a first mode in which a notification does not activate a vibration motor in the electronic device to a second mode in which the notification does activate the vibration motor, based on the determined first aspect of the physical environment.
9. The method of claim 8 , wherein the first aspect of the physical environment is ambient noise levels recorded by a microphone.
10. The method of claim 8 , wherein the notification is an incoming phone call.
11. The method of claim 8 , wherein the notification is an incoming text message.
12. The method of claim 8 , wherein the first aspect of the physical environment is temperature recorded by a thermometer.
13. The method of claim 8 , wherein intensity of the physical vibration caused by the vibration motor is varied based on a second aspect of the physical environment.
14. The method of claim 8 , wherein the first aspect of the physical environment is acceleration of the electronic device as detected by an accelerometer.
15. A non-transitory machine-readable storage medium comprising instructions, which when implemented by one or more machines, cause the one or more machines to perform operations comprising:
determining a first aspect of a physical environment, other than location or current time, of an electronic device; and
dynamically altering a mode of a notification function within the electronic device such that the mode changes from a first mode in which a notification does not activate a vibration motor in the electronic device to a second mode in which the notification does activate the vibration motor, based on the determined first aspect of the physical environment.
16. The non-transitory machine-readable storage medium of claim 15 , wherein the first aspect of the physical environment is ambient noise levels recorded by a microphone.
17. The non-transitory machine-readable storage medium of claim 15 , wherein the notification is an incoming phone call.
18. The non-transitory machine-readable storage medium of claim 15 , wherein the notification is an incoming text message.
19. The non-transitory machine-readable storage medium of claim 15 , wherein the first aspect of the physical environment is temperature recorded by a thermometer.
20. The non-transitory machine-readable storage medium of claim 15 , wherein intensity of the physical vibration caused by the vibration motor is varied based on a second aspect of the physical environment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/352,780 US20170060394A1 (en) | 2013-02-18 | 2016-11-16 | System and method of modifying a user experience based on physical environment |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/769,499 US9501201B2 (en) | 2013-02-18 | 2013-02-18 | System and method of modifying a user experience based on physical environment |
US15/352,780 US20170060394A1 (en) | 2013-02-18 | 2016-11-16 | System and method of modifying a user experience based on physical environment |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/769,499 Continuation US9501201B2 (en) | 2013-02-18 | 2013-02-18 | System and method of modifying a user experience based on physical environment |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170060394A1 true US20170060394A1 (en) | 2017-03-02 |
Family
ID=51352236
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/769,499 Active 2034-01-06 US9501201B2 (en) | 2013-02-18 | 2013-02-18 | System and method of modifying a user experience based on physical environment |
US15/352,780 Abandoned US20170060394A1 (en) | 2013-02-18 | 2016-11-16 | System and method of modifying a user experience based on physical environment |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/769,499 Active 2034-01-06 US9501201B2 (en) | 2013-02-18 | 2013-02-18 | System and method of modifying a user experience based on physical environment |
Country Status (1)
Country | Link |
---|---|
US (2) | US9501201B2 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6060783B2 (en) * | 2013-04-08 | 2017-01-18 | 富士通株式会社 | Correction processing program, information processing apparatus, and correction processing method |
CN105373299A (en) * | 2014-08-25 | 2016-03-02 | 深圳富泰宏精密工业有限公司 | Electronic apparatus and display interface adjustment method therefor |
US9852355B2 (en) * | 2015-04-21 | 2017-12-26 | Thales Avionics, Inc. | Facial analysis for vehicle entertainment system metrics |
WO2020014984A1 (en) * | 2018-07-20 | 2020-01-23 | 华为技术有限公司 | Application program control method and electronic device |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080036591A1 (en) * | 2006-08-10 | 2008-02-14 | Qualcomm Incorporated | Methods and apparatus for an environmental and behavioral adaptive wireless communication device |
US20100321312A1 (en) * | 2009-06-19 | 2010-12-23 | Lg Electronics Inc. | Method for processing touch signal in mobile terminal and mobile terminal using the same |
US20120276947A1 (en) * | 2011-04-27 | 2012-11-01 | Daniel Adam Smith | Symmetrical communicator device that dynamically changes the function of the hardware as orientation of the device changes. |
US20130006404A1 (en) * | 2011-06-30 | 2013-01-03 | Nokia Corporation | Method and apparatus for providing audio-based control |
US20140049883A1 (en) * | 2012-08-20 | 2014-02-20 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling vibration intensity according to situation awareness in electronic device |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6567104B1 (en) * | 1999-05-20 | 2003-05-20 | Microsoft Corporation | Time-based dynamic user interface elements |
WO2002033541A2 (en) * | 2000-10-16 | 2002-04-25 | Tangis Corporation | Dynamically determining appropriate computer interfaces |
US6668177B2 (en) * | 2001-04-26 | 2003-12-23 | Nokia Corporation | Method and apparatus for displaying prioritized icons in a mobile terminal |
EP1429314A1 (en) * | 2002-12-13 | 2004-06-16 | Sony International (Europe) GmbH | Correction of energy as input feature for speech processing |
US7689256B2 (en) * | 2003-11-10 | 2010-03-30 | Research In Motion Limited | Methods and apparatus for limiting communication capabilities in mobile communication devices |
US20060107219A1 (en) * | 2004-05-26 | 2006-05-18 | Motorola, Inc. | Method to enhance user interface and target applications based on context awareness |
US7890863B2 (en) * | 2006-10-04 | 2011-02-15 | Immersion Corporation | Haptic effects with proximity sensing |
US20080168267A1 (en) * | 2007-01-09 | 2008-07-10 | Bolen Charles S | System and method for dynamically configuring a mobile device |
KR101390103B1 (en) * | 2007-04-03 | 2014-04-28 | 엘지전자 주식회사 | Controlling image and mobile terminal |
US9170649B2 (en) * | 2007-12-28 | 2015-10-27 | Nokia Technologies Oy | Audio and tactile feedback based on visual environment |
US8040233B2 (en) * | 2008-06-16 | 2011-10-18 | Qualcomm Incorporated | Methods and systems for configuring mobile devices using sensors |
EP3206381A1 (en) * | 2008-07-15 | 2017-08-16 | Immersion Corporation | Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging |
US8086265B2 (en) * | 2008-07-15 | 2011-12-27 | At&T Intellectual Property I, Lp | Mobile device interface and methods thereof |
WO2010074740A1 (en) * | 2008-12-24 | 2010-07-01 | Telecommunication Systems, Inc. | Point of interest (poi) navigation search using business hours |
US20110054776A1 (en) * | 2009-09-03 | 2011-03-03 | 21St Century Systems, Inc. | Location-based weather update system, method, and device |
US9132773B2 (en) * | 2009-12-07 | 2015-09-15 | Cobra Electronics Corporation | Mobile communication system and method for analyzing alerts associated with vehicular travel |
US20120081337A1 (en) * | 2010-10-04 | 2012-04-05 | Sony Ericsson Mobile Communications Ab | Active Acoustic Multi-Touch and Swipe Detection for Electronic Devices |
US20120197728A1 (en) * | 2011-01-27 | 2012-08-02 | Seven Networks, Inc. | Single action access to context specific content at a mobile device |
KR102078093B1 (en) * | 2011-11-10 | 2020-02-18 | 삼성전자 주식회사 | Device and method for controlling temperature of wireless terminal |
-
2013
- 2013-02-18 US US13/769,499 patent/US9501201B2/en active Active
-
2016
- 2016-11-16 US US15/352,780 patent/US20170060394A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080036591A1 (en) * | 2006-08-10 | 2008-02-14 | Qualcomm Incorporated | Methods and apparatus for an environmental and behavioral adaptive wireless communication device |
US20100321312A1 (en) * | 2009-06-19 | 2010-12-23 | Lg Electronics Inc. | Method for processing touch signal in mobile terminal and mobile terminal using the same |
US20120276947A1 (en) * | 2011-04-27 | 2012-11-01 | Daniel Adam Smith | Symmetrical communicator device that dynamically changes the function of the hardware as orientation of the device changes. |
US20130006404A1 (en) * | 2011-06-30 | 2013-01-03 | Nokia Corporation | Method and apparatus for providing audio-based control |
US20140049883A1 (en) * | 2012-08-20 | 2014-02-20 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling vibration intensity according to situation awareness in electronic device |
Also Published As
Publication number | Publication date |
---|---|
US9501201B2 (en) | 2016-11-22 |
US20140237400A1 (en) | 2014-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11112867B2 (en) | Surfacing related content based on user interaction with currently presented content | |
US11009958B2 (en) | Method and apparatus for providing sight independent activity reports responsive to a touch gesture | |
US9069443B2 (en) | Method for dynamically displaying a personalized home screen on a user device | |
EP2904471B1 (en) | Data and user interaction based on device proximity | |
US11012753B2 (en) | Computerized system and method for determining media based on selected motion video inputs | |
CN108073605B (en) | Method and device for loading and pushing service data and generating interactive information | |
US9395875B2 (en) | Systems, methods, and computer program products for navigating through a virtual/augmented reality | |
JP2021527247A (en) | Matching content to a spatial 3D environment | |
US20170060394A1 (en) | System and method of modifying a user experience based on physical environment | |
US9342490B1 (en) | Browser-based notification overlays | |
US20150082145A1 (en) | Approaches for three-dimensional object display | |
US9933931B2 (en) | Freeze pane with snap scrolling | |
US20200366556A1 (en) | Phone thermal context | |
JP5580924B1 (en) | Distribution device, terminal device, distribution method, and distribution program | |
JP6224682B2 (en) | Information display program, information display device, information display method, and distribution device | |
CN107609146A (en) | Information displaying method, device, terminal and server | |
US20150058792A1 (en) | Methods, systems and apparatuses for providing user interface navigation, display interactivity and multi-browser arrays | |
JP6158903B2 (en) | Information display program, information display device, information display method, and distribution device | |
US10628848B2 (en) | Entity sponsorship within a modular search object framework | |
JP2017129752A (en) | Information display program, information display method, and control device | |
JP6144245B2 (en) | Terminal device, distribution device, display method, and display program | |
JP5646709B1 (en) | Terminal device, distribution device, display method, and display program | |
US20230409352A1 (en) | Systems and Methods for Dynamically Generating Context Aware Active Icons on a Mobile Device | |
JP2019036219A (en) | Display control program, display controller, method for controlling display, and distribution device | |
CN116048335A (en) | Interaction method, interaction device, electronic apparatus, and computer-readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: EBAY INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HIGGINS, KRYSTAL ROSE;FARRARO, ERIC J.;TAPLEY, JOHN;REEL/FRAME:040342/0068 Effective date: 20130215 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |