US20230072889A1 - Displaying a virtual environment of a session - Google Patents
Displaying a virtual environment of a session Download PDFInfo
- Publication number
- US20230072889A1 US20230072889A1 US18/055,289 US202218055289A US2023072889A1 US 20230072889 A1 US20230072889 A1 US 20230072889A1 US 202218055289 A US202218055289 A US 202218055289A US 2023072889 A1 US2023072889 A1 US 2023072889A1
- Authority
- US
- United States
- Prior art keywords
- item
- user
- dimensional representation
- virtual environment
- causing display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 claims abstract description 48
- 230000003993 interaction Effects 0.000 claims abstract description 47
- 238000004891 communication Methods 0.000 description 20
- 230000008569 process Effects 0.000 description 18
- 239000004570 mortar (masonry) Substances 0.000 description 13
- 238000001514 detection method Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 10
- 230000008878 coupling Effects 0.000 description 7
- 238000010168 coupling process Methods 0.000 description 7
- 238000005859 coupling reaction Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 5
- 230000001413 cellular effect Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 239000007789 gas Substances 0.000 description 3
- 230000014509 gene expression Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000007792 addition Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000003344 environmental pollutant Substances 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012011 method of payment Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 231100000719 pollutant Toxicity 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000008261 resistance mechanism Effects 0.000 description 1
- 230000002207 retinal effect Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0633—Lists, e.g. purchase orders, compilation or processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
Definitions
- Embodiments of the present disclosure relate generally to data processing and, more particularly, but not by way of limitation, to facilitating display of a virtual environment of a session.
- a user may view an item page hosted by a network commerce system.
- the item page includes content that is displayed on a device of the user.
- FIG. 2 is a block diagram illustrating components of a virtual environment system, according to some example embodiments.
- FIGS. 3 - 6 are flowcharts illustrating operations of the virtual environment system in performing a method of displaying 3D content, according to some example embodiments.
- FIG. 7 is a block diagram illustrating an example user interface of an item page, according to some example embodiments.
- FIG. 8 is a block diagram illustrating an example user interface of a virtual item page, according to some example embodiments.
- FIG. 10 illustrates a diagrammatic representation of a machine in the form of a computer system within which a set of instructions may be executed for causing the machine to perform any one or more of the methodologies discussed herein, according to an example embodiment.
- a system presents a session that displays two-dimensional (2D) content on a first device of a user. For instance, the user may be viewing an item page of one or more items available for sale on the device. Also, the system may detect a second device of the user that is capable of viewing three-dimensional (3D) content corresponding to the session. Upon detection of the second device, the system retrieves the 3D content and causes display of the 3D content on the second device of the user.
- the 3D content corresponding to the session includes 3D objects representative of information on a network of the system. In some instances, the information includes items available for sale. Moreover, the 3D content is selectable by the user to perform interactions with the 3D content.
- the system may process the user interactions with the 3D content and display a result, on the first device, that depicts the user interactions with the 3D content as being processed.
- the actions performed by the user while viewing the 3D content on the second device are also reflected in the session that displays the 2D content. This allows for a smooth transition between the 2D session displayed on the first device of the user and the 3D session displayed on the second device of the user.
- one or more of the methodologies discussed herein may obviate a need for a user to interact with two separate sessions, which may have the technical effect of reducing computing resources used by one or more devices within the system.
- computing resources include, without limitation, processor cycles, network traffic, memory usage, storage space, and power consumption.
- a networked system 102 in the example forms of a network-based publication or payment system, provides server-side functionality via a network 104 (e.g., the Internet or wide area network (WAN)) to one or more client devices 110 .
- FIG. 1 illustrates, for example, a web client 112 (e.g., a browser, such as the Internet Explorer® browser developed by Microsoft® Corporation of Redmond, Wash. State), a client application 114 , and a programmatic client 116 executing on the client device 110 .
- a web client 112 e.g., a browser, such as the Internet Explorer® browser developed by Microsoft® Corporation of Redmond, Wash. State
- client application 114 e.g., a browser, such as the Internet Explorer® browser developed by Microsoft® Corporation of Redmond, Wash. State
- programmatic client 116 executing on the client device 110 .
- the client device 110 may comprise, but is not limited to, a mobile phone, desktop computer, laptop, portable digital assistants (PDAs), smart phones, tablets, virtual headsets, ultra-books, netbooks, laptops, multi-processor systems, microprocessor-based or programmable consumer electronics, game consoles, set-top boxes, or any other communication device that a user may utilize to access the networked system 102 .
- the client device 110 comprises a display module (not shown) to display information (e.g., in the form of user interfaces).
- the client device 110 may comprise one or more of a touch screens, accelerometers, gyroscopes, cameras, microphones, global positioning system (GPS) devices, and so forth.
- GPS global positioning system
- the client device 110 may be a device of a user that is used to perform a transaction involving digital items within the networked system 102 .
- the networked system 102 is a network-based marketplace that responds to requests for product listings, publishes publications comprising item listings of products available on the network-based marketplace, and manages payments for these marketplace transactions.
- one or more portions of the network 104 may be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a WiFi network, a WiMax network, another type of network, or a combination of two or more such networks.
- VPN virtual private network
- LAN local area network
- WLAN wireless LAN
- WAN wide area network
- WWAN wireless WAN
- MAN metropolitan area network
- PSTN Public Switched Telephone Network
- PSTN Public Switched Telephone Network
- Each of the client devices 110 includes one or more applications (also referred to as “apps”) such as, but not limited to, a web browser, messaging application, electronic mail (email) application, an e-commerce site application (also referred to as a marketplace application), and the like.
- apps such as, but not limited to, a web browser, messaging application, electronic mail (email) application, an e-commerce site application (also referred to as a marketplace application), and the like.
- this application is configured to locally provide the user interface and at least some of the functionalities with the application configured to communicate with the networked system 102 , on an as needed basis, for data and/or processing capabilities not locally available (e.g., access to a database of items available for sale, to authenticate a user, to verify a method of payment).
- the client device 110 may use its web browser to access the e-commerce site (or a variant thereof) hosted on the networked system 102 .
- One or more users 106 may be a person, a machine, or other means of interacting with the client device 110 .
- the user 106 is not part of the network architecture 100 , but interacts with the network architecture 100 via the client device 110 or other means.
- the user 106 provides input (e.g., touch screen input or alphanumeric input) to the client device 110 and the input is communicated to the networked system 102 via the network 104 .
- the networked system 102 in response to receiving the input from the user 106 , communicates information to the client device 110 via the network 104 to be presented to the user 106 . In this way, the user 106 can interact with the networked system 102 using the client device 110 .
- An application program interface (API) server 120 and a web server 122 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 140 .
- the application servers 140 hosts one or more publication systems 142 and payment systems 144 , each of which may comprise one or more modules or applications and each of which may be embodied as hardware, software, firmware, or any combination thereof.
- the application servers 140 are, in turn, shown to be coupled to one or more database servers 124 that facilitate access to one or more information storage repositories or database(s) 126 .
- the databases 126 are storage devices that store information to be posted (e.g., publications or listings) to the publication system 142 .
- the databases 126 may also store digital item information in accordance with example embodiments.
- a third party application 132 executing on third party server(s) 130 , is shown as having programmatic access to the networked system 102 via the programmatic interface provided by the API server 120 .
- the third party application 132 utilizing information retrieved from the networked system 102 , supports one or more features or functions on a website hosted by the third party.
- the third party website for example, provides one or more promotional, marketplace, or payment functions that are supported by the relevant applications of the networked system 102 .
- the publication systems 142 provide a number of publication functions and services to users 106 that access the networked system 102 .
- the payment systems 144 likewise provide a number of functions to perform or facilitate payments and transactions. While the publication system 142 and payment system 144 are shown in FIG. 1 to both form part of the networked system 102 , it will be appreciated that, in alternative embodiments, each system 142 and 144 may form part of a payment service that is separate and distinct from the networked system 102 . In some embodiments, the payment systems 144 may form part of the publication system 142 .
- the virtual environment system 150 provides virtual three-dimensional content that is displayed on a user device capable of viewing the three-dimensional content.
- the virtual environment system 150 upon detection of the user device, retrieves 3D content that corresponds to a session that a user is viewing.
- the virtual environment system 150 may access the user 3D content from the databases 126 , the third party servers 130 , the publication system 142 , and other sources.
- the virtual environment system 150 communicates with the publication systems 142 (e.g., accessing item listings) and payment system 144 .
- the virtual environment system 150 may be a part of the publication system 142 .
- client-server-based network architecture 100 shown in FIG. 1 employs a client-server architecture
- present inventive subject matter is of course not limited to such an architecture, and could equally well find application in a distributed, or peer-to-peer, architecture system, for example.
- the various publication system 142 , payment system 144 , and virtual environment system 150 could also be implemented as standalone software programs, which do not necessarily have networking capabilities.
- the web client 112 accesses the various publication and payment systems 142 and 144 via the web interface supported by the web server 122 .
- the programmatic client 116 accesses the various services and functions provided by the publication and payment systems 142 and 144 via the programmatic interface provided by the API server 120 .
- the programmatic client 116 may, for example, be a seller application (e.g., the Turbo Lister application developed by eBay® Inc., of San Jose, Calif.) to enable sellers to author and manage listings on the networked system 102 in an off-line manner, and to perform batch-mode communications between the programmatic client 116 and the networked system 102 .
- FIG. 2 is a block diagram illustrating components of the virtual environment system 150 , according to some example embodiments.
- the virtual environment system 150 is shown as including a session module 210 , a detection module 220 , a virtual content module 230 , a display module 240 , a reception module 250 , and a process module 260 , all configured to communicate with each other (e.g., via a bus, shared memory, or a switch).
- Any one or more of the modules described herein may be implemented using hardware (e.g., one or more processors of a machine) or a combination of hardware and software.
- any module described herein may configure a processor (e.g., among one or more processors of a machine) to perform the operations described herein for that module.
- modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices.
- the session module 210 is configured to cause presentation of a session that displays two-dimensional (2D) content.
- the 2D content is of one or more items available for sale.
- the session module 210 communicates with a first device of a user and presents the session on that device (e.g., client device 110 ).
- Example sessions may include browsing an item page of one or more items available for sale, viewing a website of a brick-and-mortar store, watching a video clip, and the like.
- the session module 210 is further to cause presentation of the session based on user credentials of a user. For instance, a user provides the user credentials to log into a user account. Upon login to the user account, the session module 210 presents the session that displays the 2D content.
- the detection module 220 detects a second device of the user that is able to display three-dimensional (3D) content of the one or more items available for sale.
- the second device of the user may be a virtual reality (VR) headset, a VR component of a mobile device, or any other device compatible with displaying 3D content.
- the detection module 220 is to receive the user credentials from the second device of the user.
- the virtual content module 230 retrieves the 3D content that corresponds to the 2D content for the session.
- the 3D content includes one or more 3D objects.
- the 3D content is of the one or more items available for sale, and each of the one or more items available for sale is represented as a 3D object.
- the 3D content will include a virtual environment for the session.
- the virtual environment is used to present the 3D objects representative of the one or more items available for sale.
- the virtual environment may include a 3D item page, a 3D layout of a brick-and-mortar store, a 3D layout of a mall, and the like.
- the 3D item page depicts the 2D components of the item page in 3D form.
- a 3D model of the item is presented in the 3D item page.
- the 3D layout of the brick-and-mortar store may depict the real-life layout of the brick-and-mortar store.
- the 3D lay out of the mall may depict the real-life layout of a shopping mall. Therefore, in some instances, the one or more items available for sale will be arranged as 3D objects within the virtual environment in a manner that emulates their real-life counterparts, such as the brick-and mortar store or the shopping mall.
- the virtual content module 230 is further to retrieve the 3D content based on the user credentials. In other words, the same user credentials used to present the session that displays the 2D content is also used to retrieve or access the 3D content.
- the 3D content is labeled as corresponding to the 2D content and is stored in a database, such as the database 126 . Therefore, the virtual content module 230 retrieves the 3D content corresponding to the 2D content from the database 126 .
- the virtual content module 230 is further to retrieve the 3D content based on a location of the second device. For instance, the virtual content module 230 retrieves 3D content that depicts a layout of a brick-and-mortar store which is a pre-defined distance from the location of the second device. This allows for the user to view the 3D content pertaining to a local brick-and-mortar store familiar to the user. In this regard, the virtual content module 230 is further to identify the brick-and-mortar store within the pre-defined distance from the location of the second device.
- the display module 240 causes display of the 3D content on the second device of the user. Moreover, the 3D content displayed on the second device of the user is selectable by the user to perform user interactions with the 3D content. Further, since the 3D content depicts the one or more items available for sale as 3D objects, the interactions with the 3D content include selecting the 3D objects. For example, in order to select the 3D objects, the user is able to move the 3D objects representative of the one or more items available for sale to a virtual shopping cart. The interactions with the 3D content also include indicating a request to purchase one of the one or more items.
- a user performs a gesture with respect to a 3D object, the gesture corresponding to a request to purchase the item (e.g., shaking the item, flipping the item, and the like).
- the interactions with the 3D content may further include zooming in or navigating the 3D content, such as the virtual environment for the session.
- the display module 240 also causes display of a set of controls that enable the user to perform interactions with the 3D content.
- the set of controls may allow the user to rotate the 3D content and view the 3D content from multiple angles.
- the interactions with the 3D content may be performed on the second device of the user.
- the reception module 250 is configured to receive an indication of user performed interactions with the 3D content.
- the indication of the user performed interaction may be sent from the second device to the reception module 250 .
- the indications may include receiving a selection of the 3D objects from the displayed 3D content.
- the user viewing the 3D content within the virtual environment, may select one or more 3D objects from the 3D content.
- the user may indicate selection of the one or more 3D objects by performing a gesture (e.g., a gesture of picking up the 3D object and moving it to a virtual shopping cart) on the second device.
- the 3D objects may be displayed as being arranged in a virtual layout of a brick-and-mortar store that the user is familiar with.
- the indications may also include receiving a request to purchase an item corresponding to a 3D object from the displayed 3D content.
- the user may send the request by performing a gesture (e.g., a gesture of shaking the 3D object, flipping the 3D object, and the like) on the second device.
- a gesture e.g., a gesture of shaking the 3D object, flipping the 3D object, and the like
- the reception module 250 is further to receive a location of the second device of the user.
- the location of the second device of the user may be indicated by geographical coordinates.
- a GPS receiver embodied within the second device is able to identify the location of the second device and send the location to the reception module 250 .
- the reception module 250 is to receive user credentials from the first device of the user.
- the user credentials include user password and login information.
- the user credentials are used to login to a user account of the user.
- the user account of the user is used to access the session that displays the 2D content.
- the 3D content of the session is displayed to the user based on the user credentials.
- the process module 260 is configured to process the received indication of the user performed interactions with the 3D content. In some instances, the process module 260 processes the user performed interactions with the user account of the user. In this regard, the process module 260 is further to add items corresponding to the selected 3D objects to a virtual shopping cart (e.g., 3D shopping cart) that is associated with the account of the user. In some instances, the process module 260 is further to debit or to subtract a purchase price of the item that the user has requested to purchase from an account of the user, the item corresponding to the 3D object from the displayed 3D content. Therefore, in some instances, the process module 260 is to process the user performed interactions based on the received user credentials.
- a virtual shopping cart e.g., 3D shopping cart
- the display module 240 is configured to cause display, on the first device of the user, of a user interface that includes a result that depicts the user performed interactions as being processed for the session (e.g., transmit instructions and information to cause the display on the first device).
- the display module 240 in some instances is configured to cause display of the items corresponding to the 3D objects as being added to the virtual shopping cart (e.g., transmit instructions and information to cause the display on the second device).
- the display module 240 is to cause display of the purchase price of the purchased item as being debited from the account of the user.
- the display module 240 is to cause display of the purchased item as being sold to the user.
- the display module 240 is configured to cause display, on the second device of the user, a user interface that depicts the user performed interactions as being processed for the session.
- FIG. 3 - 6 are flowcharts illustrating operations of the virtual environment system 150 in performing a method 300 of displaying 3D content, according to some example embodiments. Operations in the method 300 may be performed by the virtual environment system 150 , using modules described above with respect to FIG. 2 . As shown in FIG. 3 , the method 300 includes operations 310 , 320 , 330 , 340 , 350 , 360 , and 370 .
- the session module 210 causes presentation on a first device a session that displays 2D content of one or more items available for sale.
- the session may include browsing an item page of one or more items available for sale, viewing a website of a brick-and-mortar store, watching a video clip, and the like.
- the detection module 220 detects a second device that is able to receive 3D content of the one or more items available for sale. In some instances, the detection module 220 detects the second device by receiving a request from the second device to view 3D content. Moreover, the second device may include identical user credentials as those used in operation 310 as part of the request.
- the virtual content module 230 retrieves 3D content that corresponds to the 2D content for the session.
- the 3D content may be of the one or more items available for sale.
- the virtual content module 230 retrieves 3D content which includes a virtual environment for the session.
- the display module 240 causes display of the 3D content on the second device.
- the 3D content is selectable by the user to perform interactions with the 3D content.
- the user may select 3D objects from among the 3D content.
- the user may zoom in on the 3D content, or the user may navigate through the 3D content.
- the display module 240 is further to cause display a set of controls that enable to the user to perform interactions with the 3D content. The set of controls enable the user to rotate and view the 3D content from one or more angles.
- the reception module 250 receives, from the second device, an indication of user performed interactions with the 3D content. For example, the reception module 250 may receive a user selection of the 3D objects from among the 3D content.
- the process module 260 processes the received indication of the user performed interactions with the 3D content, as further explained below.
- the display module 240 causes display of a user interface that includes a result that depicts the user performed interactions as being processed for the session, as further explained below.
- the method 300 may include one or more of operations 410 , 420 , and 430 .
- Operations 410 and 420 may be performed prior to the operation 330 .
- Operation 430 may be performed prior to the operation 310 .
- the reception module 250 receives a location of the second device.
- the location of the second device includes GPS coordinates of the second device.
- the GPS coordinates of the second device is identified using a GPS received embodied on the second device.
- the location of the second device may also include a physical address.
- the virtual content module 230 identifies a brick-and-mortar store within a pre-defined distance from the location of the second device. In doing so, the virtual content module 230 retrieves a list of brick-and-mortar stores that are within the pre-defined distance from the identified location of the second device. For example, the virtual content module 230 analyzes a map that includes the location the second device and retrieves, from the map, brick-and-mortar stores that are identified as being within the pre-defined distance from the location of the second device.
- the reception module 250 receives user credentials pertaining to the user.
- the user credentials include user password and login information.
- the user credentials are used to login to a user account of the user.
- the user account of the user is thereafter used to access the session that displays the 2D content and the 3D content.
- the reception module 250 receives the user credentials from either the first device of the user or the second device of the user.
- the method 300 may include one or more of operations 510 , 520 , and 530 .
- Operation 510 may be included as part of operation 350 .
- Operation 520 may be included as part of operation 360 .
- Operation 530 may be included as part of operation 370 .
- the reception module 250 receives a selection of 3D objects from the 3D content displayed.
- the 3D objects represent one or more items.
- the user is able to indicate the selection by performing a gesture with respect to the 3D objects.
- the user is able to indicate a selection of a 3D object by moving the 3D object to a further 3D object, the further 3D object representing a virtual shopping cart.
- the user may indicate a selection of a 3D object by performing a grabbing gesture with respect to the 3D object.
- the process module 260 adds the items corresponding to the selected 3D objects to a virtual shopping cart (e.g., 3D shopping cart).
- a virtual shopping cart e.g., 3D shopping cart
- the display module 240 displays the items corresponding to the selected 3D objects as being added to the virtual shopping cart.
- the display module 240 causes display on the first device of the user the items corresponding to the 3D objects selected at operation 510 as being added to the virtual shopping cart.
- items corresponding to the 3D objects are shown as being inside the virtual shopping cart.
- the method 300 may include one or more of operations 610 , 620 , and 630 .
- Operation 610 may be included as part of operation 350 .
- Operation 620 may be included as part of operation 360 .
- Operation 630 may be included as part of operation 370 .
- the reception module 250 receives a request to purchase an item corresponding to a 3D object from the displayed 3D content.
- the user is able to indicate the request by performing a gesture with respect to the 3D object. For example, the user may shake the 3D object, or flip the 3D object to send the request to purchase the item correspond to the 3D object.
- the process module 260 subtracts a purchase price of the item from an account of the user.
- the account of the user may be linked to a credit card of the user. Further, the process module 260 charges the credit card of the user for the purchase price of the item. Alternatively, the account of the user may have a certain amount of credits, and the process module 260 subtracts the purchase price of the item from the amount of credits linked to the account.
- the display module 240 causes display of the purchase price of the item as being subtracted from the account of the user. In other words, the display module 240 causes display on the first device of the user the purchase price of the item as being subtracted from the account of the user, as result of the operation 620 .
- FIG. 7 is a block diagram illustrating an example user interface of an item page 700 , according to some example embodiments.
- the item page 700 includes an image corresponding to a first item 710 (e.g., vintage coin) available for sale and an image corresponding to a second item 720 (e.g., vintage baseball bat) available for sale.
- the item page 700 is presented on a first client device as part of a session (e.g., operation 310 of FIG. 3 ).
- a user operating the client device may be logged into a user account, as indicated by a description 730 , in order to view the item page 700 .
- the user may provide user credentials in order to log into the user account.
- FIG. 8 is a block diagram illustrating an example user interface of a virtual item page 800 , according to some example embodiments.
- the virtual item page 800 is presented on a second client device that is able to display three-dimensional content.
- the virtual item page 800 may be a virtual environment including 3D content that corresponds to the item page 700 of FIG. 7 .
- the virtual item page 800 includes 3D content that depicts the first item 710 and the second item 720 as 3D objects.
- the virtual item page 800 includes a first 3D object 810 which depicts the image corresponding to the first item 710 of FIG. 7 in 3D form.
- the virtual item page 800 includes a second 3D object 820 which depicts the image corresponding to the second item 720 of FIG. 7 in 3D form.
- the 3D content is further selectable by the user to perform interactions with the 3D content.
- the virtual item page 800 includes a first set of controls 815 that allow the user to rotate and view the first 3D object 810 (e.g., vintage coin).
- the virtual item page 800 includes a second set of controls 825 that allow the user to rotate and view the second 3D object 820 (e.g., vintage baseball bat).
- a virtual shopping cart 850 where the user may place the 3D objects displayed in the virtual item page 800 .
- the user may drag each of the 3D objects into the virtual shopping cart 850 .
- the virtual shopping cart 850 includes the 3D object of the vintage baseball bat.
- these user interactions are sent from the second client device to the virtual environment system 150 to be processed by the virtual environment system 150 (e.g., operations 350 and 360 of FIG. 3 ).
- the user may access the virtual item page 800 while logged into a user account, as indicated by the description 860 .
- FIG. 9 is a block diagram illustrating an example user interface of an item page 900 , according to some example embodiments.
- the item page 900 is displayed on the first client device that was also used to display the item page 700 of FIG. 7 .
- the item page 900 includes a result 905 that depicts the user performed interactions with the 3D objects as being processed. For instance, the result 905 indicates that both the first item 910 and the second item 920 have been added to the shopping cart as a result of the interactions performed by the user in the virtual item page 800 .
- an image corresponding to the first item 910 corresponds to the image corresponding to the first item 710 of FIG. 7 and the first 3D object 810 of FIG. 8 .
- an image corresponding to the second item 920 corresponds to the image corresponding to the second item 720 of FIG. 7 and the second 3D object 820 of FIG. 8 .
- the user may access the item page 900 while logged into a user account, as indicated by a description 930 .
- Modules may constitute either software modules (e.g., code embodied on a machine-readable medium) or hardware modules.
- a “hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner.
- one or more computer systems e.g., a standalone computer system, a client computer system, or a server computer system
- one or more hardware modules of a computer system e.g., a processor or a group of processors
- software e.g., an application or application portion
- a hardware module may be implemented mechanically, electronically, or any suitable combination thereof.
- a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations.
- a hardware module may be a special-purpose processor, such as a Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC).
- a hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations.
- a hardware module may include software executed by a general-purpose processor or other programmable processor. Once configured by such software, hardware modules become specific machines (or specific components of a machine) uniquely tailored to perform the configured functions and are no longer general-purpose processors. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
- hardware module should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein.
- “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software accordingly configures a particular processor or processors, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
- Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
- a resource e.g., a collection of information
- processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein.
- processor-implemented module refers to a hardware module implemented using one or more processors.
- the methods described herein may be at least partially processor-implemented, with a particular processor or processors being an example of hardware.
- a particular processor or processors being an example of hardware.
- the operations of a method may be performed by one or more processors or processor-implemented modules.
- the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS).
- SaaS software as a service
- at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an Application Program Interface (API)).
- API Application Program Interface
- processors may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines.
- the processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented modules may be distributed across a number of geographic locations.
- FIG. 10 is a block diagram illustrating components of a machine 1000 , according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein.
- FIG. 10 shows a diagrammatic representation of the machine 1000 in the example form of a computer system, within which instructions 1016 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 1000 to perform any one or more of the methodologies discussed herein may be executed.
- the instructions may cause the machine to execute the flow diagrams of FIGS. 3 - 6 .
- the instructions may implement the modules described of FIG. 2 , and so forth.
- the instructions transform the general, non-programmed machine into a particular machine programmed to carry out the described and illustrated functions in the manner described.
- the machine 1000 operates as a standalone device or may be coupled (e.g., networked) to other machines.
- the machine 1000 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine 1000 may comprise, but not be limited to, a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 1016 , sequentially or otherwise, that specify actions to be taken by machine 1000 .
- the term “machine” shall also be taken to include a collection of machines 1000 that individually or jointly execute the instructions 1016 to perform any one or more of the methodologies discussed herein.
- the machine 1000 may include processors 1010 , memory 1030 , and I/O components 1050 , which may be configured to communicate with each other such as via a bus 1002 .
- the processors 1010 e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio-Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof
- the processors 1010 may include, for example, processor 1012 and processor 1014 that may execute instructions 1016 .
- processor is intended to include multi-core processor that may comprise two or more independent processors (sometimes referred to as “cores”) that may execute instructions contemporaneously.
- FIG. 10 shows multiple processors, the machine 1000 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core process), multiple processors with a single core, multiple processors with multiples cores, or any combination thereof.
- the memory/storage 1030 may include a memory 1032 , such as a main memory, or other memory storage, and a storage unit 1036 , both accessible to the processors 1010 such as via the bus 1002 .
- the storage unit 1036 and memory 1032 store the instructions 1016 embodying any one or more of the methodologies or functions described herein.
- the instructions 1016 may also reside, completely or partially, within the memory 1032 , within the storage unit 1036 , within at least one of the processors 1010 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by the machine 1000 .
- the memory 1032 , the storage unit 1036 , and the memory of processors 1010 are examples of machine-readable media.
- machine-readable medium means a device able to store instructions and data temporarily or permanently and may include, but is not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., Erasable Programmable Read-Only Memory (EEPROM)) and/or any suitable combination thereof.
- RAM random-access memory
- ROM read-only memory
- buffer memory flash memory
- optical media magnetic media
- cache memory other types of storage
- EEPROM Erasable Programmable Read-Only Memory
- machine-readable medium shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 1016 ) for execution by a machine (e.g., machine 1000 ), such that the instructions, when executed by one or more processors of the machine 1000 (e.g., processors 1010 ), cause the machine 1000 to perform any one or more of the methodologies described herein.
- a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices.
- the term “machine-readable medium” excludes signals per se.
- the I/O components 1050 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on.
- the specific I/O components 1050 that are included in a particular machine will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that the I/O components 1050 may include many other components that are not shown in FIG. 10 .
- the I/O components 1050 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, the I/O components 1050 may include output components 1052 and input components 1054 .
- the output components 1052 may include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor, resistance mechanisms), other signal generators, and so forth.
- a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)
- acoustic components e.g., speakers
- haptic components e.g., a vibratory motor, resistance mechanisms
- the input components 1054 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location and/or force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.
- alphanumeric input components e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components
- point based input components e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument
- tactile input components e.g., a physical button,
- the I/O components 1050 may include biometric components 1056 , motion components 1058 , environmental components 1060 , or position components 1062 among a wide array of other components.
- the biometric components 1056 may include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like.
- the motion components 1058 may include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth.
- the environmental components 1060 may include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometer that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., gas detection sensors to detection concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment.
- illumination sensor components e.g., photometer
- temperature sensor components e.g., one or more thermometer that detect ambient temperature
- humidity sensor components e.g., pressure sensor components (e.g., barometer)
- the position components 1062 may include location sensor components (e.g., a Global Position System (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.
- location sensor components e.g., a Global Position System (GPS) receiver component
- altitude sensor components e.g., altimeters or barometers that detect air pressure from which altitude may be derived
- orientation sensor components e.g., magnetometers
- the I/O components 1050 may include communication components 1064 operable to couple the machine 1000 to a network 1080 or devices 1070 via coupling 1082 and coupling 1072 respectively.
- the communication components 1064 may include a network interface component or other suitable device to interface with the network 1080 .
- communication components 1064 may include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities.
- the devices 1070 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).
- USB Universal Serial Bus
- the communication components 1064 may detect identifiers or include components operable to detect identifiers.
- the communication components 1064 may include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, UCC RSS-2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals).
- RFID Radio Frequency Identification
- NFC smart tag detection components e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, UCC RSS-2D bar code, and other optical codes
- RFID Radio Fre
- IP Internet Protocol
- Wi-Fi® Wireless Fidelity
- one or more portions of the network 1080 may be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), the Internet, a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a Wi-Fi® network, another type of network, or a combination of two or more such networks.
- VPN virtual private network
- LAN local area network
- WLAN wireless LAN
- WAN wide area network
- WWAN wireless WAN
- MAN metropolitan area network
- PSTN Public Switched Telephone Network
- POTS plain old telephone service
- the network 1080 or a portion of the network 1080 may include a wireless or cellular network and the coupling 1082 may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or other type of cellular or wireless coupling.
- CDMA Code Division Multiple Access
- GSM Global System for Mobile communications
- the coupling 1082 may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1 ⁇ RTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standard, others defined by various standard setting organizations, other long range protocols, or other data transfer technology.
- RTT Single Carrier Radio Transmission Technology
- GPRS General Packet Radio Service
- EDGE Enhanced Data rates for GSM Evolution
- 3GPP Third Generation Partnership Project
- 4G fourth generation wireless (4G) networks
- Universal Mobile Telecommunications System (UMTS) Universal Mobile Telecommunications System
- HSPA High Speed Packet Access
- WiMAX Worldwide Interoperability for Microwave Access
- LTE
- the instructions 1016 may be transmitted or received over the network 1080 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 1064 ) and utilizing any one of a number of well-known transfer protocols (e.g., hypertext transfer protocol (HTTP)).
- a network interface device e.g., a network interface component included in the communication components 1064
- HTTP hypertext transfer protocol
- the instructions 1016 may be transmitted or received using a transmission medium via the coupling 1072 (e.g., a peer-to-peer coupling) to devices 1070 .
- the term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions 1016 for execution by the machine 1000 , and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
- inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader scope of embodiments of the present disclosure.
- inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single disclosure or inventive concept if more than one is, in fact, disclosed.
- the term “or” may be construed in either an inclusive or exclusive sense. Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present disclosure. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present disclosure as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
In various example embodiments, a system and method for facilitating display of virtual content are presented. A session that displays two-dimensional (2D) content of one or more items available for sale is presented on a first device of a user. A second device of the user is detected, the second device being able to display three-dimensional (3D) content of the one or more items available for sale. 3D content of the one or more items available for sale is retrieved. Display of the 3D content on the second device is caused, the 3D content selectable by the user to perform interactions with the 3D content. An indication of the user performed interactions is received and processed. A result that depicts the user performed interactions as being processed is displayed on the first device of the user.
Description
- This application is a continuation of U.S. patent application Ser. No. 17/033,590 by Tapley et al., entitled “Displaying a Virtual Environment of a Session,” filed Sep. 25, 2020; which is a continuation of U.S. patent application Ser. No. 14/712,829 by Tapley et al., entitled “Displaying a Virtual Environment of a Session,” filed May 14, 2015, now U.S. Pat. No. 10,825,081 B2, issued Nov. 3, 2020; each of which are incorporated herein by reference in its entirety.
- Embodiments of the present disclosure relate generally to data processing and, more particularly, but not by way of limitation, to facilitating display of a virtual environment of a session.
- Conventionally, a user, during a browsing session, may view an item page hosted by a network commerce system. The item page includes content that is displayed on a device of the user.
- Various ones of the appended drawings merely illustrate example embodiments of the present disclosure and cannot be considered as limiting its scope.
-
FIG. 1 is a block diagram illustrating a networked system, according to some example embodiments. -
FIG. 2 is a block diagram illustrating components of a virtual environment system, according to some example embodiments. -
FIGS. 3-6 are flowcharts illustrating operations of the virtual environment system in performing a method of displaying 3D content, according to some example embodiments. -
FIG. 7 is a block diagram illustrating an example user interface of an item page, according to some example embodiments. -
FIG. 8 is a block diagram illustrating an example user interface of a virtual item page, according to some example embodiments. -
FIG. 9 is a block diagram illustrating an example user interface of an item page, according to some example embodiments. -
FIG. 10 illustrates a diagrammatic representation of a machine in the form of a computer system within which a set of instructions may be executed for causing the machine to perform any one or more of the methodologies discussed herein, according to an example embodiment. - The headings provided herein are merely for convenience and do not necessarily affect the scope or meaning of the terms used.
- The description that follows includes systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments of the disclosure. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art, that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques are not necessarily shown in detail.
- In various example embodiments, a system presents a session that displays two-dimensional (2D) content on a first device of a user. For instance, the user may be viewing an item page of one or more items available for sale on the device. Also, the system may detect a second device of the user that is capable of viewing three-dimensional (3D) content corresponding to the session. Upon detection of the second device, the system retrieves the 3D content and causes display of the 3D content on the second device of the user. The 3D content corresponding to the session includes 3D objects representative of information on a network of the system. In some instances, the information includes items available for sale. Moreover, the 3D content is selectable by the user to perform interactions with the 3D content. The system may process the user interactions with the 3D content and display a result, on the first device, that depicts the user interactions with the 3D content as being processed. In other words, the actions performed by the user while viewing the 3D content on the second device are also reflected in the session that displays the 2D content. This allows for a smooth transition between the 2D session displayed on the first device of the user and the 3D session displayed on the second device of the user.
- Accordingly, one or more of the methodologies discussed herein may obviate a need for a user to interact with two separate sessions, which may have the technical effect of reducing computing resources used by one or more devices within the system. Examples of such computing resources include, without limitation, processor cycles, network traffic, memory usage, storage space, and power consumption.
- With reference to
FIG. 1 , an example embodiment of a high-level client-server-basednetwork architecture 100 is shown. A networkedsystem 102, in the example forms of a network-based publication or payment system, provides server-side functionality via a network 104 (e.g., the Internet or wide area network (WAN)) to one ormore client devices 110.FIG. 1 illustrates, for example, a web client 112 (e.g., a browser, such as the Internet Explorer® browser developed by Microsoft® Corporation of Redmond, Wash. State), aclient application 114, and aprogrammatic client 116 executing on theclient device 110. - The
client device 110 may comprise, but is not limited to, a mobile phone, desktop computer, laptop, portable digital assistants (PDAs), smart phones, tablets, virtual headsets, ultra-books, netbooks, laptops, multi-processor systems, microprocessor-based or programmable consumer electronics, game consoles, set-top boxes, or any other communication device that a user may utilize to access thenetworked system 102. In some embodiments, theclient device 110 comprises a display module (not shown) to display information (e.g., in the form of user interfaces). In further embodiments, theclient device 110 may comprise one or more of a touch screens, accelerometers, gyroscopes, cameras, microphones, global positioning system (GPS) devices, and so forth. Theclient device 110 may be a device of a user that is used to perform a transaction involving digital items within thenetworked system 102. In one embodiment, thenetworked system 102 is a network-based marketplace that responds to requests for product listings, publishes publications comprising item listings of products available on the network-based marketplace, and manages payments for these marketplace transactions. For example, one or more portions of thenetwork 104 may be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a WiFi network, a WiMax network, another type of network, or a combination of two or more such networks. - Each of the
client devices 110 includes one or more applications (also referred to as “apps”) such as, but not limited to, a web browser, messaging application, electronic mail (email) application, an e-commerce site application (also referred to as a marketplace application), and the like. In some embodiments, if the e-commerce site application is included in a given one of theclient device 110, then this application is configured to locally provide the user interface and at least some of the functionalities with the application configured to communicate with thenetworked system 102, on an as needed basis, for data and/or processing capabilities not locally available (e.g., access to a database of items available for sale, to authenticate a user, to verify a method of payment). Conversely if the e-commerce site application is not included in theclient device 110, theclient device 110 may use its web browser to access the e-commerce site (or a variant thereof) hosted on thenetworked system 102. - One or
more users 106 may be a person, a machine, or other means of interacting with theclient device 110. In example embodiments, theuser 106 is not part of thenetwork architecture 100, but interacts with thenetwork architecture 100 via theclient device 110 or other means. For instance, theuser 106 provides input (e.g., touch screen input or alphanumeric input) to theclient device 110 and the input is communicated to thenetworked system 102 via thenetwork 104. In this instance, thenetworked system 102, in response to receiving the input from theuser 106, communicates information to theclient device 110 via thenetwork 104 to be presented to theuser 106. In this way, theuser 106 can interact with thenetworked system 102 using theclient device 110. - An application program interface (API)
server 120 and aweb server 122 are coupled to, and provide programmatic and web interfaces respectively to, one ormore application servers 140. Theapplication servers 140 hosts one ormore publication systems 142 andpayment systems 144, each of which may comprise one or more modules or applications and each of which may be embodied as hardware, software, firmware, or any combination thereof. Theapplication servers 140 are, in turn, shown to be coupled to one ormore database servers 124 that facilitate access to one or more information storage repositories or database(s) 126. In an example embodiment, thedatabases 126 are storage devices that store information to be posted (e.g., publications or listings) to thepublication system 142. Thedatabases 126 may also store digital item information in accordance with example embodiments. - Additionally, a third party application 132, executing on third party server(s) 130, is shown as having programmatic access to the
networked system 102 via the programmatic interface provided by theAPI server 120. For example, the third party application 132, utilizing information retrieved from thenetworked system 102, supports one or more features or functions on a website hosted by the third party. The third party website, for example, provides one or more promotional, marketplace, or payment functions that are supported by the relevant applications of thenetworked system 102. - The
publication systems 142 provide a number of publication functions and services tousers 106 that access thenetworked system 102. Thepayment systems 144 likewise provide a number of functions to perform or facilitate payments and transactions. While thepublication system 142 andpayment system 144 are shown inFIG. 1 to both form part of thenetworked system 102, it will be appreciated that, in alternative embodiments, eachsystem networked system 102. In some embodiments, thepayment systems 144 may form part of thepublication system 142. - The
virtual environment system 150 provides virtual three-dimensional content that is displayed on a user device capable of viewing the three-dimensional content. Thevirtual environment system 150, upon detection of the user device, retrieves 3D content that corresponds to a session that a user is viewing. For example, thevirtual environment system 150 may access theuser 3D content from thedatabases 126, the third party servers 130, thepublication system 142, and other sources. In some example embodiments, thevirtual environment system 150 communicates with the publication systems 142 (e.g., accessing item listings) andpayment system 144. In an alternative embodiment, thevirtual environment system 150 may be a part of thepublication system 142. - Further, while the client-server-based
network architecture 100 shown inFIG. 1 employs a client-server architecture, the present inventive subject matter is of course not limited to such an architecture, and could equally well find application in a distributed, or peer-to-peer, architecture system, for example. Thevarious publication system 142,payment system 144, andvirtual environment system 150 could also be implemented as standalone software programs, which do not necessarily have networking capabilities. - The
web client 112 accesses the various publication andpayment systems web server 122. Similarly, theprogrammatic client 116 accesses the various services and functions provided by the publication andpayment systems API server 120. Theprogrammatic client 116 may, for example, be a seller application (e.g., the Turbo Lister application developed by eBay® Inc., of San Jose, Calif.) to enable sellers to author and manage listings on thenetworked system 102 in an off-line manner, and to perform batch-mode communications between theprogrammatic client 116 and thenetworked system 102. -
FIG. 2 is a block diagram illustrating components of thevirtual environment system 150, according to some example embodiments. Thevirtual environment system 150 is shown as including asession module 210, adetection module 220, avirtual content module 230, adisplay module 240, areception module 250, and aprocess module 260, all configured to communicate with each other (e.g., via a bus, shared memory, or a switch). Any one or more of the modules described herein may be implemented using hardware (e.g., one or more processors of a machine) or a combination of hardware and software. For example, any module described herein may configure a processor (e.g., among one or more processors of a machine) to perform the operations described herein for that module. Moreover, any two or more of these modules may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules. Furthermore, according to various example embodiments, modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices. - In various example embodiments, the
session module 210 is configured to cause presentation of a session that displays two-dimensional (2D) content. In some instances, the 2D content is of one or more items available for sale. In doing so, thesession module 210 communicates with a first device of a user and presents the session on that device (e.g., client device 110). Example sessions may include browsing an item page of one or more items available for sale, viewing a website of a brick-and-mortar store, watching a video clip, and the like. Moreover, thesession module 210 is further to cause presentation of the session based on user credentials of a user. For instance, a user provides the user credentials to log into a user account. Upon login to the user account, thesession module 210 presents the session that displays the 2D content. - In various example embodiments, the
detection module 220 detects a second device of the user that is able to display three-dimensional (3D) content of the one or more items available for sale. The second device of the user may be a virtual reality (VR) headset, a VR component of a mobile device, or any other device compatible with displaying 3D content. In further embodiments, thedetection module 220 is to receive the user credentials from the second device of the user. - In various example embodiments, the
virtual content module 230 retrieves the 3D content that corresponds to the 2D content for the session. The 3D content includes one or more 3D objects. For instance, the 3D content is of the one or more items available for sale, and each of the one or more items available for sale is represented as a 3D object. Additionally, in some instances, the 3D content will include a virtual environment for the session. In some instances, the virtual environment is used to present the 3D objects representative of the one or more items available for sale. The virtual environment may include a 3D item page, a 3D layout of a brick-and-mortar store, a 3D layout of a mall, and the like. The 3D item page depicts the 2D components of the item page in 3D form. For instance, instead of an image of the item, a 3D model of the item is presented in the 3D item page. The 3D layout of the brick-and-mortar store may depict the real-life layout of the brick-and-mortar store. Also, the 3D lay out of the mall may depict the real-life layout of a shopping mall. Therefore, in some instances, the one or more items available for sale will be arranged as 3D objects within the virtual environment in a manner that emulates their real-life counterparts, such as the brick-and mortar store or the shopping mall. - In various example embodiments, the
virtual content module 230 is further to retrieve the 3D content based on the user credentials. In other words, the same user credentials used to present the session that displays the 2D content is also used to retrieve or access the 3D content. In further embodiments, the 3D content is labeled as corresponding to the 2D content and is stored in a database, such as thedatabase 126. Therefore, thevirtual content module 230 retrieves the 3D content corresponding to the 2D content from thedatabase 126. - Moreover, in some instances, the
virtual content module 230 is further to retrieve the 3D content based on a location of the second device. For instance, thevirtual content module 230 retrieves 3D content that depicts a layout of a brick-and-mortar store which is a pre-defined distance from the location of the second device. This allows for the user to view the 3D content pertaining to a local brick-and-mortar store familiar to the user. In this regard, thevirtual content module 230 is further to identify the brick-and-mortar store within the pre-defined distance from the location of the second device. - In various example embodiments, the
display module 240 causes display of the 3D content on the second device of the user. Moreover, the 3D content displayed on the second device of the user is selectable by the user to perform user interactions with the 3D content. Further, since the 3D content depicts the one or more items available for sale as 3D objects, the interactions with the 3D content include selecting the 3D objects. For example, in order to select the 3D objects, the user is able to move the 3D objects representative of the one or more items available for sale to a virtual shopping cart. The interactions with the 3D content also include indicating a request to purchase one of the one or more items. For instance, a user performs a gesture with respect to a 3D object, the gesture corresponding to a request to purchase the item (e.g., shaking the item, flipping the item, and the like). The interactions with the 3D content may further include zooming in or navigating the 3D content, such as the virtual environment for the session. In some instances, thedisplay module 240 also causes display of a set of controls that enable the user to perform interactions with the 3D content. For example, the set of controls may allow the user to rotate the 3D content and view the 3D content from multiple angles. Moreover, the interactions with the 3D content may be performed on the second device of the user. - In various example embodiments, the
reception module 250 is configured to receive an indication of user performed interactions with the 3D content. The indication of the user performed interaction may be sent from the second device to thereception module 250. The indications may include receiving a selection of the 3D objects from the displayed 3D content. For example, the user, viewing the 3D content within the virtual environment, may select one or more 3D objects from the 3D content. As stated above, the user may indicate selection of the one or more 3D objects by performing a gesture (e.g., a gesture of picking up the 3D object and moving it to a virtual shopping cart) on the second device. Moreover, the 3D objects may be displayed as being arranged in a virtual layout of a brick-and-mortar store that the user is familiar with. The indications may also include receiving a request to purchase an item corresponding to a 3D object from the displayed 3D content. The user may send the request by performing a gesture (e.g., a gesture of shaking the 3D object, flipping the 3D object, and the like) on the second device. - In various example embodiments, the
reception module 250 is further to receive a location of the second device of the user. The location of the second device of the user may be indicated by geographical coordinates. Moreover, a GPS receiver embodied within the second device is able to identify the location of the second device and send the location to thereception module 250. - In further embodiments, the
reception module 250 is to receive user credentials from the first device of the user. The user credentials include user password and login information. Moreover, the user credentials are used to login to a user account of the user. In some instances, the user account of the user is used to access the session that displays the 2D content. Moreover, the 3D content of the session is displayed to the user based on the user credentials. - In various example embodiments, the
process module 260 is configured to process the received indication of the user performed interactions with the 3D content. In some instances, theprocess module 260 processes the user performed interactions with the user account of the user. In this regard, theprocess module 260 is further to add items corresponding to the selected 3D objects to a virtual shopping cart (e.g., 3D shopping cart) that is associated with the account of the user. In some instances, theprocess module 260 is further to debit or to subtract a purchase price of the item that the user has requested to purchase from an account of the user, the item corresponding to the 3D object from the displayed 3D content. Therefore, in some instances, theprocess module 260 is to process the user performed interactions based on the received user credentials. - In various example embodiments, the
display module 240 is configured to cause display, on the first device of the user, of a user interface that includes a result that depicts the user performed interactions as being processed for the session (e.g., transmit instructions and information to cause the display on the first device). For example, thedisplay module 240 in some instances is configured to cause display of the items corresponding to the 3D objects as being added to the virtual shopping cart (e.g., transmit instructions and information to cause the display on the second device). As another example, thedisplay module 240 is to cause display of the purchase price of the purchased item as being debited from the account of the user. Also, thedisplay module 240 is to cause display of the purchased item as being sold to the user. In further embodiments, thedisplay module 240 is configured to cause display, on the second device of the user, a user interface that depicts the user performed interactions as being processed for the session. -
FIG. 3-6 are flowcharts illustrating operations of thevirtual environment system 150 in performing amethod 300 of displaying 3D content, according to some example embodiments. Operations in themethod 300 may be performed by thevirtual environment system 150, using modules described above with respect toFIG. 2 . As shown inFIG. 3 , themethod 300 includesoperations - At
operation 310, thesession module 210 causes presentation on a first device a session that displays 2D content of one or more items available for sale. As stated earlier, the session may include browsing an item page of one or more items available for sale, viewing a website of a brick-and-mortar store, watching a video clip, and the like. - At
operation 320, thedetection module 220 detects a second device that is able to receive 3D content of the one or more items available for sale. In some instances, thedetection module 220 detects the second device by receiving a request from the second device to view 3D content. Moreover, the second device may include identical user credentials as those used inoperation 310 as part of the request. - At
operation 330, thevirtual content module 230 retrieves 3D content that corresponds to the 2D content for the session. For instance, the 3D content may be of the one or more items available for sale. Additionally, thevirtual content module 230 retrieves 3D content which includes a virtual environment for the session. - At
operation 340, thedisplay module 240 causes display of the 3D content on the second device. Moreover, the 3D content is selectable by the user to perform interactions with the 3D content. For example, the user may select 3D objects from among the 3D content. As another example, the user may zoom in on the 3D content, or the user may navigate through the 3D content. In some instances, thedisplay module 240 is further to cause display a set of controls that enable to the user to perform interactions with the 3D content. The set of controls enable the user to rotate and view the 3D content from one or more angles. - At
operation 350, thereception module 250 receives, from the second device, an indication of user performed interactions with the 3D content. For example, thereception module 250 may receive a user selection of the 3D objects from among the 3D content. - At
operation 360, theprocess module 260 processes the received indication of the user performed interactions with the 3D content, as further explained below. - At
operation 370, thedisplay module 240 causes display of a user interface that includes a result that depicts the user performed interactions as being processed for the session, as further explained below. - As shown in
FIG. 4 , themethod 300 may include one or more ofoperations Operations operation 330.Operation 430 may be performed prior to theoperation 310. - At
operation 410, thereception module 250 receives a location of the second device. The location of the second device includes GPS coordinates of the second device. The GPS coordinates of the second device is identified using a GPS received embodied on the second device. The location of the second device may also include a physical address. - At
operation 420, thevirtual content module 230 identifies a brick-and-mortar store within a pre-defined distance from the location of the second device. In doing so, thevirtual content module 230 retrieves a list of brick-and-mortar stores that are within the pre-defined distance from the identified location of the second device. For example, thevirtual content module 230 analyzes a map that includes the location the second device and retrieves, from the map, brick-and-mortar stores that are identified as being within the pre-defined distance from the location of the second device. - At
operation 430, thereception module 250 receives user credentials pertaining to the user. The user credentials include user password and login information. Moreover, the user credentials are used to login to a user account of the user. The user account of the user is thereafter used to access the session that displays the 2D content and the 3D content. Thereception module 250 receives the user credentials from either the first device of the user or the second device of the user. - At shown in
FIG. 5 , themethod 300 may include one or more ofoperations Operation 510 may be included as part ofoperation 350.Operation 520 may be included as part ofoperation 360.Operation 530 may be included as part ofoperation 370. - At
operation 510, thereception module 250 receives a selection of 3D objects from the 3D content displayed. In some instances, the 3D objects represent one or more items. The user is able to indicate the selection by performing a gesture with respect to the 3D objects. For example, the user is able to indicate a selection of a 3D object by moving the 3D object to a further 3D object, the further 3D object representing a virtual shopping cart. Alternatively, the user may indicate a selection of a 3D object by performing a grabbing gesture with respect to the 3D object. - At
operation 520, theprocess module 260 adds the items corresponding to the selected 3D objects to a virtual shopping cart (e.g., 3D shopping cart). - At
operation 530, thedisplay module 240 displays the items corresponding to the selected 3D objects as being added to the virtual shopping cart. In other words, thedisplay module 240 causes display on the first device of the user the items corresponding to the 3D objects selected atoperation 510 as being added to the virtual shopping cart. In some instances, items corresponding to the 3D objects are shown as being inside the virtual shopping cart. - As shown in
FIG. 6 , themethod 300 may include one or more ofoperations Operation 610 may be included as part ofoperation 350.Operation 620 may be included as part ofoperation 360.Operation 630 may be included as part ofoperation 370. - At
operation 610, thereception module 250 receives a request to purchase an item corresponding to a 3D object from the displayed 3D content. The user is able to indicate the request by performing a gesture with respect to the 3D object. For example, the user may shake the 3D object, or flip the 3D object to send the request to purchase the item correspond to the 3D object. - At
operation 620, theprocess module 260 subtracts a purchase price of the item from an account of the user. For example, the account of the user may be linked to a credit card of the user. Further, theprocess module 260 charges the credit card of the user for the purchase price of the item. Alternatively, the account of the user may have a certain amount of credits, and theprocess module 260 subtracts the purchase price of the item from the amount of credits linked to the account. - At
operation 630, thedisplay module 240 causes display of the purchase price of the item as being subtracted from the account of the user. In other words, thedisplay module 240 causes display on the first device of the user the purchase price of the item as being subtracted from the account of the user, as result of theoperation 620. -
FIG. 7 is a block diagram illustrating an example user interface of anitem page 700, according to some example embodiments. Theitem page 700 includes an image corresponding to a first item 710 (e.g., vintage coin) available for sale and an image corresponding to a second item 720 (e.g., vintage baseball bat) available for sale. Moreover, theitem page 700 is presented on a first client device as part of a session (e.g.,operation 310 ofFIG. 3 ). Also, a user operating the client device may be logged into a user account, as indicated by adescription 730, in order to view theitem page 700. The user may provide user credentials in order to log into the user account. -
FIG. 8 is a block diagram illustrating an example user interface of avirtual item page 800, according to some example embodiments. Thevirtual item page 800 is presented on a second client device that is able to display three-dimensional content. Thevirtual item page 800 may be a virtual environment including 3D content that corresponds to theitem page 700 ofFIG. 7 . Further, thevirtual item page 800 includes 3D content that depicts thefirst item 710 and thesecond item 720 as 3D objects. For example, thevirtual item page 800 includes afirst 3D object 810 which depicts the image corresponding to thefirst item 710 ofFIG. 7 in 3D form. Also, thevirtual item page 800 includes asecond 3D object 820 which depicts the image corresponding to thesecond item 720 ofFIG. 7 in 3D form. The 3D content is further selectable by the user to perform interactions with the 3D content. For instance, thevirtual item page 800 includes a first set ofcontrols 815 that allow the user to rotate and view the first 3D object 810 (e.g., vintage coin). Also, thevirtual item page 800 includes a second set ofcontrols 825 that allow the user to rotate and view the second 3D object 820 (e.g., vintage baseball bat). - Also included in the
virtual item page 800 is avirtual shopping cart 850 where the user may place the 3D objects displayed in thevirtual item page 800. For example, the user may drag each of the 3D objects into thevirtual shopping cart 850. As shown, thevirtual shopping cart 850 includes the 3D object of the vintage baseball bat. Once the 3D objects are placed into thevirtual shopping cart 850, these user interactions are sent from the second client device to thevirtual environment system 150 to be processed by the virtual environment system 150 (e.g.,operations FIG. 3 ). Moreover, the user may access thevirtual item page 800 while logged into a user account, as indicated by thedescription 860. -
FIG. 9 is a block diagram illustrating an example user interface of anitem page 900, according to some example embodiments. Theitem page 900 is displayed on the first client device that was also used to display theitem page 700 ofFIG. 7 . Theitem page 900 includes aresult 905 that depicts the user performed interactions with the 3D objects as being processed. For instance, theresult 905 indicates that both thefirst item 910 and thesecond item 920 have been added to the shopping cart as a result of the interactions performed by the user in thevirtual item page 800. Moreover, an image corresponding to thefirst item 910 corresponds to the image corresponding to thefirst item 710 ofFIG. 7 and thefirst 3D object 810 ofFIG. 8 . Likewise, an image corresponding to thesecond item 920 corresponds to the image corresponding to thesecond item 720 ofFIG. 7 and thesecond 3D object 820 ofFIG. 8 . Moreover, the user may access theitem page 900 while logged into a user account, as indicated by adescription 930. - Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium) or hardware modules. A “hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
- In some embodiments, a hardware module may be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module may be a special-purpose processor, such as a Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC). A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software executed by a general-purpose processor or other programmable processor. Once configured by such software, hardware modules become specific machines (or specific components of a machine) uniquely tailored to perform the configured functions and are no longer general-purpose processors. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
- Accordingly, the phrase “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software accordingly configures a particular processor or processors, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
- Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
- The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented module” refers to a hardware module implemented using one or more processors.
- Similarly, the methods described herein may be at least partially processor-implemented, with a particular processor or processors being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an Application Program Interface (API)).
- The performance of certain of the operations may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented modules may be distributed across a number of geographic locations.
-
FIG. 10 is a block diagram illustrating components of amachine 1000, according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein. Specifically,FIG. 10 shows a diagrammatic representation of themachine 1000 in the example form of a computer system, within which instructions 1016 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing themachine 1000 to perform any one or more of the methodologies discussed herein may be executed. For example the instructions may cause the machine to execute the flow diagrams ofFIGS. 3-6 . Additionally, or alternatively, the instructions may implement the modules described ofFIG. 2 , and so forth. The instructions transform the general, non-programmed machine into a particular machine programmed to carry out the described and illustrated functions in the manner described. In alternative embodiments, themachine 1000 operates as a standalone device or may be coupled (e.g., networked) to other machines. In a networked deployment, themachine 1000 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. Themachine 1000 may comprise, but not be limited to, a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing theinstructions 1016, sequentially or otherwise, that specify actions to be taken bymachine 1000. Further, while only asingle machine 1000 is illustrated, the term “machine” shall also be taken to include a collection ofmachines 1000 that individually or jointly execute theinstructions 1016 to perform any one or more of the methodologies discussed herein. - The
machine 1000 may includeprocessors 1010,memory 1030, and I/O components 1050, which may be configured to communicate with each other such as via a bus 1002. In an example embodiment, the processors 1010 (e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio-Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof) may include, for example,processor 1012 andprocessor 1014 that may executeinstructions 1016. The term “processor” is intended to include multi-core processor that may comprise two or more independent processors (sometimes referred to as “cores”) that may execute instructions contemporaneously. AlthoughFIG. 10 shows multiple processors, themachine 1000 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core process), multiple processors with a single core, multiple processors with multiples cores, or any combination thereof. - The memory/
storage 1030 may include amemory 1032, such as a main memory, or other memory storage, and astorage unit 1036, both accessible to theprocessors 1010 such as via the bus 1002. Thestorage unit 1036 andmemory 1032 store theinstructions 1016 embodying any one or more of the methodologies or functions described herein. Theinstructions 1016 may also reside, completely or partially, within thememory 1032, within thestorage unit 1036, within at least one of the processors 1010 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by themachine 1000. Accordingly, thememory 1032, thestorage unit 1036, and the memory ofprocessors 1010 are examples of machine-readable media. - As used herein, “machine-readable medium” means a device able to store instructions and data temporarily or permanently and may include, but is not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., Erasable Programmable Read-Only Memory (EEPROM)) and/or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store
instructions 1016. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 1016) for execution by a machine (e.g., machine 1000), such that the instructions, when executed by one or more processors of the machine 1000 (e.g., processors 1010), cause themachine 1000 to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se. - The I/
O components 1050 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components 1050 that are included in a particular machine will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that the I/O components 1050 may include many other components that are not shown inFIG. 10 . The I/O components 1050 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, the I/O components 1050 may includeoutput components 1052 andinput components 1054. Theoutput components 1052 may include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor, resistance mechanisms), other signal generators, and so forth. Theinput components 1054 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location and/or force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like. - In further example embodiments, the I/
O components 1050 may includebiometric components 1056,motion components 1058,environmental components 1060, orposition components 1062 among a wide array of other components. For example, thebiometric components 1056 may include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. Themotion components 1058 may include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth. Theenvironmental components 1060 may include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometer that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., gas detection sensors to detection concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment. Theposition components 1062 may include location sensor components (e.g., a Global Position System (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like. - Communication may be implemented using a wide variety of technologies. The I/
O components 1050 may includecommunication components 1064 operable to couple themachine 1000 to anetwork 1080 ordevices 1070 viacoupling 1082 andcoupling 1072 respectively. For example, thecommunication components 1064 may include a network interface component or other suitable device to interface with thenetwork 1080. In further examples,communication components 1064 may include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities. Thedevices 1070 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)). - Moreover, the
communication components 1064 may detect identifiers or include components operable to detect identifiers. For example, thecommunication components 1064 may include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, UCC RSS-2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals). In addition, a variety of information may be derived via thecommunication components 1064, such as, location via Internet Protocol (IP) geo-location, location via Wi-Fi® signal triangulation, location via detecting a NFC beacon signal that may indicate a particular location, and so forth. - In various example embodiments, one or more portions of the
network 1080 may be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), the Internet, a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a Wi-Fi® network, another type of network, or a combination of two or more such networks. For example, thenetwork 1080 or a portion of thenetwork 1080 may include a wireless or cellular network and thecoupling 1082 may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or other type of cellular or wireless coupling. In this example, thecoupling 1082 may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1×RTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standard, others defined by various standard setting organizations, other long range protocols, or other data transfer technology. - The
instructions 1016 may be transmitted or received over thenetwork 1080 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 1064) and utilizing any one of a number of well-known transfer protocols (e.g., hypertext transfer protocol (HTTP)). Similarly, theinstructions 1016 may be transmitted or received using a transmission medium via the coupling 1072 (e.g., a peer-to-peer coupling) todevices 1070. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carryinginstructions 1016 for execution by themachine 1000, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software. - Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
- Although an overview of the inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader scope of embodiments of the present disclosure. Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single disclosure or inventive concept if more than one is, in fact, disclosed.
- The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
- As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present disclosure. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present disclosure as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Claims (20)
1. A system comprising:
at least one processor; and
at least one memory including instructions that, when executed by the at least one processor, cause the system to perform operations comprising:
causing display, by a first device, of an item page comprising a two-dimensional representation of an item;
causing display, by a second device, of a virtual environment comprising a three-dimensional representation of the item corresponding to the two-dimensional representation of the item;
receiving, from the second device, an indication of a user interaction with the three-dimensional representation of the item in the virtual environment; and
causing display of a result of the user interaction on the item page of the first device, wherein the result indicates that the item has been added to a virtual shopping cart of the virtual environment.
2. The system of claim 1 , wherein the instructions, when executed by the at least one processor, further cause the system to perform operations comprising:
receiving user credentials via the first device, wherein causing display of the virtual environment on the second device is based at least in part on the user credentials received via the first device.
3. The system of claim 1 , wherein causing display of the virtual environment on the second device is based at least in part on detecting the second device.
4. The system of claim 1 , wherein the instructions to receive, from the second device, the indication of the user interaction with the three-dimensional representation further causes the system to perform operations comprising:
detecting a gesture performed with respect to the three-dimensional representation of the item in the virtual environment, the gesture corresponding to a request to purchase the item; and
causing display of an amount debited from an account based at least in part on detecting the gesture.
5. The system of claim 1 , wherein the instructions, when executed by the at least one processor, further cause the system to perform operations comprising:
retrieving a location of a store that is within a predefined distance of the second device; and
causing presentation of a map that indicates a location of the second device and the location of the store.
6. The system of claim 1 , wherein the instructions to receive the indication of the user interaction further causes the system to perform operations comprising:
receiving the indication of the user interaction that indicates a user selection of the three-dimensional representation of the item in the virtual environment.
7. The system of claim 1 , wherein the instructions, when executed by the at least one processor, further cause the system to perform operations comprising:
causing display, by the second device, of the three-dimensional representation of the item inside of a three-dimensional representation of the virtual shopping cart based at least in part on the indication of the user interaction.
8. The system of claim 1 , wherein the instructions to cause display of the result of the user interaction further causes the system to perform operations comprising:
causing display, by the first device, of the two-dimensional representation of the item inside of a two-dimensional representation of the virtual shopping cart based at least in part on the indication of the user interaction.
9. A computer-implemented method, comprising:
causing display, by a first device, of an item page comprising a two-dimensional representation of an item;
causing display, by a second device, of a virtual environment comprising a three-dimensional representation of the item corresponding to the two-dimensional representation of the item;
receiving, from the second device, an indication of a user interaction with the three-dimensional representation of the item in the virtual environment; and
causing display of a result of the user interaction on the item page of the first device, wherein the result indicates that the item has been added to a virtual shopping cart of the virtual environment.
10. The computer-implemented method of claim 9 , further comprising:
receiving user credentials via the first device, wherein causing display of the virtual environment on the second device is based at least in part on the user credentials received via the first device.
11. The computer-implemented method of claim 9 , wherein causing display of the virtual environment on the second device is based at least in part on detecting the second device.
12. The computer-implemented method of claim 9 , wherein receiving, from the second device, the indication of the user interaction with the three-dimensional representation comprises:
detecting a gesture performed with respect to the three-dimensional representation of the item in the virtual environment, the gesture corresponding to a request to purchase the item; and
causing display of an amount debited from an account based at least in part on detecting the gesture.
13. The computer-implemented method of claim 9 , further comprising:
retrieving a location of a store that is within a predefined distance of the second device; and
causing presentation of a map that indicates a location of the second device and the location of the store.
14. The computer-implemented method of claim 9 , wherein receiving the indication of the user interaction comprises:
receiving the indication of the user interaction that indicates a user selection of the three-dimensional representation of the item in the virtual environment.
15. The computer-implemented method of claim 9 , further comprising:
causing display, by the second device, of the three-dimensional representation of the item inside of a three-dimensional representation of the virtual shopping cart based at least in part on the indication of the user interaction.
16. The computer-implemented method of claim 9 , wherein causing display of the result of the user interaction comprises:
causing display, by the first device, of the two-dimensional representation of the item inside of a two-dimensional representation of the virtual shopping cart based at least in part on the indication of the user interaction.
17. A non-transitory computer-readable storage medium including instructions that, when executed by at least one processor, causes a system to perform operations comprising:
causing display, by a first device, of an item page comprising a two-dimensional representation of an item;
causing display, by a second device, of a virtual environment comprising a three-dimensional representation of the item corresponding to the two-dimensional representation of the item;
receiving, from the second device, an indication of a user interaction with the three-dimensional representation of the item in the virtual environment; and
causing display of a result of the user interaction on the item page of the first device, wherein the result indicates that the item has been added to a virtual shopping cart of the virtual environment.
18. The non-transitory computer-readable storage medium of claim 17 , the operations further comprising:
receiving user credentials via the first device, wherein causing display of the virtual environment on the second device is based at least in part on the user credentials received via the first device.
19. The non-transitory computer-readable storage medium of claim 17 , wherein causing display of the virtual environment on the second device is based at least in part on detecting the second device.
20. The non-transitory computer-readable storage medium of claim 17 , wherein the instructions to receive, from the second device, the indication of the user interaction with the three-dimensional representation further causes the system to perform operations comprising:
detecting a gesture performed with respect to the three-dimensional representation of the item in the virtual environment, the gesture corresponding to a request to purchase the item; and
causing display of an amount debited from an account based at least in part on detecting the gesture.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/055,289 US20230072889A1 (en) | 2015-05-14 | 2022-11-14 | Displaying a virtual environment of a session |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/712,829 US10825081B2 (en) | 2015-05-14 | 2015-05-14 | Displaying a virtual environment of a session |
US17/033,590 US11514508B2 (en) | 2015-05-14 | 2020-09-25 | Displaying a virtual environment of a session |
US18/055,289 US20230072889A1 (en) | 2015-05-14 | 2022-11-14 | Displaying a virtual environment of a session |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/033,590 Continuation US11514508B2 (en) | 2015-05-14 | 2020-09-25 | Displaying a virtual environment of a session |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230072889A1 true US20230072889A1 (en) | 2023-03-09 |
Family
ID=57249531
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/712,829 Active 2036-06-12 US10825081B2 (en) | 2015-05-14 | 2015-05-14 | Displaying a virtual environment of a session |
US17/033,590 Active 2035-10-15 US11514508B2 (en) | 2015-05-14 | 2020-09-25 | Displaying a virtual environment of a session |
US18/055,289 Pending US20230072889A1 (en) | 2015-05-14 | 2022-11-14 | Displaying a virtual environment of a session |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/712,829 Active 2036-06-12 US10825081B2 (en) | 2015-05-14 | 2015-05-14 | Displaying a virtual environment of a session |
US17/033,590 Active 2035-10-15 US11514508B2 (en) | 2015-05-14 | 2020-09-25 | Displaying a virtual environment of a session |
Country Status (5)
Country | Link |
---|---|
US (3) | US10825081B2 (en) |
EP (1) | EP3295295A1 (en) |
KR (2) | KR102381857B1 (en) |
CN (1) | CN107533428A (en) |
WO (1) | WO2016183476A1 (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10825081B2 (en) | 2015-05-14 | 2020-11-03 | Ebay Inc. | Displaying a virtual environment of a session |
US20190272585A1 (en) * | 2016-11-08 | 2019-09-05 | Misho MILICEVIC | Virtual shopping software system and method |
US10586379B2 (en) * | 2017-03-08 | 2020-03-10 | Ebay Inc. | Integration of 3D models |
US20190188918A1 (en) * | 2017-12-14 | 2019-06-20 | Tsunami VR, Inc. | Systems and methods for user selection of virtual content for presentation to another user |
WO2019151323A1 (en) * | 2018-02-05 | 2019-08-08 | 株式会社ソニー・インタラクティブエンタテインメント | Entertainment device, display control method and display control program |
US11048374B2 (en) * | 2018-03-08 | 2021-06-29 | Ebay Inc. | Online pluggable 3D platform for 3D representations of items |
US10523921B2 (en) * | 2018-04-06 | 2019-12-31 | Zspace, Inc. | Replacing 2D images with 3D images |
US11727656B2 (en) | 2018-06-12 | 2023-08-15 | Ebay Inc. | Reconstruction of 3D model with immersive experience |
US10916220B2 (en) * | 2018-08-07 | 2021-02-09 | Apple Inc. | Detection and display of mixed 2D/3D content |
US10902506B2 (en) * | 2018-09-11 | 2021-01-26 | Ebay Inc. | Crowd sourcing locations for seller privacy |
US11288733B2 (en) * | 2018-11-14 | 2022-03-29 | Mastercard International Incorporated | Interactive 3D image projection systems and methods |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040153371A1 (en) * | 2003-01-30 | 2004-08-05 | Razumov Sergey N. | Graphical user interface for product ordering in retail system |
US20100010902A1 (en) * | 2008-05-16 | 2010-01-14 | Ginger Casey | Systems and Methods for Virtual Markets with Product Pickup |
US20120005717A1 (en) * | 2010-06-30 | 2012-01-05 | At&T Intellectual Property I, L.P. | Apparatus and method for managing the presentation of media content |
US20130066749A1 (en) * | 2010-11-19 | 2013-03-14 | Mastercard International Incorporated | Method and system for consumer transactions using voice or human based gesture actions |
US20130141428A1 (en) * | 2011-11-18 | 2013-06-06 | Dale L. Gipson | Computer-implemented apparatus, system, and method for three dimensional modeling software |
US20130218912A1 (en) * | 2012-02-22 | 2013-08-22 | Ebay Inc. | Systems and methods to provide search results based on time to obtain |
US20130311340A1 (en) * | 2012-05-18 | 2013-11-21 | Ebay Inc. | Systems and methods for displaying items |
US20140365272A1 (en) * | 2013-06-07 | 2014-12-11 | Bby Solutions, Inc. | Product display with emotion prediction analytics |
Family Cites Families (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU1333895A (en) * | 1993-11-30 | 1995-06-19 | Raymond R. Burke | Computer system for allowing a consumer to purchase packaged goods at home |
US6480204B1 (en) * | 1996-04-30 | 2002-11-12 | Sun Microsystems, Inc. | Transparent sunpad for home shopping |
US20010019337A1 (en) | 2000-03-03 | 2001-09-06 | Jong Min Kim | System for providing clients with a three dimensional virtual reality |
KR20020084148A (en) | 2000-03-10 | 2002-11-04 | 리츠에프엑스 리미티드 | Natural user interface for virtual reality shopping systems |
WO2001091016A1 (en) * | 2000-05-25 | 2001-11-29 | Realitybuy, Inc. | A real time, three-dimensional, configurable, interactive product display system and method |
US20030163438A1 (en) | 2000-10-19 | 2003-08-28 | General Electric Company | Delegated administration of information in a database directory using at least one arbitrary group of users |
US6917370B2 (en) | 2002-05-13 | 2005-07-12 | Charles Benton | Interacting augmented reality and virtual reality |
US7680694B2 (en) * | 2004-03-11 | 2010-03-16 | American Express Travel Related Services Company, Inc. | Method and apparatus for a user to shop online in a three dimensional virtual reality setting |
US7696992B2 (en) | 2007-01-16 | 2010-04-13 | Motorola, Inc. | Method and apparatus to facilitate multi-setting virtual reality experiences |
US8117089B2 (en) * | 2007-02-13 | 2012-02-14 | Claudia Juliana Minsky | System for segmentation by product category of product images within a shopping cart |
US20090132309A1 (en) | 2007-11-21 | 2009-05-21 | International Business Machines Corporation | Generation of a three-dimensional virtual reality environment from a business process model |
US8065200B2 (en) * | 2007-11-26 | 2011-11-22 | International Business Machines Corporation | Virtual web store with product images |
KR20090094526A (en) | 2008-03-03 | 2009-09-08 | 이동천 | Smart purchasing support system |
JP4740990B2 (en) * | 2008-10-10 | 2011-08-03 | 東芝テック株式会社 | Table for restaurant and electronic menu device using this table |
JP5157969B2 (en) * | 2009-03-09 | 2013-03-06 | ソニー株式会社 | Information processing apparatus, threshold setting method and program thereof |
US9122707B2 (en) | 2010-05-28 | 2015-09-01 | Nokia Technologies Oy | Method and apparatus for providing a localized virtual reality environment |
US8443300B2 (en) * | 2010-08-24 | 2013-05-14 | Ebay Inc. | Three dimensional navigation of listing information |
US20140201029A9 (en) * | 2010-09-03 | 2014-07-17 | Joseph Anthony Plattsmier | 3D Click to Buy |
WO2012048252A1 (en) | 2010-10-07 | 2012-04-12 | Aria Glassworks, Inc. | System and method for transitioning between interface modes in virtual and augmented reality applications |
US9269096B2 (en) * | 2011-05-23 | 2016-02-23 | Microsoft Technology Licensing, Llc | Advertisement rendering for multiple synced devices |
US8190749B1 (en) | 2011-07-12 | 2012-05-29 | Google Inc. | Systems and methods for accessing an interaction state between multiple devices |
US20130110666A1 (en) | 2011-10-28 | 2013-05-02 | Adidas Ag | Interactive retail system |
US9875480B2 (en) * | 2012-01-27 | 2018-01-23 | Sony Network Entertainment International Llc | System, method, and infrastructure for real-time live streaming content |
US8497859B1 (en) | 2012-04-03 | 2013-07-30 | Google Inc. | Display of information on or within a three-dimensional image |
US20140007205A1 (en) * | 2012-06-28 | 2014-01-02 | Bytemobile, Inc. | No-Click Log-In Access to User's Web Account Using a Mobile Device |
US9665905B2 (en) * | 2012-08-21 | 2017-05-30 | Matthew Lehrer | Three dimensional shopping cart |
US20140067624A1 (en) | 2012-09-05 | 2014-03-06 | Microsoft Corporation | Accessing a shopping service through a game console |
US9430752B2 (en) | 2012-11-02 | 2016-08-30 | Patrick Soon-Shiong | Virtual planogram management, systems, and methods |
US10304037B2 (en) * | 2013-02-04 | 2019-05-28 | Haworth, Inc. | Collaboration system including a spatial event map |
US20140337149A1 (en) * | 2013-03-12 | 2014-11-13 | Taco Bell Corp. | Systems, methods, and devices for a rotation-based order module |
US20150026012A1 (en) * | 2013-07-16 | 2015-01-22 | Ron Gura | Systems and methods for online presentation of storefront images |
US9451162B2 (en) * | 2013-08-21 | 2016-09-20 | Jaunt Inc. | Camera array including camera modules |
US20150084837A1 (en) | 2013-09-19 | 2015-03-26 | Broadcom Corporation | Coordination of multiple mobile device displays |
US20150120496A1 (en) | 2013-10-25 | 2015-04-30 | Stuart Watson | Shopping System |
US20190037611A1 (en) * | 2013-12-23 | 2019-01-31 | Google Llc | Intuitive inter-device connectivity for data sharing and collaborative resource usage |
US20150213496A1 (en) * | 2014-01-24 | 2015-07-30 | Aol Inc. | Methods and systems for displaying electronic content to individuals in geographic zone having inner boundary |
CN104123664A (en) | 2014-08-04 | 2014-10-29 | 中网一号电子商务有限公司 | Correlation method of 3D (three-dimensional) virtual world and real city |
US10235714B2 (en) * | 2014-12-01 | 2019-03-19 | Verizon Patent And Licensing Inc. | Customized virtual reality user environment control |
US10825081B2 (en) | 2015-05-14 | 2020-11-03 | Ebay Inc. | Displaying a virtual environment of a session |
-
2015
- 2015-05-14 US US14/712,829 patent/US10825081B2/en active Active
-
2016
- 2016-05-13 KR KR1020217003761A patent/KR102381857B1/en active IP Right Grant
- 2016-05-13 CN CN201680028037.2A patent/CN107533428A/en active Pending
- 2016-05-13 WO PCT/US2016/032438 patent/WO2016183476A1/en unknown
- 2016-05-13 KR KR1020177036153A patent/KR20180006976A/en active Application Filing
- 2016-05-13 EP EP16793629.3A patent/EP3295295A1/en not_active Ceased
-
2020
- 2020-09-25 US US17/033,590 patent/US11514508B2/en active Active
-
2022
- 2022-11-14 US US18/055,289 patent/US20230072889A1/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040153371A1 (en) * | 2003-01-30 | 2004-08-05 | Razumov Sergey N. | Graphical user interface for product ordering in retail system |
US20100010902A1 (en) * | 2008-05-16 | 2010-01-14 | Ginger Casey | Systems and Methods for Virtual Markets with Product Pickup |
US9633384B2 (en) * | 2008-05-16 | 2017-04-25 | Ginger Casey | Systems and methods for virtual markets with product pickup |
US20120005717A1 (en) * | 2010-06-30 | 2012-01-05 | At&T Intellectual Property I, L.P. | Apparatus and method for managing the presentation of media content |
US20130066749A1 (en) * | 2010-11-19 | 2013-03-14 | Mastercard International Incorporated | Method and system for consumer transactions using voice or human based gesture actions |
US20130141428A1 (en) * | 2011-11-18 | 2013-06-06 | Dale L. Gipson | Computer-implemented apparatus, system, and method for three dimensional modeling software |
US20130218912A1 (en) * | 2012-02-22 | 2013-08-22 | Ebay Inc. | Systems and methods to provide search results based on time to obtain |
US20130311340A1 (en) * | 2012-05-18 | 2013-11-21 | Ebay Inc. | Systems and methods for displaying items |
US20140365272A1 (en) * | 2013-06-07 | 2014-12-11 | Bby Solutions, Inc. | Product display with emotion prediction analytics |
Also Published As
Publication number | Publication date |
---|---|
CN107533428A (en) | 2018-01-02 |
US10825081B2 (en) | 2020-11-03 |
KR20210018541A (en) | 2021-02-17 |
KR20180006976A (en) | 2018-01-19 |
EP3295295A4 (en) | 2018-03-21 |
US20210012414A1 (en) | 2021-01-14 |
WO2016183476A1 (en) | 2016-11-17 |
US20160335712A1 (en) | 2016-11-17 |
US11514508B2 (en) | 2022-11-29 |
EP3295295A1 (en) | 2018-03-21 |
KR102381857B1 (en) | 2022-04-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11514508B2 (en) | Displaying a virtual environment of a session | |
US20170193544A1 (en) | Modification of content according to user engagement | |
US11640633B2 (en) | Enhanced shopping actions on a mobile device | |
US20160313888A1 (en) | Graphical user interface for distraction free shopping on a mobile device | |
US11568474B2 (en) | On-line session trace system | |
US11907938B2 (en) | Redirecting to a trusted device for secured data transmission | |
US20150379045A1 (en) | Obtaining item listings relating to a look of image selected in a user interface | |
US11954723B2 (en) | Replaced device handler | |
US20210158371A1 (en) | Verified video reviews | |
US20190295172A1 (en) | Transmitting data to select users | |
WO2016172419A1 (en) | Generating a discovery page depicting item aspects | |
US10157240B2 (en) | Systems and methods to generate a concept graph | |
US20160314513A1 (en) | Automatic negotiation using real time messaging | |
US20160314523A1 (en) | Presentation of bidding activity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: EBAY INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAPLEY, JOHN;LEACH, SKOT;BEACH, DAVID;SIGNING DATES FROM 20150512 TO 20150513;REEL/FRAME:061763/0592 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |