GB2512571A - System and method for managing interaction with a simulated object - Google Patents

System and method for managing interaction with a simulated object Download PDF

Info

Publication number
GB2512571A
GB2512571A GB1301814.8A GB201301814A GB2512571A GB 2512571 A GB2512571 A GB 2512571A GB 201301814 A GB201301814 A GB 201301814A GB 2512571 A GB2512571 A GB 2512571A
Authority
GB
United Kingdom
Prior art keywords
user
simulation
actions
selectable interaction
simulated object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1301814.8A
Other versions
GB201301814D0 (en
Inventor
Nathan Summers
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1301814.8A priority Critical patent/GB2512571A/en
Publication of GB201301814D0 publication Critical patent/GB201301814D0/en
Priority to PCT/EP2014/051299 priority patent/WO2014118072A1/en
Publication of GB2512571A publication Critical patent/GB2512571A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0621Item configuration or customization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Human Computer Interaction (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method of managing user interaction with a simulated object, the method comprises: generating a simulation of the object, the simulation being capable of being manipulated by a user and being associated with a plurality of user-selectable interaction actions; displaying the simulation on a first display device; manipulating the simulation of the object in response to user inputs; monitoring which of the plurality of user-selectable interaction actions the user selects; identifying currently unused user-selectable interaction actions; prompting the userregarding at least one of the currently unused user-selectable interaction actions. Interacting with the simulation includes a gesture control input mechanism and is animated. Actions may include calling up information overlays; calling up menus; changing a viewpoint of the simulation; changing a parameter or the simulated object; changing the simulated object. Prompting the user comprises sending an audible alert prompt type to the user displaying a visual indication prompt type text based message to the user. The embodiment relates to an interactive system, for customising a car before ordering the car.

Description

SYSTEM AND METHOD FOR MANAGING INTERACTION WITH A SIMULATED OBJECT
TECHNICAL FIELD
The present invention relates to a system and method for managing interaction with a simulated object. The present invention relates particularly, but not exclusively, to interacting with a simulation of a vehicle during the lifecycle of a transaction in which a display server prompts the user with respect to the various simulation interactions available to them Aspects of the invention ielate to a method, to a system, to a carrier medium and to server for managing the user interaction.
BACKGROUND
Prospective customers wishing to purchase a transaction item generally have one or more purchase routes available to them: purchase in a store, telephone order or online purchase via an online transaction platform (e.g. manufacturer's website). With the proliferation of high speed broadband internet connections many customers are favouring the online purchase route.
In some circumstances, particularly for large value purchases, a customer may initially research a transaction item online before visiting a retail outlet to either complete the transaction or to view the transaction item prior to an online purchase.
In some transaction environments the transaction item may comprise configurable elements and the online transaction platform that is available for the customer to use may allow these various configurable options to be displayed to the customer. For example, where the transaction item is a vehicle then the customer may have the option of exploring various configuration options relating to the vehicle they are interested in, e.g. paint colour and finish, interior trim options, exterior trim options etc. Any changes made while viewing the vehicle on the manufacturer's website may be represented via an online rendering of the vehicle that has been selected.
Although the ability to configure aspects of the vehicle may be provided to a customer on the online transaction platform, often the visual experience that is available to them is limited by the display and processing limitations of the device they are viewing the vehicle from. For example, if a customer visits an online vehicle configurator via a mobile device then there are likely to be processing and display limitations. Even if the customer visits the configurator from a home PC then there may be display limitations that mean that they do not receive a representative experience of the vehicle they are interested in.
When interacting with a simulated version of a vehicle the display system may be able to provide a number of configuration options to the user and to provide an array of supporting information, e.g. about performance specifications, delivery times etc. However! in some circumstances such vehicle related data may either not be explored by the user or ignored by the user.
It is an aim of the present invention to address disadvantages associated with a user's interaction with a simulated object.
SUMMARY OF THE INVENTION
According to an aspect of the present invention, there is provided a method of managing user interaction with a simulated object, the method comprising: generating a simulation of the object, the simulation being capable of being manipulated by a user and being associated with a plurality of user-selectable interaction actions; displaying the simulation on a first display device; manipulating the simulation of the object in response to user inputs; monitoring which of the plurality of user-selectable interaction actions the user selects; identifying currently unused user-selectable interaction actions; prompting the user regarding at least one of the currently unused user-selectable interaction actions.
The present invention provides a method of interacting with a simulated object and prompting a user who is manipulating the object that there are user-selectable interaction options that they have not currently used. In the context of a simulation of a vehicle "manipulating the simulation" may mean rotating, zooming into and out of the vehicle, in other words, spatial manipulation of the object. On the other hand user selectable interaction actions" may comprise changing the view of the vehicle such that the user is given the effect of getting into the vehicle", changing paint colour and/or trim options on the simulated object, inspecting features on the vehicle (e.g. light array, wheel hubs etc.), calling up menus or information overlays etc. The first display device may conveniently be in communication with a rendering means, the rendering means rendering the simulation of the object. The simulation may be a three dimensional simulation of the object.
Interacting with the simulation on the first display device may comprise interacting with a gesture control input mechanism. The simulation may be animated so that interaction with the simulated object may be arranged to mimic real world responses of the object.
Manipulating the object in response to user inputs may conveniently refer to spatial manipulation of the simulated object. User-selectable interaction actions may comprise one of more of the following actions: calling up information overlays; calling up menus; changing a viewpoint of the simulation (e.g. a step change in the viewpoint of the object rather than a gradual and continuous change in viewpoint that is achieved via manipulation of the object); changing a parameter or the simulated object; changing the simulated object.
The user-selectable interaction actions may be associated with a feature of the simulated object.
Generating a simulation of the object may comprise generating a list of all the user-selectable interaction actions that may be selected by the user for the simulated object.
Identifying currently unused user-selectable interaction actions may comprise monitoring the complete list of all the user-selectable interaction actions against selected user-selectable interaction actions. Identifying currently unused user-selectable interaction actions may comprise monitoring a subset of the complete list of all the user-selectable interaction actions against selected user-selectable interaction actions. The subset may relate to user-selectable actions associated with a portion of the simulated object that is being displayed.
In this manner if a user is, for example, looking at the outside of a simulated object the system may not track the actions that are associated with the inside of the vehicle. In this way, the system may identify a list of currently unused user-selectable actions and prioritise the order of those actions by relevance to the current user interaction with that object. In other words, the system may not monitor user-selectable options that are currently inaccessible, e.g. because it relates to a different part of the object or something that is not currently being simulated (inside versus outside view). However, if the user appears not to have recognised the extent of the available interaction and functionality of the simulated object, the user may be prompted to take a guided tour or guided to interact with a feature intended to extend the user's interactive experience with the simulated object.
Prompting the user may comprise sending an audible alert prompt type to the user.
Alternatively, prompting the user may comprise displaying a visual indication prompt type (for example on the first display device or another display device).
The visual indication may also comprise a text based message to the user (e.g. instructions to the user). Alternatively or additionally, the visual indication may comprise an icon based indication. The icon based indication may be an animated indication which is arranged to move towards a feature of the simulated object which is associated with an at least one of the currently unused user-selectable interaction actions. Such an icon may move to the vicinity of the feature or may move over the feature in question.
The prompt type may be configured by a system administrator or by the user manipulating the simulated object. The frequency of the appearance of the prompt type may be configured by the system administrator or by the user manipulating the simulated object.
The method may comprise updating a list of currently unused user-selectable interaction actions once a user has selected one or more of the actions from the list. The appearance of a prompt, or the order in which successive prompts are generated, may be generated at least in part in dependence on key user selected configuration settings of the simulated object itself. In this way, if the user has chosen options to accentuate or enhance a key performance characteristic of the object, then they may be guided to interact with simulated features relevant to that performance characteristic as a priority over, or as an alternative to, other features relevant to other performance characteristics.
In the event that the user-selectable interaction actions are associated with a feature of the simulated object, the method may further comprise initially displaying in an initial view of the simulated object a feature to the user without prompting the user regarding a currently unused user-selectable interaction action(s) associated with that feature. The user may then be subsequently prompted of the currently unused user-selectable interaction actions at a later point in time. The user may view an intermediate view of the simulated object in which the feature is not visible before subsequently returning to the initial view of the feature at the later point in time.
According to a further aspect of the present invention, there is provided a display system for interacting with an object, the system comprising: a first display device; a computing device arranged to: generate a simulation of the object, the simulation being capable of being manipulated by a user and being associated with a plurality of user-selectable interaction actions; display the simulation on a first display device; manipulate the simulation of the object in response to user inputs; monitor which of the plurality of user-selectable interaction actions the user selects; identify currently unused user-selectable interaction actions; prompt the user regarding at least one of the currently unused user-selectable interaction actions.
The invention extends to a carrier medium for carrying a computer readable code for controlling a server to carry out the method of the first aspect of the invention.
Within the scope of this application it is expressly envisaged that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. Features described in connection with one embodiment are applicable to all embodiments, unless such features are incompatible.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 shows an overview of the architecture of a transaction management system in accordance with an embodiment of the present invention; Figure 2 is a flow chart of the lifecycle of a transaction in accordance with another embodiment of the present invention; Figure 3 shows the elements of a system component of Figure 1 in greater detail; Figure 4 shows a display system in accordance with an embodiment of the present invention; Figure 5 shows a flow chart of a user interacting with a simulated object; Figure 6 shows a user interacting with components of the system shown in Figure 3; Figure 7 shows a method of interacting with a simulated object according to an embodiment of the present invention Figures 8 and 9 show a user interacting with components of the system shown in Figure 3.
DETAILED DESCRIPTION
Turning to Figure 1 a transaction management system 1 in accordance with an embodiment of the present invention is shown. The transaction management system 1 comprises a transaction server 3 and a display system 5.
As shown in Figure 1, the server 3 and display system 5 are located remotely from one another and are in communication with one another via the internet 9 (or any another suitable communications network, e.g. a bespoke communications network or a mobile communications based network). It is however noted that the server 3 and display system 5 could be co-located at the same physical location.
As well as being in communication with the display system 5, the server 3 may also be accessed by users at a user computing device 11 (such as a PC, smartphone, laptop or any other suitable computing device). For the sake of clarity only one user computing device is shown in Figure 1 although it is to be appreciated that a plurality of such computing devices may interact with the server 3 at any given time.
The server further comprises a portal means 13 in the form of a portal module through which a user at the computing device 11 may interact with the server 3 (and through which the server 3 may interact with the display system 5), configuration means 15 in the form of a configuration module and customer relationship management (CRM) means 17 in the form of a customer relationship management module.
In use, the server may be arranged to output data (via the portal means 13) to the computing device 11 to allow a visual representation of a transaction item to be displayed on a display screen 19 of the computing device. The user may configure the transaction item to display various different configuration options and the configuration means 15 is arranged to manage the configuration process.
Any user related data entered during the user's interaction with the server 3 may be recorded and captured within the CRM means 17 and stored within a database 21.
Database 21 may also store details of the various transaction items that the user can access along with each items potential configuration settings/options.
Also, shown in Figure 1 is an information element 23 in accordance with embodiments of the present invention, the operation of which is described in detail below. The information element is shown being supplied to the user's computing device 11. It is also noted that the information element 23 and/or the visual representation of the transaction item may also be sent to the display system 5 as described in greater detail below.
The transaction management system 1 may be used to manage the lifecycle of a transaction made by a user. The lifecycle management process is depicted in Figure 2 which is described with further reference to Figure 1.
In Step 201 of Figure 2 a user at a computing device 11 connects to the transaction management system 1 and in particular the server 3 via the portal means 13 and the internet 9 and accesses a configurable transaction item. In one embodiment of the present invention the transaction item may be a vehicle and the accessing of a configurable transaction item may comprise choosing a vehicle model.
In Step 203 the user interacts with the configuration means 15 to configure the transaction item. The configuration options may relate to configurable elements on the selected vehicle, e.g. paint colour and finish, interior and exterior trim options etc. As different configuration options are selected the server 3 may output an updated representation of the transaction item for display on the display screen 19 of the computing device 11.
Once the user has configured the transaction item the server 3 stores the configured transaction item, e.g. in the database 21, to allow later retrieval and generates an information element 23 in step 205 that is linked to the configured transaction item data. The information element 23 may be in the form of an optical representation, examples of which may be a barcode, such as a two-dimensional barcode, QR code, glyph or a dynamic optical encoding of content. The CRM means 17 may be arranged to generate the information element and to manage the link between the information element 23, configured transaction item and user details may be managed by the CRM means 17. The data associated with the configured transaction item that is stored in the database 21 comprises the transaction item selected by the user and the user selected configuration options relating to that transaction item.
Once the configured transaction item has been linked to the information element the user is able to retrieve the configuration settings (the selected user configuration options) for the transaction element at a later point in time, in step 207, by scanning the information element.
For example, where the information element comprises an optical information element, the action of scanning may comprise placing the information element in the field of view of a camera or scanning with a barcode reader. In the example of an information element that is encoded in an near-field communication (NFC) device the action of scanning may comprise bringing an NFC reader into close proximity with the NFC device that stores the information element.
S Having scanned the information element 23 the configuration settings for the transaction item may be retrieved at a point of sale system from the database 21/CRM means 17 on the server 3 and the user may make a transaction to acquire the transaction item (step 209). In more detail, the data from the scanned information element 23 is received at the portal means 13 and passed to the CRM means 17 which can retrieve the selected configuration options for the transaction element 23 from the database 21.
Although not shown in Figure 2 it is noted that a security check/validation step may be incorporated within the process flow of Figure 2 (for example within either step 209 or 211 (described below)) in which a user identification process is triggered and possibly a credit check. For example, a user may be required to scan an identification item (such as a driving licence) as part of the scanning step 207 in order to retrieve their configuration options. A credit check step may also be initiated, especially for high value transaction items, in which a user's ability to acquire the transaction item is determined and verified. This may be an automated credit check or may involve interaction with a human operator (who may be remotely located at, for example, a call centre). As well as determining the user's ability to acquire the transaction item such a credit check may also prevent a user from inadvertently acquiring a transaction item by accidentally selecting a purchase option.
In Step 211 the database 21 is updated (via the CRM means 17) with details of the transaction. For the user in question the database now stores details of a unique information element for that user, and the transaction item such as the vehicle model and configuration settings for that vehicle and details of the transaction.
In Step 213 the information element may be used to access post-transaction services. For the vehicle transaction example this may comprise scanning the information element again to receive details of the progress of the vehicle build or to access servicing or maintenance options (e.g. the transaction element could be scanned and the user presented with the option of booking a regular service).
Figure 3 shows a system component of the transaction management system 1 of Figure 1 in more detail. In particular, Figure 3 shows the display system 5 of Figure 1 in greater detail. It can be seen that the display system comprises a display server 25 which includes a means for rendering 27 in the form of a render processor (it is noted that the display server 25 is labelled as "dealership server 25" in Figure 3. The transaction example described herein is a particular example of the application of the present invention. Within such an example the display server may be referred to as a dealership server", e.g. a server located at a car dealership. In other examples of the application of the present invention the feature 25 in Figure 3 may more generally be referred to as a "display server'). The display server is also in communication with a first display device 29, a first input device 31, second display devices 33 (which are represented in the figure by a portable tablet computing device such as an iPad® but which may be any suitable computing device such as a laptop, PC etc.) and a further display device 35 and further input device 37.
The display system 5 also includes an image capture device 39, such as a camera or barcode scanner, an audio output device 41, such as a loudspeaker or arrangement of speakers, and a data store 43.
One of the second display devices 33 shown in Figure 3 is depicted with a pair of image capture devices 34a and 34b. The device 34a may be represented by a forward facing camera for, for example, video calling. The device 34b may comprise a rear facing camera for taking photographs or video.
The display system 5 shown in Figures 1 and 3 is in communication with the server 3 and may receive from the server 3 data relating to the transaction item that the user has configured according to the process of Figure 2 above. Such data may comprise information to allow the render processor 27 to render a simulation/representation of the transaction item for display on the first display device 29 and/or the second display devices 33.
It is noted that the simulation of the transaction item that is displayed on the first display device 29 may be manipulated via the first input device 31. Suitable input devices include touchpads (which may be embedded within the display screen of the first display device or which may be a standalone input device in communication with the render processor 27), gesture recognition input devices (such as the Microsoft Kinect ® system), speech recognition input devices, keyboard and mouse input devices etc. It is also noted that the second display devices 33 may also allow manipulation of the representation of the transaction item that is displayed, e.g. in the case of a tablet computing device the input may be received via a touchscreen, or by manipulation of the orientation of the device 33 relative to its environment, such as may be achieved using motion sensors integrated into the device 33 itself..
According to embodiments of the present invention, the display system 5 represents a location a user visits to interact with a computer generated simulation of the transaction item that they have configured according to the process depicted in Figure 2. For example, the display system 5 may represent an actual or a virtual" car dealership where the user can view and interact with a near life-size rendering of the transaction item that they have configured.
The display system 5 may be located in the same physical location that the transaction item would normally be purchased from (e.g. it may be located in a car showroom, an actual car dealership) or alternatively it may be located in another environment (e.g. shopping mall, airport departure lounge etc., a virtual" car dealership).
The display system 5 affords the user the opportunity to see a rendering, prior to purchase, of their selected and configured transaction item on a display device with superior display functionality than the computing device 11 that they started the transaction lifecycle upon.
In embodiments of the invention the first display device 29 may comprise a high definition screen of sufficient dimensions to be able to display the transaction item on substantially life-size scale.
As noted above the transaction item may be configured by the user from the computing device 11 and data relating to the configured transaction item may be stored in the database 21. The display server 25 may retrieve this data using the information element 23 that is provided to the user at the end of the configuration process.
The information element 23 may be scanned by the image capture device 39 and the display server 25 may use the information encoded within the information element to contact the server 3 and request details of the transaction item that the user is interested in and the configuration settings/options for that item. Depending on the particular embodiment of the invention the information element may represent a unique identification code that is linked at the server 3 side to the user and their configured transaction item. Alternatively, the information element may encode user data, transaction item data and configuration options data.
The information element 23 may be generated by the server 3 or any one of the modules 13, 15, 17 comprising the server. The information stored on the information element 23 may be automatically associated with a user data record stored in either the server 3 and/or the database 21. The data record may further comprise user related data and/or data relating to the transaction item.
Prior to displaying a render of the transaction item on the first display device 29 or second display devices 33 the user may be able to fine tune the configuration of the transaction item via a further display device 35 and further input device 37. In one embodiment the first display device 29 and further display device 35 may be of similar dimensions to one another and be located side by side such that updates to the configuration of the transaction item can be "moved" from the further display device 35 to the high definition render of the transaction item on the first display device 29. In embodiments where the further input device 37 is a touchscreen within the further display device 35 then the movement" of the updated configured transaction item may comprise the user swiping" the updated configured transaction item across from the further display 35 to the first display device 29.
The audio output 41 may be used to simulate a sound environment normally associated with the transaction item. For example, where the transaction item is a vehicle then the sounds may comprise simulated traffic noise or muffled traffic noise if the interior of the vehicle is being displayed.
Figure 4 shows an embodiment of the present invention which depicts an arrangement of a first display device 29, a number of second display devices 33 and a further display device 35.
It can be seen that the first display device 29 is displaying a simulation of a vehicle 46 which a user may interact with via the input device 31 located above the screen area of the first display device.
In this embodiment a further display device 35 is provided which displays the configuration options/settings selected by the user from their computing device 11 (not shown in in Figure 4) in steps 201 and 203 described above. These settings are retrieved from the server 3 upon presentation of an information element 23 in accordance with further embodiments of the present invention at the image capture device 39. The further display device 35 essentially comprises a large scale configuration screen which is touch enabled (input device 37) to allow the user to make further adjustments to their configuration settings before rendering the transaction item (vehicle) on the first display device 29 or to make further adjustments upon reviewing the simulation on the first display device 29.
In step 205 of Figure 2 above the server 3 generates an information element 23 (in response to the user's configuration of the transaction item in Step 203 of Figure 2) that is linked to the user's details and also to the configured transaction item that the user has configured via their user computing device 11.
Figure 5 is a flow chart of the process of interacting with elements of the system component (display system 5) shown in Figure 3.
In Step 221 a simulation of the transaction item (i.e. the object to be simulated) is generated by the render processor 27. In the embodiment depicted in Figure 3 the rendering means 27 is located within the display system 5. In alternative embodiments the rendering means may be located remote from the display system, for example in the server 3.
In step 223 the simulation is displayed on the first display device 29 and in step 225 the user may interact with the simulation shown on the first display device 29.
The simulation that is generated and rendered by the rendering means 27 may be a 3D simulation of the transaction item which is arranged to react to input from the input device 31 to simulate real world interactions with the transaction item (for example the vehicle orientation may be changed by moving relative to the first display device 29. The relative size of the simulated object may also be changed by moving further away from or closer to the first display device 29). In the example of a vehicle as the transaction item the simulation may respond to user input such that doors on the vehicle may be opened and closed within the simulation. The user may also be able to change the view provided on the first display device 29 such that the point of view of the simulation changes from an outside view of the vehicle to an inside view. The user may also interact with controls within the cockpit of the vehicle within the context of the simulation.
In Step 227 the user or another user may capture a representation of the simulation on the first display device 29 for display on a second display device 33 and in Step 229 the representation of the simulation may be displayed on the second display device 33. In Step 231 the user (or the other user) may interact with the representation of the simulation on the second display device 33.
The second display device 33 may comprise an image capture device of its own, e.g. a built in camera, to enable a representation of the simulation on the first display device to be captured (see, for example, feature 34b in Figure 3). The process of capturing the representation may comprise taking a photograph of the first display device using the second display device. The captured representation may then be manipulated on the second display device.
In step 225 (as described above) the user may interact with the simulation on the first display device 29.
Figure 6 shows a sequence of five views (29a to 29e) of the first display device 29 over time.
The first image in the sequence is at the top left of the figure and the final image in the sequence is at bottom right. The input device 31 for the first display device is shown above the display and, in the example shown, comprises a Kinect® style control device. The user 45 may therefore interact with the on-screen representation of the transaction item via a gesture control mechanism.
In the first view 29a in the sequence the display device 29 is showing a side view of a vehicle (the vehicle, in this context, representing the user configured transaction item 46). A first gesture 47 by the user 45 causes the image of the vehicle to rotate so that a rear view is shown, the second view 29b in the sequence. The user 45 then repeats the first gesture 47 to rotate the vehicle again so that a perspective side view is shown in view 29c.
As noted above the simulation is rendered such that real world interactions with the rendered object may be made. In view 29c some interaction prompt symbols 49 have been overlaid on the simulated object to indicate to the user 45 that they may interact with the simulation in some way. In the present example the symbols 49 are located over the vehicle doors to indicate to the user 45 that the doors of the vehicle may be opened. It is noted that the prompt symbols may not be visible at all times to the user and may only be displayed once the display server determines that they have overlooked certain interaction options available to them.
Alternative interaction prompt symbols may indicate to the user that they can switch the vehicle lights on and off within the simulation, that they can alter the seat tilt, that they can view the luggage bay (and potentially place virtual luggage within the bay), that they can turn the steering wheel and alter the ride height of the vehicle. Other interactions may be highlighted to the user depending on the configuration of the simulation.
The user 45 then performs a second gesture 51 which causes the doors of the vehicle to open (view 29d). A further overlay symbol 53 has appeared in view 29d to indicate that the user may enter the vehicle within the context of the simulation. The user then performs a third gesture 55 to enter the vehicle within the simulation (view 29e).
Although the embodiment shown in Figure 6 uses gesture controls, specifically arm movements of the user 45, to control the simulation on the first display device 29 it is to be appreciated that other control commands may be used and alternative control devices may be used.
For example, where a gesture based input device 31 is used, the simulation of the transaction item may respond to the user 45 physically moving position. For example, movement towards the screen may bring the simulated object closer, movement to the left or right may initiate rotation of the device.
Alternative input devices may comprise a voice input device so that the simulation can be manipulated by voice command, a control device that incorporates one or more motion sensors, a separate touchpad for touch based input etc. It is further noted that in alternative embodiments more than one input device type may be used to interact with the simulated object, e.g. a combination of gesture control as shown in Figure 6 plus voice control could be used such that the transition between views 29c and 29d could be accomplished by speaking the command "open doors" instead of performing gesture 51.
Steps 227 to 231 above describe how a second display device 33 may capture a representation of the simulated object from the first display device 29 and display that on the second display device 33 such that the user or a further user may interact with the representation.
Figure 7 is a flow chart showing a method of interacting with a simulated object according to an embodiment of the present invention. In Step 241 a simulation of an object to be displayed is generated, for example by the server computer 25. In the examples above the object is represented by a vehicle and the simulation is provided in support of a vehicle purchase. It is noted however that the present invention may be applied to any object that may be simulated and with which a user may want to interact. The simulation of the object may support a number of manipulations of the simulated object, e.g. rotation of the object, changing the view of a simulated object (such as from a perspective to top down view), zooming towards and away from the simulated object.
Manipulation of the simulated object may be via any suitable input control method, for example the gesture based input device 31 described above.
In addition to supporting manipulations of the simulated object a pluiality of user-selectable interaction actions may be associated with the simulated object. Such interaction actions may comprise: changing the colour of the object, changing the object that is being simulated (for example the user may be able to switch vehicle models), calling up an information overlay, calling up an options menu (e.g. to change a configurable feature on the simulated object such as the wheel design, tyre selection, light array design), moving inside the object (e.g. getting into the vehicle).
In Step 242 the simulated object may be displayed on the display device 29 and the user 45 may, in Step 243, interact with the simulated object 46.
In Step 244, the server computer 25 monitors which of the user-selectable options the user has selected. Such monitoring may be done in stages. For example, while the user is interacting with an outside view of the object the system 25 may not monitor the user-selectable options that are associated with the inside of the object. The system 25 may be configurable to have a list of user-selectable options which are prioritised over others, and if the user fails to interact with these options within a pre-determined time, the system 25 may automatically generate prompt(s) for the user to explore these options before they quit the interactive experience. In this way, the system 25 may generate user prompts to notify the existence of key user-selectable options to the user regardless of whether those options are in context with the current user interaction. To avoid causing confusion or a source of irritation for the user, such prompts may take the form of an animated or text-based prompt presented on one of the displays 33, 35 not currently being used to present the simulated object 46.
In Step 245, the server computer identifies which interaction options are currently unused by the user 45. Again, the server computer may be arranged to identify those currently unused interaction options that are associated with a current view being displayed on the display device 29.
In Step 246, the server computer is arranged to prompt the user regarding at least one of the currently unused user-selectable interaction options. Such prompting may take a number of forms, including but not limited to one or more of the following: an audible alert (e.g. a pre-recorded message), an on-screen textual prompt (e.g. "click on Xfor more details regarding this feature") or an on-screen visual prompt (e.g. arrows, highlighted area, colour changed area, animated prompt icon) etc. In response to a prompt the user 45 may choose to ignore the prompt and continue manipulating the simulated object. Alternatively, the user 45 may select a user-selectable interaction option as suggested by the server computer 25. Following selection of a prompted option the server may update the list of currently unused user-selectable interaction options that are available to the user. As noted above, such an updated list may refer to a "global" list of all the available user-selectable interaction options or alternatively may refer to those currently unused interaction options that are associated with a current view being displayed on the display device 29.
The prompting of unused and available user-selectable interaction options may be limited to a certain number of prompts in a given time period (e.g. to maintain the user experience any given user may only be prompted once every 5 minutes). The time period between prompts may be configurable either via a system administrator or by the user themselves.
The prompting of unused and available user-selectable interaction options may be limited with respect to the current view on the display device 29. There may also be a limit on the number of times a particular feature is prompted (e.g. prompt once per session or allow multiple prompts). Again such prompts may be configurable either via a system administrator or by the user themselves.
The appearance of prompts relating to a particular view of the simulated object may also be delayed. For example, the user may initially view a particular view without the prompts being displayed and may then move on to another view of the object. On returning to the original view the server 25 may display interaction prompts to indicate that one or more interaction options were not accessed during the initial interaction with the original view.
Figure 8 shows one example of a prompt that has been presented to a user by the server computer 25. In Figure 8 the display screen 29 is showing an outside view of a vehicle 46 and the wheel design is associated with a user-selectable interaction action. The interaction action may, for example, be an information pop-up that comprises details on the particular wheel design that has been chosen as part of the vehicle configuration. Alternatively, or additionally, the interaction action may be a menu option for changing the selected wheel design.
o In Figure 8 the prompt is provided visually on-screen to the user 45 in the form of a circular icon 300 that has appeared in the top right-hand side of the screen 29. The icon is then shown to travel 302 to a second location 304 over one of the wheels. It is noted that the icon 300 may not be configured to move ovel the simulated object 46 in all embodiments and may always appear in a fixed location on the screen. The user or a system administrator may, for example, disable the movement of the icon in the event that it is distracting or unwanted.
It is noted however that in the embodiment of Figure 8 the movement 302 of the icon to the relevant part of the simulated object 46 provides the user 45 with an indication of the type of interaction action available for selection without needing to actually select the icon 302. In this manner the user 45 may be able to filter outlignore prompts that are not of interest to them without needing to open or interact with the prompt. In this manner the user's experience of interacting with the simulated object is enhanced since they do not have to break their train of thought and interact with the prompts if it is associated with a feature they are not interested in.
Although a circular icon is shown in Figure 8 the skilled person will appreciate that other types of visual icons may be used, such as flashing/animated icons, manipulation of lighting effects or alternative shapes.
Figure 9 shows a user 45 interacting via a gesture control system 31 with the simulated object 46 on the display device 29. Four separate views of the screen are shown (291, 29g, 29h, 29i). In view 291 the user is interacting with the displayed object 46. The server computer 25 then notes that certain elements of the simulation have not been selected by the user and in view 29g places an on screen prompt 306 in the form of a text based instruction. In view 29h the user has ignored the instruction 306 and has continued to interact with the outside view of the vehicle 46. In view 29i however the user has followed the text based instruction and has opened the doors of the vehicle 46. By selecting this particular user-selectable interaction action 306 the user has been presented with a further on-screen prompt in the form of the arrow 308 indicating that the user is able to move inside" the vehicle 46 within the context of the simulation.
In further embodiments the user may be prompted to interact with the simulation via an information element 23. For example, the user 45 may be able to unlock further interaction actions by scanning an information element or part of an information element 23 with the image capture device 39.

Claims (26)

  1. CLAIMS: 1. A method of managing user interaction with a simulated object, the method comprising: generating a simulation of the object, the simulation being capable of being manipulated by a user and being associated with a plurality of user-selectable interaction actions; displaying the simulation on a first display device; manipulating the simulation of the object in response to user inputs; monitoring which of the plurality of user-selectable interaction actions the user selects; identifying currently unused user-selectable interaction actions; and prompting the user regarding at least one of the currently unused user-selectable interaction actions.
  2. 2. A method as claimed in Claim 1, wherein the first display device is in communication with rendering means, the rendering means rendering the simulation of the object.
  3. 3. A method as claimed in Claim 1 or Claim 2, wherein the simulation is a three dimensional simulation of the object.
  4. 4. A method as claimed in any preceding claim, wherein interacting with the simulation on the first display device comprises interacting with a gesture control input mechanism.
  5. 5. A method as claimed in any preceding claim, wherein the simulation is animated so that interaction with the simulated object is arranged to mimic real world responses of the object.
  6. 6. A method as claimed in any preceding claim, wherein manipulating the object in response to user inputs refers to spatial manipulation of the simulated object.
  7. 7. A method as claimed in any preceding claim, wherein user-selectable interaction actions comprise one of more of the following actions: calling up information overlays; calling up menus; changing a viewpoint of the simulation; changing a parameter or the simulated object; changing the simulated object.
  8. 8. A method as claimed in any preceding claim wherein the user-selectable interaction actions are associated with a feature of the simulated object.
  9. 9. A method as claimed in any preceding claim, wherein generating a simulation of the object comprises generating a list of all the user-selectable interaction actions that may be selected by the user for the simulated object.
  10. 10. A method as claimed in Claim 9, wherein identifying currently unused user-selectable interaction actions comprises monitoring the complete list of all the user-selectable interaction actions against selected user-selectable interaction actions.
  11. 11. A method as claimed in Claim 9 or Claim 10, wherein identifying currently unused user-selectable interaction actions comprises monitoring a subset of the complete list of all the user-selectable interaction actions against selected user-selectable interaction actions.
  12. 12. A method as claimed in Claim 11, wherein the subset relates to user-selectable actions associated with a portion of the simulated object that is being displayed.
  13. 13. A method as claimed in any preceding claim, wherein prompting the user comprises sending an audible alert prompt type to the user.
  14. 14. A method as claimed in any one of Claims 1 to 13, wherein prompting the user comprises displaying a visual indication prompt type.
  15. 15. A method as claimed in Claim 14, wherein the visual indication comprises a text based message to the user.
  16. 16. A method as claimed in Claim 14 or Claim 15, wherein the visual indication comprises an icon based indication.
  17. 17. A method as claimed in Claim 16, wherein the icon based indication is an animated indication which is arranged to move towards a feature of the simulated object which is associated with an at least one of the currently unused user-selectable interaction actions.
  18. 18. A method as claimed in any one of Claims 12 to 17, wherein the prompt type is configurable by a system administrator or by the user manipulating the simulated object.
  19. 19. A method as claimed in Claim 18, wherein the frequency of the appearance of the prompt type is configurable by the system administrator or by the user manipulating the simulated object.
  20. 20. A method as claimed in any preceding claim, comprising updating a list of currently unused user-selectable interaction actions once a user has selected one or more of the actions from the list.
  21. 21. A method as claimed in any preceding claim, wherein the user-selectable interaction actions are associated with a feature of the simulated object and the method further comprises initially displaying the feature to the user without prompting the user regarding at least one of the currently unused user-selectable interaction actions and subsequently prompting the user of currently unused user-selectable interaction actions at a later point in time.
  22. 22. A method as claimed in Claim 21, wherein the user views an intermediate view of the simulated object in which the feature is not visible before subsequently returning to the initial view of the feature at the later point in time.
  23. 23. A display system for interacting with an object, the system comprising: a first display device; a computing device arranged to: generate a simulation of the object, the simulation being capable of being manipulated by a user and being associated with a plurality of user-selectable interaction actions; display the simulation on a first display device; manipulate the simulation of the object in response to user inputs; monitor which of the plurality of user-selectable interaction actions the user selects; identify currently unused user-selectable interaction actions; and prompt the user regarding at least one of the currently unused user-selectable interaction actions.
  24. 24. A carrier medium for carrying a computer readable code for controlling a server to carry out the method of any one of Claims 1 to 22.
  25. 25. A method of managing user interaction with a simulated object as hereinbefore described with reference to the accompanying figures.
  26. 26. A display system for interacting with an object as hereinbefore described in relation to the accompanying figures.
GB1301814.8A 2013-02-01 2013-02-01 System and method for managing interaction with a simulated object Withdrawn GB2512571A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1301814.8A GB2512571A (en) 2013-02-01 2013-02-01 System and method for managing interaction with a simulated object
PCT/EP2014/051299 WO2014118072A1 (en) 2013-02-01 2014-01-23 System and method for managing interaction with a simulated object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1301814.8A GB2512571A (en) 2013-02-01 2013-02-01 System and method for managing interaction with a simulated object

Publications (2)

Publication Number Publication Date
GB201301814D0 GB201301814D0 (en) 2013-03-20
GB2512571A true GB2512571A (en) 2014-10-08

Family

ID=47988559

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1301814.8A Withdrawn GB2512571A (en) 2013-02-01 2013-02-01 System and method for managing interaction with a simulated object

Country Status (2)

Country Link
GB (1) GB2512571A (en)
WO (1) WO2014118072A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201707595D0 (en) * 2017-05-11 2017-06-28 Howden Joinery Group Plc Swatch display
CN110874781A (en) * 2018-09-03 2020-03-10 上海汽车集团股份有限公司 Automobile matching method and system
CN111062767A (en) * 2018-10-16 2020-04-24 上海汽车集团股份有限公司 Full-process C2B intelligent matching method and system
TWI763380B (en) * 2021-03-17 2022-05-01 同致電子企業股份有限公司 Method for achieving interactions between user and automobile

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070118431A1 (en) * 2005-11-21 2007-05-24 Johansson Mikael I System for configuring a chemical separation system
US20070156540A1 (en) * 2006-01-05 2007-07-05 Yoram Koren Method and apparatus for re-configurable vehicle interior design and business transaction
US20080177639A1 (en) * 2007-01-19 2008-07-24 Marc Kuppersmith System and method for facilitating the retail sale of customizable products

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070118431A1 (en) * 2005-11-21 2007-05-24 Johansson Mikael I System for configuring a chemical separation system
US20070156540A1 (en) * 2006-01-05 2007-07-05 Yoram Koren Method and apparatus for re-configurable vehicle interior design and business transaction
US20080177639A1 (en) * 2007-01-19 2008-07-24 Marc Kuppersmith System and method for facilitating the retail sale of customizable products

Also Published As

Publication number Publication date
GB201301814D0 (en) 2013-03-20
WO2014118072A1 (en) 2014-08-07

Similar Documents

Publication Publication Date Title
JP7461941B2 (en) Optimizing virtual data views using voice commands and defined perspectives
US10109041B2 (en) Method of interacting with a simulated object
US20200310624A1 (en) Augmentable and spatially manipulable 3d modeling
KR101803168B1 (en) Data manipulation based on real world object manipulation
CN105229566B (en) Indicating observations or visual patterns in augmented reality systems
US20140025529A1 (en) Systems and Methods for Generating Three-Dimensional Product Configuration
CN105191330A (en) Display apparatus and graphic user interface screen providing method thereof
US20080103913A1 (en) System and method for guided sales
WO2014118072A1 (en) System and method for managing interaction with a simulated object
KR102121107B1 (en) Method for providing virtual reality tour and record media recorded program for implement thereof
US9299097B2 (en) Information element
US20150242920A1 (en) System and method for managing lifecycle of a transaction item

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)