GB2532010A - Display method and device - Google Patents

Display method and device Download PDF

Info

Publication number
GB2532010A
GB2532010A GB1419622.4A GB201419622A GB2532010A GB 2532010 A GB2532010 A GB 2532010A GB 201419622 A GB201419622 A GB 201419622A GB 2532010 A GB2532010 A GB 2532010A
Authority
GB
United Kingdom
Prior art keywords
content
display
user
user input
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1419622.4A
Other versions
GB201419622D0 (en
Inventor
Scio Edward
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to GB1419622.4A priority Critical patent/GB2532010A/en
Publication of GB201419622D0 publication Critical patent/GB201419622D0/en
Publication of GB2532010A publication Critical patent/GB2532010A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Abstract

Disclosed is a content display method for small computing devices. First content is displayed on a screen, user input is received to change the content displayed, the user input being in the form of relative movement between a user of the device and the device itself. The content is then processed based on the user input to produce and display second content, the second content being a modification of the first content. The first content is such that it fills the screen and the second content extends beyond the extent of the screen. The first content is processed to produce a magnified version of the content with part of the virtual screen visible through the display screen. Further input is received to navigate around the virtual enlarged first content.

Description

Display Method and Device The present invention relates generally to a display method and device.
Traditionally, user interaction with a display device would have taken the form of one or more button presses, or inputs via a wired or wireless remote control. More recently, however, user interaction has become more versatile, and has taken forms such as touch input, gesture input, and the like. Such advances have lead to a more immersive or enhanced user experience. At the same time as developments in user input have occurred, the display devices themselves have also changed, accordingly. For instance, sophisticated display devices are now found in portable formats, which include wearable formats.
Although not always the case, the portable and/or wearable nature of the device restricts its screen size. The restriction in screen size imposes, of course, a restriction on the visibility of content that can be displayed on that screen. As a result, content may be too small to easily read or understand, or even interact with, which might be impractical for the user. A result might be a requirement for significant and repeated user input to manipulate the content or its display to improve ease of viewing. In one example, multiple and repeated touch inputs or gestures might be required in order to zoom in and out of content that is of interest to the user.
In a related problem, content may be specifically formatted for use on a particular device, for example a device with a particular screen size, format or aspect ratio. Again, in order to interact with this content the user may require multiple or repeated touch inputs, for example to select, access or search through on-screen menus or items or the like.
It is an example aim of example embodiments of the present invention to at least partially obviate or mitigate one or more disadvantages as described above, or as present elsewhere in the prior art, or to provide an alternative to exist in devices and methods.
According to the present invention there is provided an apparatus and method as set forth in the appended claims. Other features of the invention will be apparent from the dependent claims, and the description which follows.
According to a first aspect of the present invention there is provided a content display method for a display device, comprising: displaying first content on a display screen; receiving user input to change the display of content, the user input being in the form of relative movement between a user of the device and the device itself; processing the first content to provide second content, the second content being a modification of the first content; and displaying the second content.
The first content may be arranged to be displayed in a format wherein, without processing, the first content would extend to an extent of the display screen.
The second content may, in its entirety, extend beyond an extent of the first content and/or the display screen.
The processing may comprise: magnification of a part or all of the first content; and/or addition of content at one or more boundaries of the first content.
The processing may be such that the second content forms, or provides the effect of forming, a larger or magnified virtual screen visible through the display screen.
The method may comprise receiving further user input in the form of relative movement between a user of the device and the device itself, in order to navigate about and/or selectively display different parts of the first content and/or second content.
The second content: may be stored for subsequent and selective display; or may be displayed on-the-fly (i.e. in substantially real time) as a result of the processing.
The processing may be undertaken: before displaying the first content; or after the displaying of first content, and before receiving of user input; or after receiving user input to change the display of content.
The method may comprise receiving an indication from the user that first content is to be displayed, or if second content is to be displayed. That is, the user may switch between first and/or second content viewing modes or formats.
A configuration of the display of second content may be dependent on the content.
The configuration comprises or results in one or more of: ensuring that an entire image forming at least a part of the second content is visible, by an appropriate degree of magnification; or ensuring that an entire section of text forming at least a part of the second content is visible by an appropriate degree of magnification; or ensuring that an entire section of text forming at least a part of the second content is visible by re-configuring the text.
At least part of a displayed or non-displayed portion of the second content may comprise at least a part of the first content, in modified or unmodified form.
User input in the form of relative movement between a user of the device and the device itself may comprise one or more of movement of a user's head relative to the device; and/or movement of at least a part of a user's face relative to the device; and/or movement of a user's eye or eyes relative to the device; and/or a change in orientation and/or position of the device.
According to a second aspect of the present invention there is provided a display device, comprising: a display screen arranged to display first content; a user interface arranged to receive user input to change the display of content, the user input being in the form of relative movement between a user of the device and the device itself; a processor arranged to process the first content to provide second content, the second content being a modification of the first content; and the display screen being arranged to display the second content.
The device may be a user-wearable device, and/or a portable device.
It will be appreciated that one or more features of one or more aspects of the present invention may be used in combination with, and/or in place of, one or more features of one or more other aspects of the present invention, unless such a combination and/or replacement would be understood by the skilled person to be clearly mutually exclusive from a reading of this disclosure.
For a better understanding of the invention, and to show how embodiments of the same may be carried into effect, reference will now be made, by way of example, to the accompanying diagrammatic Figures in which: Figures and 1 and 2 schematically depict a user interacting with a display device; Figures 3 to 5 schematically depict a user interacting with a display device in order to modify displayed content, in accordance with an example embodiment; Figures 6 and 7 schematically depict further interactive principles associated with
example embodiments;
Figures 8 to 10 schematically depict configuration of content for display to a user, the configuration being dependent on the content, according to an example embodiment; Figures 11 and 12 depict a different modification of content according to another example embodiment; and Figure 13 schematically depicts a combination of content modifications according to another example embodiment.
Figure 1 schematically depicts a display device 2. In this example, the display device takes the form of a mobile telephone. The display device 2 includes a touch sensitive display screen 4 for displaying content 6. A finger 8 of a user is shown, and the user may provide input to the display device 2 via a touch input using the finger 8 via the display screen 4.
Figure 2 shows how touch input of one form or another, for example a screen tap, screen press, or a pinch touch gesture, or the like, may be used to zoom-in on content of interest 10. Zooming in on the content of interest 10 may allow for easier user interaction with that content 10, for example viewing or reading that content.
Although the sort of user interaction depicted in Figures 1 and 2 works well in many examples, and has done so for many years, developments in technology and user experience or expectations mean that alternative and/or better ways of interacting with devices might be beneficial. For instance, referring to Figures 1 and 2 in combination, it can be seen that the size of the display screen 4 is not significantly bigger than the user's finger 8. For coarse or rudimentary touch input, this may not be a problem. However, when the display content 6 includes fine or small details, or for example small objects or menus to interact with, it may be difficult to strike a good balance between display of content, and interaction with that content, when the display screen is limited in size. This is particularly true of portable and/or wearable devices.
Example embodiments of the present invention at least partially obviate or mitigate disadvantages of the prior art, or at least provide an alternative to existing display devices and methods. According to one aspect, the present invention relates to displaying first content on a display screen of a display device. The display device receives user input in order to change the display of content in some way. The user input is in the form of relative movement between the user of the device and the device itself, which negates the need for touch input (or solely touch input) and the related problems of touch manipulation on small and/or difficult to engage with display screens. The first content is processed to provide second content, the second content being a modification (i.e. not a movement, or just a movement, for example a scroll) of the first content. Typically, the modification will allow the first content to be more easily viewed, or will be an addition to the first content to functionally improve the user experience. That second content is then displayed to the user. Overall, the example embodiment improves upon, or at least provides an alternative to, existing methods of display device interaction, and leads to it being easier to interact with content on the device.
Example embodiments will now be described in more detail with reference to Figures 3 to 13. In those Figures, the same features have been given the same reference numerals for consistency and clarity. The Figures are not drawn to any particular scale, and are simply given as an aid to understanding underlying concepts.
Figure 3 schematically depicts a display device 20 according to an example embodiment. In this example, the display device takes the form of a mobile telephone 20.
However, the display device could, in other examples, take different forms, for example any portable (which includes wearable) display device, such as a watch, a bracelet or ankle bracelet, glasses, a tablet, a laptop, and the like. The display device is not restricted to being portable or wearable, but such functionality generally imposes a practical limit on screen size, and example embodiments are well suited to solving problems associated with such limits.
The display device 20 comprises a user interface arranged to receive user input to change a display of content 22 displayed on a display screen 24 of the device 20. User input is in the form of relative movement between the user (which includes a part thereof) and the device itself In this example, the user interface comprises one or both of a camera 26 and a device movement sensor 28. The camera 26 may be for tracking movement of the user relative to the device 20. The device movement sensor 28 may be the form of one or more gyroscope or accelerometers 28, for detecting movement of the device 20, which includes changes in position and/or orientation of the device.
Figure 3 shows that the content 22, conveniently denoted as first content 22, is arranged to be displayed in a format that extends to an extent of the display screen 24. That is, the content may be a web page, menu or picture which fills or extends to a display screen 24 length or width.
As described above, the content 22 may be difficult for the user to easily interact or engage with, for example being difficult to see without zooming in on certain content by appropriate touch input or similar. In accordance with an example embodiment, the first content may therefore be modified in some way to improve the user experience of, or engaging with, the content 22. The user may provide an indication to the device that such modification to provide second content for display is desired, for example entering (or indeed leaving) a modified or second content mode. This indication may be in any convenient form, for example a button press, a particular gesture, voice control, a particular orientation of the device related to the user, or the like. Altematively, the device may always be in this mode, or have this mode set as a default, or have this mode controllable in advance from general user preferences or settings.
Figure 4 shows the situation where particular content 30 is deemed of interest to the user. This may be determined by the camera 26 determining that the movement of the user's eye (denoted by zone 32) results in a particular focus on a particular region of the display screen 24. This might be combination of eye-tracking (e.g. location) over a period of time, with a specific period of time in one area being deemed as user interest in that area. Once this user input is received, in the form of relative movement between the user and the device 20, second content is displayed to the user, the second content being a modification of the first content. Figure 5 shows that, in this example, the second content 34 is modified by a magnification of at least part of (or all of) the first content, to allow the user to more easily see the content of interest 34.
In this example, of course, the second content 34 includes at least a part of the first content. Typically, this will be the case in most examples. The processing required to modify the first content to result in the second content does, in this example, comprise a magnification by rendering or otherwise of the first content. The processing could be undertaken after receiving user input to change the display of content. For instance, the processing could be undertaken in real-time, or in other words on-the-fly. In other examples, the processing could be undertaken before the first content is even displayed, for example comprising a form of preprocessing ready for subsequent display. The processing might take place after displaying of the first content, and before receiving of the user input, for similar reasons. In a related manner, the second content may be displayed on-the-fly as a result of the processing, or may be stored for subsequent and selective display as and when necessary. Different implementations might be more or less advantageous in different circumstances, for example depending on the pre-processing ability of the device, the processing power of the device, available system resources, the nature of the content and so on.
Processing of the content may be undertaken by a processor of the device (e.g. a CPU, a GPU, and the like), or a processor in connection with the device (e.g. wired or wireless connection). That is, the processing may be undertaken remote from the device, the processing and/or access to content being achieved via a server or similar.
It has already been described that the first content may typically be arranged to be displayed in a format, wherein, without processing for modification, the first content will extend to an extent of the display screen. With modification to form a second content, the second content might, in its entirety, extend beyond an extent of the first content and/or the display screen. For instance, this might be the case if the second content is, in some way, a magnification of some or all of the first content.
Figure 5 shows second content 34 that is deemed of interest to the user by appropriate tracking of the user's eyes. In some examples, the user input and appropriate display of second content may effectively terminate at this point, and for example the display screen may revert to the initial display of first content as shown in Figures 3 or 4, ready for subsequent detection of movement and appropriate modification and display of second content. This might be achieved by the user indicating in some way that they want to return to the first content view mode, as discussed above. However, Figures 6 and 7 show a related example, which may be viewed as an extension of the examples shown in Figures 3 to 5.
Figures 6 and 7 show that when in the enlarged/magnified modified second content view, the second content may be navigated around and/or be controlled to selectively display different parts of the second content by further user input in the form of relative movement between the user and the device. For example, Figures 6 and 7 show how the detected face or head position of a user 36 may be used to navigate about and/or selectively display different parts of the second, modified, content 38. To this extent, Figures 6 and 7 show how the processing of the first content is such that the modified, second content forms, or provides the effect of forming, a larger or magnified virtual screen that is visible through the physical display screen 24. In other words, the embodiment provides the illusion that a larger, and in some instances a far larger screen is visible through the smaller physical display screen. This sort of implementation may find particular advantage when the display screen is particularly small, for example in the form of a watch face, or a screen on a bracelet or similar. The user can enjoy the benefits of large virtual screen, using only a small physical screen, and does not have the problem of interacting with that small screen using (solely) potentially problematic touch input, but instead interacts using relative movement.
In Figures 6 and 7, the dashed outline of content 38 shows non-displayed portions of the second content that may be visible to the user by appropriate input in the form of relative movement between the device and the user.
In some examples, the processing and modification of the first content may be set to some sort of default. For instance, the second content may be a certain magnification of the first content, or, with appropriate user input in the form of voice control, gesture, touch, movement or so on, degrees of magnification, either continuously or in a step-wise manner. The degree of magnification may be user controllable on-the-fly or set in advance in settings of the device. If, for instance, the modification or magnification is pre-set in advance, and fixed, Figure 8 depicts a potential problem.
In Figure 8, it can be seen that of the modified, second content 38, content of particular interest 40 (for example, selected for display on the display screen 24 by relative movement between the user and the device 20 as described previously) extends beyond an extent of the display screen 24. This may not always be a significant problem. For instance, the user may get the gist of the content from that portion displayed on the display screen 24, and to the satisfaction of the user. However, it is conceivable that such partial display of content may be more problematic, for example when the content comprises text or similar. To therefore overcome this problem, in one example embodiment a configuration of the display of the second content may in fact be dependent on that content, for example the nature or type of the content. For instance, the configuration may comprise or result in one or more of ensuring that an entire image forming at least a part of the second content is visible on the display screen by an appropriate degree of magnification of the first content; or ensuring that an entire section of text forming at least a part of the second content is visible by an appropriate degree of magnification of the first content; or ensuring that an entire section of text forming at least a part of the second content is visible by re-configuring the text.
Figure 9 shows an example where magnification to fit has been employed, to ensure that content of interest 40 is fully visible.
Figure 10 shows an example where the 40 content has been re-configured, for example changing the format of the content to make it easier to display the content on the display screen. In the example of Figure 10, this is achieved by splitting the content 40. Splitting may be more appropriate for certain content (e.g. text) than other content (e.g. pictures). When the content comprises text, the configuring or re-configuring of the text could be a change of font, or a change of font size, or a changing of the text layout, for example splitting a first section of text into two smaller sections, for example a large column into two smaller columns.
The configuring or re-configuring of the display of the second content could be part of the processing of the first content, or could be a separate stage or separate step. Typically, the configuration or re-configuration of the display of second content will have at least some dependence on the first content, on the basis of which the second content is formed for display.
Figures 3 to 10 have shown how this first content may be modified by appropriate magnification to form the second content. Figure 11 shows that, in a related example, the second content may comprise first content 22, and additional content 42 located at one or more boundaries of the first content 22. The additional content 42 may enhance user functionality, and might include menu items, icons, information related to the first content, and so on. Again, this provides an enhanced user experience providing modified, second content that can be viewed by relative movement between the user and the device, as depicted in Figure 12. In other words, the user experience is again not limited by the physical screen size.
Figure 13 shows that the magnification modification of Figures 3 to 10 can be combined with the addition to content modification of Figures 11 and 12.
In all the above examples, first content is modified to provide modified second content that in some way enhances the user experience. This may be achieved either by the addition of content or by the magnification of content, in combination with accessing, or viewing, or navigating about that second content by relative movement between the user of the device and the device itself. This means that the user enjoys the benefit of what appears to be a greater (virtual) screen size without having to increase the physical size of the screen, and without also needing to interact with the screen in a button press or other touch or tactile method.
The present invention may find use in any display device, but may find particular use in display devices of a more portable nature, which includes, in particular, wearable devices. This is because the screen size of the devices will typically be limited to achieve the desired portability. Examples of such devices include watches, wrist or ankle bracelets, glasses, tablets, mobile telephones, laptops and even projection systems. For projection systems this may include a screen on the projector itself, or even a way of interacting with content projected by the projector, the user interface being located in the projection system, or in connection with the projection system.
User input in the form of relative movement in between the user of the device and the device itself might comprise one or more of: movement of a user's head relative to the device; and/or movement of at least a part of a user's face relative to the device (e.g. a facial gesture or similar, or change in orientation of the face); and/or movement of a user's eye or eyes relative to the device; and/or a change in orientation and opposition of the device relative to the user. In some examples, the user input as previously described may be used in the alternate or in combination, or even in a cascading manner. For instance, example embodiments as described above may be implemented by using eye tracking. If eye tracking is unavailable, or is not functioning as intended, then the user could implement either of the inventions by appropriate moving of the display screen, for example changing its orientation.
Although a few preferred embodiments have been shown and described, it will be appreciated by those skilled in the art that various changes and modifications might be made without departing from the scope of the invention, as defined in the appended claims.
Attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.
All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
The invention is not restricted to the details of the foregoing embodiment(s). The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.

Claims (16)

  1. Claims 1. A content display method for a display device, comprising: displaying first content on a display screen; receiving user input to change the display of content, the user input being in the form of relative movement between a user of the device and the device itself; processing the first content to provide second content, the second content being a modification of the first content; and displaying the second content.
  2. 2. The method of claim 1, wherein the first content is arranged to be displayed in a format wherein, without processing, the first content would extend to an extent of the display screen.
  3. 3. The method of claim 1 or claim 2, wherein the second content would, in its entirety, extend beyond an extent of the first content and/or the display screen.
  4. 4. The method of any preceding claim, wherein the processing comprises: magnification of a part or all of the first content; and/or addition of content at one or more boundaries of the first content.
  5. 5. The method of any preceding claim, wherein the processing is such that the second content forms, or provides the effect of forming, a larger or magnified virtual screen visible through the display screen.
  6. 6. The method of any preceding claim, wherein the method comprises receiving further user input in the form of relative movement between a user of the device and the device itself, in order to navigate about and/or selectively display different parts of the first content and/or second content.
  7. 7. The method of any preceding claim, wherein the second content: is stored for subsequent and selective display; or is displayed on-the-fly as a result of the processing.
  8. 8. The method of any preceding claim, wherein the processing is undertaken: before displaying the first content; or after the displaying of first content, and before receiving of user input; or after receiving user input to change the display of content.
  9. 9. The method of any preceding claim, wherein the method comprises receiving an indication from the user that first content is to be displayed, or if second content is to be displayed.
  10. 10. The method of any preceding claim, wherein a configuration of the display of second content is dependent on the content.
  11. 11. The method of claim 10, wherein the configuration comprises or results in one or more of: ensuring that an entire image forming at least a part of the second content is visible, by an appropriate degree of magnification; or ensuring that an entire section of text forming at least a part of the second content is visible by an appropriate degree of magnification; or ensuring that an entire section of text forming at least a part of the second content is visible by re-configuring the text.
  12. 12. The method of any preceding claim, wherein at least part of a displayed or non-displayed portion of the second content comprises at least a part of the first content, in modified or unmodified form.
  13. 13. The method of any preceding claim, wherein user input being in the form of relative movement between a user of the device and the device itself comprises one or more of: movement of a user's head relative to the device; and/or movement of at least a part of a users face relative to the device; and/or movement of a user's eye or eyes relative to the device; and/or a change in orientation and/or position of the device.
  14. 14. A display device, comprising: a display screen arranged to display first content; a user interface arranged to receive user input to change the display of content, the user input being in the form of relative movement between a user of the device and the device itself; a processor arranged to process the first content to provide second content, the second content being a modification of the first content; and the display screen being arranged to display the second content.
  15. 15. The display device of claim 14, wherein the device is a user-wearable device, and/or aportable device.
  16. 16. A method or a display device substantially as described herein, substantially as described herein with reference to the accompanying Figures, or substantially as shown in the accompanying Figures.
GB1419622.4A 2014-11-04 2014-11-04 Display method and device Withdrawn GB2532010A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1419622.4A GB2532010A (en) 2014-11-04 2014-11-04 Display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1419622.4A GB2532010A (en) 2014-11-04 2014-11-04 Display method and device

Publications (2)

Publication Number Publication Date
GB201419622D0 GB201419622D0 (en) 2014-12-17
GB2532010A true GB2532010A (en) 2016-05-11

Family

ID=52118676

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1419622.4A Withdrawn GB2532010A (en) 2014-11-04 2014-11-04 Display method and device

Country Status (1)

Country Link
GB (1) GB2532010A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108121491A (en) * 2017-12-18 2018-06-05 威创集团股份有限公司 A kind of display methods and device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002007027A (en) * 2000-06-27 2002-01-11 Masanori Idesawa Image information display device
US20020140666A1 (en) * 2001-03-29 2002-10-03 Bradski Gary R. Intuitive mobile device interface to virtual spaces
JP2005215031A (en) * 2004-01-27 2005-08-11 Sony Corp Display apparatus and display control method, recording medium and program
WO2006036069A1 (en) * 2004-09-27 2006-04-06 Hans Gude Gudensen Information processing system and method
US20100174421A1 (en) * 2009-01-06 2010-07-08 Qualcomm Incorporated User interface for mobile devices
US20110209090A1 (en) * 2010-02-19 2011-08-25 Sony Europe Limited Display device
US20120194692A1 (en) * 2011-01-31 2012-08-02 Hand Held Products, Inc. Terminal operative for display of electronic record
US20120310588A1 (en) * 2011-05-31 2012-12-06 Hon Hai Precision Industry Co., Ltd. Electronic device and display adjustment method
US20140002351A1 (en) * 2012-07-02 2014-01-02 Sony Computer Entertainment Inc. Methods and systems for interaction with an expanded information space
US20140009504A1 (en) * 2012-07-09 2014-01-09 Hon Hai Precision Industry Co., Ltd. Handheld device and method for displaying software interface

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002007027A (en) * 2000-06-27 2002-01-11 Masanori Idesawa Image information display device
US20020140666A1 (en) * 2001-03-29 2002-10-03 Bradski Gary R. Intuitive mobile device interface to virtual spaces
JP2005215031A (en) * 2004-01-27 2005-08-11 Sony Corp Display apparatus and display control method, recording medium and program
WO2006036069A1 (en) * 2004-09-27 2006-04-06 Hans Gude Gudensen Information processing system and method
US20100174421A1 (en) * 2009-01-06 2010-07-08 Qualcomm Incorporated User interface for mobile devices
US20110209090A1 (en) * 2010-02-19 2011-08-25 Sony Europe Limited Display device
US20120194692A1 (en) * 2011-01-31 2012-08-02 Hand Held Products, Inc. Terminal operative for display of electronic record
US20120310588A1 (en) * 2011-05-31 2012-12-06 Hon Hai Precision Industry Co., Ltd. Electronic device and display adjustment method
US20140002351A1 (en) * 2012-07-02 2014-01-02 Sony Computer Entertainment Inc. Methods and systems for interaction with an expanded information space
US20140009504A1 (en) * 2012-07-09 2014-01-09 Hon Hai Precision Industry Co., Ltd. Handheld device and method for displaying software interface

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108121491A (en) * 2017-12-18 2018-06-05 威创集团股份有限公司 A kind of display methods and device
CN108121491B (en) * 2017-12-18 2021-02-09 威创集团股份有限公司 Display method and device

Also Published As

Publication number Publication date
GB201419622D0 (en) 2014-12-17

Similar Documents

Publication Publication Date Title
JP6293763B2 (en) Hybrid system and method for low latency user input processing and feedback
US9791918B2 (en) Breath-sensitive digital interface
US20160357358A1 (en) Device, Method, and Graphical User Interface for Manipulating Application Windows
US20150193120A1 (en) Systems and methods for transforming a user interface icon into an enlarged view
US20150242083A1 (en) Circumferential span region of a virtual screen
KR20190133080A (en) Touch free interface for augmented reality systems
KR20150116871A (en) Human-body-gesture-based region and volume selection for hmd
CN108073432B (en) User interface display method of head-mounted display equipment
JP2014157466A (en) Information processing device and storage medium
US20180181287A1 (en) Content bumping in multi-layer display systems
US10353550B2 (en) Device, method, and graphical user interface for media playback in an accessibility mode
JP7005161B2 (en) Electronic devices and their control methods
KR20150056074A (en) Electronic apparatus and method for screen sharing with external display apparatus
JP2018022516A (en) Magnifying tool for viewing and interacting with data visualization on mobile device
US10095384B2 (en) Method of receiving user input by detecting movement of user and apparatus therefor
KR20150134674A (en) User terminal device, and Method for controlling for User terminal device, and multimedia system thereof
KR20170059242A (en) Image display apparatus and operating method for the same
US20210117048A1 (en) Adaptive assistive technology techniques for computing devices
US11915671B2 (en) Eye gaze control of magnification user interface
WO2018198703A1 (en) Display device
US10114501B2 (en) Wearable electronic device using a touch input and a hovering input and controlling method thereof
Pollmann et al. HoverZoom: making on-screen keyboards more accessible
KR20160097868A (en) A display apparatus and a display method
GB2532010A (en) Display method and device
CN109804340B (en) Method and device for page display, graphical user interface and mobile terminal

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)