GB2496134A - Touch sensitive, interactive image system - Google Patents

Touch sensitive, interactive image system Download PDF

Info

Publication number
GB2496134A
GB2496134A GB1118804.2A GB201118804A GB2496134A GB 2496134 A GB2496134 A GB 2496134A GB 201118804 A GB201118804 A GB 201118804A GB 2496134 A GB2496134 A GB 2496134A
Authority
GB
United Kingdom
Prior art keywords
text
image
information
tactile
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1118804.2A
Other versions
GB201118804D0 (en
Inventor
Julian Warwick Coleman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEXUS ALPHA LOW POWER SYSTEMS Ltd
Original Assignee
NEXUS ALPHA LOW POWER SYSTEMS Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEXUS ALPHA LOW POWER SYSTEMS Ltd filed Critical NEXUS ALPHA LOW POWER SYSTEMS Ltd
Priority to GB1118804.2A priority Critical patent/GB2496134A/en
Publication of GB201118804D0 publication Critical patent/GB201118804D0/en
Publication of GB2496134A publication Critical patent/GB2496134A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41MPRINTING, DUPLICATING, MARKING, OR COPYING PROCESSES; COLOUR PRINTING
    • B41M3/00Printing processes to produce particular kinds of printed work, e.g. patterns
    • B41M3/16Braille printing

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An image (4), such as a tactile map, is made interactive by placing the image on a touch sensor panel (5) such as a projected capacitance touch panel. The touch panel is connected to a control circuit such as a computer which is arranged so as to sense touches on the image and may trigger events as may be programmed into the computer. The system may be reprogrammed and updated, over network links where available, thus providing a rapid and cost effective way to manage changes. A typical use would be a tactile map of the locale, where the playback of audio files provides a visually impaired user with access to information such as local facilities, transport services etc. Where the computer is connected to a network, interactive services such as live transport information may also be provided. Inputs and outputs are provided so as to support the attachment of third party systems designed to support the visually impaired.

Description

PAGE 1
A METHOD TO PROVIDE PRINTED AND TACTILE SURFACES WITH A
PROGRAMMABLE TOUCH RESPONSE FACILITY FOR TRIGGERING
RESPONSES.
The present invention relates to a method which uses a projected capacitance type touch panel which is placed behind a tactile or printed image in such a way that touches on the image can be sensed and used to generate responses. This technique is of particular use for the generation of interactive tactile maps for use by the visually impaired.
Various types of tactile images are known. Typically an image is created using layers of plastic bonded together to form a three dimensional representation of a location such as a building, public facility or a public space such that the visually impaired can feel the geography of that location. Symbols may be bonded to parts of the map and a key to those symbols may be used to facilitate identification. Braille and raised text may be used to explain the symbols and provide explanatory information.
It is also known to trigger audio by means of embedded switches which are connected to an audio playback device.
Tactile images constructed in this way suffer from inherent problems which limit their efficacy particularly when used to provide information to visually impaired users.
Specifically, it is not possible to design a sufficiently extensive symbol set so as to be able to have a unique symbol for every unique feature. Hence tactile keys are required for each image enabling generic symbols to be used for different functions in different locations, thus requiring the user both to read the key and to adapt to these changes.
Page 2 It is also not possible using conventional methods to change the image information without a change to the entire image since the image must itself contain all relevant details rendered in Braille or raised text. For example if a map of a shopping area is created and deployed, subsequent to which the occupancy of one or more shop units changes, it is necessary to change the entire image so as to update the information such as the shop names.
In addition, for conventional technologies, the provision of audio is expensive to implement and creates additional and significant costs when changes must be made since the entire map surface, together with its switch elements must be constructed as new.
As a result of these problems tactile maps are not frequently deployed because they are not cost effective. Where they are used they may frequently be out of date as changes are both slow and expensive to implement.
In accordance with an aspect of the invention, there is provided an interactive tactile image comprising: the image; a projected capacitance type touch panel on which the tactile image is placed; an optional backlight which is placed behind the touch panel; and a computer (with associated additional components as may be required) to which the touch panel is connected.
The image may be constructed in any suitable way such as by using a multi layer print technique or by using separate machined parts which are bonded together.
The touch panel is of the projected capacitance type which is able to sense the presence of touches through the tactile image which overlays it. At this time only certain projected capacitance panels may achieve this effect in a manner suitable for this application, however in principle any other touch panel type which is able to operate to similar effect, may be used.
Page 3 The backlight, which is optional, is employed where it is of benefit to partially sighted users to be able to see the image more clearly. In this instance it is clear that the touch panel and the tactile image must be constructed of materials which allow light to pass through and that the image must be suitably designed for such a purpose and will preferably be of simple, high contrast design.
The computer may be of any suitable type and may also be attached to any other devices or support infrastructure as may be required to achieve the desired functions.
In accordance with another aspect of the invention, there is provided a method of configuring the touch sensor and computer, the method comprising: defining in software by use of a configuration file or other such mechanism, the regions which must respond to touch events, these regions to relate to pertinent regions of the overlying image; defining in software what result is required to happen when the region is triggered in accordance with the configuration file; and providing the means by any combination of software and hardware to generate the required output event.
The method may further comprise interactive elements on the image such as a keyboard or other graphic or tactile device which is configured so as to support interactive services.
The provision of the projected capacitance type sensor panel located behind the image enables a variety of interactive functions to be provided according to the required function of the individual image. For example regions of the image may be defined that trigger audio responses which are provided so as to be able to provide the user with additional information. Alternatively, or in addition, regions may be defined which trigger other actions as may be required.
The use of a projected capacitance (or similar) type panel removes the need for mechanical switches which are complex to install and which are prone to failure.
The ability to trigger audio or other responses reduces the requirement of the image to include explanatory text either in Braille or in relief script. This has the benefit of simplifying the map whist supporting the availability of more detailed information.
Page 4 The provision of audio which provides feedback and information to the user removes the need for a separate key to the symbols used, thus both simplifying design and making the image easier to use, even for the delivery of complex information. In addition the provision of audio provides the user with additional information on the features on the image and hence the number of symbols required to impart said information can be reduced. By these means the images may be more consistent in their use of symbols as symbols are not required to be unique.
The inclusion of the computer provides a means by which the trigger regions may easily be redefined if necessary and by which the responses may also be easily redefined. In this manner the function and information can be kept up to date easily and cost effectively.
The measures previously detailed simplify image design and construction and consequently reduce the cost of any image replacement that may be necessary.
The inclusion of the computer together with network links enables remote monitoring and numerous other additional functions.
Preferably the image is made translucent and the fitment of the backlight increases the legibility of the image for users with limited vision.
Preferably the image is provided with a tactile keyboard which may be configured so as to provide the user with a way to access further information, for example at a bus station, a user may type in a destination and the device will speak details of the services expected for that destination. Where access to dynamic information, for example bus movement data, is available, the method enables the system to offer up to the minute information to the user.
For a better understanding of the present invention, and to show more clearly how it may be carried into effect, reference will now be made by way of example, to the following accompanying drawings, in which: Page 5 Fig 1 is an example image which is a representation of a small railway station Fig 2 is a schematic cross section view of the interactive tactile image system.
Fig 3 is a schematic of the electronic components Figure 1 shows an example image representing, in this case, a railway station showing its key features and facilities. Where it is of benefit for this image to be tactile, the image may be created using any suitable technique, such as 3D printing or by assembling machined parts, such that relevant features of the location are modelled so as to provide a raised element which can be sensed by touch.
Image features may include representations of elements such as railway lines (1) and buildings containing facilities (2) which are of interest to the map user. These facilities may be further defined by use of raised symbols of which (3) is an example.
Any number of features and symbols may be defined according to local requirements.
The image (4) is shown in cross section in Figure 2, lain on top of the touch sense panel (5) with an optional lighting module (6) beneath. The configuration illustrated in Figure 2 enables the projected capacitance sensor panel to be used so as to sense touches on the image surface.
The lighting module (6), if fitted, is used to illuminate the map from beneath so as to enhance the contrast of the image so as to improve visibility for partially sighted users. In this configuration the image must be constructed from a mixture of translucent elements (for the main body) and opaque elements (for the features).
Figure 3 shows a typical electronic assembly wherein the sensor panel (4) is connected to a computer (7) such that the computer can read the sensor data from the panel. Additionally a radio trigger module (8) together with an aerial (9) may be fitted to the system so as to make the system compliant with optional third party proprietary personal transmitter devices such as the React' (RNIB) and Step-Hear' (Geemarc) systems. Additionally a communications interface (10) may be connected to the computer so that the computer may gain access to other resources. This module may be any suitable connection method such as a wired network, WiFi, GPRS etc or may be any combination of such modules.

Claims (1)

  1. <claim-text>Page 6 Claims 1. A method for rendering an image interactive such that it may respond to touches comprising: a sensor panel; an image, such as a tactile map, which is lain on top of the sensor panel and a control circuit, such as a computer with necessary peripheral elements, which is connected to the sensor panel and which is arranged so as to sense touches on the image.</claim-text> <claim-text>2. A method, as claimed in claim 1, wherein the control circuit is configured to recognise specific touch events on specific regions of the image and to trigger any suitable responses as may be configured for the specific instance of the system.</claim-text> <claim-text>3. A method, as claimed in any preceding claim, wherein the image may be either a flat image or a tactile image which is constructed such that the surface relief renders the image accessible to visually impaired users who may touch the image and thus discern its salient features.</claim-text> <claim-text>4. A method, as claimed in any preceding claim, wherein the image may be printed using a suitable three dimensional printing technique.</claim-text> <claim-text>5. A method, as claimed in any preceding claim, wherein the image may be constructed using machined parts.</claim-text> <claim-text>6. A method, as claimed in any preceding claim, wherein the system is programmed to recognise touch events at specific places in the image such that responses may be programmed which are specific to that place, thus making it possible for the system to respond to the user.</claim-text> <claim-text>7. A method, as claimed in any preceding claim, wherein the system is programmed to differentiate between a touch which is a movement, a single tap or a double tap (or other pattern of taps), which thus allows the system be configured to generate different responses according to the touch event so recognised.</claim-text> <claim-text>Page 7 B. A method, as claimed in any preceding claim, wherein the system may be programmed to generate any type of response such as the playback of sound (including speech), the triggering of other devices such as machinery or any combination.</claim-text> <claim-text>9. A method, as claimed in any preceding claim, wherein the image includes an interactive element such as a tactile keyboard enabling the user to enter requests for information, such as by typing in the destination for which they are seeking service information.</claim-text> <claim-text>10. A method, as claimed in any preceding claim, wherein the image is made partially translucent and is fitted with a backlight so as to enhance the contrast for the benefit of partially sighted users or for use in low light conditions.</claim-text> <claim-text>11. A method, as claimed in any preceding claim, wherein the system is fitted with a receiver so as to be able to operate with third party devices designed to assist the visually impaired with navigation and local information (such as the RNIB React' or the Geemarc Step-Hear' devices) and to respond as may be necessary to any received transmissions.</claim-text> <claim-text>12. A method, as claimed in any preceding claim, wherein the system is connected via some form of networking to other computer services and is able to extract additional information such as transport service details as may be requested by the user.</claim-text>
GB1118804.2A 2011-11-01 2011-11-01 Touch sensitive, interactive image system Withdrawn GB2496134A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1118804.2A GB2496134A (en) 2011-11-01 2011-11-01 Touch sensitive, interactive image system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1118804.2A GB2496134A (en) 2011-11-01 2011-11-01 Touch sensitive, interactive image system

Publications (2)

Publication Number Publication Date
GB201118804D0 GB201118804D0 (en) 2011-12-14
GB2496134A true GB2496134A (en) 2013-05-08

Family

ID=45375609

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1118804.2A Withdrawn GB2496134A (en) 2011-11-01 2011-11-01 Touch sensitive, interactive image system

Country Status (1)

Country Link
GB (1) GB2496134A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030098803A1 (en) * 2001-09-18 2003-05-29 The Research Foundation Of The City University Of New York Tactile graphic-based interactive overlay assembly and computer system for the visually impaired
US20030235452A1 (en) * 2002-06-21 2003-12-25 Microsoft Corporation Method and system for using a keyboard overlay with a touch-sensitive display screen
WO2007035115A1 (en) * 2005-09-20 2007-03-29 David Norris Kenwright Apparatus and method for proximity-responsive display materials
GB2475253A (en) * 2009-11-11 2011-05-18 Novalia Ltd A sheet with capacitive elements for input

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030098803A1 (en) * 2001-09-18 2003-05-29 The Research Foundation Of The City University Of New York Tactile graphic-based interactive overlay assembly and computer system for the visually impaired
US20030235452A1 (en) * 2002-06-21 2003-12-25 Microsoft Corporation Method and system for using a keyboard overlay with a touch-sensitive display screen
WO2007035115A1 (en) * 2005-09-20 2007-03-29 David Norris Kenwright Apparatus and method for proximity-responsive display materials
GB2475253A (en) * 2009-11-11 2011-05-18 Novalia Ltd A sheet with capacitive elements for input

Also Published As

Publication number Publication date
GB201118804D0 (en) 2011-12-14

Similar Documents

Publication Publication Date Title
CN107660301B (en) Electronic display panel and systems and methods associated therewith
US20130271482A1 (en) Electronic map generation
KR20130040131A (en) Touch keypad module
US20170215032A1 (en) Geotags For Accessing Local Information By The Visually Impaired
JP2006164929A (en) Keyboard device for displaying character by luminescent array and key unit thereof
US20220068199A1 (en) Display device and a vehicle with the display device
TW200516436A (en) Information display system and information display method
CN1460237A (en) Notification service on transportation network
EP3670229A1 (en) A display device and a vehicle comprising the display device
JP2008538872A (en) Simultaneous information context distribution system in public mode and individual mode
CN103863216A (en) Method and device for operating an electronic device
KR101789331B1 (en) Apparatus and method for sharing informaion in virtual space
US20170092153A1 (en) Assistive Grid For Mobile Devices
JPWO2014196157A1 (en) Display device, display unit and conference system
GB2496134A (en) Touch sensitive, interactive image system
US20190281068A1 (en) Method for providing an access device for a personal data source
Giménez et al. Augmented reality as an enabling factor for the Internet of Things
Tomitsch Interactive media facades—research prototypes, application areas and future directions
US6370395B1 (en) Interactive office nameplate
GB2428503A (en) Configurable touch screen keypad
Iqbal et al. Intelligent Bus Stops in the Flexible Bus Systems.
RU161048U1 (en) INFORMATION TERMINAL
Ringbauer et al. From “design for all” towards “design for one”–A modular user interface approach
JP2006268863A (en) Display board system
WO2007004193A1 (en) Communication system and method

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)