GB2622184A - Personal assistance systems and methods - Google Patents

Personal assistance systems and methods Download PDF

Info

Publication number
GB2622184A
GB2622184A GB2206522.1A GB202206522A GB2622184A GB 2622184 A GB2622184 A GB 2622184A GB 202206522 A GB202206522 A GB 202206522A GB 2622184 A GB2622184 A GB 2622184A
Authority
GB
United Kingdom
Prior art keywords
user
controller
actuators
sensor
actuator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2206522.1A
Other versions
GB202206522D0 (en
Inventor
Clark Billy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kp Enview Ltd
Original Assignee
Kp Enview Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kp Enview Ltd filed Critical Kp Enview Ltd
Priority to GB2206522.1A priority Critical patent/GB2622184A/en
Publication of GB202206522D0 publication Critical patent/GB202206522D0/en
Publication of GB2622184A publication Critical patent/GB2622184A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/08Devices or methods enabling eye-patients to replace direct visual perception by another kind of perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1619Thorax
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5064Position sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5092Optical sensor

Landscapes

  • Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Rehabilitation Therapy (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Pain & Pain Management (AREA)
  • Epidemiology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Vascular Medicine (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

There is provided a personal assistance system 100, comprising: a sensor unit 200 comprising at least one sensor arranged to obtain data from the surrounding environment; at least one feedback unit 300 comprising at least one actuator arranged to provide haptic feedback to a user; and a controller arranged to receive the data from the at least one sensor 200 and to control the at least one actuator; wherein the controller is configured to: process the data from the at least one sensor 200 to detect an object in the surrounding environment and to determine the proximity of the object and at least one further property of the object; and control the at least one actuator in dependence on the proximity and the at least one further property of the object. The device is useful for assisting the blind and visually impaired.

Description

Personal Assistance Systems and Methods
Field of Invention
This invention relates to personal assistance systems and methods for assisting a visually impaired person, particularly in navigating their environment or interacting with objects in their environment.
Background
A visually impaired person faces a number of difficulties when navigating their environment or interacting with objects in their environment and thus typically relies on assistance in doing so. The most widely used solution for providing this assistance is a cane (known typically as a "white cane") which allows a user to scan their surroundings for objects (e.g. obstacles or orientation markers).
However, the white cane only provides assistance in identifying objects immediately in front of a user and provides no prior warning of objects further ahead or objects moving towards the user from the side or from behind. In addition, the white cane is unreliable for use in navigating, because it is prone to miss orientation markers causing a user to miss turnings.
Furthermore, the white cane provides no assistance at all for a user in interacting with their environment, such as opening doors or picking up objects.
The present invention seeks to ameliorate some of these problems. Summary of the Invention Aspects and embodiments of the present invention are set out in the appended claims. These and other aspects and embodiments of the invention are also described herein.
According to at least one aspect disclosed herein, there is provided a personal assistance system, comprising: a sensor unit comprising at least one sensor arranged to obtain data from the surrounding environment; at least one feedback unit comprising at least one actuator arranged to provide haptic feedback to a user; and a controller arranged to receive the data from the at least one sensor and to control the at least one actuator; wherein the controller is configured to: process the data from the at least one sensor to detect an object in the surrounding environment and to determine the proximity of the object and at least one further property of the object; and control the at least one actuator in dependence on the proximity and the at least one further property of the object. -2 -
Preferably, the controller is configured to control the at least one actuator to provide a haptic feedback having a characteristic that is dependent on the determined proximity and/or the at least one further property of the object.
According to another aspect disclosed herein, there is provided a personal assistance system comprising: a sensor unit comprising at least one sensor arranged to obtain data from the surrounding environment; at least one feedback unit comprising an at least one actuator arranged to provide haptic feedback; and a controller arranged to receive the data from the at least one sensor and to control the at least one actuator, wherein the controller is configured to: process the data from the at least one sensor to detect an object in the surrounding environment and to determine at least one property of the object; and control the at least one actuator to provide a haptic feedback having a characteristic that is correlated with the determined at least one property of the object.
Preferably, the characteristic of the haptic feedback provided by the at least one actuator is at least one of: a strength of the haptic feedback provided by the at least one actuator; and/or a frequency of the haptic feedback provided by the at least one actuator.
Preferably, the at least one property, or at least one further property, of the object determined by the controller is an identity of the object.
Preferably, the identity of the object is determined based on at least one physical attribute of the object detected by the at least one sensor.
Preferably, the at least one physical attribute is at least one of: a shape of the object or part of the object; a colour of the object or part of the object, and/or a movement pattern of the object across the field of view of the at least one sensor.
Preferably, the identity of the object is determined based on the data from the at least one sensor and stored data relating to objects of known identities.
Preferably, the identity of the object is determined using image processing software comprising machine learning algorithms.
Preferably, the at least one property, or at least one further property, of the object determined by the controller is at least one of: a position of the object relative to the sensor unit; a proximity of the object relative to the sensor unit; a size of the object relative to other objects within a field of view of the at least one sensor; an absolute size of the object; and a significance of the object. -3 -
Preferably, the controller is configured to determine the position of the object relative to the sensor unit based on the position of the object in the field of view of the at least one sensor.
Preferably, the controller is configured to determine the proximity of the object relative to the sensor unit based on the position of the object relative to other surrounding objects and/or the extent of the object as a proportion of the field of view of the at least one sensor.
Preferably, the sensor unit comprises at least two sensors, and the controller is configured to determine the proximity of the object relative to the sensor unit based on a comparison of the positions of the object in each of the sensor's fields of view.
Preferably, the controller is configured to determine the size of the object relative to other objects within a field of view of the at least one sensor by comparing the relative extents of the objects as a proportion of the field of view of the at least one sensor.
Preferably, the controller is configured to determine the absolute size of the object based on: a determined identity of the object and predetermined information relating to the absolute size of objects of the determined identity; and/or a determined proximity of the object and a determined extent of the object relative to the field of view of the at least one sensor.
Preferably, the controller is configured to determine the significance of the object based on a determined identity of the object and predetermined information relating to the danger associated with objects of the determined identity.
Preferably, the sensor unit comprises an array of actuators, and the controller is configured to control the actuators in a region of the array to provide haptic feedback having a characteristic that is dependent on at least one determined property of the object.
Preferably, the sensor unit comprises an array of actuators, and the controller is configured to control the actuators in a region of the array to provide haptic feedback having a characteristic that is correlated with at least one determined property of the object.
Preferably, the characteristic of the haptic feedback provided by the actuators in the region of the array is at least one of: an intensity of haptic feedback provided by the actuators in the region, the intensity preferably being the average strength of the haptic feedback across the region; an extent of the region in the array; a shape of the region in the array; and/or a location of the region in the array.
Preferably, the at least one actuator comprises at least one motor, and/or wherein the haptic feedback comprises a vibration. -4 -
Preferably, the at least one sensor comprises at least one camera. Preferably, the at least one sensor comprises a set of two cameras. Preferably, the at least one camera is at least one 360 degree camera.
Preferably, at least one of the sensor unit, feedback unit, and controller are arranged to be 5 worn by a user.
Preferably, the sensor unit and feedback unit are arranged to be worn by a user. Preferably, the sensor unit is configured to be worn over the torso of a user.
Preferably, the at least one sensor is mounted on the sensor unit such that, in use, the at least one sensor is located on the shoulder of the user.
Preferably, the controller is integrated with either or both of the sensor unit or the feedback unit.
Preferably, the system further comprises a belt arranged to be worn around the torso, preferably the abdomen, of a user, the belt comprising one of the at least one feedback units.
Preferably, the system comprises an array of actuators, wherein the array of actuators is arranged as a grid on the belt, such that in use the actuators provide haptic feedback to the torso, preferably the abdomen, of a user.
Preferably, the controller is configured to: process the data from the at least one sensor to determine the position of a user's limb relative to the detected object; and control the at least one actuator in dependence on the determined position of the user's limb relative to the detected object.
Preferably, the controller is configured to control the at least one actuator by adjusting a characteristic of the haptic feedback provided by the or each actuator in dependence on, preferably in correlation with, the determined position of the user's limb relative to the detected object.
Preferably, the system comprises an array of actuators, wherein the actuators are arranged in groups wherein the controller is configured to control the groups of actuators in dependence on the direction from the user's limb to the detected object, preferably to activate the actuators in a group indicative of the direction from the user's limb to the detected object.
Preferably, the system comprises an array of actuators, wherein the actuators are arranged in groups wherein at least one of the groups is arranged in a linear configuration, and wherein -5 -the controller is configured to vary the strength of the haptic feedback provided by each actuator in the group along the length of the linear configuration in dependence on the proximity and/or direction from the user's limb to the detected object.
Preferably, the system comprises an array of actuators, wherein the actuators are arranged in groups wherein the groups of actuators are distributed substantially equidistant from one another around the feedback unit.
Preferably, each group consists of three actuators.
Preferably, the controller is configured to control the at least one actuator to provide a signature haptic feedback when it is determined that the position of the user's limb corresponds with the position of the object.
Preferably, the signature haptic feedback comprises at least one of activating a specific actuator or group of actuators; controlling an actuator or group of actuators to provide haptic feedback with a particular characteristic; and activating a group of actuators in a specific order, pattern, or configuration.
Preferably, the controller is configured to control the at least one actuator to provide at least one of: a first signature haptic feedback when it is determined that the vertical position of the user's limb corresponds with the vertical position of the object; a second signature haptic feedback when it is determined that the horizontal position of the user's limb corresponds with the horizontal position of the object; and a third signature haptic feedback when it is determined that both the vertical and horizontal positions of the user's limb correspond with the vertical and horizontal position of the object.
Preferably, the system comprises a glove and/or wristband, and wherein the glove and/or wristband comprises one of the at least one feedback units.
Preferably, the glove and/or wristband comprises a movable portion arranged to be moved, in use, to overlap a user's palm, wherein at least one actuator is located on the movable portion.
Preferably, the at least one actuator located on the movable portion is configured to provide the signature haptic feedback.
Preferably, the controller is configured to: receive information relating to a route; and control the at least one actuator in dependence on the information relating to the route.
Preferably, the controller is configured to control the at least one actuator to provide haptic feedback representative of navigation instructions along the route. -6 -
Preferably, the system comprises a user device configured to receive user input of the route, and to transmit information relating to the route to the controller.
Preferably, the system comprises means for determining a position of the user relative to the route, and wherein the controller is configured to control the at least one actuator in dependence on the determined position of the user relative to the route.
Preferably, the system comprises an array of actuators, wherein the array of actuators comprises two groups of actuators arranged in linear configurations, and wherein the controller is configured to vary the strength of the haptic feedback provided by each actuator in the group along the length of the linear configurations to provide haptic feedback representative of navigation instructions along the route.
Preferably, the two groups of actuators are arranged to intersect one another.
Preferably, the two groups of actuators are arranged to intersect one another such that one of the groups is oriented substantially perpendicularly to the other.
Preferably, the intersecting groups of actuators form multiple branches of actuators, each branch being indicative of a navigation direction.
Preferably, the controller is configured to: activate at least one of the actuators of a branch in dependence on the information relating to the route so as to provide haptic feedback indicative of the navigation direction corresponding to that branch and/or control the actuators of the branch in dependence on the determined position of the user relative to the route.
Preferably, the controller is configured to control the actuators of the branch in dependence on the proximity of the user to a waypoint on the route and/or adjust at least one of: the strength of haptic feedback of the actuators in the branch in dependence on the proximity of the user to a waypoint on the route; and the number of actuators in the branch that are activated.
According to another aspect disclosed herein, there is provided a personal assistance system comprising: a sensor unit comprising at least one sensor arranged to obtain data from the surrounding environment; at least one feedback unit comprising an at least one actuator arranged to provide haptic feedback; and a controller arranged to receive the data from the at least one sensor and to control the at least one actuator; wherein the system comprises means for receiving a user input of information relating to an object; and wherein the controller is configured to process the data from the at least one sensor to: detect the object in the surrounding environment based on the data from the at least one sensor and the user input information relating to the object; determine at least one property of the detected object; and 7 -control the at least one actuator in dependence on the at least one property of the object, preferably to convey to a user information relating to the location of the object.
According to another aspect disclosed herein, there is provided a computer implemented method comprising: receiving, from at least one sensor of a sensor unit, data relating to a surrounding environment; processing the data to detect an object in the surrounding environment and to determine at least one property of the object; and controlling an array of actuators of a feedback unit to provide haptic feedback, said controlling comprising controlling the actuators in dependence on the determined at least one property of the object.
As used herein, the word 'array' preferably refers to any ordered arrangement. For example, in some case this word is used to refer to a grid arrangement, while in other cases this word is used to refer to a linear arrangement, a cross-shaped arrangement, or a tshaped arrangement. The arrays described herein may comprise other arrays, for example a cross-shaped array comprises two intersecting linear arrays.
As used herein, the word 'haptic' preferably refers to the sense of touch and may refer to a vibration or a force for example. The term 'haptic feedback' should be interpreted accordingly as a signal provided to a user to be registered by their sense of touch.
As used herein, the word 'object' preferably refers to anything in the user's environment, such as something that poses a danger to the user. something a user might want to interact with, or a waypoint marker. An object may be stationary, moving, living, or inanimate.
As used herein, references to a characteristic of the haptic feedback being 'correlated with' a property of an object preferably refers to the haptic feedback changing in response to change in the property of the object. For example, the strength of the haptic feedback may be controlled to be proportional to the proximity of the object to the user.
Any apparatus feature as described herein may also be provided as a method feature, and vice versa. As used herein, means plus function features may be expressed alternatively in terms of their corresponding structure.
Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, method aspects may be applied to apparatus aspects, and vice versa. Furthermore, any, some and/or all features in one aspect can be applied to any, some and/or all features in any other aspect, in any appropriate combination.
It should also be appreciated that particular combinations of the various features described and defined in any aspects of the invention can be implemented and/or supplied and/or used independently. -8 -
The invention also provides a computer program or a computer program product for carrying out any of the methods described herein, and/or for embodying any of the apparatus features described herein, and a computer readable medium having stored thereon a program for carrying out any of the methods described herein and/or for embodying any of the apparatus features described herein.
The invention also provides a signal embodying a computer program or a computer program product for carrying out any of the methods described herein, and/or for embodying any of the apparatus features described herein, a method of transmitting such a signal, and a computer product having an operating system which supports a computer program for carrying out the methods described herein and/or for embodying any of the apparatus features described herein.
Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, method aspects may be applied to apparatus aspects, and vice versa. As used herein, means plus function features may be expressed alternatively in terms of their corresponding structure, such as a suitably programmed processor and associated memory.
Furthermore, features implemented in hardware may generally be implemented in software, and vice versa. Any reference to software and hardware features herein should be construed accordingly.
The invention extends to methods, system and apparatus substantially as herein described and/or as illustrated with reference to the accompanying figures.
One or more aspects will now be described, by way of example only and with reference to the accompanying drawings having like-reference numerals, in which: Figure 1 is an overview of the personal assistance system, showing an exemplary sensor unit and multiple exemplary feedback units.
Figure 2 is a rear view of the sensor unit; Figure 3a is a front view of the inner face of a first feedback unit; Figure 3b is a perspective view of the first feedback unit, fastened as when worn by a user; Figure 3c shows front, rear and side views of a user wearing the first feedback unit around their torso; -9 -Figure 4a and 4b are front and rear views of a second feedback unit, as worn on a user's wrist and hand, with a movable portion overlapping the user's palm; Figure 4c is a front view of the second feedback unit, as worn on a user's wrist and hand, with the movable portion overlapping the user's wrist; and Figure 5a and 5b are front and rear views of a third feedback unit, as worn on a user's wrist.
Detailed description
The personal assistance system disclosed herein comprises a sensor unit, at least one feedback unit, and a controller. An overview of the main components of the system is described below with reference to Figure 1. The sensor unit provides a means for detecting objects in the user's surrounding environment and is described below with reference to Figure 2. The feedback unit comprises an array of actuators arranged to provide haptic feedback to the user. In one example, the feedback unit is formed as a belt for being worn around a user's torso, as described below with reference to Figures 3a to 3c, and is configured to warn a user of obstacles in their surrounding environment. In another example, the feedback unit is formed as a glove for being worn on a user's hand and wrist, as described below with reference to Figures 4a and 4b, and is configured to assist a user in interacting with objects in their surrounding environment. In another example, the feedback unit is formed as a wristband for being worn on a user's wrist, as described below with reference to Figures 5a and 5b, and is configured to assist a user in navigating their surrounding environment. The personal assistance system may comprise a combination of two or three of these feedback units, along with the sensor unit, to provide combined assistance to the user.
System overview Figure 1 shows an overview of the personal assistance system 100. The system 100 comprises a sensor unit 200 comprising at least one sensor arranged to obtain measurements, i.e. data, of the environment surrounding the sensor unit. The system 100 also comprises a first haptic feedback unit 300 for warning a user of obstacles in their surrounding environment, a second haptic feedback unit 400 for assisting a user in interacting with objects in their surrounding environment, and a third haptic feedback unit 500 for assisting a user in navigating their surrounding environment. Each of the feedback units 300, 400, 500 comprises an array of actuators arranged to provide haptic feedback to the user.
The system also comprises a controller arranged to receive the measurements, i.e. data, from the at least one sensor of the sensor unit and to control the actuators of the feedback units. The controller is configured to process the measurements, i.e. data, from the at least one -10 -sensor of the sensor unit to detect objects in the surrounding environment and to determine at least one property of the objects. The controller then controls the actuators in dependence on the determined at least one property of the objects.
The feedback units 300, 400, 500 are connected to the controller by a wired connection in this example. In other examples, the feedback units may be connected to the controller by a wireless connection. The sensor unit 200 and the feedback units 300, 400, 500 each have features that enable these units to be worn by a user, so that the system travels with the user as they move around to provide assistance to the user.
Object detection Figure 2 is a rear view of the sensor unit 200. The sensor unit 200 in this example is formed as a back-pack, having a pair of straps 202 through which a user passes their arms to wear the sensor unit on their back. The straps may also comprise a clasp for attaching the straps together across the front of the user's torso to secure the sensor unit to the user. Figure 1 shows front and rear views of a user wearing the sensor unit in this way.
The sensor unit 200 comprises a pair of sensors, which in this example are a pair of cameras 204a, 204b providing a stereo vision system. The cameras in this example are 360 degree cameras and as such are each capable of imaging the environment surrounding the user in all directions. The cameras 204a, 204b are mounted on the sensor unit 200, specifically on the straps 202 of the unit, so that when the unit is worn by a user each of the cameras sit on top of each of the user's shoulders. The location of the cameras 204a, 204b on the sensor unit 200 in this way enables the cameras to achieve a clear field of view (except for the portion of each camera's field of view obscured by the user's head, which is detected by means of the other camera) while minimising inconvenience to the user.
The personal assistance system 100 also comprises a controller which in this example is a microprocessor 206 integrated with sensor unit 200. The microprocessor is configured to process the images received from the cameras 204a, 204b (received, for example, via a wired or wireless connection between the microprocessor and the cameras). By this processing, the microprocessor detects objects in the surrounding environment and determines at least one property of the objects. The controller then controls the actuators of at least one of the feedback units in dependence on the determined at least one property of the objects.
The system 100 also comprise a power supply which in this example is a battery, or battery pack, 208. The battery 208 in this example is integrated with the sensor unit 200 so that the weight of the battery 208 is conveniently carried on the user's back with the sensor unit 200.
The battery 208 provides power to the microprocessor 206, for example by a wired connection between the two. The battery 208 also provides power to the feedback units 300, 400, 500, for example also by a wired connection, to power the actuators of the feedback units.
The sensor unit 200 comprises a set of connection points 210a, 210b, 210c for connecting the sensor unit to each of the feedback units 300, 400, 500 respectively. The connection points provide means for connecting a power supply of the sensor unit (i.e. the battery 208) to the feedback units to power the actuators, and for connecting the controller (i.e. the microprocessor 206) to the actuators to enable the controller to transmit electrical control signals to the actuators.
The cameras 204a, 204b are configured to obtain images continuously or periodically of the environment surrounding the sensor unit 200. The microprocessor then processes these images to detect objects in the surrounding environment (as will be described) and to determine at least one property of the objects. Exemplary properties of the detected objects, and how these properties are determined by the controller, are set out below: * Object position This property refers to the direction from the user to the object relative to the direction the user is facing. The direction the user is facing is taken to be the direction forward from the sensor unit 200. Therefore, the direction forward from the sensor unit is used as a reference direction relative to which the position of an object is measured. As the cameras 204a, 204b are 360 degree cameras, the position of the object may be expressed as a bearing angle from the sensor unit relative to the direction forward from the sensor unit (i.e. the direction forward from the sensor unit being 0 degrees).
o This property is determined based on the position of the object within the field of view of camera. This determination may also be based on a calibration of the field of view of the cameras with the surrounding environment. As the cameras 204a, 204b are 360 degree cameras, the position of the object within the field of view of the camera provides the position of the object relative to the forward direction. For example, if the field of view of the camera is calibrated to start at the forward direction, and if it is determined that the object is exactly in the middle of the field of view of the camera, the object's position will be at 180 degrees to the forward direction (i.e. directly behind the user).
* Object proximity -12-o This property refers to the distance between the user and the object. It could be an absolute proximity, that is, an absolute distance between the user and the object, or a relative proximity, that is, whether the object is closer to or further from the user relative to another object.
a This property is determined using the stereo vision system provided by the two cameras 204a, 204b. The object proximity may be based on the object's position relative to other surrounding objects and/or position within each of the two camera's field of view. In the former case, this property is determined based on the relative size of the object within the camera image. In the latter case, this property may be determined based on a parallax measurement by the cameras 204a, 204b of the object relative to a background.
* Object identity o This property refers to the classification of the object. For example, this property may refer to a general category of object (e.g. vehicle, pedestrian, building). Alternatively, the property may relate to a specific property or a specific object, such as e.g. the user's personal telecommunication device.
o This property is determined based on the physical attributes of the object as recorded in the camera images. These attributes may be, for example, the shape, colour, or movement pattern of the object. This determination is made by comparing, in real time using image processing software comprising machine learning algorithms, the images from the cameras to images of known identities of objects stored within an internal database. Based on identified similarities between the camera images and the images in the database, the identity of the object in the camera image is determined. Various known object recognition methods may be used -for example, neural network approaches based on region-based convolutional neural networks or You Only Look Once (YOLO) may be used.
o Determining an identity of the object, and controlling the actuators in dependence on the determined identify, may be advantageous because it enables a user to distinguish between different objects. For example, in the context of the protection assistance (as described below) a user can recognise the identity of objects based on the haptic feedback and thereby understand the danger posed by those objects. In the context of interaction assistance (as also described below) distinguishing between different objects enables a user to be guided towards a specific object, or type of object, with which they wish to interact. This may be particularly advantageous if the actuators are -13 -controlled in dependence on the determined identity of the object and the proximity of the object as this combination of information allows a user to navigate and interact with their environment effectively.
* Relative object size o This property refers to the size of the object as compared to other nearby objects, or as a proportion of the total camera field of view.
o This property is determined based on the extent of the object in the field of view of the camera, as compared to the extent of another object also in the field of view of the camera. For example, this property may by derived from the number of camera pixels covered by the object as compared to the number of pixels covered by another detected object, or as compared to the total number of pixels over the entire camera field of view.
* Absolute object size o This property refers to the objective size of the object, rather than as compared to another object, or as a proportion of the total camera field of view.
o This property is determined based on the relative size of the object (as described above), in combination with other information to calculate the absolute size. For example, if the relative size of the object is known as a proportion of the total camera field of view, and the distance to the object is also known, then the absolute size of the object can be calculated based this known information, using a calibration of the absolute size of the camera field of view as a function of distance from the camera. In another example, if the relative size of the object is known as compared to another nearby object, and if the absolute size of the other object and the difference in the proximities of the two objects to the camera are known, the absolute size of the object can be calculated.
The processing of the camera images by the controller to determine the properties listed above is based on artificial intelligence image recognition algorithms. This image processing can be performed on-the-fly to provide the determined properties of the objects in real time and with a high level of accuracy. These algorithms preferably determine the properties listed above by recognising one or more object features which can be translated into each of the listed properties, such as object position, object identity and relative and absolute object size.
Alternatively or additionally, the algorithms may utilise elements of motion and/or distance detection to assist with identifying the properties listed above, such as object proximity, -14 -position, and size. Such motion and/or distance detection may involve, for example, passive infrared, ultrasonic, radar, laser telemetric or microwave motion/distance detection.
Once the controller has determined the at least one property of the object, the controller is configured to control the actuators of the feedback units so as to provide haptic feedback to the user. The controller is configured to control the actuators in dependence on the determined properties of the objects. In this way the haptic feedback to the user is representative of the properties of the object, and thus provides information to the user about their surrounding environment. With time, the user will become accustomed to interpreting the haptic feedback to understand their surrounding environment.
Protection assistance The sensor unit can be used in combination with a first feedback unit to protect the user by warning them about potentially dangerous obstacles in their surrounding environment.
Figure 3a is a front view of the inner face of the first feedback unit 300. Figure 3b is a perspective view of the first exemplary feedback unit 300, fastened as when worn by a user.
This feedback unit is formed as a belt, arranged to be worn around a user's torso, preferably around their abdomen, so as to provide haptic feedback to the user's torso, preferably to their abdomen.
As shown in Figure 3a, the feedback unit 300 comprises a belt body 302 which is an elongate strip of material sized to be wrapped around the torso of the user. The feedback unit 300 comprises an array of actuators. The actuators of this feedback unit 300 (and the actuators of the feedback units 400, 500 described below) are preferably vibration actuators, and have a controllable frequency, amplitude, and occurrence. Suitable options for the actuators are as follows: * Eccentric Rotating Mass (ERM) -advantageously, this actuator is widely used and readily available and is simple to use and control.
* Piezo Haptic Actuator -advantageously, this actuator has a quick response time, and a controllable frequency and amplitude.
* Linear Resonant Actuator -advantageously, this actuator is compact, has a controllable amplitude, and is readily available.
* Solenoid Hapfic Actuator -advantageously, this actuator has a wide frequency range and amplitude control.
In the examples described herein, the actuators are motors 304 (such as ERM motors). The motors 304 are arranged in rows and columns forming a rectangular grid configuration. The -15 -motors 304 are individually acfivatable by the controller, and the characteristics of the haptic feedback provided by each motor can be adjusted by the controller as described below. In this example, the haptic feedback provided by the motors is a vibration.
To wear the feedback unit, the ends of the belt body 302 are secured together by a fastener 306 which may be, for example, VelcroTm or a buckle, as shown in Figure 3b, with the motors 304 on the inner facing side of the belt. In this way, when the belt is secured around the torso of a user, the motors 304 contact the user's torso such that the vibrational haptic feedback provided by the motors is felt by the user's toros. Figure 3c shows front, rear and side views of a user wearing the feedback unit as described. In use, the belt should be worn with the fastener 306 at the front of user's torso such that the orientation of the belt is consistent, and the positions of the motors are calibrated (i.e. if the controller activates motors associated with the left of the unit, those motors will in fact be on the left hand side of the user's torso when the unit is being worn).
The controller is configured to control the motors 304 in dependence on the determined properties of the objects detected by the cameras 204a, 204b so as to provide information about the user's surrounding environment. This information may be provided by selectively activating certain motors 304 in the array, or by adjusting the characteristics of the haptic feedback provided by certain motors 304 in the array. Exemplary characteristics of the vibrational haptic feedback that may be adjusted are set out below: * Position: the controller selects which of the motors in the array are activated so that the vibrational haptic feedback is provided to the user at a position on the belt representative of the position of the object relative to the user. For example, if a user is stood with an object on their left, the control will activate the motors in a region on the left of the belt to provide a vibrational haptic feedback on the left-hand side of the user's torso to indicate the presence of the object on their left. If the object were to turn to face the object, or if the object moved across the front of the user's body, the controller would move the haptic feedback across the belt to the front of the belt, by gradually deactivating the motors on the left of the region of the belt (e.g. column by column in the array) and gradually activating the motors on the right of the region of the belt.
* Strength: the strength of the vibrational haptic feedback can be adjusted by the controller. The strength of the haptic feedback is a measure of the amplitude of the vibrations. The strength of the feedback may be adjusted by the controller to indicate the proximity of the object to the user, with the controller powering the motors at a low -16 -strength when the object is far away and increasing the strength as the object draws nearer to the user.
* Extent and/or shape: the extent and/or shape of the region of motors on the belt that are activated by the controller can be adjusted to provide haptic feedback over a region of the user's body representative of a size (relative or absolute) and/or shape of the object. For example, if the controller identifies a lamppost ahead of the user, based on the camera images, the controller may activate a narrow vertical strip of motors (e.g. just a single column of motors) in the array as would be representative of the size and shape of the lamppost. Equally, if the controller identifies a large building ahead, the controller may activate a large rectangular block of motors in the array as would be representative of the size and shape of the building. The extent of the region of motors on the belt that are activated, as a proportion of the total area of the belt's inner face, is preferably proportional to the relative size of the object as a proportion of the
camera's field of view.
* Frequency: the frequency of the vibrational haptic feedback provided by the motors can be adjusted by the controller. The frequency of the haptic feedback is preferably adjusted to be representative of the absolute size of the object. Preferably, the larger the absolute size of the object, the lower the frequency of the haptic feedback provided by the motors; this relationship between haptic feedback frequency and object size is the most intuitive for a user. For example: if the object is a pedestrian, the controller would adjust the frequency of the haptic feedback to a high frequency; if the object is a vehicle, the controller would adjust the frequency of the haptic feedback to a medium frequency; and if the object is a building the controller would adjust the frequency of the haptic feedback to a low frequency.
Therefore, the feedback unit 300 of the system provides protection for the user by alerting the user to obstacles in their surrounding environment. The user can therefore move around with confidence based on their awareness of the obstacles ahead as provided by the haptic feedback of the belt This system has a number of advantages over existing solutions, such as white canes, particularly in that it provides the user with a pre-warning of objects much further ahead than is possible with a cane, and it provides 360 degree awareness of the objects so that the user is warned of obstacles at their side or behind them.
The haptic feedback provided by the motors is also practical to use, because the adjustment of the characteristics of the haptic feedback in dependence on the properties of the detected objects enables a user to understand the significance of the danger posed by each object, and -17 -thus adjust their reaction to each object proportionately or to prioritise their attention to the most significant dangers. This achieved because particularly dangerous objects (i.e. large or close objects) are indicated with a vigorous haptic feedback which would be very noticeable to a user, whereas less dangerous objects (i.e. small or distant objects) are indicated with a gentler haptic feedback which would be less noticeable to the user. For example, a large vehicle closer to a user would be indicated by a high strength, mid frequency, large extent haptic feedback whereas a stationary lamppost in the distance would be indicated with a low strength, high frequency, small extent haptic feedback.
Interaction assistance In addition to protecting the user against potential obstacles, the system also provides assistance to the user when interacting with objects, such as when picking up or grasping objects. This interaction assistance is provided by a second feedback unit.
Figure 4a and 4b are front and rear views of the second feedback unit 400, as worn on a user's wrist and hand, in one configuration, and Figure 4c is a front view of the second feedback unit 400 in another configuration. The front view is a view of towards the palm-side of the user's hand, while the rear view is towards the back of the user's hand.
The second feedback unit 400 comprises a wristband 402, arranged to be worn on a user's wrist and a movable portion arranged to be worn on the user's hand in use. The movable portion comprises a palm pad 404 and an elastic hoop 406. The palm pad is joined to the wristband by a folding joint, and the elastic hoop is joined to the palm pad at the opposite end of the palm pad to the folding joint. The palm pad is movable between a first position where it is secured to the user's hand such that it overlaps the user's palm (as shown in Figure 4a), and a second position where it is instead secured to the user's wrist such that it overlaps the wristband on the user's wrist (as shown in Figure 4c).
The elastic hoop 406 secures the palm pad 404 in the first or second positions. When not in use, the palm pad is kept in the second position so that it does not obstruct the user's palm and is also protected while it is not needed. To keep the palm pad in the second position, the elastic hoop is secured around the user's wrist so that the palm pad is folded at its foldable joint with the wristband 402, keeping the palm pad secured against the user's wrist and out of the way of the user's palm as shown in Figure 4c. When in use, the elastic hoop is moved from the user's wrist up to their hand, preferably past the user's thumb, so that the elastic hoop is secured around the user's palm, which then secures the palm pad over the user's palm as shown in Figure 4a.
-18 -The second feedback unit 400 comprises an array of actuators for providing haptic feedback. The actuators in this example are motors 410. Therefore, in use, the motors 410 are located against the user's hand and wrist so as to provides haptic feedback to the user's hand and wrist. The motors 410 are arranged in an array on the second feedback unit 400. The array comprises a set of four linear arrays 408 of motors 410, which in this example each consist of three motors. The four linear arrays 408 are distributed evenly around the wristband 402, with the linear arrays being arranged so that, when the unit is worn, one array in located along the underside of the user's wrist, another is located along the topside of the user's wrist, and the other two arrays are located along the left and right sides of the user's wrist respectively. The linear arrays are aligned along the longitudinal axis of the user's arm. The motors in each of the linear arrays closest to the user's hand (i.e. furthest from the user's body) are referred to as the distal motors while the motors furthest from the user's hand (i.e. closest to the user's body) are referred to as the proximal motors.
The array of motors 410 also comprises a fifth array 409 located on the palm pad. The fifth array 409 in this example consists of four motors arranged in a triangular or T-shaped configuration, with three of the four motors arranged in a linear configuration and the fourth motor located adjacent the central motor in the linear configuration. The palm pad itself is also triangular, such that the configuration of motors fits substantially concentrically within the palm pad. When the palm pad is in the first position, the fifth array 409 of motors are located against the user's palm so as to provide haptic feedback to the user's palm.
For use with the second feedback unit 400, the sensor unit 200 also comprises means for determining the position of the user's hand relative to the object they are attempting to interact with. In this example, this means is provided by the cameras 204a, 204b in combination with the controller 206. The controller is configured to process the images taken by the cameras 204a, 204b to detect the user's hand and to determine the position of the user's hand relative to the sensor unit using the methods described with reference to Figure 2. The controller detects the user's hand by, for example, comparing the images taken by the camera to stored images that are known to include a hand and detecting similarities between the images, or by recognising an identifier on the feedback unit 400. The identifier may comprise a fiducial marker (that is, an object placed in the field of view of an imaging system that appears in the image produced for use as a point of reference or a measure) within the design of the glove. This marker may comprise a dedicated data matrix or infrared light that is recognisable and trackable by the cameras and controller. This may allow for a more reliable communication of the position of the user's hand which does not rely on recognising a hand using object detection. The controller also determines the position of the object the user is attempting to interacting with (using the methods described with reference to Figure 2), and thus the -19 -controller can determine the position of the user's hand relative to the object based on the positions of the user's hand and the object relative to the sensor unit.
In other examples the means for determining the position of the user's hand may be provided by a transmitter and receiver respectively located on in the second feedback unit 400 and the object the user is attempting to interact with (or vice versa). The transmitter and receiver could be, for example, a short-range communication transmitter and receiver such as a radio frequency identification (RFID) or near-field communication (NFC) transmitter and receiver. The position and/or proximity of the user's hand from the object could be determined based on a signal communicated between the transmitter and receiver.
The controller 206 is configured to convey to the user instructions to assist the user is interacting with the object based on the position of the user's hand relative to the object. The configuration of the motors on the second feedback unit 400 provides intuitive instructions to the user by way of haptic feedback in order to assist the user to move their hand accurately to interact the object in their environment, such as to grasp the object. This is achieved by the controller selectively activating motors in the motor array associated with the direction in which the user should move their hand in order for their hand to contact the object they are attempting to interact with. The controller may also adjust the characteristics of the haptic feedback provided by the motors or by adjusting the number of motors activated in the array to convey further information to the user.
So that the user can translate the haptic feedback provided by the motors 410 into instructions as to how they should move their hand to interact with the object, the arrays of motors 408 on the second feedback unit 400 are each associated with a direction and/or a position of the user's hand relative to the object. The four linear arrays 408 located on the wristband 402 of the second feedback unit 400 are each associated with a direction, the direction being defined relative to the orientation of the feedback unit 400 when the user holds their hand out in front of them with their palm facing down. Thus, the linear array located along the underside of the user's wrist is associated with the 'down' direction, the linear array located along the topside of the user's wrist is associated with the 'up' direction, and the linear arrays located along the sides of the user's wrist are associated with the left' and 'right' directions respectively.
The controller controls the motors 410 of the various arrays to provide information to user regarding how they should move their hand in order to interact with the object, as follows: To convey information relating to the horizontal or vertical position of the user's hand relative to the object, to controller activates the motors 410 in the array corresponding to the direction in which the user should move their hand in order align the horizontal -20 -and vertical position of their hand with the object they are attempting to interact with. For example, if the user extends their hand to grasp a doorhandle, the sensor unit 200 determines the position of the user's hand relative to the position of the doorhandle (the object the user is trying to interact with) and if the user's hand is positioned to the left of the doorhandle, the controller will activate the motors 410 in the linear array along the right hand side of the user's wrist so as to convey to the user that they should move their hand to the right in order for their hand to be aligned with the doorhandle. When the user has moved their hand sufficiently far to the right so that it is horizontally aligned with the doorhandle, the controller deactivates the motors on the right of the user's wrist. Equally, if the user's hand is positioned above the doorhandle, the controller will activate the motors 410 in the linear array along the underside of the user's wrist so as to convey to the user that they should move their hand downwards in order for their hand to be aligned with the doorhandle. When the user has moved their hand sufficiently far downward so that it is vertically aligned with the doorhandle, the controller deactivates the motors on the right of the user's wrist.
To convey information relating to the distance between the user's hand and the object (i.e. towards or away from the user), the controller adjusts the number of motors 410 in the linear arrays that are activated and/or the location of the motors 410 in the linear arrays are activated. The number of motors in the linear arrays that are activated convey to the user how far away their hand is from the object, and the location of the motors that are activated conveys to the user the direction (i.e. forward or backward) in which to move their hand to reach the object. For example, if the user's hand is close to the object but must be extended further forward to reach the object, the controller will activate the first and second motors 410 of the linear array(s) closest to the user's hand (i.e. the distal end and middle motors). This indicates to the user that the user should move their hand forward to reach the object. As the user extends their hand toward the object, the controller deactivates the middle motor leaving just the distal end motor activated; this reduces the intensity of the haptic feedback experienced by the user indicating that their hand is closer to the object. Equally, if the user were to pull their arm away from the object, the controller would activate the third (i.e. proximal end) motor in the linear array(s) so that the user feels a more intense vibration, indicating to the user that their hand is now further away from the object. If the user moved they hand too far forward and beyond the object, the controller would deactivate the distal end motor and activate only the proximal end motor, indicating that the user should pull their hand back slightly to align it with the object. -21 -
Therefore, in broad terms, the controller is configured to control the motors of the second feedback device 400 based their circumferential position on the device in dependence on the direction between the user's hand and the object, and based on their longitudinal position (along the user's limb) in dependence on the distance between the feedback unit and the object.
These two functions may be performed in sequence or in parallel. For example, the controller may first control the motors to assist the user in aligning their hand horizontally and vertically with the object, and then subsequently control the motors to assist the user in aligning the distance of their hand with the distance of the object from them (or vice versa). Alternatively, the controller may control the motors to provide this assistance simultaneously; for example if the user must move their hand upwards, right, and forwards, the controller will activate the distal motors of the topside and right-hand side linear arrays.
The fifth array 409 of motors, located on the palm pad, is used to notify the user when their hand is aligned with the object they are attempting to interact with (as determined by the sensor unit 200). The array of motors on the palm pad comprises one linear row of three motors, and a fourth motor adjacent the central motor of the row. To convey to the user that their hand is aligned with the object the controller is configured to activate the motors on the palm pad to provide a signature haptic feedback. In this example: when the user's hand is horizontally aligned with the object, the controller is configured to activate the two motors on the ends of the row (for example, the two motors might provide a short, coordinated vibration pulse); when the user's hand is vertically aligned with the object, the controller is configured to activate the central motor of the row (for example, the motor might provide a short vibration pulse); and when the distance of the user's hand from the user is aligned with the distance of object form the user, the controller is configured to activate both the central motor in the row and the adjacent fourth motor (for example, the two motors might provide a short, coordinated vibration pulse). When the user's hand is aligned with the object in all directions, the controller may control the fourth motor (which is located centrally on the user's palm in use) to provide a signature haptic feedback, such as a high strength or high frequency pulse; this informs that user that their hand is at the object and they can close their fingers to grasp the object.
The system may also comprise means for receiving a user input of information specifying an object, or a type of object, with which the user wishes to interact. For example, the user may provide this information by way of a voice command. The controller may be configured then to process the data from the camera to detect the object specified by the user in the surrounding environment. For example, if a user is in front of a door, they may specify (e.g. by way of a voice command) that they wish to interact with the door handle. The system may -22 -then process the data from the cameras to identify a door handle in the user's field of view, and then guide the user's hand to handle as described above.
In other examples, the system may be configured to identify objects in the user's surroundings and to compile a list of identified objects from which the user could select an object with which they wish to interact. For example, the controller may be configured to process the data from the cameras on request essentially to scan the user's entire surrounds, and to identify the objects in the user's surroundings and to compile a list of those objects for the user. Alternatively or additionally the controller may be configured to repeatedly scan the user's surroundings, and continually update a list of objects in the user's surroundings at the user moves around. In order for the system to be used by blind and deaf individuals, the system may comprise means for communicating the list of objects in a user's surroundings through use of a refreshable braille display (for example integrated with the sensor unit 200). The system may also comprise means for receiving a user input of information from a blind and deaf user, by way of a braille keyboard (for example integrated with the sensor unit 200).
Navigation assistance In addition to assisting the user to interact with objects in their environment, the system also provides assistance to the user when navigating their environment. This navigation assistance is provided by a third feedback unit.
Figure 5a and 5b are front and rear views of the third feedback unit 500, as worn on a user's wrist. The third feedback unit comprises a wristband 501, and an array of actuators, which in this example are motors 502, embedded in the wristband 501. Thus, in use, when the user is wearing the wristband 501, the motors 502 will be in contact with the underside of the user's wrist, which is an area sensitive to haptic feedback.
The array of motors 502 comprises two arrays, each arranged in a linear configuration.
Specifically, the two arrays are each made up of five motors 502 arranged in a straight line.
The two arrays intersect one another such that one of the motors 502 is shared between the two arrays. In this example, the arrays interact such that they are perpendicular to one another, forming a cross configuration, where the middle motor in each array of five motors is shared between the two arrays and forms the centre of the cross. The other motors 502 in the cross configuration form four branches, each comprising two motors and extending from the central motor, offset from one another by 90 degrees The four branches are associated with four directions (i.e. forward, backward, left, right). In this example, the linear array which is aligned longitudinally along the user's arm has one -23 -branch (closest to the user's hand) which is associated with the forward direction and an opposite branch (furthest from the user's hand) which is associated with the backward direction. The linear array which is aligned transversely across the user's arm has one branch associated with the left-hand direction and an opposite branch associated with the right-hand direction.
The system 100 comprises a global positioning system (GPS), for example integrated with the third feedback unit 500, the sensor unit 200, or the controller 206, for determining the location of the user. The system 100 also comprise either an integrated mapping system or a means for connecting to an external mapping system such as a mapping application on a smart phone. The mapping system is configured to receive input of information (e.g. via user input, such as via a voice command, or from GPS information) relating to an origin location and destination location and to calculate a route between these locations. The mapping system is configured to then transmit information relating to the route to the controller, for example information about the waypoints along the route, information about the directions and/or distances between the waypoints, and the action required (e.g. the turn direction required) at each waypoint. The controller is configured to control the motors 502 of the third feedback unit 500 to convey navigation instructions to the user, by way of haptic feedback, based on the received information relating to the route to guide the user along the route.
The cross configuration of the motors provides a means for providing intuitive navigation instructions to the user by way of haptic feedback. This is achieved by the controller selectively activating motors in the branch associated with the direction in which the user should move. The controller may also adjust the characteristics of the haptic feedback provided by the motors in a branch or the number of motors activated in the branch.
An exemplary mode of control of the motors to convey navigation instructions to the user is as 25 follows: * When, according to the planned route; the user should move forward, the controller activates the motors in the branch associated with the forward direction (i.e. the two motors in the branch closest to the user's hand) to provide a haptic feedback to the user, which the user interprets as a forward instruction * As the user approaches a waypoint which, according to the planned route, requires a left-hand turn, the controller activates one of the motors in the branch associated with the left-hand direction (preferably only the motor in the left-hand branch closest to the centre of the cross configuration), to provide a haptic feedback which the user interprets as a warning that a left-hand turn is approaching -24 -* * * * * As the user nears the left-hand turn, the controller reduces the strength of the motors associated with the forward direction, and/or deactivates one of those motors (preferably the motor furthest from the centre of the cross configuration, leaving the motor closest to the centre activated), which the user interprets as an instruction to slow down in the forward direction When the user arrives at the waypoint, the controller deactivates the remaining motor in the forward direction, and activates the second motor in the left-hand direction, which the user interprets as an instruction to stop moving in the forward direction and turn left When the user has completed the left turn, controller deactivates the motors associated with the left hand direction and reactivates the motors associated with the forward direction, which the user interprets as an instruction to move forward in the new direction If a user misses a turning, the controller activates the motors associated with the backward direction, first by activating one of the motors (preferably the motor closest to the centre of the cross configuration) and, if the user continues to move further beyond the waypoint without turning, by also activating the second motor which the user interprets as an instruction to turn around and walk back in the opposite direction If at any point an emergency stop is required, for example because a dangerous object has been detected ahead by the sensor unit, the central motor in the cross configuration is activated by the controller with a high strength that will be interpreted by the user as an instruction to stop immediately. When it is again safe to walk, the navigation instructions will restart, with the controller activating the motor(s) associated with the forward direction.
In this way, the controller provides navigation instructions to a user by way of haptic feedback, the haptic feedback comprising: the selective activation of the motors in a branch associated with a particular direction; the number of motors in the branch that are activated; and the strength of the haptic feedback provided by the activated motors, where the specific haptic feedback that is provided to the user depends on the position of the user relative to the waypoint along the route as described.
The controller also adjusts the navigation instructions provided to the user based on either a record of previous journey along the route selected by the user and/or based on the objects detected by the sensor unit. In the former case, the mapping system may calculate the route between the origin and destination locations at least party on the basis of a record of previous journeys along a route between those locations; in this way, the mapping system can avoid -25 -areas (such as busy roads) that have previously caused difficulties or delays for the user thus provides a route which is easier and quicker for the user to navigate. In the latter case, user may adjust its navigation instructions on the fly to take account of obstacles identified by the sensor unit 200; for example if at a junction two possible routes are available to reach the destination, the controller may instruct the user, via the third feedback unit 500, to follow the route which appears, based on the measurements of the sensor unit 200, to have fewer obstacles or fewer significant obstacles for the user.
Alternatives and modifications While the sensors have been described above as being cameras, in other examples the sensors may be electromagnetic radiation based sensors (e.g. LiDAR sensors), ultrasonic sensors, or other sensors.
Similarly, while the actuators have been described above as being motors providing vibrational haptic feedback, the motors could instead provide a different haptic feedback (e.g. a force against the user). Alternatively, the actuators may not be motors at all, and may be alternative means for providing a haptic or tactile feedback to the user.
It will be understood that the invention has been described above purely by way of example, and modifications of detail can be made within the scope of the invention.
Each feature disclosed in the description, and (where appropriate) the claims and drawings may be provided independently or in any appropriate combination.
Reference numerals appearing in the claims are by way of illustration only and shall have no limiting effect on the scope of the claims.

Claims (24)

  1. -26 -Claims 1 A personal assistance system, comprising: a sensor unit comprising at least one sensor arranged to obtain data from the surrounding environment; at least one feedback unit comprising at least one actuator arranged to provide haptic feedback to a user; and a controller arranged to receive the data from the at least one sensor and to control the at least one actuator; wherein the controller is configured to: process the data from the at least one sensor to detect an object in the surrounding environment and to determine the proximity of the object and at least one further property of the object; and control the at least one actuator in dependence on the proximity and the at least one further property of the object.
  2. 2 A system according to Claim 1, wherein the controller is configured to control the at least one actuator to provide a haptic feedback having a characteristic that is dependent on, preferably correlated with, the determined proximity and/or the at least one further property of the object.
  3. 3 A personal assistance system comprising: a sensor unit comprising at least one sensor arranged to obtain data from the surrounding environment; at least one feedback unit comprising an at least one actuator arranged to provide haptic feedback; and a controller arranged to receive the data from the at least one sensor and to control the at least one actuator, wherein the controller is configured to: process the data from the at least one sensor to detect an object in the surrounding environment and to determine at least one property of the object; and control the at least one actuator to provide a haptic feedback having a characteristic that is correlated with the determined at least one property of the object.
  4. -27 - 4 A system according to Claim 2 or 3, wherein the characteristic of the haptic feedback provided by the at least one actuator is at least one of: a strength of the haptic feedback provided by the at least one actuator; and/or a frequency of the haptic feedback provided by the at least one actuator.
  5. A system according to any preceding claim, wherein the at least one property, or at least one further property, of the object determined by the controller is an identity of the object; preferably wherein the identity of the object is determined based on at least one physical attribute of the object detected by the at least one sensor, the at least one physical attribute preferably being at least one of: a shape of the object or part of the object; a colour of the object or part of the object; and/or a movement pattern of the object across the field of view of the at least one sensor; more preferably wherein the identity of the object is determined based on the data from the at least one sensor and stored data relating to objects of known identities, yet more preferably wherein the identity of the object is determined using image processing software comprising machine learning algorithms.
  6. 6 A system according to any preceding claim, wherein the at least one property, or at least one further property, of the object determined by the controller is at least one of: a position of the object relative to the sensor unit; a proximity of the object relative to the sensor unit; a size of the object relative to other objects within a field of view of the at least one sensor; an absolute size of the object; and a significance of the object.
  7. 7 A system according to Claim 6, wherein: the controller is configured to determine the position of the object relative to the sensor unit based on the position of the object in the field of view of the at least one sensor; and/or the controller is configured to determine the proximity of the object relative to the sensor unit based on the position of the object relative to other surrounding objects and/or the extent of the object as a proportion of the field of view of the at least one sensor; and/or the sensor unit comprises at least two sensors, and the controller is configured to determine the proximity of the object relative to the sensor unit based on a comparison of the positions of the object in each of the sensor's fields of view; and/or -28 -the controller is configured to determine the size of the object relative to other objects within a field of view of the at least one sensor by comparing the relative extents of the objects as a proportion of the field of view of the at least one sensor and/or the controller is configured to determine the absolute size of the object based on: a determined identity of the object and predetermined information relating to the absolute size of objects of the determined identity; and/or a determined proximity of the object and a determined extent of the object relative to the field of view of the at least one sensor; and/or the controller is configured to determine the significance of the object based on a determined identity of the object and predetermined information relating to the danger associated with objects of the determined identity.
  8. 8 A system according to any preceding claim, wherein the sensor unit comprises an array of actuators, and the controller is configured to control the actuators in a region of the array to provide haptic feedback having a characteristic that is dependent on, preferably correlated with, at least one determined property of the object.
  9. 9 A system according to Claim 8, wherein the characteristic of the haptic feedback provided by the actuators in the region of the array is at least one of: an intensity of haptic feedback provided by the actuators in the region, the intensity preferably being the average strength of the haptic feedback across the region; an extent of the region in the array; a shape of the region in the array; and/or a location of the region in the array.
  10. 10. A system according to any preceding claim, wherein the at least one actuator comprises at least one motor, and/or wherein the haptic feedback comprises a vibration.
  11. 11. A system according to any preceding claim, wherein the at least one sensor comprises at least one camera, preferably a set of two cameras, more preferably wherein the at least one camera is at least one 360 degree camera.
  12. 12. A system according to any preceding claim, wherein at least one of the sensor unit, feedback unit, and controller are arranged to be worn by a user, preferably wherein the sensor unit and feedback unit are arranged to be worn by a user, more preferably wherein the sensor unit is configured to be worn over the torso of a user, and yet -29 -more preferably wherein the at least one sensor is mounted on the sensor unit such that, in use, the at least one sensor is located on the shoulder of the user.
  13. 13. A system according to any preceding claim, wherein the controller is integrated with either or both of the sensor unit or the feedback unit.
  14. 14 A system according to any preceding claim, further comprising a belt arranged to be worn around the torso, preferably the abdomen, of a user, the belt comprising one of the at least one feedback units, preferably wherein the system comprises an array of actuators, wherein the array of actuators is arranged as a grid on the belt, such that in use the actuators provide hapfic feedback to the torso, preferably the abdomen, of a user.
  15. A system according to any preceding claim, wherein the controller is configured to: process the data from the at least one sensor to determine the position of a user's limb relative to the detected object; and control the at least one actuator in dependence on the determined position of the user's limb relative to the detected object.
  16. 16. A system according to Claim 15, wherein the controller is configured to control the at least one actuator by adjusting a characteristic of the hapfic feedback provided by the or each actuator in dependence on, preferably in correlation with, the determined position of the user's limb relative to the detected object.
  17. 17 A system according to any of Claims 15 or 16, comprising an array of actuators, wherein the actuators are arranged in groups wherein: the controller is configured to control the groups of actuators in dependence on the direction from the user's limb to the detected object, preferably to activate the actuators in a group indicative of the direction from the user's limb to the detected object; and/or at least one of the groups is arranged in a linear configuration, and wherein the controller is configured to vary the strength of the haptic feedback provided by each actuator in the group along the length of the linear configuration in dependence on the proximity and/or direction from the user's limb to the detected object, preferably, wherein the groups of actuators are distributed substantially equidistant from one another around the feedback unit, more preferably wherein each group consists of three actuators.
  18. -30 - 18. A system according to any of Claims 15 to 17, wherein the controller is configured to control the at least one actuator to provide a signature haptic feedback when it is determined that the position of the user's limb corresponds with the position of the object, preferably wherein the signature haptic feedback comprises at least one of: activating a specific actuator or group of actuators; controlling an actuator or group of actuators to provide haptic feedback with a particular characteristic; and activating a group of actuators in a specific order, pattern, or configuration; more preferably wherein the controller is configured to control the at least one actuator to provide at least one of: a first signature haptic feedback when it is determined that the vertical position of the user's limb corresponds with the vertical position of the object; a second signature haptic feedback when it is determined that the horizontal position of the user's limb corresponds with the horizontal position of the object; and a third signature haptic feedback when it is determined that both the vertical and horizontal positions of the user's limb correspond with the vertical and horizontal position of the object.
  19. 19 A system according to any of Claims 15 to 18, wherein the system comprises a glove and/or wristband, and wherein the glove and/or wristband comprises one of the at least one feedback units, preferably wherein the glove and/or wristband comprises a movable portion arranged to be moved, in use, to overlap a user's palm, wherein at least one actuator is located on the movable portion, more preferably wherein the at least one actuator located on the movable portion is configured to provide the signature haptic feedback.
  20. A system according to any preceding claim, wherein the controller is configured to: receive information relating to a route; and control the at least one actuator in dependence on the information relating to the route, preferably wherein the controller is configured to control the at least one actuator to provide haptic feedback representative of navigation instructions along the route.
  21. 21. A system according to Claim 20, comprising a user device configured to receive user input of the route, and transmit information relating to the route to the controller, preferably wherein the system comprises means for determining a position of the user relative to the route, and wherein the controller is configured to control the at -31 -least one actuator in dependence on the determined position of the user relative to the route.
  22. 22 A system according to any of Claims 20 or 21, comprising an array of actuators, wherein the array of actuators comprises two groups of actuators arranged in linear configurations, and wherein the controller is configured to vary the strength of the haptic feedback provided by each actuator in the group along the length of the linear configurations to provide haptic feedback representative of navigation instructions along the route.
  23. 23 A system according to Claim 22, wherein the two groups of actuators are arranged to intersect one another, preferably such that one of the groups is oriented substantially perpendicularly to the other, more preferably wherein the intersecting groups of actuators form multiple branches of actuators, each branch being indicative of a navigation direction, yet more preferably wherein the controller is configured to: activate at least one of the actuators of a branch in dependence on the information relating to the route so as to provide haptic feedback indicative of the navigation direction corresponding to that branch; and preferably control the actuators of the branch in dependence on the determined position of the user relative to the route, preferably wherein the controller is configured to control the actuators of the branch in dependence on the proximity of the user to a waypoint on the route; and yet more preferably adjust at least one of: the strength of haptic feedback of the actuators in the branch in dependence on the proximity of the user to a waypoint on the route; and the number of actuators in the branch that are activated.
  24. 24 A personal assistance system comprising: a sensor unit comprising at least one sensor arranged to obtain data from the surrounding environment; at least one feedback unit comprising an at least one actuator arranged to provide haptic feedback; and a controller arranged to receive the data from the at least one sensor and to control the at least one actuator; wherein the system comprises means for receiving a user input of information relating to an object; and wherein the controller is configured to process the data from the at least one sensor to: -32 -detect the object in the surrounding environment based on the data from the at least one sensor and the user input information relating to the object; determine at least one property of the detected object; and control the at least one actuator in dependence on the at least one property of the object, preferably to convey to a user information relating to the location of the object.A computer implemented method comprising: receiving, from at least one sensor of a sensor unit, data relating to a surrounding environment; processing the data to detect an object in the surrounding environment and to determine at least one property of the object; and controlling an array of actuators of a feedback unit to provide haptic feedback, said controlling comprising controlling the actuators in dependence on the determined at least one property of the object.
GB2206522.1A 2022-05-04 2022-05-04 Personal assistance systems and methods Pending GB2622184A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2206522.1A GB2622184A (en) 2022-05-04 2022-05-04 Personal assistance systems and methods

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2206522.1A GB2622184A (en) 2022-05-04 2022-05-04 Personal assistance systems and methods

Publications (2)

Publication Number Publication Date
GB202206522D0 GB202206522D0 (en) 2022-06-15
GB2622184A true GB2622184A (en) 2024-03-13

Family

ID=81943975

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2206522.1A Pending GB2622184A (en) 2022-05-04 2022-05-04 Personal assistance systems and methods

Country Status (1)

Country Link
GB (1) GB2622184A (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6055048A (en) * 1998-08-07 2000-04-25 The United States Of America As Represented By The United States National Aeronautics And Space Administration Optical-to-tactile translator
US20080088469A1 (en) * 2005-01-13 2008-04-17 Siemens Aktiengesellschaft Device for Communicating Environmental Information to a Visually Impaired Person
ES1073252U (en) * 2010-09-30 2010-11-25 Gerardo Labernia Tomas Touching device and volumetric perception (Machine-translation by Google Translate, not legally binding)
WO2010142689A2 (en) * 2009-06-08 2010-12-16 Kieran O'callaghan An object detection device
WO2013018090A1 (en) * 2011-08-01 2013-02-07 Abir Eliahu System and method for non-visual sensory enhancement
WO2014066516A1 (en) * 2012-10-23 2014-05-01 New York University Somatosensory feedback wearable object
US20150196101A1 (en) * 2014-01-14 2015-07-16 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US20160127698A1 (en) * 2014-11-04 2016-05-05 iMerciv Inc. Apparatus and method for detecting objects
US20180012376A1 (en) * 2016-07-08 2018-01-11 Toyota Motor Engineering & Manufacturing North America, Inc. Aligning vision-assist device cameras based on physical characteristics of a user
US20180078444A1 (en) * 2016-09-17 2018-03-22 Noah Eitan Gamerman Non-visual precision spatial awareness device.
US20180303702A1 (en) * 2017-04-20 2018-10-25 Neosensory, Inc. Method and system for providing information to a user
US20190070064A1 (en) * 2016-03-07 2019-03-07 Wicab, Inc. Object detection, analysis, and alert system for use in providing visual information to the blind
WO2019156990A1 (en) * 2018-02-09 2019-08-15 Vasuyantra Corp., A Delaware Corporation Remote perception of depth and shape of objects and surfaces
WO2021072460A1 (en) * 2019-10-15 2021-04-22 Sedlackova Katerina Clothing item
WO2022061380A1 (en) * 2020-09-22 2022-03-31 Thomas Scheu Guide apparatus for persons with impaired vision
GB2599471A (en) * 2021-05-20 2022-04-06 Hope Tech Plus Ltd System and method for guiding user

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6055048A (en) * 1998-08-07 2000-04-25 The United States Of America As Represented By The United States National Aeronautics And Space Administration Optical-to-tactile translator
US20080088469A1 (en) * 2005-01-13 2008-04-17 Siemens Aktiengesellschaft Device for Communicating Environmental Information to a Visually Impaired Person
WO2010142689A2 (en) * 2009-06-08 2010-12-16 Kieran O'callaghan An object detection device
ES1073252U (en) * 2010-09-30 2010-11-25 Gerardo Labernia Tomas Touching device and volumetric perception (Machine-translation by Google Translate, not legally binding)
WO2013018090A1 (en) * 2011-08-01 2013-02-07 Abir Eliahu System and method for non-visual sensory enhancement
WO2014066516A1 (en) * 2012-10-23 2014-05-01 New York University Somatosensory feedback wearable object
US20150196101A1 (en) * 2014-01-14 2015-07-16 Toyota Motor Engineering & Manufacturing North America, Inc. Smart necklace with stereo vision and onboard processing
US20160127698A1 (en) * 2014-11-04 2016-05-05 iMerciv Inc. Apparatus and method for detecting objects
US20190070064A1 (en) * 2016-03-07 2019-03-07 Wicab, Inc. Object detection, analysis, and alert system for use in providing visual information to the blind
US20180012376A1 (en) * 2016-07-08 2018-01-11 Toyota Motor Engineering & Manufacturing North America, Inc. Aligning vision-assist device cameras based on physical characteristics of a user
US20180078444A1 (en) * 2016-09-17 2018-03-22 Noah Eitan Gamerman Non-visual precision spatial awareness device.
US20180303702A1 (en) * 2017-04-20 2018-10-25 Neosensory, Inc. Method and system for providing information to a user
WO2019156990A1 (en) * 2018-02-09 2019-08-15 Vasuyantra Corp., A Delaware Corporation Remote perception of depth and shape of objects and surfaces
WO2021072460A1 (en) * 2019-10-15 2021-04-22 Sedlackova Katerina Clothing item
WO2022061380A1 (en) * 2020-09-22 2022-03-31 Thomas Scheu Guide apparatus for persons with impaired vision
GB2599471A (en) * 2021-05-20 2022-04-06 Hope Tech Plus Ltd System and method for guiding user

Also Published As

Publication number Publication date
GB202206522D0 (en) 2022-06-15

Similar Documents

Publication Publication Date Title
EP1293184B1 (en) Walking auxiliary for person with dysopia
US8803699B2 (en) Object detection device
US20190235630A1 (en) Haptic guidance system
US8204679B2 (en) Mobile apparatus, control device and control program
KR101898582B1 (en) A stick for the blind
KR101715472B1 (en) Smart walking assistance device for the blind and Smart walking assistance system using the same
CA2250961A1 (en) X-ray guided surgical location system with extended mapping volume
JP2003340764A (en) Guide robot
US11497673B2 (en) Motion-liberating smart walking stick
US8825389B1 (en) Mobility device and method for guiding the visually impaired
KR101893374B1 (en) A stick for the blind
KR20160125215A (en) The drone, the route guidance drone set and the method of route guidance using them
CN104020446A (en) Autonomous navigation and positioning system in intelligent nursing bed and positioning and navigation method
US11703881B2 (en) Method of controlling a guide machine and a navigation system
TW201831920A (en) Auto moving device
US10593058B2 (en) Human radar
GB2622184A (en) Personal assistance systems and methods
Gurubaran et al. A survey of voice aided electronic stick for visually impaired people
KR101398880B1 (en) Wearable robot with humanoid function and control method of the same
JP4839939B2 (en) Autonomous mobile device
Bolla et al. Object Detection in Computer Vision Using Machine Learning Algorithm For Visually Impaired People
JP2006258717A (en) Distance-measuring system
Dias A sensor platform for the visually impaired to walk straight avoiding obstacles
KR101880611B1 (en) Space perecption device
KR102472144B1 (en) Interaction robot, and control method for the same