US20210345913A1 - System and Method for Detecting Handwriting Problems - Google Patents

System and Method for Detecting Handwriting Problems Download PDF

Info

Publication number
US20210345913A1
US20210345913A1 US17/306,507 US202117306507A US2021345913A1 US 20210345913 A1 US20210345913 A1 US 20210345913A1 US 202117306507 A US202117306507 A US 202117306507A US 2021345913 A1 US2021345913 A1 US 2021345913A1
Authority
US
United States
Prior art keywords
handwriting
neural network
instrument
strokes
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/306,507
Inventor
Eric Humbert
Amélie Caudron
Arthur Belhomme
Fabrice Devige
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Invoxia SAS
Original Assignee
Invoxia SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Invoxia SAS filed Critical Invoxia SAS
Publication of US20210345913A1 publication Critical patent/US20210345913A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1124Determining motor skills
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • G06K9/00416
    • G06K9/00429
    • G06K9/224
    • G06K9/6256
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/22Character recognition characterised by the type of writing
    • G06V30/228Character recognition characterised by the type of writing of three-dimensional handwriting, e.g. writing in the air
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/333Preprocessing; Feature extraction
    • G06V30/347Sampling; Contour coding; Stroke extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/36Matching; Classification
    • G06V30/373Matching; Classification using a special pattern or subpattern alphabet
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0204Operational features of power management
    • A61B2560/0209Operational features of power management adapted for power saving
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0204Operational features of power management
    • A61B2560/0214Operational features of power management of power generation or supply
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0247Pressure sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0257Proximity sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/002Monitoring the patient using a local or closed circuit, e.g. in a room or building
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1124Determining motor skills
    • A61B5/1125Grasping motions of hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the methods used to detect handwriting problems are based on the use of a stylus-pen with which a user is asked to write on a mobile terminal specifically dedicated to detecting handwriting problems.
  • a mobile terminal such as an electronic tablet.
  • the tablet comprises an embedded pressure sensor able to calculate the pressure of the stylus on the electronic tablet when the user is writing.
  • the method uses a database of distorted letters with which the letters written by the user are compared. This method allows detecting handwriting problems such as dyslexia.
  • the material used to detect the problem is very specific and does not allow detecting easily handwriting problems for a large number of people.
  • One purpose of this disclosure is to improve the situation.
  • This system presents the technical advantage of being able to detect a handwriting problem with a regular handwriting instrument.
  • the proposed system is really easy to use since no further electronic components such as a dedicated tablet and stylus are needed.
  • the motion sensor and the calculating unit are embedded in a second extremity of the handwriting instrument.
  • the system further comprises a detection device, said detection device comprising the motion sensor and the calculation unit, said detection device being mounted on the second extremity of the handwriting instrument.
  • the system can then be used with any already existing handwriting instrument.
  • the motion sensor is embedded in the handwriting instrument, the handwriting instrument further included a short-range radio communication interface configured to communicate raw data acquired by the motion sensor to a mobile device comprising said calculating unit via a communication interface of said mobile device.
  • the motion sensor comprises two three-axis accelerometers.
  • the motion sensor used to detect handwriting problems does not consume a lot of power.
  • the handwriting instrument comprises two motion sensors being one three-axis accelerometer and one three-axis gyroscope.
  • the gyroscope comprises a wake-up input suited for receiving a wake-up signal from said calculating unit when a movement is detected by the accelerometer, said gyroscope being configured for switching into an active state when said wake-up signal is received.
  • the system further comprises a pressure sensor embedded in the pen or the pencil, said calculating unit being configured to further receive data acquired by said pressure sensor.
  • the system is then able to detect handwriting problems on the basis of different kind of data: motion data, pressure data, etc.
  • the system further comprises a stroke sensor configured to acquire stroke data while the user is using the handwriting instrument, said artificial intelligence being trained with said stroke data to determine handwriting problems.
  • the artificial intelligence is then able to determine when the user is actually using the handwriting instrument on a support and differentiate the data corresponding to an actual use of the handwriting instrument and the data acquired while the handwriting instrument is just hold in the air.
  • the stroke sensor is the motion sensor.
  • the stroke sensor is a pressure sensor, or a contact sensor, or a vibration sensor.
  • the system does not use another embedded sensor.
  • the system remains compact and low-power consuming.
  • the artificial intelligence is a neural network.
  • the neural network is a deep neural network.
  • the method further comprises a prior learning step comprising:
  • the neural network can then be trained by means of an end-to-end classification. This improves the precision of the results.
  • the acquired data are classified in at least one of the following classes:
  • the system can then detect a lot of different and particular handwriting problems.
  • the method further comprises acquiring vibration data by a stroke sensor, the method further comprising a prior learning step comprising:
  • the neural network can then be trained by segmentation and classification of strokes.
  • the size of the neural network is then smaller than the neural network trained according to the end-to-end classification.
  • the features extracted from the strokes timestamps comprise:
  • the classification of the features of stroke timestamps is made by a hand-crafted algorithm or a learned model.
  • the neural network is further trained with a data base of letters and numbers correctly formed, a sequence of strokes and a direction of said strokes of the sequence of strokes being associated to each letter and number of the data base, and wherein, based on the motion and vibration data acquired during the use of the handwriting instrument, the neural network determines if the user is forming letters and numbers correctly.
  • the system is then adapted to detect a lot of different handwriting problems, with a high precision, small number of components and easy use.
  • FIG. 1 is an illustration of a system for detecting handwriting problems according to a first embodiment.
  • FIG. 2 is a block schema of the system illustrated in FIG. 1 .
  • FIG. 3 is an illustration of a system for detecting handwriting problems according to a second embodiment.
  • FIG. 4 is a block schema of the system illustrated in FIG. 3 .
  • FIG. 5 is an illustration of a system for detecting handwriting problems according to a third embodiment.
  • FIG. 5 is an illustration of a system for detecting handwriting problems according to a fourth embodiment.
  • FIG. 7 is an illustration of a system for detecting handwriting problems according to an alternative embodiment of FIGS. 1 and 2 .
  • FIG. 8 is a block diagram illustrated the training phase of the neural network according to a first embodiment.
  • FIG. 8 is a block diagram illustrated the training phase of the neural network according to a second embodiment
  • FIG. 10A to 10C illustrate block diagrams of the collect phase, training phase and inference phase of the trained neural network.
  • FIGS. 1 to 7 illustrating embodiments of a system 1 for detecting handwriting problems.
  • the same reference numbers are used to describe identical elements of the system.
  • a handwriting problem which can be detected according to the present disclosure can be dyslexia, dysgraphia or a difficulty to reproduce characters.
  • FIGS. 1 and 2 generally illustrates a system 1 according to a first embodiment.
  • the system 1 comprises a handwriting instrument 2 .
  • the handwriting instrument 2 can be a pen, a pencil, a brush or any element allowing a user to write or draw with it on a support.
  • the support can be paper, canvas, or any surface on which a user can write or draw.
  • the support can also be a coloring book.
  • the handwriting instrument 2 comprises a body 3 extending longitudinally between a first end 4 and a second end 5 .
  • the first end 4 comprises a writing tip 6 which is able to write on a support.
  • the tip 6 can deliver ink or color.
  • the handwriting instrument 2 further includes at least one motion sensor 7 ,
  • the motion sensor 7 can be a three-axis accelerometer or a three-axis gyroscope.
  • the handwriting instrument 2 preferably includes two motion sensors 7 .
  • the handwriting instrument 2 comprises two three-axis accelerometers.
  • the handwriting instrument 2 comprises one three-axis accelerometer and one three-axis gyroscope.
  • the at least one motion sensor 7 is able to acquire data on the handwriting of the user when the user is using the handwriting instrument 2 . These data are communicated to a calculating unit 8 which is configured to analyze the data and detect an eventual handwriting problem of the user.
  • the calculating unit 8 can comprise a volatile memory to store the data acquired by the motion sensor 7 and a non-volatile memory to store a model enabling the detection of handwriting problem.
  • the handwriting instrument 2 can also comprise a short-range radio communication interface 9 allowing the communication of data between the motion sensor 7 and the calculating unit 8 .
  • the short-range radio communication interface is using a Wi-Fi, Bluetooth®, LORA®, SigFox® or NBIoT network. In another embodiment, it can also communicate using a 2G, 3G, 4G or 5G network.
  • the handwriting instrument 2 further includes a battery 10 providing power to at least the motion sensor 7 when the user is using the handwriting instrument.
  • the battery 9 can also provide power to the calculating unit 8 when the calculating unit is included in the writing instrument 2 .
  • the handwriting instrument 2 comprises the at least one motion sensor 7 , the short-range radio communication interface 9 and the battery 10 .
  • the system 1 further comprises a mobile device 11 , distinct from the handwriting instrument 2 ,
  • the mobile device 11 can typically be an electronic tablet, a mobile phone or a computer.
  • the mobile device 11 comprises the calculating unit 8 ,
  • the mobile device 11 further comprises a short-range radio communication interface 12 enabling communication between the calculating unit 8 and the handwriting instrument 2 .
  • the calculating device 8 of the mobile device receives raw data acquired by the motion sensor 7 and analyzed them to detect an eventual handwriting problem.
  • the motion sensors 7 , the calculating unit 8 , the short-range radio communication interface 9 and the battery 10 are not embedded in the handwriting instrument 2 .
  • the electronics 20 can be comprised in a detection device 13 , distinct from the handwriting instrument 2 .
  • the detection device 13 can be mounted on the second end 5 of the handwriting instrument 2 .
  • the detection device 13 comprises a body 14 to be mounted on the second end 5 of the handwriting instrument 2 and a protuberant tip 15 able to be inserted in the body 3 of the handwriting instrument 2 .
  • one motion sensor 7 can be provided on the protuberant tip 15 and another motion sensor 7 can be provided in the body 14 of the detection device 13 .
  • the two motions sensors 7 are able to acquire different data during the handwriting of the user.
  • the motions sensors 7 are provided in the body 14 of the detection device 13 .
  • the detection device 13 can be mounted on any type of handwriting instrument 2 , without necessitating a hollow body 3 of the handwriting instrument 2 .
  • the at least one motion sensor 7 , the calculating unit 8 , the short-range radio communication interface 9 and the battery 10 are directly embedded in the handwriting instrument 2 .
  • one motion sensor 7 can be provided close to the first end 4 of the handwriting instrument 2 , while another motion sensor 7 can be provided on the second end 5 of the handwriting instrument 2 .
  • the handwriting instrument 2 can also comprise a pressure sensor able to acquire data. These data can be transmitted to the calculation unit that analyze these data and the data acquired by the at least one motion sensor 7 .
  • the pressure sensor can be embedded in the handwriting instrument 2 or in the detection device 13 .
  • the calculating unit 8 receives data acquired from at least on motion sensor 7 and from the pressure sensor 15 , if applicable, to analyze them and detect a handwriting problem.
  • the calculating unit 8 can store an artificial intelligence model able to analyze the data acquired by the motion sensor 7 .
  • the artificial intelligence can comprise a trained neural network.
  • the neural network is trained according to the method of using intermediate features extraction.
  • step S 1 the motion sensor 7 acquires data during the use of the handwriting instrument 2 .
  • the neural network receives the raw signals of the data acquired at step S 1 .
  • the neural network also receives the sample labels at step S 3 . These labels correspond to whether or not the signal corresponds to a stroke.
  • the neural network is able to determine if the signal correspond to a stroke on a support. The neural network is then able to determine stroke timestamps.
  • the calculating unit 8 performs a stroke features extraction to obtain intermediate features at step S 5 .
  • the neural network is able to derive indications about handwriting problems.
  • an algorithm is able to derive indications about handwriting problems.
  • This algorithm can be a learned model such as a second neural network, or a handcrafted algorithm.
  • the model is trained on a supervised classification task, where the inputs are stroke features with labels, and the outputs are handwriting problems.
  • the hand-crafted algorithm can compute statistics on the stroke features and compare them to thresholds found in the scientific literatures, in order to detect handwriting problems.
  • step S 7 the system is able to detect handwriting problems.
  • handwriting problems include but are not limited to:
  • the neural network is trained according to the method of end-to-end classification.
  • step S 10 the data are acquired by the motion sensor 7 .
  • the classification is made in step S 11 .
  • the neural network receives the raw signal of the data acquired by the motion sensor 7 and global labels (step S 12 ).
  • the global labels corresponds to the handwriting problems to be detected by the neural network which can be, but are not limited to:
  • step S 13 the neural network delivers the result.
  • the trained neural network described in reference with FIGS. 8 and 9 is stored.
  • the neural network can be stored in the calculating unit 8 .
  • FIGS. 10A to 100 illustrate more specifically the embodiment described with reference to FIG. 8 .
  • the neural network may determine the timestamps of the strokes on the support.
  • This information can be detected by a stroke sensor 16 .
  • the stroke sensor 16 is advantageously embedded in the handwriting instrument or in the detection device 13 mounted on the handwriting instrument.
  • the stroke sensor 16 may be a pressure sensor, a contact sensor or a vibration sensor. Then, the neural network receives the data collected by the stroke sensor 16 at step S 3 .
  • the stroke sensor 16 is the motion sensor 7 . More preferably, the motion sensor 7 is a three-axis accelerometer.
  • FIG. 10A illustrates the collect of data used during the training phase of the neural network, which is illustrated FIG. 10B .
  • FIG. 100 illustrates the inference of the neural network by a user of the handwriting instrument.
  • the accelerometer first need to be set such that its sample rate is at least twice superior to the maximum frequency of the vibrations to be detected.
  • the accelerometer is highly sensitive.
  • the accelerometer may be bound to the writing tip 6 of the handwriting instrument 2 by rigid contacts with little damping.
  • the accelerometer is set with a sample rate F 2 . While the user is using the handwriting instrument 2 , the accelerometer acquires data at step S 20 . These data can be sent by short-range radio to a recording device at step S 21 .
  • the handwriting instrument 2 also comprises a three-axis gyroscope as another motion sensor 7 , the three-axis gyroscope can also acquire data that are sent to the recording device at step S 21 .
  • FIG. 10B illustrates the training phase of the neural network.
  • the data sent to the recording device are provided.
  • the data are analyzed at step S 23 A to determine the labels (step S 238 ).
  • the labels comprise the strokes timestamps, detected when vibration is detected in the data, and the stroke velocity.
  • the stroke velocity is advantageously determined using the acceleration data and the high frequencies contained in the vibration.
  • Step S 24 comprises the undersampling of the data.
  • the frequency of the accelerometer was set to be higher than the one set for the inference phase.
  • the vibration analysis was made on the basis of the three-axis accelerometer and the three-axis gyroscope.
  • the constant use of the gyroscope leads to high energy consumption.
  • the undersampling step S 24 comprises the degradation of the parameters.
  • Frequency F 2 of the accelerometer is reduced to a frequency F 1 , smaller than F 2 , and the training is made only according to three-axis detection.
  • step S 25 the neural network is trained to be able to perform strokes segmentation, as described with reference to FIG. 8 , step S 2 .
  • FIG. 10C illustrates the inference phase.
  • the neural network is trained to detect handwriting problems by means of strokes segmentation.
  • step S 26 a user is using the handwriting instrument 2 in view of detecting an eventual handwriting problem.
  • the accelerometer in the handwriting instrument is set to the frequency F 1 and advantageously, the data are acquired according to three-axis.
  • the trained neural network is feed with the acquired data
  • the neural network is able to deliver the strokes timestamps and the velocity.
  • Step S 29 actually corresponds to steps S 4 to S 7 , already described with reference to FIG. 8 .
  • the neural network can be trained continuously with the data acquired by the user of the handwriting pen 2 after the storage of the neural network.
  • the neural network can also be trained to detect a wrong ductus of the user.
  • the ductus corresponds to the formation of letter and number.
  • the neural network is able to determine if a sequence of strokes correspond to a letter or a number.
  • the neural network can also be fed with a large data base of letters and numbers, Each letters and numbers can be associated with a sequence of strokes.
  • the sequence of strokes can advantageously corresponds to acceleration signals acquired by the accelerometer during the collect phase when forming the letters and numbers.
  • the labels to be determined by the neural network may be the direction and an order of the sequence of strokes for each letter and number.
  • the intermediate features can then also comprise the temporal sequence of strokes and their direction.
  • step S 7 the neural network is able to determine if the user is forming correctly letters and numbers.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Evolutionary Computation (AREA)
  • Psychiatry (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Primary Health Care (AREA)
  • Evolutionary Biology (AREA)
  • Epidemiology (AREA)
  • Databases & Information Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Character Discrimination (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Neurology (AREA)

Abstract

A method for detecting handwriting problem, comprising:
    • acquiring, by a handwriting instrument comprising one motion sensor, motion data while a user is using the handwriting instrument,
    • analyzing the motion data by an artificial intelligence trained to detect a handwriting problem.

Description

  • This application claims priority from European patent application EP20305430.9, filed on 4 May 2020, this content being incorporated herein by reference.
  • TECHNICAL FIELD
  • This disclosure pertains to the field of systems and methods for detecting handwriting problems.
  • BACKGROUND ART
  • Currently, several systems or methods exist for detecting particular handwriting problems. Among these handwriting problems, one can found dyslexia, learning disabilities, difficulty in forming letters, etc.
  • Currently, the methods used to detect handwriting problems are based on the use of a stylus-pen with which a user is asked to write on a mobile terminal specifically dedicated to detecting handwriting problems. Such method is for example disclosed in the document U.S. Pat. No. 6,304,667. In this document, the user writes on a mobile terminal such as an electronic tablet. The tablet comprises an embedded pressure sensor able to calculate the pressure of the stylus on the electronic tablet when the user is writing. Furthermore, the method uses a database of distorted letters with which the letters written by the user are compared. This method allows detecting handwriting problems such as dyslexia. However, the material used to detect the problem is very specific and does not allow detecting easily handwriting problems for a large number of people.
  • SUMMARY
  • One purpose of this disclosure is to improve the situation.
  • It is proposed a system for detecting handwriting problems comprising:
      • a handwriting instrument including a body extending longitudinally between a first end and a second end, said first end having a writing tip which is able to write on a support, said handwriting instrument further including at least one motion sensor configured to acquire data on the handwriting of the user when a user is using said handwriting instrument,
      • one calculating unit communicating with said motion sensor and configured to analyze said data by an artificial intelligence trained to detect whether said user has handwriting problems.
  • This system presents the technical advantage of being able to detect a handwriting problem with a regular handwriting instrument. The proposed system is really easy to use since no further electronic components such as a dedicated tablet and stylus are needed.
  • The following features can be optionally implemented, separately or in combination one with the others:
  • The motion sensor and the calculating unit are embedded in a second extremity of the handwriting instrument.
  • The system further comprises a detection device, said detection device comprising the motion sensor and the calculation unit, said detection device being mounted on the second extremity of the handwriting instrument.
  • The system can then be used with any already existing handwriting instrument.
  • The motion sensor is embedded in the handwriting instrument, the handwriting instrument further included a short-range radio communication interface configured to communicate raw data acquired by the motion sensor to a mobile device comprising said calculating unit via a communication interface of said mobile device.
  • The motion sensor comprises two three-axis accelerometers.
  • Therefore, the motion sensor used to detect handwriting problems does not consume a lot of power.
  • The handwriting instrument comprises two motion sensors being one three-axis accelerometer and one three-axis gyroscope.
  • It increases the precision in the acquired motion data.
  • The gyroscope comprises a wake-up input suited for receiving a wake-up signal from said calculating unit when a movement is detected by the accelerometer, said gyroscope being configured for switching into an active state when said wake-up signal is received.
  • This configuration allows to reduce the consumption of power needed by the system.
  • The system further comprises a pressure sensor embedded in the pen or the pencil, said calculating unit being configured to further receive data acquired by said pressure sensor.
  • The system is then able to detect handwriting problems on the basis of different kind of data: motion data, pressure data, etc.
  • The system further comprises a stroke sensor configured to acquire stroke data while the user is using the handwriting instrument, said artificial intelligence being trained with said stroke data to determine handwriting problems.
  • The artificial intelligence is then able to determine when the user is actually using the handwriting instrument on a support and differentiate the data corresponding to an actual use of the handwriting instrument and the data acquired while the handwriting instrument is just hold in the air.
  • The stroke sensor is the motion sensor.
  • Advantageously, the stroke sensor is a pressure sensor, or a contact sensor, or a vibration sensor.
  • Advantageously, the system does not use another embedded sensor. The system remains compact and low-power consuming.
  • In another aspect, it is proposed a method for detecting handwriting problem, comprising:
      • acquiring, by means of a handwriting instrument comprising at least one motion sensor, motion data while a user is using said handwriting instrument,
      • analyzing said motion data by an artificial intelligence trained to detect a handwriting problem.
  • The following features can be optionally implemented, separately or in combination one with the others:
  • The artificial intelligence is a neural network.
  • Preferably, the neural network is a deep neural network.
  • The method further comprises a prior learning step comprising:
      • acquiring a plurality of motion data from a plurality of persons using said handwriting instrument,
      • labelizing said acquired data,
      • using end-to-end supervised learning on the labelizing to train the neural network until it converges,
      • storing said neural network.
  • The neural network can then be trained by means of an end-to-end classification. This improves the precision of the results.
  • The acquired data are classified in at least one of the following classes:
      • type of grip on the handwriting instrument,
      • pressure applied on the handwriting instrument,
      • use of the handwriting instrument among writing, drawing or coloring,
      • fluidity of writing,
      • dyslexia,
      • dysgraphia,
      • wrong ductus.
  • The system can then detect a lot of different and particular handwriting problems.
  • The method further comprises acquiring vibration data by a stroke sensor, the method further comprising a prior learning step comprising:
      • acquiring a plurality of motion data and vibration data from a plurality of persons using said handwriting instrument,
      • processing the vibration data to obtain stroke timestamps labels,
      • using supervised learning on the processing to train said neural network until it converges,
      • storing said neural network.
  • The neural network can then be trained by segmentation and classification of strokes. The size of the neural network is then smaller than the neural network trained according to the end-to-end classification.
  • The features extracted from the strokes timestamps comprise:
      • total strokes duration,
      • total in air stroke duration,
      • strokes mean duration,
      • strokes mean and peak velocity,
      • number of pauses during use of the handwriting instrument,
      • ballistic index, which corresponds to an indicator of handwriting fluency which measures smoothness of the movement defined by the ratio between the number of zero crossings in the acceleration and the number of zero crossings in the velocity,
      • number of zero-crossing in the acceleration during strokes,
      • number of zero-crossing in the velocity during strokes.
  • The classification of the features of stroke timestamps is made by a hand-crafted algorithm or a learned model.
  • Two approaches can then be used to extract the features.
  • Method according to any of claims 15 to 17, wherein the extracted features of the stroke timestamps and motion data are classified in at least one of the following classes:
      • type of grip on the handwriting instrument,
      • pressure applied on the handwriting instrument,
      • use of the handwriting instrument among writing, drawing or coloring,
      • fluidity of writing,
      • dyslexia,
      • dysgraphia,
      • wrong ductus.
  • The neural network is further trained with a data base of letters and numbers correctly formed, a sequence of strokes and a direction of said strokes of the sequence of strokes being associated to each letter and number of the data base, and wherein, based on the motion and vibration data acquired during the use of the handwriting instrument, the neural network determines if the user is forming letters and numbers correctly.
  • The system is then adapted to detect a lot of different handwriting problems, with a high precision, small number of components and easy use.
  • BRIEF DESCRIPTION OF DRAWINGS
  • Other features, details and advantages will be shown in the following detailed description and on the figures, on which:
  • FIG. 1 is an illustration of a system for detecting handwriting problems according to a first embodiment.
  • FIG. 2 is a block schema of the system illustrated in FIG. 1.
  • FIG. 3 is an illustration of a system for detecting handwriting problems according to a second embodiment.
  • FIG. 4 is a block schema of the system illustrated in FIG. 3.
  • FIG. 5 is an illustration of a system for detecting handwriting problems according to a third embodiment.
  • FIG. 5 is an illustration of a system for detecting handwriting problems according to a fourth embodiment.
  • FIG. 7 is an illustration of a system for detecting handwriting problems according to an alternative embodiment of FIGS. 1 and 2.
  • FIG. 8 is a block diagram illustrated the training phase of the neural network according to a first embodiment.
  • FIG. 8 is a block diagram illustrated the training phase of the neural network according to a second embodiment,
  • FIG. 10A to 10C illustrate block diagrams of the collect phase, training phase and inference phase of the trained neural network.
  • DESCRIPTION OF EMBODIMENTS
  • Figures and the following detailed description contain, essentially, some exact elements. They can be used to enhance understanding the disclosure and, also, to define the invention if necessary.
  • It is now referred to FIGS. 1 to 7 illustrating embodiments of a system 1 for detecting handwriting problems. The same reference numbers are used to describe identical elements of the system.
  • In an embodiment, a handwriting problem which can be detected according to the present disclosure can be dyslexia, dysgraphia or a difficulty to reproduce characters.
  • FIGS. 1 and 2 generally illustrates a system 1 according to a first embodiment. The system 1 comprises a handwriting instrument 2. Typically, the handwriting instrument 2 can be a pen, a pencil, a brush or any element allowing a user to write or draw with it on a support. Typically, the support can be paper, canvas, or any surface on which a user can write or draw. The support can also be a coloring book.
  • The handwriting instrument 2 comprises a body 3 extending longitudinally between a first end 4 and a second end 5. The first end 4 comprises a writing tip 6 which is able to write on a support. Typically, the tip 6 can deliver ink or color.
  • The handwriting instrument 2 further includes at least one motion sensor 7, In one embodiment, the motion sensor 7 can be a three-axis accelerometer or a three-axis gyroscope.
  • In the illustrated embodiments on FIGS. 1 to 7, the handwriting instrument 2 preferably includes two motion sensors 7. In a preferred embodiment, the handwriting instrument 2 comprises two three-axis accelerometers. In another preferred embodiment, the handwriting instrument 2 comprises one three-axis accelerometer and one three-axis gyroscope.
  • The at least one motion sensor 7 is able to acquire data on the handwriting of the user when the user is using the handwriting instrument 2. These data are communicated to a calculating unit 8 which is configured to analyze the data and detect an eventual handwriting problem of the user. The calculating unit 8 can comprise a volatile memory to store the data acquired by the motion sensor 7 and a non-volatile memory to store a model enabling the detection of handwriting problem.
  • The handwriting instrument 2 can also comprise a short-range radio communication interface 9 allowing the communication of data between the motion sensor 7 and the calculating unit 8. In one embodiment, the short-range radio communication interface is using a Wi-Fi, Bluetooth®, LORA®, SigFox® or NBIoT network. In another embodiment, it can also communicate using a 2G, 3G, 4G or 5G network.
  • The handwriting instrument 2 further includes a battery 10 providing power to at least the motion sensor 7 when the user is using the handwriting instrument. The battery 9 can also provide power to the calculating unit 8 when the calculating unit is included in the writing instrument 2.
  • More specifically, in the embodiment of FIGS. 3 and 4, the handwriting instrument 2 comprises the at least one motion sensor 7, the short-range radio communication interface 9 and the battery 10. The system 1 further comprises a mobile device 11, distinct from the handwriting instrument 2, The mobile device 11 can typically be an electronic tablet, a mobile phone or a computer. The mobile device 11 comprises the calculating unit 8, The mobile device 11 further comprises a short-range radio communication interface 12 enabling communication between the calculating unit 8 and the handwriting instrument 2.
  • In this embodiment, the calculating device 8 of the mobile device receives raw data acquired by the motion sensor 7 and analyzed them to detect an eventual handwriting problem.
  • In another embodiment illustrated FIGS. 5 and 6, the motion sensors 7, the calculating unit 8, the short-range radio communication interface 9 and the battery 10, are not embedded in the handwriting instrument 2. The electronics 20 can be comprised in a detection device 13, distinct from the handwriting instrument 2. The detection device 13 can be mounted on the second end 5 of the handwriting instrument 2.
  • In this embodiment, the detection device 13 comprises a body 14 to be mounted on the second end 5 of the handwriting instrument 2 and a protuberant tip 15 able to be inserted in the body 3 of the handwriting instrument 2. Preferably, one motion sensor 7 can be provided on the protuberant tip 15 and another motion sensor 7 can be provided in the body 14 of the detection device 13. By this means, the two motions sensors 7 are able to acquire different data during the handwriting of the user.
  • In another embodiment, the motions sensors 7 are provided in the body 14 of the detection device 13. By this means, the detection device 13 can be mounted on any type of handwriting instrument 2, without necessitating a hollow body 3 of the handwriting instrument 2.
  • In another embodiment illustrated on FIG. 7, the at least one motion sensor 7, the calculating unit 8, the short-range radio communication interface 9 and the battery 10 are directly embedded in the handwriting instrument 2.
  • In this embodiment, one motion sensor 7 can be provided close to the first end 4 of the handwriting instrument 2, while another motion sensor 7 can be provided on the second end 5 of the handwriting instrument 2.
  • In an embodiment, the handwriting instrument 2 can also comprise a pressure sensor able to acquire data. These data can be transmitted to the calculation unit that analyze these data and the data acquired by the at least one motion sensor 7.
  • The pressure sensor can be embedded in the handwriting instrument 2 or in the detection device 13.
  • In all the embodiments described above, the calculating unit 8 receives data acquired from at least on motion sensor 7 and from the pressure sensor 15, if applicable, to analyze them and detect a handwriting problem.
  • More specifically, the calculating unit 8 can store an artificial intelligence model able to analyze the data acquired by the motion sensor 7. The artificial intelligence can comprise a trained neural network.
  • In one embodiment illustrated on FIG. 8, the neural network is trained according to the method of using intermediate features extraction.
  • More particularly, at step S1, the motion sensor 7 acquires data during the use of the handwriting instrument 2.
  • At step S2, the neural network receives the raw signals of the data acquired at step S1. The neural network also receives the sample labels at step S3. These labels correspond to whether or not the signal corresponds to a stroke.
  • More precisely, the neural network is able to determine if the signal correspond to a stroke on a support. The neural network is then able to determine stroke timestamps.
  • More particularly, this means that the neural network is able to determine for each stroke timestamps if a stroke has actually been made on the support by the user during the use of the handwriting instrument 2.
  • At step S4, the calculating unit 8 performs a stroke features extraction to obtain intermediate features at step S5.
  • These intermediate features comprise, but are not limited to:
      • total strokes duration,
      • total in air stroke duration,
      • strokes mean duration,
      • strokes mean and peak velocity,
      • number of pauses during use of the handwriting instrument,
      • ballistic index, which corresponds to an indicator of handwriting fluency which measures smoothness of the movement defined by the ratio between the number of zero crossings in the acceleration and the number of zero crossings in the velocity,
      • number of zero-crossing in the acceleration during strokes,
      • number of zero-crossing in the velocity during strokes.
  • From these intermediate features, the neural network is able to derive indications about handwriting problems.
  • At step S6, an algorithm is able to derive indications about handwriting problems.
  • This algorithm can be a learned model such as a second neural network, or a handcrafted algorithm.
  • In the embodiment where a learned model such as a neural network is used, the model is trained on a supervised classification task, where the inputs are stroke features with labels, and the outputs are handwriting problems.
  • In the embodiment where a hand-crafted algorithm is used, the hand-crafted algorithm can compute statistics on the stroke features and compare them to thresholds found in the scientific literatures, in order to detect handwriting problems.
  • Finally, at step S7, the system is able to detect handwriting problems. These handwriting problems include but are not limited to:
      • dyslexia,
      • dysgraphia,
      • wrong grip of the handwriting instrument,
      • bad character writing.
  • In another embodiment illustrated on FIG. 9, the neural network is trained according to the method of end-to-end classification.
  • According to this embodiment, at step S10, the data are acquired by the motion sensor 7.
  • The classification is made in step S11. To do learn the classification task, the neural network receives the raw signal of the data acquired by the motion sensor 7 and global labels (step S12). The global labels corresponds to the handwriting problems to be detected by the neural network which can be, but are not limited to:
      • dyslexia,
      • dysgraphia,
      • wrong grip of the handwriting instrument,
      • bad character writing.
  • In step S13, the neural network delivers the result.
  • The trained neural network described in reference with FIGS. 8 and 9 is stored.
  • The neural network can be stored in the calculating unit 8.
  • FIGS. 10A to 100 illustrate more specifically the embodiment described with reference to FIG. 8.
  • In order of segment the strokes (step S2 of FIG. 8), the neural network may determine the timestamps of the strokes on the support.
  • This information can be detected by a stroke sensor 16. The stroke sensor 16 is advantageously embedded in the handwriting instrument or in the detection device 13 mounted on the handwriting instrument.
  • In an embodiment, the stroke sensor 16 may be a pressure sensor, a contact sensor or a vibration sensor. Then, the neural network receives the data collected by the stroke sensor 16 at step S3.
  • In a preferred embodiment illustrated FIGS. 10A to 100, the stroke sensor 16 is the motion sensor 7. More preferably, the motion sensor 7 is a three-axis accelerometer.
  • FIG. 10A illustrates the collect of data used during the training phase of the neural network, which is illustrated FIG. 10B, Finally, FIG. 100 illustrates the inference of the neural network by a user of the handwriting instrument.
  • To use the motion sensor 7 as the stroke sensor 16, the accelerometer first need to be set such that its sample rate is at least twice superior to the maximum frequency of the vibrations to be detected.
  • Preferably, the accelerometer is highly sensitive. To allow detection of the vibrations by the accelerometer, the accelerometer may be bound to the writing tip 6 of the handwriting instrument 2 by rigid contacts with little damping.
  • In an embodiment, it is possible to enhance the precision of the vibration detection by using a support presenting a rough surface with known spatial frequency.
  • In FIG. 10A, representing the collect phase, the accelerometer is set with a sample rate F2. While the user is using the handwriting instrument 2, the accelerometer acquires data at step S20. These data can be sent by short-range radio to a recording device at step S21.
  • In an embodiment, during the collect phase, if the handwriting instrument 2 also comprises a three-axis gyroscope as another motion sensor 7, the three-axis gyroscope can also acquire data that are sent to the recording device at step S21.
  • FIG. 10B illustrates the training phase of the neural network.
  • At step S22, the data sent to the recording device are provided. The data are analyzed at step S23A to determine the labels (step S238). For example, the labels comprise the strokes timestamps, detected when vibration is detected in the data, and the stroke velocity. The stroke velocity is advantageously determined using the acceleration data and the high frequencies contained in the vibration.
  • Step S24 comprises the undersampling of the data. Particularly, during the preceding steps, the frequency of the accelerometer was set to be higher than the one set for the inference phase. Moreover, the vibration analysis was made on the basis of the three-axis accelerometer and the three-axis gyroscope. However, the constant use of the gyroscope leads to high energy consumption.
  • The undersampling step S24 comprises the degradation of the parameters. Frequency F2 of the accelerometer is reduced to a frequency F1, smaller than F2, and the training is made only according to three-axis detection.
  • At step S25, the neural network is trained to be able to perform strokes segmentation, as described with reference to FIG. 8, step S2.
  • FIG. 10C illustrates the inference phase. In this phase, the neural network is trained to detect handwriting problems by means of strokes segmentation.
  • At step S26, a user is using the handwriting instrument 2 in view of detecting an eventual handwriting problem.
  • The accelerometer in the handwriting instrument is set to the frequency F1 and advantageously, the data are acquired according to three-axis.
  • At step S27, the trained neural network is feed with the acquired data, At step S28, the neural network is able to deliver the strokes timestamps and the velocity.
  • Finally, the neural network is able to perform the intermediate stroke feature extraction and the classification at step S29. Step S29 actually corresponds to steps S4 to S7, already described with reference to FIG. 8.
  • In an embodiment, the neural network can be trained continuously with the data acquired by the user of the handwriting pen 2 after the storage of the neural network.
  • In an embodiment, the neural network can also be trained to detect a wrong ductus of the user. The ductus corresponds to the formation of letter and number.
  • More specifically, the neural network is able to determine if a sequence of strokes correspond to a letter or a number.
  • To this end, the neural network can also be fed with a large data base of letters and numbers, Each letters and numbers can be associated with a sequence of strokes. The sequence of strokes can advantageously corresponds to acceleration signals acquired by the accelerometer during the collect phase when forming the letters and numbers.
  • The labels to be determined by the neural network may be the direction and an order of the sequence of strokes for each letter and number.
  • In step S5 of FIG. 8, the intermediate features can then also comprise the temporal sequence of strokes and their direction.
  • In step S7, the neural network is able to determine if the user is forming correctly letters and numbers.

Claims (8)

1. A method for detecting handwriting problem, comprising:
acquiring, by means of a handwriting instrument comprising at least one motion sensor, motion data while a user is using said handwriting instrument,
analyzing said motion data by an artificial intelligence trained to detect a handwriting problem.
2. The method according to claim 1, wherein the artificial intelligence is a neural network.
3. The method according to claim 2, the method further comprising a prior learning step comprising:
acquiring a plurality of motion data from a plurality of persons using said handwriting instrument,
labelizing said acquired data,
using end-to-end supervised learning to train the neural network until it converges,
storing said neural network.
4. The method according to claim 3, wherein the acquired data are classified in at least one of the following classes:
type of grip on the handwriting instrument,
pressure applied on the handwriting instrument,
use of the handwriting instrument among writing, drawing or coloring,
fluidity of writing,
dyslexia,
dysgraphia,
wrong ductus.
5. The method according to claim 8, further comprising acquiring vibration data by a stroke sensor, the method further comprising a prior learning step comprising:
acquiring a plurality of motion data and vibration data from a plurality of persons using said handwriting instrument,
processing the vibration data to obtain stroke timestamps labels,
using supervised learning to train said neural network until it converges,
storing said neural network.
6. The method according to claim 5, wherein the features extracted from the strokes timestamps comprise:
total strokes duration,
total in air stroke duration,
strokes mean duration,
strokes mean and peak velocity,
number of pauses during use of the handwriting instrument,
ballistic index, which corresponds to an indicator of handwriting fluency which measures smoothness of the movement defined by the ratio between the number of zero crossings in the acceleration and the number of zero crossings in the velocity,
number of zero-crossing in the acceleration during strokes,
number of zero-crossing in the velocity during strokes.
7. The method according to claim 5, wherein the extracted features of the stroke timestamps are classified in at least one of the following classes:
type of grip on the handwriting instrument,
pressure applied on the handwriting instrument,
use of the handwriting instrument among writing, drawing or coloring,
fluidity of writing,
dyslexia,
dysgraphia,
wrong ductus.
8. The method according to claim 2, wherein the neural network is further trained with a data base of letters and numbers correctly formed, a sequence of strokes and a direction of said strokes of the sequence of strokes being associated to each letter and number of the data base, and wherein, based on the motion and vibration data acquired during the use of the handwriting instrument, the neural network determines if the user is forming letters and numbers correctly.
US17/306,507 2020-05-04 2021-05-03 System and Method for Detecting Handwriting Problems Pending US20210345913A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP20305430.9A EP3907587A1 (en) 2020-05-04 2020-05-04 System and method for detecting handwriting problems
EP20305430.9 2020-05-04

Publications (1)

Publication Number Publication Date
US20210345913A1 true US20210345913A1 (en) 2021-11-11

Family

ID=70802821

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/997,904 Pending US20230148911A1 (en) 2020-05-04 2021-04-30 System and method for detecting handwriting problems
US17/306,507 Pending US20210345913A1 (en) 2020-05-04 2021-05-03 System and Method for Detecting Handwriting Problems

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/997,904 Pending US20230148911A1 (en) 2020-05-04 2021-04-30 System and method for detecting handwriting problems

Country Status (4)

Country Link
US (2) US20230148911A1 (en)
EP (2) EP3907587A1 (en)
CN (1) CN115769177A (en)
WO (1) WO2021224147A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200057414A1 (en) * 2018-08-14 2020-02-20 Invoxia Computer-Implemented Method And System For Diagnosing Mechanical Default Of A Mechanical Watch, And Mechanical Watch For Implementing Said Method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4198810A1 (en) * 2021-12-20 2023-06-21 Société BIC Chatbot systems and methods using dynamic features of handwriting

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190076078A1 (en) * 2017-09-08 2019-03-14 Randall Davis Systems and methods for tremor detection and quantification
US10248652B1 (en) * 2016-12-09 2019-04-02 Google Llc Visual writing aid tool for a mobile writing device
US20200251217A1 (en) * 2019-12-12 2020-08-06 Renee CASSUTO Diagnosis Method Using Image Based Machine Learning Analysis of Handwriting
US20220147791A1 (en) * 2019-06-21 2022-05-12 Intel Corporation A generic modular sparse three-dimensional (3d) convolution design utilizing sparse 3d group convolution
US11464443B2 (en) * 2019-11-26 2022-10-11 The Chinese University Of Hong Kong Methods based on an analysis of drawing behavior changes for cognitive dysfunction screening
US20230126043A1 (en) * 2020-03-31 2023-04-27 Politecnico Di Milano Writing instrument, system and method for transparent monitoring and analysis of writing

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6697524B1 (en) * 1992-09-04 2004-02-24 Canon Kabushiki Kaisha Information processing method and apparatus
US6304667B1 (en) 2000-06-21 2001-10-16 Carmen T. Reitano System and method for incorporating dyslexia detection in handwriting pattern recognition systems
GB201008089D0 (en) * 2010-05-14 2010-06-30 Manus Neurodynamica Ltd Apparatus for use in diagnosing neurological disorder
US9329703B2 (en) * 2011-06-22 2016-05-03 Apple Inc. Intelligent stylus
TW201504859A (en) * 2013-07-26 2015-02-01 Hon Hai Prec Ind Co Ltd Pen cover and recording method for handwriting
US20190150837A1 (en) * 2017-11-20 2019-05-23 International Business Machines Corporation Detecting Human Input Activity for Engagement Assessment Using Wearable Inertia and Audio Sensors
CN108376086A (en) * 2018-02-05 2018-08-07 广东欧珀移动通信有限公司 Display control method and device, terminal, computer readable storage medium
US11226683B2 (en) * 2018-04-20 2022-01-18 Hewlett-Packard Development Company, L.P. Tracking stylus in a virtual reality system
EP3989108A1 (en) * 2020-10-26 2022-04-27 Invoxia System and method for recognizing online handwriting

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10248652B1 (en) * 2016-12-09 2019-04-02 Google Llc Visual writing aid tool for a mobile writing device
US20190076078A1 (en) * 2017-09-08 2019-03-14 Randall Davis Systems and methods for tremor detection and quantification
US20220147791A1 (en) * 2019-06-21 2022-05-12 Intel Corporation A generic modular sparse three-dimensional (3d) convolution design utilizing sparse 3d group convolution
US11464443B2 (en) * 2019-11-26 2022-10-11 The Chinese University Of Hong Kong Methods based on an analysis of drawing behavior changes for cognitive dysfunction screening
US20200251217A1 (en) * 2019-12-12 2020-08-06 Renee CASSUTO Diagnosis Method Using Image Based Machine Learning Analysis of Handwriting
US20230126043A1 (en) * 2020-03-31 2023-04-27 Politecnico Di Milano Writing instrument, system and method for transparent monitoring and analysis of writing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Chellapilla, Kumar & Puri, Sidd & Simard, Patrice. (2006). High Performance Convolutional Neural Networks for Document Processing. available at <'URL=https://www.researchgate.net/publication/228344387_High_Performance_Convolutional_Neural_Networks_for_Document_Processing"> (Year: 2006) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200057414A1 (en) * 2018-08-14 2020-02-20 Invoxia Computer-Implemented Method And System For Diagnosing Mechanical Default Of A Mechanical Watch, And Mechanical Watch For Implementing Said Method
US11619913B2 (en) * 2018-08-14 2023-04-04 Invoxia Computer-implemented method and system for diagnosing mechanical default of a mechanical watch, and mechanical watch for implementing said method

Also Published As

Publication number Publication date
CN115769177A (en) 2023-03-07
EP3910454A1 (en) 2021-11-17
WO2021224147A1 (en) 2021-11-11
EP3907587A1 (en) 2021-11-10
US20230148911A1 (en) 2023-05-18

Similar Documents

Publication Publication Date Title
US20210345913A1 (en) System and Method for Detecting Handwriting Problems
CN100487720C (en) Face comparison device
KR101939683B1 (en) Apparatus and method for recognizing user activity
US20070080955A1 (en) Electronic acquisition of a hand formed expression and a context of the expression
CN105856907A (en) Pen used for training pen holding gesture and strength
US20110285634A1 (en) Portable data entry device
US20220366713A1 (en) System and method for recognizing online handwriting
CN112506360A (en) Intelligent identification and correction method for conventional paper writing operation or test paper
CN108647657A (en) A kind of high in the clouds instruction process evaluation method based on pluralistic behavior data
CN115938244A (en) Display method, system and storage medium of electronic paper book adapting to multiple pen shapes
Wehbi et al. Digitizing handwriting with a sensor pen: A writer-independent recognizer
CN103425406A (en) Inputting method and inputting device of mobile terminal
CN110134257A (en) Electronic pen and its writing identification model method for building up based on inertial sensor
CN110910291A (en) Dot matrix paper pen technology application method combined with kannel or Dongda writing method
KR102242154B1 (en) System for digital pen with handwriting and memo function
CN210072562U (en) Electronic pen based on inertial sensor
CN113723571A (en) Method for judging pen holding posture in dot matrix pen writing
Kotak et al. An accelerometer based handwriting recognition of English alphabets using basic strokes
EP4198810A1 (en) Chatbot systems and methods using dynamic features of handwriting
CN107256657A (en) Calligraphy training system based on pressure detector
CN107240329A (en) Intelligent calligraphy training system
CN104464401B (en) A kind of method and device for identifying writing and pronouncing
JPH07200127A (en) Handwriting input device
CN117437830A (en) Handwriting teaching system based on AR
CN206594644U (en) Intelligent writing device

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION