WO2018085583A1 - Apparatuses, methods, and systems for motivating quality home-based spirometry maneuvers and automated evaluation and coaching - Google Patents

Apparatuses, methods, and systems for motivating quality home-based spirometry maneuvers and automated evaluation and coaching Download PDF

Info

Publication number
WO2018085583A1
WO2018085583A1 PCT/US2017/059783 US2017059783W WO2018085583A1 WO 2018085583 A1 WO2018085583 A1 WO 2018085583A1 US 2017059783 W US2017059783 W US 2017059783W WO 2018085583 A1 WO2018085583 A1 WO 2018085583A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
lung function
maneuver
sensor
flow detection
Prior art date
Application number
PCT/US2017/059783
Other languages
French (fr)
Inventor
Abigail COHEN
Andrew Brimer
Original Assignee
Sparo, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sparo, Inc. filed Critical Sparo, Inc.
Publication of WO2018085583A1 publication Critical patent/WO2018085583A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/085Measuring impedance of respiratory organs or lung elasticity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/087Measuring breath flow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/087Measuring breath flow
    • A61B5/0871Peak expiratory flowmeters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/091Measuring volume of inspired or expired gases, e.g. to determine lung capacity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/097Devices for facilitating collection of breath or for directing breath into or through measuring devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7278Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7465Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0242Operational features adapted to measure environmental factors, e.g. temperature, pollution
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/04Constructional details of apparatus
    • A61B2560/0443Modular apparatus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0204Acoustic sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/06Arrangements of multiple sensors of different types
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays

Definitions

  • Disclosed herein are computer-implemented systems, methods, and/or non-transitory computer readable media for motivating quality home-based spirometry maneuvers and automated evaluation and coaching.
  • the systems, methods, and/or media disclosed herein are configured to provide full feed-back to the user if any element in the hardware, software goes wrong. Further, the systems, methods, and/or media disclosed herein are configured to automatically identify problems in the hardware, software, and/or maneuver by the user and provide coaching information for the user to solve the problems and ensure correct, reliable, and convenient maneuver using a home-based spirometer.
  • Some exemplary embodiments of a home-based spirometer are discussed in US Patent No. 9,706,946, the disclosure of which is hereby expressly incorporated by reference in its entirety.
  • the systems, methods, and/or media disclosed herein include a home-based portable spirometer that is configurable to connect to an external portable device.
  • the systems, methods, and/or media disclosed herein include an application for automatic evaluation and coaching of the user for quality lung function test using a portable spirometer.
  • the systems, methods, and/or media disclosed herein utilizes a machine learning algorithm to generate automatic coaching information for the user that is capable of providing customized coaching based on the user's previous maneuver or medical information.
  • systems, methods, and/or media disclosed herein enables accurate evaluation of lung function maneuvers performed by the user so that the information can be used by the user or his/her doctor for current or future intervention, such as planning medication or future treatment.
  • a lung function motivation method for motivating quality operation of a lung flow detection device may include receiving, at a processor associated with a mobile device, an indication of the start of a lung function maneuver for measuring lung function of a user using a flow detection device. Data from at least during the maneuver may be received via the processor from at least one flow detection sensor of the flow detection device. A visualization of the lung function maneuver may be initiated via the processor, and the visualization may be displayed on the mobile device. The visualization displayed on the screen may be adjusted via the processor and based on the data from the at least one flow detection sensor.
  • An end of the lung function maneuver may be determined via the processor, and data received from the at least one flow detection sensor between the indicator to start the lung function maneuver and the end of the lung function maneuver may be aggregated via the processor. At least one pre-processing error detection may be performed, via the processor, on the aggregated data, and a result of the lung function maneuver may be displayed via the processor.
  • the method also includes performing, via the processor, at least one pretest error detection prior to the start of the lung function maneuver and/or performing, via the processor, at least one error detection test during the lung function maneuver.
  • the at least one error detection test during the lung function maneuver may comprise comparing data from the at least one flow detection sensor to a threshold value in real time and/or may comprise determining power or input data and comparing the power or input data to a threshold in real time.
  • the method may further comprise calculating a scaled force vector based on data from the at least one flow detection sensor, and comparing the scaled force vector to a threshold. The threshold comparison may occur in real time.
  • the scaled force vector may be calculated based on data from the at least one flow detection sensor and at least one of randomized game variables and interactive game variables.
  • the method may also include sending, via a processor, the aggregated data to at least one server, and receiving, from the server and via the processor, the result of the lung function maneuver.
  • the pre-test error detection includes collecting data comprising at least one of data from the at least one flow detection sensor, at least one of microphone power data and input data from the mobile device, environmental data, and video data from a camera associated with the mobile device.
  • the environmental data may comprise at least one of ambient pressure, ambient temperature, and ambient volume.
  • the pre-test error detection may comprise determining environmental data comprising at least one of ambient pressure, ambient temperature, and ambient volume, and scaling parameters for measurements from the at least one flow detection sensor based on at least one of the ambient pressure, ambient temperature, and ambient volume.
  • the flow detection device may comprise an oscillation chamber and the at least one flow detection sensor comprises a microphone.
  • Data from the microphone may comprise amplitude data, and the aggregated data may be an audio file.
  • the visualization may comprise a display of at least one moving object, and the adjustment of the visualization may comprise changing at least one of a speed and a direction of the at least one moving object.
  • the visualization may be configured to provide motivation to a user to correctly perform the lung function maneuver.
  • the flow detection sensor may comprise at least one of an acoustic sensor, a microphone, an oscillation rate sensor, a pressure sensor, a pressure transducer, an ultrasound sensor, a rotary wheel sensor, a piezoelectric sensor, a cantilever beam sensor, an acoustic sensor, a thermal sensor, and a hot wire sensor.
  • a countdown may be displayed via the screen to tell a user when to initiate the lung function maneuver.
  • the at least one pre-processing error detection on the aggregated data may comprise a pre-processing error detection machine learning algorithm.
  • a lung function maneuver motivation method for motivating quality operation of a lung flow detection device may include receiving, at a server, the aggregated data. At least one post-processing error detection may be performed, at the server, based on the aggregated data.
  • the post-processing error detection may include an analysis of at least one of lung function results, data from the at least one flow detection sensor, fast Fourier transform heat map, data from prior lung function maneuvers, and results from prior lung function maneuvers.
  • a result of the lung function maneuver may also be determined. The result of the lung function maneuver may be sent to the mobile device.
  • a lung function maneuver motivation system for motivating quality operation by a user of a lung function detection device.
  • the system includes a lung flow detection device having at least one flow detection sensor and configured to produce lung function maneuver data, a mobile device having a processor, a memory, and a display, and an application program operating on the mobile device.
  • the lung flow detection device may connect with or otherwise send signals to the mobile device.
  • the application program may comprise a plurality of computer instructions configured to cause the processor of the mobile device to collect lung function maneuver data from the at least one flow detection sensor, generate and adjust a visualization based on the lung function maneuver data, and display a result of the lung function maneuver.
  • the flow detection device may comprise an oscillation chamber, and the at least one flow detection sensor may comprise a microphone.
  • the visualization may comprise a display of at least one moving object, and the adjustment of the visualization may comprise changing at least one of a speed and a direction of the at least one moving object.
  • the mobile device may determine a scaled force vector based on the lung function maneuver data, and the scaled force vector may determine the adjustment of the visualization.
  • the mobile device further comprises at least one sensor.
  • the mobile device may set at least one parameter based on a reading from the at least one sensor, and data from the at least one flow detection sensor may be interpreted based on the at least one parameter.
  • the mobile device may aggregate the lung function maneuver data.
  • the mobile device may verify that the aggregated lung function maneuver data meets at least one threshold.
  • the system may further comprise a server, and when the threshold is met, the mobile device may upload the aggregated lung function maneuver data to the server.
  • the server may perform at least one post-processing error detection based on the aggregated lung function maneuver data.
  • the postprocessing error detection comprises an analysis of lung function results, data from the at least one flow detection sensor, fast Fourier transform heat map, data from prior lung function maneuvers, results from prior lung function maneuvers, and the result of the lung function maneuver.
  • the server may compare the aggregated lung function maneuver data to a threshold range, wherein the threshold range is determined based on at least one of generic or patient specific data.
  • the server inputs the aggregated lung function maneuver data into a machine learning algorithm that is configured to identify errors during the post-processing error detection.
  • FIG. 1 shows a non-limiting exemplary embodiment of a portable home-based spirometer according to some embodiments.
  • Figs. 2A-C show an exemplary flow diagram according to some embodiments.
  • Figs. 3A-3C show non-limiting exemplary embodiments of pre-test error detection according to some embodiments.
  • Figs. 4A-4E show non-limiting exemplary embodiments of during test motivation and error detection according to some embodiments.
  • Figs. 5A-5C show non-limiting exemplary embodiments of pre-processing error detection according to some embodiments.
  • Figs. 6A-6E show non-limiting exemplary embodiments of post-processing error detection according to some embodiments.
  • Fig. 7 shows a non-limiting exemplary embodiment of a results screen according to some embodiments.
  • FIGs. 8A-B shows a flow diagram of evaluating and coaching user for quality home-based spirometry maneuvers according to some embodiments.
  • FIG. 9 shows a non-limiting exemplary embodiment of visualization when a user starts a single lung function maneuver according to some embodiments.
  • Fig. 10 shows a non-limiting exemplary embodiment of visualization when a user finishes a single lung function maneuver according to some embodiments.
  • Fig. 11 shows a non-limiting exemplary embodiment that is configurable to evaluate and provide visualization of lung function of the user.
  • Fig. 12 shows a non-limiting exemplary embodiment of tracking lung function performance over time according to some embodiments.
  • Fig. 13 shows a non-limiting exemplary embodiment of visualizing lung function performance according to some embodiments.
  • Fig. 14 shows a non-limiting exemplary embodiment of a digital processing device according to some embodiments. DETAILED DESCRIPTION OF THE INVENTION
  • a visualization may be started by the user beginning a lung function maneuver, and the visualization may be completed when the lung function maneuver is complete.
  • the quality of the lung function maneuver may be determined and/or evaluated, and feedback may be displayed to the user.
  • the feedback may include the maneuver quality, and, in some embodiments, may also include coaching on technique changes that the user can implement in future uses to obtain better results.
  • a home-based spirometer may include a device that measures expired flow rate and volume that can be used in a home setting for screening, diagnosis, and/or monitoring acute or chronic respiratory conditions such as asthma, COPD, Cystic Fibrosis, bronchitis, or pneumonia.
  • the spirometer comprises a display screen and a modality for capturing expired flow rate and/or volume.
  • the spirometer disclosed herein may include one or more of: cloud-based software; a display, such as a mobile device display; a user interface, such as one displayed through a mobile application ("mobile app” or “app”); at least one sensor, wherein at least one of the sensor(s) transmits measured data to the user interface (e.g., the app) in real-time during a lung function maneuver; and/or the like.
  • the spirometer may include a flow sensor configured to determine at least one of speed and volume of the expired flow.
  • a flow meter may be in communication with a computing device executing one or more applications to process and analyze data generated at the spirometer.
  • a device for measuring a continuous flow rate of an airstream includes a nozzle having at least one channel to vent a portion of the airstream into an environment external to the device and at least one other channel to direct another portion of the airstream into a fluidic oscillator.
  • the device also includes a fluidic oscillator having a housing and at least one obstacle to induce oscillations in the airstream. A frequency of the oscillations correlates to the continuous flow rate of the airstream.
  • the device also includes at least one sensor to measure the oscillations of the airstream. The at least one sensor also generates an electronic signal corresponding to the oscillations measured and transmits the electronic signal to a computing device.
  • a device for measuring a continuous flow rate of an air stream includes a nozzle to direct a portion of the airstream into a fluidic oscillator.
  • the device also includes a detachable mouthpiece to reduce back pressure within the device.
  • the detachable mouthpiece has a diameter approximately or exactly equal to the diameter of the nozzle and defines a plurality of channels. At least one of the plurality of channels directs a portion of the airstream to the nozzle while, at least one other channel vents another portion of the airstream to an environment external to the device.
  • the device also includes a fluidic oscillator having a housing and at least one obstacle to induce oscillations in the airstream. A frequency of the oscillations correlates to the continuous flow rate of the airstream.
  • the device also includes at least one sensor to measure the oscillations of the airstream. The at least one sensor also generates an electronic signal corresponding to the oscillations measured and transmits the electronic signal to a computing device.
  • a flow meter for monitoring lung function of a user may include a flow sensor capable of determining a rate and/or volume of air expired by the user.
  • an oscillation chamber may be used to induce at least one oscillation in an airflow traversing the oscillation chamber. The airflow is generated by the user during a respiratory test.
  • the sensor may measure the rate of oscillation and transmit a data signal to a computing device. In some embodiments, the rate of oscillation may be measured in the oscillation chamber.
  • the computing device has at least one processor and receives the data signal. The computing device also processes the data signal to determine at least one of a flow rate, time duration, and a volume of the airflow in the oscillation chamber.
  • the system further includes a display device in communication with the computing device to display an assessment of respiratory health and a risk level to the user.
  • the sensors may include an acoustic sensor, a microphone, an oscillation rate sensor, a pressure sensor, a pressure transducer, an ultrasound sensor, a rotary wheel sensor, a piezoelectric sensor, a cantilever beam sensor, an acoustic sensor, a thermal sensor, a hot wire sensor, and/or the like.
  • an acoustic sensor or microphone may detect sounds produced by oscillations within the oscillation chamber. In some embodiments, more than one of these sensors may be used in combination. For example, an acoustic sensor or microphone may be used in conjunction with an oscillation rate sensor and/or a pressure sensor.
  • a system for monitoring lung function of a user includes at least one processor, at least one data storage device and an application executing on the at least one processor to determine spirometric characteristics of a data signal received from a spirometer.
  • the data signal is generated in response to an airflow in the spirometer.
  • the system also generates at least one display on a display device to display the spirometric characteristics.
  • a user may perform a lung function maneuver using a device that measures lung function.
  • the device can be a spirometer similar to that described above or may be any other device.
  • the lung function maneuver may comprise an individual breathing in as much air as possible and expiring that air as hard and as fast as possible in a way that enables the flow sensor to capture the data. The user blasts out their air until they cannot expel any more air from their lungs.
  • a lung function maneuver may be initiated by a user.
  • the lung function maneuver may be initiated by the user blowing into the device, by the user pressing a start button on the device, by the user pressing a start button on the app, and/or the like.
  • a processor at the device and/or a processor associated with a mobile device associated with the app determines whether the user is in an environment that is conducive for taking the lung function test.
  • various data may be collected.
  • the data may be one or more of: amplitude data from a microphone associated with the device, microphone power and input data associated with the mobile device, and other data, such as pressure, temperature, video from phone, and/or the like.
  • the data may be compared in real time (or approximate real time) to a set threshold.
  • a connectivity test may be performed to ensure that the device is connected to the app.
  • the mobile device may connect to the device via a cable or wirelessly, such as by using WiFi, Bluetooth, a digital connection, and/or the like.
  • the device API may be pinged and a response may be confirmed.
  • parameters for measurement may be set. For example, parameters for body temperature and pressure, saturated (BTPS) may be scaled. In some implementations, the BTPS may be scaled based on ambient temperature and pressure. Several other pre-test checks may also be performed.
  • Such tests may include determining whether the sensor is plugged in, whether the environment is too loud, whether the device and/or mobile device is connected to the internet, whether the ambient pressure or temperature is outside of an acceptable range, and whether the user has bad posture. If any of these pre-test checks fails, an error message may be displayed to the user via a display associated with the mobile device and/or the device. The error message may be descriptive and/or may contain technique coaching to the user based on the error that is detected.
  • a countdown to lung function measurement is displayed and/or verbally indicated to the user.
  • the countdown may be displayed on the app and/or on the device.
  • the mobile device and/or device may provide verbal instructions, such as "breathe in all of your air,” “blast your air out,” “keep going,” and/or the like.
  • a processor on the device and/or associated with the mobile device may determine whether a start visualization threshold has been met.
  • the threshold may be amplitude data from a microphone associated with the device and real-time comparison may be made with the set threshold.
  • one or more types of sensors may be used, such as an acoustic sensor, a microphone, an oscillation rate sensor, a pressure sensor, a pressure transducer, an ultrasound sensor, a rotary wheel sensor, a piezoelectric sensor, an acoustic sensor, a thermal sensor, and/or the like. If the threshold is not met within a set period of time, an error message may be displayed to the user via a display associated with the mobile device and/or the device. The error message may be descriptive and/or may contain technique coaching to the user based on the error that is detected.
  • a visualization may be initiated.
  • the visualization may be initiated via and shown in the app.
  • the visualization may have an initial scaled force vector, which is determined based on data from the one or more sensors associated with the device. In one embodiment, amplitude from the device microphone may be used.
  • the scaled force vector may also be based on randomized and/or interactive game variables. The scaled force vector may be updated and the updated scaled force vector may be applied to the visualization. Additionally, the sensor data, randomized game variables, and interactive game variables may be used to motivate the user to perform the test correctly.
  • the visualization may be a game animation, such as a rocket taking off.
  • the sensor When the user blows into the device, the sensor detects a metric associated with the expiration rate and/or volume. A scaled force vector is determined based on the metric, and the scaled force vector may correlate to the speed of the rocket in the visualization. Thus, when the user expires air at an increased rate and/or volume, the rocket appears to move faster, thereby motivating the user to continue expiring air.
  • the data may be monitored and compared to a set threshold in real time (or approximate real time) to determine whether an in-test error has occurred. This may prevent a user from continuing a lung function maneuver or waiting for a test that cannot be processed. For example, input data may be monitored to ensure there is no interruption in the data stream. In some implementations, microphone power and other inputs may be compared to a set threshold to determine if there has been an interruption in the data stream. If so, an error message may be displayed to the user.
  • sensor data such as amplitude data from a microphone associated with the device
  • a scaled force vector which is calculated with data from the one or more sensor, may also be compared in real time (or approximate real time) to a threshold to determine if a visualization timeout has been reached.
  • data associated with the completed maneuver may be pre-processed for errors.
  • the pre-processing error detection may prevent a user from waiting for full processing on a test that cannot be processed or that has errors.
  • sensor data may be analyzed and compared to thresholds.
  • an audio file associated with the lung function maneuver may be generated from a microphone associated with the device. The audio file may be compared to set thresholds, and the data collected during the lung function test may be inputted into pre-processing error detection machine learning algorithms for future evaluation and detection of errors.
  • a length of the audio file may be compared to a threshold length or a threshold file size, and if the audio file is too short or too small, an error message may be displayed to the user to indicate that the user did not expire long enough or the lung function test was not captured or did not occur. If other pre-processing quality checks are not met, a different error message corresponding to the error that occurred may be displayed to the user.
  • a processor associated with the mobile device having the app may determine if there is appropriate connectivity for the test data to be uploaded. If not, an error message may be displayed to the user.
  • the test data may be stored at a memory associated with the mobile device until appropriate connectivity is obtained.
  • the app may automatically push the data to the server or may prompt the user to initiate the upload when connectivity is obtained, or the app may store the data until the user reopens the app and sends the data.
  • the test data is received at the server, and the server may perform post-processing error detection. This may ensure that the signal and the data meet various quality standards. Additionally, this may prevent the user from getting poor or incorrect results, as well as understand what to change to get improved results.
  • the post-processing error detection may include an analysis of lung function data (such as FEV1, peak flow, flow volume curve, etc.), data from the at least one sensor (such as amplitude data from a microphone associated with the device), a fast Fourier Transform (FFT) heat map of the audio file, user's previous data (including previous test results, FFT, sensor data, etc.), and/or the like. These are inputted into various post-processing error detection machine learning algorithms.
  • lung function data such as FEV1, peak flow, flow volume curve, etc.
  • data from the at least one sensor such as amplitude data from a microphone associated with the device
  • FFT fast Fourier Transform
  • Post-processing error detection may include testing for technical errors and user errors.
  • Example technical errors include low signal-to-noise ratio, trace/heatmap inconsistencies (such as poor trace, incorrect PEF, incorrect FEV1, and/or the like), microphone static, and/or the like.
  • User errors may include a cough, slow start, vocalization, incorrect mouth placement on the device and/or relative to the sensor, insufficient expiratory time, user hunched over, and/or the like. If any of these errors (or any others tested for) occur, a display message is displayed to the user. The display message may include a description of the error that occurred, and may also include suggestions or coaching for the user to avoid the error in subsequent maneuvers.
  • a machine learning algorithm associated with post-processing error detection may determine when a test result and/or test data falls outside of a normal range (e.g., by comparing the test result and/or test data with threshold values). When the result/data is within a threshold of deviation, the test may be deemed to be a good test; when the result/data is outside of the threshold of deviation, the test may be deemed to be a bad test.
  • the threshold values and/or range may be based on inputted data and/or a set of inputted rules and/or vectors. In other implementations, the threshold values and/or range may update based on test data and/or test results.
  • the test data and/or test results that inform the threshold(s) may be specific to the user or may be generic to all app users.
  • User-specific threshold(s) may allow for further customization of test result outcomes. For example, each user may have a unique identifier associated with how (s)he performs a test, similar to a fingerprint. By tracking and learning from user-specific data/results, the thresholds may account for the user-specific tendencies or characteristics. Additionally, the user- specific data/results may be correlated to user-specific warning signs. For example, if a user consistently has an asthma attack after a unique type of test result (i.e., different from the generic population of users), a warning indicator may be generated and displayed to the user when the user has a test result that corresponds to that unique result.
  • the app may monitor to determine if a processing timeout has been reached.
  • a timer may start once the signal is uploaded to the server ("cloud").
  • the app monitors the cloud/ API for available test results, such as test results determined from the test data. If the test results are not achieved within a preset time threshold, the session times out and a display message is shown to the user. The user may be given the option to rerun the test, for example by re-uploading the test data to the server, or may be asked to restart with a new lung function maneuver. If test results are successfully determined, the processed results are returned to the mobile device. The results may be displayed within the app, and/or the user may be prompted to perform another lung function maneuver, such as a next lung function measurement in a set.
  • a lung function maneuver may begin when a user starts a lung function test(s) by pressing the start button.
  • the start button may be on the spirometer device or may be displayed in the app.
  • the app and/or a processor included in the spirometer may perform several tests to ensure that the device and app are prepared for the test. For example, the app and/or a processor included in the spirometer may determine whether the sensor is transmitting data to the mobile device. If the sensor is not transmitting data, an error message may be displayed to the user, such as the error message shown in Fig. 3A.
  • the app and/or a processor included in the spirometer may determine if the device associated with the app and/or device is connected to the internet. If not, an error message, such as the one shown in Fig. 3B, is displayed.
  • the app and/or processor included in the spirometer may determine a noise level, such as an ambient noise level before the test begins or as the test is running, in order to determine if the auditory sensor will be able to obtain an accurate reading. If the ambient noise is too loud, the error message shown in Fig. 3C may be displayed. In some embodiments, no pre-maneuver tests may be performed, while in other embodiments, one or more pre-maneuver tests are performed.
  • the user may be prompted to complete a lung function test.
  • the user may be prompted via a display screen in the app on the user's mobile device, via a display (such as a light or display screen) on the spirometer, and/or the like.
  • the display may, in some implementations, include a countdown.
  • the app may display a user interface indicating that the app and/or device are ready for the test to begin.
  • a screen may be displayed to indicate that the test is ready for launch, and may display a button for the user to press to start the test.
  • An exemplary screen display is shown in Fig. 4A. When the user presses the start button, the display may change to indicate that the user may blow into the device when ready, as shown in Fig. 4B.
  • a visualization may be displayed to the user.
  • An exemplary visualization is shown in Fig. 4C, which shows a virtual dashboard, where the more force and/or volume of air the user expires correlates to an increase in the at least one scale shown.
  • the displayed visualization may include at least one object that moves and a background.
  • the sensor(s) send data corresponding to flow rate and/or volume to the user's mobile device, which uses the data to generate the visualization.
  • an initial threshold of rate and/or volume the object(s) may begin to move relative to the background.
  • Such object(s) may be indicative of the expired rate and/or volume of air. If the initial threshold of signal that is indicative of expired flow or volume is not reached, the objects do not move and the end state is not reached. Once the initial threshold has been met, and the objects begin to move so as to provide motivation to the user.
  • the motivation may be randomized movement of the objects progressing toward the end state and/or accumulation of points or a score.
  • the movement of the objects and/or score may be varied or weighted based on data indicative of expired flow, such as flow rate, flow volume, the length of the expiration, and/or the like.
  • the weighting of the score system may favor the user blasting out their air initially (i.e. data indicative of expired flow or volume) and favors the length later in the lung function maneuver to encourage the user to blast out their air as fast as possible at the beginning and keep blasting out their air until they have no more air left in their lungs.
  • an end state may be reached.
  • the end state threshold may be the same as the initial threshold, or it may be higher or lower than the initial threshold.
  • the end state may be based on a minimum signal level that is indicative of expired flow or volume, which, when reached, indicates the user is no longer expiring air.
  • the visualization may end automatically based on when the user completes their lung function test. Once the end state is reached, the visualization screen may display an end screen. The end screen may indicate that the test is being processed, as shown in Fig. 4D.
  • an error message may be generated and displayed to the user.
  • An exemplary error screen is shown in Fig. 4E.
  • the signal is evaluated for maneuver quality and technical errors. This can include evaluating the maneuver by comparing the data to the user's predicted values, to other maneuvers within a set of previously completed maneuvers, to other maneuvers from previously completed sets, based on heuristics of the flow volume curve, and/or the like. The comparisons may be made against or based on patient-specific data, or they may be made against or based on generic patient data, such as overall average or recommended data.
  • exemplary error messages displayed to the user may include an indication that the maneuver was too short, as shown in Fig. 5A; an indication of an error with the connection to the servers and/or cloud, as shown in Fig. 5B; an indication of an unknown processing error, as shown in Fig. 5C; and/or the like.
  • Figs. 6A-E show various other exemplary error messages that may be displayed to the user.
  • Fig. 6A shows a vocalization or cough error
  • Fig. 6B shows an incorrect placement error
  • Fig. 6C shows a slow start error
  • Fig. 6D shows a technical error
  • Fig. 6E shows a device error and indicates that the user may want to contact customer service, which the user may do directly through the app or the user may be provided with a link and/or contact information for customer service.
  • a results screen may be displayed to show the user the results of the lung function maneuver.
  • An exemplary results screen according to some embodiments is shown in Fig. 7
  • Figs. 8A-B show an exemplary flowchart for automatic evaluation and coaching for a lung function test.
  • a user may initiate a lung function measurement by either using the spirometer or via the app on their mobile device.
  • a visualization environment is generated.
  • the visualization includes countdown to lung function test to prepare the user and starts visualization if one or more start visualization threshold(s) has been met. If the threshold has not been met, the error that may be caused by the hardware, software, and or user operation may be identified, and a display that is descriptive of identified errors may be presented to the user, for example, via the mobile app. Coaching and feedback associated with correcting the identified errors may also be provided to the user.
  • Such coaching and feedback may be interactive via an input device to the user.
  • visualization parameters may be updated in real-time using randomized or interactive game variables.
  • the visualization may be customized to reflect various aspects in the user' s performance during each individual lung function test.
  • the signal and data are checked to determine whether one or more pre-processing quality checks are met. Further, the connectivity for uploading or transmitting data to an external device may be checked, and the signal and/or data is processed and uploaded for storage.
  • motivational visualization may start in an application.
  • An exemplary embodiment is shown in Fig. 9, which shows a user beginning a test on a spirometer and a screen showing the test is ready to begin.
  • the visualization only starts when one or more of the specific visualization threshold(s) has been met.
  • the visualization may be real-time, and may be reflective of the user's expiratory activity.
  • a visualization in a motivational application may end when the user stops the lung function test.
  • the visualization may end when the user stops blowing into the spirometer.
  • the visualization may end when the flow speed and/or volume drops below a threshold speed/volume.
  • the visualization may stop when a predetermined time has elapsed since the start of visualization.
  • a user' s lung function may be automatically evaluated and visualized. For example, in an embodiment shown in Fig. 11, lung function evaluation may be color-coded with different colors, thereby indicating different lung performance levels. If the test goes well (i.e., for embodiments having error detection, no errors are detected), an assessment of the user's lung health is performed based on the data from the lung function test(s). If the user's lung function is good, the user may be given a green indicator (e.g., via the app or via a light on the spirometer).
  • a green indicator e.g., via the app or via a light on the spirometer.
  • the user may be given a yellow indicator.
  • the yellow indicator may be a yellow light on the spirometer device and/or may be shown on the user' s mobile device via the app.
  • the yellow indicator may be a warning indicator, which signals the user to take a rescue medication, such as an inhaler.
  • the device and/or app may prompt the user to perform another test after a set period of time, such as twenty minutes.
  • the set period of time may be shorter, such as five minutes, ten minutes, fifteen minutes, and/or the like, or the set period of time may be longer, such as twenty-five minutes, thirty minutes, forty-five minutes, an hour, and/or the like.
  • the user may be given a red indicator, which may be a red indicator on the spirometer device and/or may be shown on the user' s mobile device via the app.
  • a red indicator may indicate an emergency and/or that an emergency situation is imminent.
  • the user may be told to communicate with a medical professional or to optimize medication options.
  • the app may provide a prompt to the user to call or otherwise message (e.g., message via the app, text message, SMS message, MMS message, email, etc.) the user's medical professional or caregivers (such as family, friends, coworkers, etc.).
  • the prompt may be a pop up notification.
  • the app may automatically contact the medical professional without further input from the user.
  • a sample yellow zone test result is shown in Fig. 13.
  • the user may be prompted to send test results (whether green, yellow, or red) to their medical professional directly from the app on their mobile device.
  • the results may be sent to the medical professional via text message, SMS message, MMS message, app message, app notification, email, fax, and/or the like.
  • lung function performance may be tracked over time. Additionally, lung function performance may be correlated with medication, activity, triggers, or other influence factors over time. As such, factors for improvement of lung performance can be determined and functional deterioration of the lung can be prevented.
  • the platforms, media, methods and applications described herein include a digital processing device, a processor, or use of the same.
  • the digital processing device includes one or more hardware central processing units (CPU) that carry out the device's functions.
  • the digital processing device further comprises an operating system configured to perform executable instructions.
  • the digital processing device is optionally connected to a computer network.
  • the digital processing device is optionally connected to the Internet such that it accesses the World Wide Web.
  • the digital processing device is optionally connected to a cloud computing infrastructure.
  • the digital processing device is optionally connected to an intranet.
  • the digital processing device is optionally connected to a data storage device.
  • suitable digital processing devices include, by way of non-limiting examples, server computers, desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles.
  • server computers desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles.
  • smartphones are suitable for use in the system described herein.
  • Suitable tablet computers include those with booklet, slate, and convertible configurations, known to those of skill in the art.
  • the digital processing device includes an operating system configured to perform executable instructions.
  • the operating system is, for example, software, including programs and data, which manages the device's hardware and provides services for execution of applications.
  • suitable server operating systems include, by way of non- limiting examples, FreeBSD, OpenBSD, NetBSD ® , Linux, Apple ® Mac OS X Server ® , Oracle ® Solaris ® , Windows Server ® , and Novell ® NetWare ® .
  • suitable personal computer operating systems include, by way of non-limiting examples, Microsoft ® Windows ® , Apple ® Mac OS X ® , UNIX ® , and UNIX-like operating systems such as GNU/Linux ® .
  • the operating system is provided by cloud computing.
  • suitable mobile smart phone operating systems include, by way of non-limiting examples, Nokia ® Symbian ® OS, Apple ® iOS ® , Research In Motion ® BlackBeny OS ® , Google ® Android ® , Microsoft ® Windows Phone ® OS, Microsoft ® Windows Mobile ® OS, Linux ® , and Palm ® WebOS ® .
  • the device includes a storage and/or memory device.
  • the storage and/or memory device is one or more physical apparatuses used to store data or programs on a temporary or permanent basis.
  • the device is volatile memory and requires power to maintain stored information.
  • the device is non-volatile memory and retains stored information when the digital processing device is not powered.
  • the nonvolatile memory comprises flash memory.
  • the non-volatile memory comprises dynamic random-access memory (DRAM).
  • the non-volatile memory comprises ferroelectric random access memory (FRAM).
  • the non-volatile memory comprises phase-change random access memory (PRAM).
  • the device is a storage device including, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, magnetic disk drives, magnetic tapes drives, optical disk drives, and cloud computing based storage.
  • the storage and/or memory device is a combination of devices such as those disclosed herein.
  • the digital processing device includes a display to send visual information to a user.
  • the display is a cathode ray tube (CRT).
  • the display is a liquid crystal display (LCD).
  • the display is a thin film transistor liquid crystal display (TFT-LCD).
  • the display is an organic light emitting diode (OLED) display.
  • OLED organic light emitting diode
  • on OLED display is a passive- matrix OLED (PMOLED) or active-matrix OLED (AMOLED) display.
  • the display is a plasma display.
  • the display is a video projector.
  • the display is a combination of devices such as those disclosed herein.
  • the digital processing device includes an input device to receive information from a user.
  • the input device is a keyboard.
  • the input device is a pointing device including, by way of non-limiting examples, a mouse, trackball, track pad, joystick, game controller, or stylus.
  • the input device is a touch screen or a multi-touch screen.
  • the input device is a microphone to capture voice or other sound input.
  • the input device is a video camera or other sensor to capture motion or visual input.
  • the input device is a Kinect, Leap Motion, or the like.
  • the input device is a combination of devices such as those disclosed herein.
  • an exemplary digital processing device 101 is programmed or otherwise configured to measure test-retest reliability using FRP, MAP or other suitable precision evaluation methods.
  • the device 101 can regulate various aspects of test-retest precision measurement of the present disclosure, such as, for example, formulating test-retest reliability as an information retrieval problem, and ranking retest measurements by their distance to a subject's test measurement. As another example, it may assess a similarity between a test result and a retest result.
  • the digital processing device 101 includes a central processing unit (CPU, also "processor” and “computer processor” herein) 105, which can be a single core or multi core processor, or a plurality of processors for parallel processing.
  • the digital processing device 101 also includes memory or memory location 110 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 115 (e.g., hard disk), communication interface 120 (e.g., network adapter) for communicating with one or more other systems, and peripheral devices 125, such as cache, other memory, data storage and/or electronic display adapters.
  • the memory 110, storage unit 115, interface 120 and peripheral devices 125 are in communication with the CPU 105 through a communication bus (solid lines), such as a motherboard.
  • the storage unit 115 can be a data storage unit (or data repository) for storing data.
  • the digital processing device 101 can be operatively coupled to a computer network (“network") 130 with the aid of the communication interface 120.
  • the network 130 can be the Internet, an internet and/or extranet, or an intranet and/or extranet that is in communication with the Internet.
  • the network 130 in some cases is a telecommunication and/or data network.
  • the network 130 can include one or more computer servers, which can enable distributed computing, such as cloud computing.
  • the network 130 in some cases with the aid of the device 101, can implement a peer-to-peer network, which may enable devices coupled to the device 101 to behave as a client or a server.
  • the digital processing device 101 can be operatively connected to one or more specialized medical device (not shown) via the network 130. Such connection may enable data collection from the medical device; the data may include one or more test results, retest results, and other related test and subject information.
  • the specialized medical device is configured to measure visionary features(s) of one or more subjects.
  • the CPU 105 can execute a sequence of machine-readable instructions, which can be embodied in a program or software.
  • the instructions may be stored in a memory location, such as the memory 110.
  • the instructions can be directed to the CPU 105, which can subsequently program or otherwise configure the CPU 105 to implement methods of the present disclosure. Examples of operations performed by the CPU 105 can include fetch, decode, execute, and write back.
  • the CPU 105 can be part of a circuit, such as an integrated circuit. One or more other components of the device 101 can be included in the circuit. In some cases, the circuit is an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the storage unit 115 can store files, such as drivers, libraries and saved programs.
  • the storage unit 115 can store user data, e.g., user preferences and user programs.
  • the digital processing device 101 in some cases can include one or more additional data storage units that are external, such as located on a remote server that is in communication through an intranet or the Internet.
  • the digital processing device 101 can communicate with one or more remote computer systems through the network 130.
  • the device 101 can communicate with a remote computer system of a user.
  • remote computer systems include personal computers (e.g., portable PC), slate or tablet PCs (e.g., Apple® iPad, Samsung® Galaxy Tab), telephones, Smart phones (e.g., Apple® iPhone, Android-enabled device, Blackberry®), or personal digital assistants.
  • Methods as described herein can be implemented by way of machine (e.g., computer processor) executable code stored on an electronic storage location of the digital processing device 101, such as, for example, on the memory 110 or electronic storage unit 115.
  • the machine executable or machine readable code can be provided in the form of software.
  • the code can be executed by the processor 105.
  • the code can be retrieved from the storage unit 115 and stored on the memory 110 for ready access by the processor 105.
  • the electronic storage unit 115 can be precluded, and machine-executable instructions are stored on memory 110.
  • Non-transitory computer readable storage medium
  • the platforms, media, methods and applications described herein include one or more non-transitory computer readable storage media encoded with a program including instructions executable by the operating system of an optionally networked digital processing device.
  • a computer readable storage medium is a tangible component of a digital processing device.
  • a computer readable storage medium is optionally removable from a digital processing device.
  • a computer readable storage medium includes, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, solid state memory, magnetic disk drives, magnetic tape drives, optical disk drives, cloud computing systems and services, and the like.
  • the program and instructions are permanently, substantially permanently, semi-permanently, or non-transitorily encoded on the media.
  • the platforms, media, methods and applications described herein include at least one computer program, or use of the same.
  • a computer program includes a sequence of instructions, executable in the digital processing device's CPU, written to perform a specified task.
  • Computer readable instructions may be implemented as program modules, such as functions, objects, Application Programming Interfaces (APIs), data structures, and the like, that perform particular tasks or implement particular abstract data types.
  • APIs Application Programming Interfaces
  • a computer program may be written in various versions of various languages.
  • a computer program comprises one sequence of instructions. In some embodiments, a computer program comprises a plurality of sequences of instructions. In some embodiments, a computer program is provided from one location. In other embodiments, a computer program is provided from a plurality of locations. In various embodiments, a computer program includes one or more software modules. In various embodiments, a computer program includes, in part or in whole, one or more web applications, one or more mobile applications, one or more standalone applications, one or more web browser plug-ins, extensions, add-ins, or addons, or combinations thereof.
  • a computer program includes a web application.
  • a web application in various embodiments, utilizes one or more software frameworks and one or more database systems.
  • a web application is created upon a software framework such as Microsoft ® .NET or Ruby on Rails (RoR).
  • a web application utilizes one or more database systems including, by way of non-limiting examples, relational, non-relational, object oriented, associative, and XML database systems.
  • suitable relational database systems include, by way of non-limiting examples, Microsoft ® SQL Server, mySQLTM, and Oracle ® .
  • a web application in various embodiments, is written in one or more versions of one or more languages.
  • a web application may be written in one or more markup languages, presentation definition languages, client-side scripting languages, server-side coding languages, database query languages, or combinations thereof.
  • a web application is written to some extent in a markup language such as Hypertext Markup Language (HTML), Extensible Hypertext Markup Language (XHTML), or extensible Markup Language (XML).
  • a web application is written to some extent in a presentation definition language such as Cascading Style Sheets (CSS).
  • CSS Cascading Style Sheets
  • a web application is written to some extent in a client-side scripting language such as Asynchronous Javascript and XML (AJAX), Flash ® Actionscript, Javascript, or Silverlight ® .
  • AJAX Asynchronous Javascript and XML
  • Flash ® Actionscript Javascript
  • Javascript or Silverlight ®
  • a web application is written to some extent in a server-side coding language such as Active Server Pages (ASP), ColdFusion ® , Perl, JavaTM, JavaServer Pages (JSP), Hypertext Preprocessor (PHP), PythonTM, Ruby, Tel, Smalltalk, WebDNA ® , or Groovy.
  • a web application is written to some extent in a database query language such as Structured Query Language (SQL).
  • SQL Structured Query Language
  • a web application integrates enterprise server products such as IBM ® Lotus Domino ® .
  • a web application includes a media player element.
  • a media player element utilizes one or more of many suitable multimedia technologies including, by way of non-limiting examples, Adobe ® Flash ® , HTML 5, Apple ® QuickTime ® , Microsoft ® Silverlight ® , JavaTM, and Unity ® .
  • a computer program includes a mobile application provided to a mobile digital processing device.
  • the mobile application is provided to a mobile digital processing device at the time it is manufactured.
  • the mobile application is provided to a mobile digital processing device via the computer network described herein.
  • a mobile application is created by techniques known to those of skill in the art using hardware, languages, and development environments known to the art. Those of skill in the art will recognize that mobile applications are written in several languages. Suitable programming languages include, by way of non-limiting examples, C, C++, C#, Objective- C, Swift, JavaTM, Kotlin, Javascript, Pascal, Object Pascal, PythonTM, Ruby, VB.NET, WML, and XHTML/HTML with or without CSS, or combinations thereof.
  • Suitable mobile application development environments are available from several sources. Commercially available development environments include, by way of non-limiting examples, AirplaySDK, alcheMo, Appcelerator ® , Celsius, Bedrock, Flash Lite, .NET Compact Framework, Rhomobile, and WorkLight Mobile Platform. Other development environments are available without cost including, by way of non-limiting examples, Lazarus, MobiFlex, MoSync, and Phonegap. Also, mobile device manufacturers distribute software developer kits including, by way of non-limiting examples, iPhone and iPad (iOS) SDK, AndroidTM SDK, BlackBeny ® SDK, BREW SDK, Palm ® OS SDK, Symbian SDK, webOS SDK, and Windows ® Mobile SDK.
  • iOS iPhone and iPad
  • An authorized third party may integrate an STK into their own application, which may enable the third party application to access data from the flow detection sensor and/or access the server and processing algorithms for the display of results through the third party application.
  • a computer program includes a standalone application, which is a program that is run as an independent computer process, not an add-on to an existing process, e.g., not a plug-in.
  • standalone applications are often compiled.
  • a compiler is a computer program(s) that transforms source code written in a programming language into binary object code such as assembly language or machine code. Suitable compiled programming languages include, by way of non-limiting examples, C, C++, Objective-C, COBOL, Delphi, Eiffel, JavaTM, Lisp, PythonTM, Visual Basic, and VB .NET, or combinations thereof. Compilation is often performed, at least in part, to create an executable program.
  • a computer program includes one or more executable complied applications.
  • the platforms, media, methods and applications described herein include software, server, and/or database modules, or use of the same.
  • software modules are created by techniques known to those of skill in the art using machines, software, and languages known to the art.
  • the software modules disclosed herein are implemented in a multitude of ways.
  • a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof.
  • a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof.
  • the one or more software modules comprise, by way of non-limiting examples, a web application, a mobile application, and a standalone application.
  • software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on cloud computing platforms. In some embodiments, software modules are hosted on one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location.
  • the platforms, systems, media, and methods disclosed herein include one or more databases, or use of the same.
  • the entries of the database(s) disclosed herein are searchable using the search module/engine as disclosed herein.
  • each of seats including one or more parameters of a seat space.
  • one or more parameters of a seat space is a direct parameter or a derived/calculated parameter based on one or more parameters of the seat space.
  • each of the seats includes one or more parameters that is not related to direct or derived parameters of the seat space.
  • a database of seats containing information extracted from external sources to the systems, methods, and media disclosed herein is a database of seats containing information extracted from external sources to the systems, methods, and media disclosed herein.
  • the external sources may include commercial airline databases, webpages, websites, online cloud database, or any other commercially available data sources.
  • the database disclosed herein includes at least one parameter of a seat space.
  • the database disclosed herein includes at least one parameter of a seat space that is not directly obtained from external commercial sources.
  • the database includes at least one parameter of a seat space that is derived/calculated based on data from external commercial sources.
  • the database includes at least one parameter of a seat space that is directly measured and entered using the systems, media and methods disclosed herein.
  • the data based may include a leg space that is derived/calculated from commercially available information of the seat space.
  • the database includes a price per unit usable space of the seat that is calculated based on the flight ticket price and the usable space available from external sources.
  • the database disclosed herein includes a plurality of users, each of the users includes one or more parameters of physical information of the user.
  • one or more parameters of the physical information is a direct physical information or a derived/calculated parameter based on the physical information of the user.
  • each of the users includes one or more parameters that is not related to direct or derived parameter of the physical information of the user.
  • suitable databases include, by way of non-limiting examples, relational databases, non-relational databases, object oriented databases, object databases, entity-relationship model databases, associative databases, and XML databases.
  • a database is internet-based.
  • a database is web-based.
  • a database is cloud computing-based.
  • a database is based on one or more local computer storage devices.
  • the computer program includes a web browser plug-in.
  • a plug-in is one or more software components that add specific functionality to a larger software application. Makers of software applications support plug-ins to enable third-party developers to create abilities which extend an application, to support easily adding new features, and to reduce the size of an application. When supported, plug-ins enable customizing the functionality of a software application. For example, plug-ins are commonly used in web browsers to play video, generate interactivity, scan for viruses, and display particular file types. Those of skill in the art will be familiar with several web browser plug-ins including, Adobe ® Flash ® Player, Microsoft ® Silverlight ® , and Apple ® QuickTime ® .
  • the toolbar comprises one or more web browser extensions, add-ins, or add-ons. In some embodiments, the toolbar comprises one or more explorer bars, tool bands, or desk bands.
  • plug-in frameworks are available that enable development of plug-ins in various programming languages, including, by way of non-limiting examples, C++, Delphi, JavaTM, PHP, PythonTM, and VB .NET, or combinations thereof.
  • Web browsers are software applications, designed for use with network-connected digital processing devices, for retrieving, presenting, and traversing information resources on the World Wide Web. Suitable web browsers include, by way of non-limiting examples, Microsoft ® Internet Explorer ® , Mozilla ® Firefox ® , Google ® Chrome, Apple ® Safari ® , Opera Software ® Opera ® , and KDE Konqueror. In some embodiments, the web browser is a mobile web browser.
  • Mobile web browsers are designed for use on mobile digital processing devices including, by way of non-limiting examples, handheld computers, tablet computers, netbook computers, subnotebook computers, smartphones, music players, personal digital assistants (PDAs), and handheld video game systems.
  • Suitable mobile web browsers include, by way of non-limiting examples, Google ® Android ® browser, RFM BlackBerry ® Browser, Apple ® Safari ® , Palm ® Blazer, Palm ® WebOS ® Browser, Mozilla ® Firefox ® for mobile, Microsoft ® Internet Explorer ® Mobile, Amazon ® Kindle ® Basic Web, Nokia ® Browser, Opera Software ® Opera ® Mobile, and Sony ® PSPTM browser.

Abstract

A user may initiate a lung function maneuver via a flow detection device and/or a mobile device. Pre-test error detection may occur before the maneuver begins, and the user may perform the maneuver. The mobile device may receive data from one or more sensors associated with the flow detection device, and the mobile device may initiate a visualization to be shown on the mobile device display. The visualization may be adjusted in real time based on the data received from the sensor(s). At the end of the maneuver, the test data may be aggregated and tested for errors. The result of the maneuver may be displayed to the user.

Description

APPARATUSES, METHODS, AND SYSTEMS FOR MOTIVATING QUALITY HOME-BASED SPIROMETRY MANEUVERS AND AUTOMATED EVALUATION AND COACHING
CROSS-REFERENCE TO RELATED APPLICATIONS
[001] This application is a non-provisional of and claims priority to US Provisional Application No. 62/416,657, filed November 2, 2016, and entitled "Method for Motivating Quality Home-Based Spirometry Maneuvers and Automated Evaluation and Coaching," the disclosure of which is hereby expressly incorporated by reference in its entirety.
SUMMARY OF THE INVENTION
[002] Disclosed herein are computer-implemented systems, methods, and/or non-transitory computer readable media for motivating quality home-based spirometry maneuvers and automated evaluation and coaching. The systems, methods, and/or media disclosed herein are configured to provide full feed-back to the user if any element in the hardware, software goes wrong. Further, the systems, methods, and/or media disclosed herein are configured to automatically identify problems in the hardware, software, and/or maneuver by the user and provide coaching information for the user to solve the problems and ensure correct, reliable, and convenient maneuver using a home-based spirometer. Some exemplary embodiments of a home-based spirometer are discussed in US Patent No. 9,706,946, the disclosure of which is hereby expressly incorporated by reference in its entirety.
[003] In some cases, the systems, methods, and/or media disclosed herein include a home-based portable spirometer that is configurable to connect to an external portable device. In some instances, the systems, methods, and/or media disclosed herein include an application for automatic evaluation and coaching of the user for quality lung function test using a portable spirometer. In some cases, the systems, methods, and/or media disclosed herein utilizes a machine learning algorithm to generate automatic coaching information for the user that is capable of providing customized coaching based on the user's previous maneuver or medical information. Further, systems, methods, and/or media disclosed herein enables accurate evaluation of lung function maneuvers performed by the user so that the information can be used by the user or his/her doctor for current or future intervention, such as planning medication or future treatment.
[004] A lung function motivation method for motivating quality operation of a lung flow detection device is disclosed. The method may include receiving, at a processor associated with a mobile device, an indication of the start of a lung function maneuver for measuring lung function of a user using a flow detection device. Data from at least during the maneuver may be received via the processor from at least one flow detection sensor of the flow detection device. A visualization of the lung function maneuver may be initiated via the processor, and the visualization may be displayed on the mobile device. The visualization displayed on the screen may be adjusted via the processor and based on the data from the at least one flow detection sensor. An end of the lung function maneuver may be determined via the processor, and data received from the at least one flow detection sensor between the indicator to start the lung function maneuver and the end of the lung function maneuver may be aggregated via the processor. At least one pre-processing error detection may be performed, via the processor, on the aggregated data, and a result of the lung function maneuver may be displayed via the processor.
[005] In some embodiments, the method also includes performing, via the processor, at least one pretest error detection prior to the start of the lung function maneuver and/or performing, via the processor, at least one error detection test during the lung function maneuver. The at least one error detection test during the lung function maneuver may comprise comparing data from the at least one flow detection sensor to a threshold value in real time and/or may comprise determining power or input data and comparing the power or input data to a threshold in real time. In some implementations, the method may further comprise calculating a scaled force vector based on data from the at least one flow detection sensor, and comparing the scaled force vector to a threshold. The threshold comparison may occur in real time. The scaled force vector may be calculated based on data from the at least one flow detection sensor and at least one of randomized game variables and interactive game variables.
[006] The method may also include sending, via a processor, the aggregated data to at least one server, and receiving, from the server and via the processor, the result of the lung function maneuver.
[007] In embodiments where pre-test error detection is performed, the pre-test error detection includes collecting data comprising at least one of data from the at least one flow detection sensor, at least one of microphone power data and input data from the mobile device, environmental data, and video data from a camera associated with the mobile device. The environmental data may comprise at least one of ambient pressure, ambient temperature, and ambient volume. The pre-test error detection may comprise determining environmental data comprising at least one of ambient pressure, ambient temperature, and ambient volume, and scaling parameters for measurements from the at least one flow detection sensor based on at least one of the ambient pressure, ambient temperature, and ambient volume.
[008] In some implementations, the flow detection device may comprise an oscillation chamber and the at least one flow detection sensor comprises a microphone. Data from the microphone may comprise amplitude data, and the aggregated data may be an audio file.
[009] The visualization may comprise a display of at least one moving object, and the adjustment of the visualization may comprise changing at least one of a speed and a direction of the at least one moving object. The visualization may be configured to provide motivation to a user to correctly perform the lung function maneuver.
[010] The flow detection sensor may comprise at least one of an acoustic sensor, a microphone, an oscillation rate sensor, a pressure sensor, a pressure transducer, an ultrasound sensor, a rotary wheel sensor, a piezoelectric sensor, a cantilever beam sensor, an acoustic sensor, a thermal sensor, and a hot wire sensor.
[011] In some embodiments, a countdown may be displayed via the screen to tell a user when to initiate the lung function maneuver. The at least one pre-processing error detection on the aggregated data may comprise a pre-processing error detection machine learning algorithm.
[012] A lung function maneuver motivation method for motivating quality operation of a lung flow detection device may include receiving, at a server, the aggregated data. At least one post-processing error detection may be performed, at the server, based on the aggregated data. The post-processing error detection may include an analysis of at least one of lung function results, data from the at least one flow detection sensor, fast Fourier transform heat map, data from prior lung function maneuvers, and results from prior lung function maneuvers. A result of the lung function maneuver may also be determined. The result of the lung function maneuver may be sent to the mobile device.
[013] A lung function maneuver motivation system for motivating quality operation by a user of a lung function detection device is disclosed. The system includes a lung flow detection device having at least one flow detection sensor and configured to produce lung function maneuver data, a mobile device having a processor, a memory, and a display, and an application program operating on the mobile device. The lung flow detection device may connect with or otherwise send signals to the mobile device. The application program may comprise a plurality of computer instructions configured to cause the processor of the mobile device to collect lung function maneuver data from the at least one flow detection sensor, generate and adjust a visualization based on the lung function maneuver data, and display a result of the lung function maneuver.
[014] In some embodiments, the flow detection device may comprise an oscillation chamber, and the at least one flow detection sensor may comprise a microphone. The visualization may comprise a display of at least one moving object, and the adjustment of the visualization may comprise changing at least one of a speed and a direction of the at least one moving object. The mobile device may determine a scaled force vector based on the lung function maneuver data, and the scaled force vector may determine the adjustment of the visualization.
[015] In some implementations, the mobile device further comprises at least one sensor. The mobile device may set at least one parameter based on a reading from the at least one sensor, and data from the at least one flow detection sensor may be interpreted based on the at least one parameter.
[016] In some embodiments, the mobile device may aggregate the lung function maneuver data. The mobile device may verify that the aggregated lung function maneuver data meets at least one threshold. The system may further comprise a server, and when the threshold is met, the mobile device may upload the aggregated lung function maneuver data to the server. The server may perform at least one post-processing error detection based on the aggregated lung function maneuver data. The postprocessing error detection comprises an analysis of lung function results, data from the at least one flow detection sensor, fast Fourier transform heat map, data from prior lung function maneuvers, results from prior lung function maneuvers, and the result of the lung function maneuver. In some embodiments, the server may compare the aggregated lung function maneuver data to a threshold range, wherein the threshold range is determined based on at least one of generic or patient specific data. In some implementations, the server inputs the aggregated lung function maneuver data into a machine learning algorithm that is configured to identify errors during the post-processing error detection.
BRIEF DESCRIPTION OF THE DRAWINGS
[017] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:
[018] Fig. 1 shows a non-limiting exemplary embodiment of a portable home-based spirometer according to some embodiments.
[019] Figs. 2A-C show an exemplary flow diagram according to some embodiments.
[020] Figs. 3A-3C show non-limiting exemplary embodiments of pre-test error detection according to some embodiments.
[021] Figs. 4A-4E show non-limiting exemplary embodiments of during test motivation and error detection according to some embodiments.
[022] Figs. 5A-5C show non-limiting exemplary embodiments of pre-processing error detection according to some embodiments.
[023] Figs. 6A-6E show non-limiting exemplary embodiments of post-processing error detection according to some embodiments.
[024] Fig. 7 shows a non-limiting exemplary embodiment of a results screen according to some embodiments.
[025] Figs. 8A-B shows a flow diagram of evaluating and coaching user for quality home-based spirometry maneuvers according to some embodiments.
[026] Fig. 9 shows a non-limiting exemplary embodiment of visualization when a user starts a single lung function maneuver according to some embodiments.
[027] Fig. 10 shows a non-limiting exemplary embodiment of visualization when a user finishes a single lung function maneuver according to some embodiments.
[028] Fig. 11 shows a non-limiting exemplary embodiment that is configurable to evaluate and provide visualization of lung function of the user.
[029] Fig. 12 shows a non-limiting exemplary embodiment of tracking lung function performance over time according to some embodiments.
[030] Fig. 13 shows a non-limiting exemplary embodiment of visualizing lung function performance according to some embodiments.
[031] Fig. 14 shows a non-limiting exemplary embodiment of a digital processing device according to some embodiments. DETAILED DESCRIPTION OF THE INVENTION
[032] In some embodiments, disclosed herein are computer-implemented systems, methods, and/or non-transitory computer readable media for motivating use of a home-based spirometer during the completion of a single lung function maneuver. A visualization may be started by the user beginning a lung function maneuver, and the visualization may be completed when the lung function maneuver is complete. The quality of the lung function maneuver may be determined and/or evaluated, and feedback may be displayed to the user. The feedback may include the maneuver quality, and, in some embodiments, may also include coaching on technique changes that the user can implement in future uses to obtain better results.
[033] In some cases, a home-based spirometer may include a device that measures expired flow rate and volume that can be used in a home setting for screening, diagnosis, and/or monitoring acute or chronic respiratory conditions such as asthma, COPD, Cystic Fibrosis, bronchitis, or pneumonia. The spirometer comprises a display screen and a modality for capturing expired flow rate and/or volume. In further cases, the spirometer disclosed herein may include one or more of: cloud-based software; a display, such as a mobile device display; a user interface, such as one displayed through a mobile application ("mobile app" or "app"); at least one sensor, wherein at least one of the sensor(s) transmits measured data to the user interface (e.g., the app) in real-time during a lung function maneuver; and/or the like. In some embodiments, the spirometer may include a flow sensor configured to determine at least one of speed and volume of the expired flow.
[034] An exemplary spirometer is shown in Fig. 1, and additional embodiments are also described in US Patent No. 9,706,946, which is incorporated by reference herein in its entirety. The spirometer may be configured to measure airflow, which may be used to monitor or assess respiratory function. In various embodiments, a flow meter may be in communication with a computing device executing one or more applications to process and analyze data generated at the spirometer.
[035] In some embodiments, a device for measuring a continuous flow rate of an airstream includes a nozzle having at least one channel to vent a portion of the airstream into an environment external to the device and at least one other channel to direct another portion of the airstream into a fluidic oscillator. The device also includes a fluidic oscillator having a housing and at least one obstacle to induce oscillations in the airstream. A frequency of the oscillations correlates to the continuous flow rate of the airstream. The device also includes at least one sensor to measure the oscillations of the airstream. The at least one sensor also generates an electronic signal corresponding to the oscillations measured and transmits the electronic signal to a computing device.
[036] In another embodiment, a device for measuring a continuous flow rate of an air stream includes a nozzle to direct a portion of the airstream into a fluidic oscillator. The device also includes a detachable mouthpiece to reduce back pressure within the device. The detachable mouthpiece has a diameter approximately or exactly equal to the diameter of the nozzle and defines a plurality of channels. At least one of the plurality of channels directs a portion of the airstream to the nozzle while, at least one other channel vents another portion of the airstream to an environment external to the device. The device also includes a fluidic oscillator having a housing and at least one obstacle to induce oscillations in the airstream. A frequency of the oscillations correlates to the continuous flow rate of the airstream. The device also includes at least one sensor to measure the oscillations of the airstream. The at least one sensor also generates an electronic signal corresponding to the oscillations measured and transmits the electronic signal to a computing device.
[037] A flow meter for monitoring lung function of a user may include a flow sensor capable of determining a rate and/or volume of air expired by the user. In some embodiments, an oscillation chamber may be used to induce at least one oscillation in an airflow traversing the oscillation chamber. The airflow is generated by the user during a respiratory test. The sensor may measure the rate of oscillation and transmit a data signal to a computing device. In some embodiments, the rate of oscillation may be measured in the oscillation chamber. The computing device has at least one processor and receives the data signal. The computing device also processes the data signal to determine at least one of a flow rate, time duration, and a volume of the airflow in the oscillation chamber. The system further includes a display device in communication with the computing device to display an assessment of respiratory health and a risk level to the user.
[038] Various other sensors may also be used to determine flow rate and/or volume of air expired by the user. For example, the sensors may include an acoustic sensor, a microphone, an oscillation rate sensor, a pressure sensor, a pressure transducer, an ultrasound sensor, a rotary wheel sensor, a piezoelectric sensor, a cantilever beam sensor, an acoustic sensor, a thermal sensor, a hot wire sensor, and/or the like. In several implementations, an acoustic sensor or microphone may detect sounds produced by oscillations within the oscillation chamber. In some embodiments, more than one of these sensors may be used in combination. For example, an acoustic sensor or microphone may be used in conjunction with an oscillation rate sensor and/or a pressure sensor. [039] A system for monitoring lung function of a user includes at least one processor, at least one data storage device and an application executing on the at least one processor to determine spirometric characteristics of a data signal received from a spirometer. The data signal is generated in response to an airflow in the spirometer. The system also generates at least one display on a display device to display the spirometric characteristics.
[040] Various methods for measuring airflow and monitoring lung function may be performed using various embodiments. The methods may also be performed with other suitable devices and systems.
[041] A user may perform a lung function maneuver using a device that measures lung function. The device can be a spirometer similar to that described above or may be any other device. The lung function maneuver may comprise an individual breathing in as much air as possible and expiring that air as hard and as fast as possible in a way that enables the flow sensor to capture the data. The user blasts out their air until they cannot expel any more air from their lungs.
[042] As shown in Figs. 2A-C, in some embodiments, a lung function maneuver may be initiated by a user. The lung function maneuver may be initiated by the user blowing into the device, by the user pressing a start button on the device, by the user pressing a start button on the app, and/or the like. A processor at the device and/or a processor associated with a mobile device associated with the app determines whether the user is in an environment that is conducive for taking the lung function test. As part of this pre-test error detection, various data may be collected. The data may be one or more of: amplitude data from a microphone associated with the device, microphone power and input data associated with the mobile device, and other data, such as pressure, temperature, video from phone, and/or the like. The data may be compared in real time (or approximate real time) to a set threshold. A connectivity test may be performed to ensure that the device is connected to the app. The mobile device may connect to the device via a cable or wirelessly, such as by using WiFi, Bluetooth, a digital connection, and/or the like. In some embodiments, the device API may be pinged and a response may be confirmed. In some embodiments, parameters for measurement may be set. For example, parameters for body temperature and pressure, saturated (BTPS) may be scaled. In some implementations, the BTPS may be scaled based on ambient temperature and pressure. Several other pre-test checks may also be performed. Such tests may include determining whether the sensor is plugged in, whether the environment is too loud, whether the device and/or mobile device is connected to the internet, whether the ambient pressure or temperature is outside of an acceptable range, and whether the user has bad posture. If any of these pre-test checks fails, an error message may be displayed to the user via a display associated with the mobile device and/or the device. The error message may be descriptive and/or may contain technique coaching to the user based on the error that is detected.
[043] If no pre-test errors are detected, thereby indicating that the user is in the right environment to take the lung function test, a countdown to lung function measurement is displayed and/or verbally indicated to the user. The countdown may be displayed on the app and/or on the device. In some embodiments, the mobile device and/or device may provide verbal instructions, such as "breathe in all of your air," "blast your air out," "keep going," and/or the like. A processor on the device and/or associated with the mobile device may determine whether a start visualization threshold has been met. In some embodiments, the threshold may be amplitude data from a microphone associated with the device and real-time comparison may be made with the set threshold. In other implementations, one or more types of sensors may be used, such as an acoustic sensor, a microphone, an oscillation rate sensor, a pressure sensor, a pressure transducer, an ultrasound sensor, a rotary wheel sensor, a piezoelectric sensor, an acoustic sensor, a thermal sensor, and/or the like. If the threshold is not met within a set period of time, an error message may be displayed to the user via a display associated with the mobile device and/or the device. The error message may be descriptive and/or may contain technique coaching to the user based on the error that is detected.
[044] When the user is attempting a lung function test and the threshold has been met, a visualization may be initiated. In some embodiments, the visualization may be initiated via and shown in the app. The visualization may have an initial scaled force vector, which is determined based on data from the one or more sensors associated with the device. In one embodiment, amplitude from the device microphone may be used. In some embodiments, the scaled force vector may also be based on randomized and/or interactive game variables. The scaled force vector may be updated and the updated scaled force vector may be applied to the visualization. Additionally, the sensor data, randomized game variables, and interactive game variables may be used to motivate the user to perform the test correctly. For example, the visualization may be a game animation, such as a rocket taking off. When the user blows into the device, the sensor detects a metric associated with the expiration rate and/or volume. A scaled force vector is determined based on the metric, and the scaled force vector may correlate to the speed of the rocket in the visualization. Thus, when the user expires air at an increased rate and/or volume, the rocket appears to move faster, thereby motivating the user to continue expiring air. [045] As the test continues, the data may be monitored and compared to a set threshold in real time (or approximate real time) to determine whether an in-test error has occurred. This may prevent a user from continuing a lung function maneuver or waiting for a test that cannot be processed. For example, input data may be monitored to ensure there is no interruption in the data stream. In some implementations, microphone power and other inputs may be compared to a set threshold to determine if there has been an interruption in the data stream. If so, an error message may be displayed to the user.
[046] Additionally or alternatively, sensor data, such as amplitude data from a microphone associated with the device, may also be compared in real time (or approximate real time) to a set threshold to determine if an end visualization threshold has been reached. In some embodiments, a scaled force vector, which is calculated with data from the one or more sensor, may also be compared in real time (or approximate real time) to a threshold to determine if a visualization timeout has been reached. When a test is running and no errors have occurred, these factors are continually monitored. As long as there is no interruption in the data stream and once the end visualization or a timeout has occurred, the maneuver is complete and the visualization ends.
[047] In some embodiments, data associated with the completed maneuver may be pre-processed for errors. The pre-processing error detection may prevent a user from waiting for full processing on a test that cannot be processed or that has errors. To achieve this, sensor data may be analyzed and compared to thresholds. In some implementations, an audio file associated with the lung function maneuver may be generated from a microphone associated with the device. The audio file may be compared to set thresholds, and the data collected during the lung function test may be inputted into pre-processing error detection machine learning algorithms for future evaluation and detection of errors. For example, a length of the audio file may be compared to a threshold length or a threshold file size, and if the audio file is too short or too small, an error message may be displayed to the user to indicate that the user did not expire long enough or the lung function test was not captured or did not occur. If other pre-processing quality checks are not met, a different error message corresponding to the error that occurred may be displayed to the user.
[048] If the test meets the pre-processing quality checks (if any), a processor associated with the mobile device having the app may determine if there is appropriate connectivity for the test data to be uploaded. If not, an error message may be displayed to the user. In some embodiments, the test data may be stored at a memory associated with the mobile device until appropriate connectivity is obtained. The app may automatically push the data to the server or may prompt the user to initiate the upload when connectivity is obtained, or the app may store the data until the user reopens the app and sends the data.
[049] The test data is received at the server, and the server may perform post-processing error detection. This may ensure that the signal and the data meet various quality standards. Additionally, this may prevent the user from getting poor or incorrect results, as well as understand what to change to get improved results. In some embodiments, the post-processing error detection may include an analysis of lung function data (such as FEV1, peak flow, flow volume curve, etc.), data from the at least one sensor (such as amplitude data from a microphone associated with the device), a fast Fourier Transform (FFT) heat map of the audio file, user's previous data (including previous test results, FFT, sensor data, etc.), and/or the like. These are inputted into various post-processing error detection machine learning algorithms. Post-processing error detection may include testing for technical errors and user errors. Example technical errors include low signal-to-noise ratio, trace/heatmap inconsistencies (such as poor trace, incorrect PEF, incorrect FEV1, and/or the like), microphone static, and/or the like. User errors may include a cough, slow start, vocalization, incorrect mouth placement on the device and/or relative to the sensor, insufficient expiratory time, user hunched over, and/or the like. If any of these errors (or any others tested for) occur, a display message is displayed to the user. The display message may include a description of the error that occurred, and may also include suggestions or coaching for the user to avoid the error in subsequent maneuvers.
[050] In some embodiments, a machine learning algorithm associated with post-processing error detection may determine when a test result and/or test data falls outside of a normal range (e.g., by comparing the test result and/or test data with threshold values). When the result/data is within a threshold of deviation, the test may be deemed to be a good test; when the result/data is outside of the threshold of deviation, the test may be deemed to be a bad test. In some embodiments, the threshold values and/or range may be based on inputted data and/or a set of inputted rules and/or vectors. In other implementations, the threshold values and/or range may update based on test data and/or test results. The test data and/or test results that inform the threshold(s) may be specific to the user or may be generic to all app users. User-specific threshold(s) may allow for further customization of test result outcomes. For example, each user may have a unique identifier associated with how (s)he performs a test, similar to a fingerprint. By tracking and learning from user-specific data/results, the thresholds may account for the user-specific tendencies or characteristics. Additionally, the user- specific data/results may be correlated to user-specific warning signs. For example, if a user consistently has an asthma attack after a unique type of test result (i.e., different from the generic population of users), a warning indicator may be generated and displayed to the user when the user has a test result that corresponds to that unique result.
[051] While the processing occurs at the server, the app (and mobile device associated therewith) may monitor to determine if a processing timeout has been reached. In some embodiments, a timer may start once the signal is uploaded to the server ("cloud"). The app monitors the cloud/ API for available test results, such as test results determined from the test data. If the test results are not achieved within a preset time threshold, the session times out and a display message is shown to the user. The user may be given the option to rerun the test, for example by re-uploading the test data to the server, or may be asked to restart with a new lung function maneuver. If test results are successfully determined, the processed results are returned to the mobile device. The results may be displayed within the app, and/or the user may be prompted to perform another lung function maneuver, such as a next lung function measurement in a set.
[052] In further embodiments, a lung function maneuver may begin when a user starts a lung function test(s) by pressing the start button. The start button may be on the spirometer device or may be displayed in the app. Before the maneuver begins, the app and/or a processor included in the spirometer may perform several tests to ensure that the device and app are prepared for the test. For example, the app and/or a processor included in the spirometer may determine whether the sensor is transmitting data to the mobile device. If the sensor is not transmitting data, an error message may be displayed to the user, such as the error message shown in Fig. 3A. Additionally or alternatively, the app and/or a processor included in the spirometer may determine if the device associated with the app and/or device is connected to the internet. If not, an error message, such as the one shown in Fig. 3B, is displayed. In some embodiments where an auditory sensor is used, the app and/or processor included in the spirometer may determine a noise level, such as an ambient noise level before the test begins or as the test is running, in order to determine if the auditory sensor will be able to obtain an accurate reading. If the ambient noise is too loud, the error message shown in Fig. 3C may be displayed. In some embodiments, no pre-maneuver tests may be performed, while in other embodiments, one or more pre-maneuver tests are performed.
[053] If no errors are detected after pre-maneuver tests are performed, the user may be prompted to complete a lung function test. For example, the user may be prompted via a display screen in the app on the user's mobile device, via a display (such as a light or display screen) on the spirometer, and/or the like. The display may, in some implementations, include a countdown. In some embodiments, the app may display a user interface indicating that the app and/or device are ready for the test to begin. For example, a screen may be displayed to indicate that the test is ready for launch, and may display a button for the user to press to start the test. An exemplary screen display is shown in Fig. 4A. When the user presses the start button, the display may change to indicate that the user may blow into the device when ready, as shown in Fig. 4B.
[054] When the user starts blowing into the device and through the sensor (e.g., the flow sensor of the spirometer), a visualization may be displayed to the user. An exemplary visualization is shown in Fig. 4C, which shows a virtual dashboard, where the more force and/or volume of air the user expires correlates to an increase in the at least one scale shown.
[055] In some cases, the displayed visualization may include at least one object that moves and a background. When the user expires air into the spirometer, the sensor(s) send data corresponding to flow rate and/or volume to the user's mobile device, which uses the data to generate the visualization. When an initial threshold of rate and/or volume is reached, the object(s) may begin to move relative to the background. Such object(s) may be indicative of the expired rate and/or volume of air. If the initial threshold of signal that is indicative of expired flow or volume is not reached, the objects do not move and the end state is not reached. Once the initial threshold has been met, and the objects begin to move so as to provide motivation to the user. The motivation may be randomized movement of the objects progressing toward the end state and/or accumulation of points or a score. The movement of the objects and/or score may be varied or weighted based on data indicative of expired flow, such as flow rate, flow volume, the length of the expiration, and/or the like. The weighting of the score system may favor the user blasting out their air initially (i.e. data indicative of expired flow or volume) and favors the length later in the lung function maneuver to encourage the user to blast out their air as fast as possible at the beginning and keep blasting out their air until they have no more air left in their lungs.
[056] Once the flow rate and/or volume drops below a threshold level, an end state may be reached. The end state threshold may be the same as the initial threshold, or it may be higher or lower than the initial threshold. The end state may be based on a minimum signal level that is indicative of expired flow or volume, which, when reached, indicates the user is no longer expiring air. The visualization may end automatically based on when the user completes their lung function test. Once the end state is reached, the visualization screen may display an end screen. The end screen may indicate that the test is being processed, as shown in Fig. 4D.
[057] If, at some point during the test and before the end state is reached, the app is no longer connected to the spirometer, an error message may be generated and displayed to the user. An exemplary error screen is shown in Fig. 4E.
[058] After the lung function test is completed, the signal is evaluated for maneuver quality and technical errors. This can include evaluating the maneuver by comparing the data to the user's predicted values, to other maneuvers within a set of previously completed maneuvers, to other maneuvers from previously completed sets, based on heuristics of the flow volume curve, and/or the like. The comparisons may be made against or based on patient-specific data, or they may be made against or based on generic patient data, such as overall average or recommended data.
[059] In some cases, once the signal is evaluated and/or graded for maneuver quality and technical errors, feedback is displayed to the user. The feedback may include information about the quality of the maneuver, information about any technical errors, and coaching about ways to change lung function maneuver technique to improve maneuver quality. Exemplary error messages displayed to the user may include an indication that the maneuver was too short, as shown in Fig. 5A; an indication of an error with the connection to the servers and/or cloud, as shown in Fig. 5B; an indication of an unknown processing error, as shown in Fig. 5C; and/or the like.
[060] Figs. 6A-E show various other exemplary error messages that may be displayed to the user. Fig. 6A shows a vocalization or cough error, Fig. 6B shows an incorrect placement error, Fig. 6C shows a slow start error, Fig. 6D shows a technical error, and Fig. 6E shows a device error and indicates that the user may want to contact customer service, which the user may do directly through the app or the user may be provided with a link and/or contact information for customer service.
[061] When no error is found, a results screen may be displayed to show the user the results of the lung function maneuver. An exemplary results screen according to some embodiments is shown in Fig. 7
[062] Figs. 8A-B show an exemplary flowchart for automatic evaluation and coaching for a lung function test. In some embodiments, a user may initiate a lung function measurement by either using the spirometer or via the app on their mobile device. When the measurement is initiated, a visualization environment is generated. The visualization includes countdown to lung function test to prepare the user and starts visualization if one or more start visualization threshold(s) has been met. If the threshold has not been met, the error that may be caused by the hardware, software, and or user operation may be identified, and a display that is descriptive of identified errors may be presented to the user, for example, via the mobile app. Coaching and feedback associated with correcting the identified errors may also be provided to the user. Such coaching and feedback may be interactive via an input device to the user. When visualization is in progress, visualization parameters may be updated in real-time using randomized or interactive game variables. For example, the visualization may be customized to reflect various aspects in the user' s performance during each individual lung function test. When the visualization ends, the signal and data are checked to determine whether one or more pre-processing quality checks are met. Further, the connectivity for uploading or transmitting data to an external device may be checked, and the signal and/or data is processed and uploaded for storage.
[063] As discussed above, when the user begins a lung function test, motivational visualization may start in an application. An exemplary embodiment is shown in Fig. 9, which shows a user beginning a test on a spirometer and a screen showing the test is ready to begin. In some embodiments, the visualization only starts when one or more of the specific visualization threshold(s) has been met. The visualization may be real-time, and may be reflective of the user's expiratory activity.
[064] As shown in Fig. 10, a visualization in a motivational application may end when the user stops the lung function test. For example, the visualization may end when the user stops blowing into the spirometer. In another embodiment, the visualization may end when the flow speed and/or volume drops below a threshold speed/volume. In some implementations, if one or more of the end visualization thresholds has not been met, the visualization may stop when a predetermined time has elapsed since the start of visualization.
[065] Based on one or more lung function test(s), a user' s lung function may be automatically evaluated and visualized. For example, in an embodiment shown in Fig. 11, lung function evaluation may be color-coded with different colors, thereby indicating different lung performance levels. If the test goes well (i.e., for embodiments having error detection, no errors are detected), an assessment of the user's lung health is performed based on the data from the lung function test(s). If the user's lung function is good, the user may be given a green indicator (e.g., via the app or via a light on the spirometer).
[066] If the lung function falls in a yellow zone, the user may be given a yellow indicator. The yellow indicator may be a yellow light on the spirometer device and/or may be shown on the user' s mobile device via the app. The yellow indicator may be a warning indicator, which signals the user to take a rescue medication, such as an inhaler. In some embodiments, after a yellow zone test is performed, the device and/or app may prompt the user to perform another test after a set period of time, such as twenty minutes. The set period of time may be shorter, such as five minutes, ten minutes, fifteen minutes, and/or the like, or the set period of time may be longer, such as twenty-five minutes, thirty minutes, forty-five minutes, an hour, and/or the like.
[067] If the lung function falls in the red zone, the user may be given a red indicator, which may be a red indicator on the spirometer device and/or may be shown on the user' s mobile device via the app. A red indicator may indicate an emergency and/or that an emergency situation is imminent. Thus, in some embodiments, the user may be told to communicate with a medical professional or to optimize medication options. The app may provide a prompt to the user to call or otherwise message (e.g., message via the app, text message, SMS message, MMS message, email, etc.) the user's medical professional or caregivers (such as family, friends, coworkers, etc.). In some implementations, the prompt may be a pop up notification. In some implementations, the app may automatically contact the medical professional without further input from the user.
[068] A sample yellow zone test result is shown in Fig. 13. In some implementations, the user may be prompted to send test results (whether green, yellow, or red) to their medical professional directly from the app on their mobile device. The results may be sent to the medical professional via text message, SMS message, MMS message, app message, app notification, email, fax, and/or the like.
[069] Referring to Fig. 12, lung function performance may be tracked over time. Additionally, lung function performance may be correlated with medication, activity, triggers, or other influence factors over time. As such, factors for improvement of lung performance can be determined and functional deterioration of the lung can be prevented.
Digital processing device
[070] In some embodiments, the platforms, media, methods and applications described herein include a digital processing device, a processor, or use of the same. In further embodiments, the digital processing device includes one or more hardware central processing units (CPU) that carry out the device's functions. In still further embodiments, the digital processing device further comprises an operating system configured to perform executable instructions. In some embodiments, the digital processing device is optionally connected to a computer network. In further embodiments, the digital processing device is optionally connected to the Internet such that it accesses the World Wide Web. In still further embodiments, the digital processing device is optionally connected to a cloud computing infrastructure. In other embodiments, the digital processing device is optionally connected to an intranet. In other embodiments, the digital processing device is optionally connected to a data storage device.
[071] In accordance with the description herein, suitable digital processing devices include, by way of non-limiting examples, server computers, desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles. Those of skill in the art will recognize that many smartphones are suitable for use in the system described herein. Those of skill in the art will also recognize that select televisions, video players, and digital music players with optional computer network connectivity are suitable for use in the system described herein. Suitable tablet computers include those with booklet, slate, and convertible configurations, known to those of skill in the art.
[072] In some embodiments, the digital processing device includes an operating system configured to perform executable instructions. The operating system is, for example, software, including programs and data, which manages the device's hardware and provides services for execution of applications. Those of skill in the art will recognize that suitable server operating systems include, by way of non- limiting examples, FreeBSD, OpenBSD, NetBSD®, Linux, Apple® Mac OS X Server®, Oracle® Solaris®, Windows Server®, and Novell® NetWare®. Those of skill in the art will recognize that suitable personal computer operating systems include, by way of non-limiting examples, Microsoft® Windows®, Apple® Mac OS X®, UNIX®, and UNIX-like operating systems such as GNU/Linux®. In some embodiments, the operating system is provided by cloud computing. Those of skill in the art will also recognize that suitable mobile smart phone operating systems include, by way of non-limiting examples, Nokia® Symbian® OS, Apple® iOS®, Research In Motion® BlackBeny OS®, Google® Android®, Microsoft® Windows Phone® OS, Microsoft® Windows Mobile® OS, Linux®, and Palm® WebOS®.
[073] In some embodiments, the device includes a storage and/or memory device. The storage and/or memory device is one or more physical apparatuses used to store data or programs on a temporary or permanent basis. In some embodiments, the device is volatile memory and requires power to maintain stored information. In some embodiments, the device is non-volatile memory and retains stored information when the digital processing device is not powered. In further embodiments, the nonvolatile memory comprises flash memory. In some embodiments, the non-volatile memory comprises dynamic random-access memory (DRAM). In some embodiments, the non-volatile memory comprises ferroelectric random access memory (FRAM). In some embodiments, the non-volatile memory comprises phase-change random access memory (PRAM). In other embodiments, the device is a storage device including, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, magnetic disk drives, magnetic tapes drives, optical disk drives, and cloud computing based storage. In further embodiments, the storage and/or memory device is a combination of devices such as those disclosed herein.
[074] In some embodiments, the digital processing device includes a display to send visual information to a user. In some embodiments, the display is a cathode ray tube (CRT). In some embodiments, the display is a liquid crystal display (LCD). In further embodiments, the display is a thin film transistor liquid crystal display (TFT-LCD). In some embodiments, the display is an organic light emitting diode (OLED) display. In various further embodiments, on OLED display is a passive- matrix OLED (PMOLED) or active-matrix OLED (AMOLED) display. In some embodiments, the display is a plasma display. In other embodiments, the display is a video projector. In still further embodiments, the display is a combination of devices such as those disclosed herein.
[075] In some embodiments, the digital processing device includes an input device to receive information from a user. In some embodiments, the input device is a keyboard. In some embodiments, the input device is a pointing device including, by way of non-limiting examples, a mouse, trackball, track pad, joystick, game controller, or stylus. In some embodiments, the input device is a touch screen or a multi-touch screen. In other embodiments, the input device is a microphone to capture voice or other sound input. In other embodiments, the input device is a video camera or other sensor to capture motion or visual input. In further embodiments, the input device is a Kinect, Leap Motion, or the like. In still further embodiments, the input device is a combination of devices such as those disclosed herein.
[076] Referring to Fig. 14, in a particular embodiment, an exemplary digital processing device 101 is programmed or otherwise configured to measure test-retest reliability using FRP, MAP or other suitable precision evaluation methods. The device 101 can regulate various aspects of test-retest precision measurement of the present disclosure, such as, for example, formulating test-retest reliability as an information retrieval problem, and ranking retest measurements by their distance to a subject's test measurement. As another example, it may assess a similarity between a test result and a retest result. In this embodiment, the digital processing device 101 includes a central processing unit (CPU, also "processor" and "computer processor" herein) 105, which can be a single core or multi core processor, or a plurality of processors for parallel processing. The digital processing device 101 also includes memory or memory location 110 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 115 (e.g., hard disk), communication interface 120 (e.g., network adapter) for communicating with one or more other systems, and peripheral devices 125, such as cache, other memory, data storage and/or electronic display adapters. The memory 110, storage unit 115, interface 120 and peripheral devices 125 are in communication with the CPU 105 through a communication bus (solid lines), such as a motherboard. The storage unit 115 can be a data storage unit (or data repository) for storing data. The digital processing device 101 can be operatively coupled to a computer network ("network") 130 with the aid of the communication interface 120. The network 130 can be the Internet, an internet and/or extranet, or an intranet and/or extranet that is in communication with the Internet. The network 130 in some cases is a telecommunication and/or data network. The network 130 can include one or more computer servers, which can enable distributed computing, such as cloud computing. The network 130, in some cases with the aid of the device 101, can implement a peer-to-peer network, which may enable devices coupled to the device 101 to behave as a client or a server. The digital processing device 101 can be operatively connected to one or more specialized medical device (not shown) via the network 130. Such connection may enable data collection from the medical device; the data may include one or more test results, retest results, and other related test and subject information. The specialized medical device is configured to measure visionary features(s) of one or more subjects.
[077] Continuing to refer to Fig. 14, the CPU 105 can execute a sequence of machine-readable instructions, which can be embodied in a program or software. The instructions may be stored in a memory location, such as the memory 110. The instructions can be directed to the CPU 105, which can subsequently program or otherwise configure the CPU 105 to implement methods of the present disclosure. Examples of operations performed by the CPU 105 can include fetch, decode, execute, and write back. The CPU 105 can be part of a circuit, such as an integrated circuit. One or more other components of the device 101 can be included in the circuit. In some cases, the circuit is an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
[078] Continuing to refer to Fig. 14, the storage unit 115 can store files, such as drivers, libraries and saved programs. The storage unit 115 can store user data, e.g., user preferences and user programs. The digital processing device 101 in some cases can include one or more additional data storage units that are external, such as located on a remote server that is in communication through an intranet or the Internet.
[079] Continuing to refer to Fig. 14, the digital processing device 101 can communicate with one or more remote computer systems through the network 130. For instance, the device 101 can communicate with a remote computer system of a user. Examples of remote computer systems include personal computers (e.g., portable PC), slate or tablet PCs (e.g., Apple® iPad, Samsung® Galaxy Tab), telephones, Smart phones (e.g., Apple® iPhone, Android-enabled device, Blackberry®), or personal digital assistants.
[080] Methods as described herein can be implemented by way of machine (e.g., computer processor) executable code stored on an electronic storage location of the digital processing device 101, such as, for example, on the memory 110 or electronic storage unit 115. The machine executable or machine readable code can be provided in the form of software. During use, the code can be executed by the processor 105. In some cases, the code can be retrieved from the storage unit 115 and stored on the memory 110 for ready access by the processor 105. In some situations, the electronic storage unit 115 can be precluded, and machine-executable instructions are stored on memory 110.
Non-transitory computer readable storage medium
[081] In some embodiments, the platforms, media, methods and applications described herein include one or more non-transitory computer readable storage media encoded with a program including instructions executable by the operating system of an optionally networked digital processing device. In further embodiments, a computer readable storage medium is a tangible component of a digital processing device. In still further embodiments, a computer readable storage medium is optionally removable from a digital processing device. In some embodiments, a computer readable storage medium includes, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, solid state memory, magnetic disk drives, magnetic tape drives, optical disk drives, cloud computing systems and services, and the like. In some cases, the program and instructions are permanently, substantially permanently, semi-permanently, or non-transitorily encoded on the media. Computer program
[082] In some embodiments, the platforms, media, methods and applications described herein include at least one computer program, or use of the same. A computer program includes a sequence of instructions, executable in the digital processing device's CPU, written to perform a specified task. Computer readable instructions may be implemented as program modules, such as functions, objects, Application Programming Interfaces (APIs), data structures, and the like, that perform particular tasks or implement particular abstract data types. In light of the disclosure provided herein, those of skill in the art will recognize that a computer program may be written in various versions of various languages.
[083] The functionality of the computer readable instructions may be combined or distributed as desired in various environments. In some embodiments, a computer program comprises one sequence of instructions. In some embodiments, a computer program comprises a plurality of sequences of instructions. In some embodiments, a computer program is provided from one location. In other embodiments, a computer program is provided from a plurality of locations. In various embodiments, a computer program includes one or more software modules. In various embodiments, a computer program includes, in part or in whole, one or more web applications, one or more mobile applications, one or more standalone applications, one or more web browser plug-ins, extensions, add-ins, or addons, or combinations thereof.
Web application
[084] In some embodiments, a computer program includes a web application. In light of the disclosure provided herein, those of skill in the art will recognize that a web application, in various embodiments, utilizes one or more software frameworks and one or more database systems. In some embodiments, a web application is created upon a software framework such as Microsoft® .NET or Ruby on Rails (RoR). In some embodiments, a web application utilizes one or more database systems including, by way of non-limiting examples, relational, non-relational, object oriented, associative, and XML database systems. In further embodiments, suitable relational database systems include, by way of non-limiting examples, Microsoft® SQL Server, mySQL™, and Oracle®. Those of skill in the art will also recognize that a web application, in various embodiments, is written in one or more versions of one or more languages. A web application may be written in one or more markup languages, presentation definition languages, client-side scripting languages, server-side coding languages, database query languages, or combinations thereof. In some embodiments, a web application is written to some extent in a markup language such as Hypertext Markup Language (HTML), Extensible Hypertext Markup Language (XHTML), or extensible Markup Language (XML). In some embodiments, a web application is written to some extent in a presentation definition language such as Cascading Style Sheets (CSS). In some embodiments, a web application is written to some extent in a client-side scripting language such as Asynchronous Javascript and XML (AJAX), Flash® Actionscript, Javascript, or Silverlight®. In some embodiments, a web application is written to some extent in a server-side coding language such as Active Server Pages (ASP), ColdFusion®, Perl, Java™, JavaServer Pages (JSP), Hypertext Preprocessor (PHP), Python™, Ruby, Tel, Smalltalk, WebDNA®, or Groovy. In some embodiments, a web application is written to some extent in a database query language such as Structured Query Language (SQL). In some embodiments, a web application integrates enterprise server products such as IBM® Lotus Domino®. In some embodiments, a web application includes a media player element. In various further embodiments, a media player element utilizes one or more of many suitable multimedia technologies including, by way of non-limiting examples, Adobe® Flash®, HTML 5, Apple® QuickTime®, Microsoft® Silverlight®, Java™, and Unity®.
Mobile application
[085] In some embodiments, a computer program includes a mobile application provided to a mobile digital processing device. In some embodiments, the mobile application is provided to a mobile digital processing device at the time it is manufactured. In other embodiments, the mobile application is provided to a mobile digital processing device via the computer network described herein.
[086] In view of the disclosure provided herein, a mobile application is created by techniques known to those of skill in the art using hardware, languages, and development environments known to the art. Those of skill in the art will recognize that mobile applications are written in several languages. Suitable programming languages include, by way of non-limiting examples, C, C++, C#, Objective- C, Swift, Java™, Kotlin, Javascript, Pascal, Object Pascal, Python™, Ruby, VB.NET, WML, and XHTML/HTML with or without CSS, or combinations thereof.
[087] Suitable mobile application development environments are available from several sources. Commercially available development environments include, by way of non-limiting examples, AirplaySDK, alcheMo, Appcelerator®, Celsius, Bedrock, Flash Lite, .NET Compact Framework, Rhomobile, and WorkLight Mobile Platform. Other development environments are available without cost including, by way of non-limiting examples, Lazarus, MobiFlex, MoSync, and Phonegap. Also, mobile device manufacturers distribute software developer kits including, by way of non-limiting examples, iPhone and iPad (iOS) SDK, Android™ SDK, BlackBeny® SDK, BREW SDK, Palm® OS SDK, Symbian SDK, webOS SDK, and Windows® Mobile SDK.
[088] Those of skill in the art will recognize that several commercial forums are available for distribution of mobile applications including, by way of non-limiting examples, Apple® App Store, Android™ Market, BlackBeny® App World, App Store for Palm devices, App Catalog for webOS, Windows® Marketplace for Mobile, Ovi Store for Nokia® devices, Samsung® Apps, and Nintendo® DSi Shop.
[089] An authorized third party may integrate an STK into their own application, which may enable the third party application to access data from the flow detection sensor and/or access the server and processing algorithms for the display of results through the third party application.
Standalone application
[090] In some embodiments, a computer program includes a standalone application, which is a program that is run as an independent computer process, not an add-on to an existing process, e.g., not a plug-in. Those of skill in the art will recognize that standalone applications are often compiled. A compiler is a computer program(s) that transforms source code written in a programming language into binary object code such as assembly language or machine code. Suitable compiled programming languages include, by way of non-limiting examples, C, C++, Objective-C, COBOL, Delphi, Eiffel, Java™, Lisp, Python™, Visual Basic, and VB .NET, or combinations thereof. Compilation is often performed, at least in part, to create an executable program. In some embodiments, a computer program includes one or more executable complied applications.
Software modules
[091] In some embodiments, the platforms, media, methods and applications described herein include software, server, and/or database modules, or use of the same. In view of the disclosure provided herein, software modules are created by techniques known to those of skill in the art using machines, software, and languages known to the art. The software modules disclosed herein are implemented in a multitude of ways. In various embodiments, a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof. In further various embodiments, a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof. In various embodiments, the one or more software modules comprise, by way of non-limiting examples, a web application, a mobile application, and a standalone application. In some embodiments, software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on cloud computing platforms. In some embodiments, software modules are hosted on one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location.
Databases
[092] In some embodiments, the platforms, systems, media, and methods disclosed herein include one or more databases, or use of the same. The entries of the database(s) disclosed herein are searchable using the search module/engine as disclosed herein.
[093] In some cases, disclosed herein is a database of a plurality of seats, each of seats including one or more parameters of a seat space. In further cases, one or more parameters of a seat space is a direct parameter or a derived/calculated parameter based on one or more parameters of the seat space. In other cases, each of the seats includes one or more parameters that is not related to direct or derived parameters of the seat space.
[094] In some cases, disclosed herein is a database of seats containing information extracted from external sources to the systems, methods, and media disclosed herein. The external sources may include commercial airline databases, webpages, websites, online cloud database, or any other commercially available data sources. In some cases, the database disclosed herein includes at least one parameter of a seat space. In further cases, the database disclosed herein includes at least one parameter of a seat space that is not directly obtained from external commercial sources. In yet further cases, the database includes at least one parameter of a seat space that is derived/calculated based on data from external commercial sources. In alternative cases, the database includes at least one parameter of a seat space that is directly measured and entered using the systems, media and methods disclosed herein. For example, the data based may include a leg space that is derived/calculated from commercially available information of the seat space. As another example, the database includes a price per unit usable space of the seat that is calculated based on the flight ticket price and the usable space available from external sources.
[095] The database disclosed herein, in instances, includes a plurality of users, each of the users includes one or more parameters of physical information of the user. In further cases, one or more parameters of the physical information is a direct physical information or a derived/calculated parameter based on the physical information of the user. In other cases, each of the users includes one or more parameters that is not related to direct or derived parameter of the physical information of the user.
[096] In view of the disclosure provided herein, those of skill in the art will recognize that many databases are suitable for storage and retrieval of barcode, route, parcel, user, or network information. In various embodiments, suitable databases include, by way of non-limiting examples, relational databases, non-relational databases, object oriented databases, object databases, entity-relationship model databases, associative databases, and XML databases. In some embodiments, a database is internet-based. In further embodiments, a database is web-based. In still further embodiments, a database is cloud computing-based. In other embodiments, a database is based on one or more local computer storage devices.
[097] While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention.
Web browser plug-in
[098] In some embodiments, the computer program includes a web browser plug-in. In computing, a plug-in is one or more software components that add specific functionality to a larger software application. Makers of software applications support plug-ins to enable third-party developers to create abilities which extend an application, to support easily adding new features, and to reduce the size of an application. When supported, plug-ins enable customizing the functionality of a software application. For example, plug-ins are commonly used in web browsers to play video, generate interactivity, scan for viruses, and display particular file types. Those of skill in the art will be familiar with several web browser plug-ins including, Adobe® Flash® Player, Microsoft® Silverlight®, and Apple® QuickTime®. In some embodiments, the toolbar comprises one or more web browser extensions, add-ins, or add-ons. In some embodiments, the toolbar comprises one or more explorer bars, tool bands, or desk bands.
[099] In view of the disclosure provided herein, those of skill in the art will recognize that several plug-in frameworks are available that enable development of plug-ins in various programming languages, including, by way of non-limiting examples, C++, Delphi, Java™, PHP, Python™, and VB .NET, or combinations thereof.
[0100] Web browsers (also called Internet browsers) are software applications, designed for use with network-connected digital processing devices, for retrieving, presenting, and traversing information resources on the World Wide Web. Suitable web browsers include, by way of non-limiting examples, Microsoft® Internet Explorer®, Mozilla® Firefox®, Google® Chrome, Apple® Safari®, Opera Software® Opera®, and KDE Konqueror. In some embodiments, the web browser is a mobile web browser. Mobile web browsers (also called mircrobrowsers, mini-browsers, and wireless browsers) are designed for use on mobile digital processing devices including, by way of non-limiting examples, handheld computers, tablet computers, netbook computers, subnotebook computers, smartphones, music players, personal digital assistants (PDAs), and handheld video game systems. Suitable mobile web browsers include, by way of non-limiting examples, Google® Android® browser, RFM BlackBerry® Browser, Apple® Safari®, Palm® Blazer, Palm® WebOS® Browser, Mozilla® Firefox® for mobile, Microsoft® Internet Explorer® Mobile, Amazon® Kindle® Basic Web, Nokia® Browser, Opera Software® Opera® Mobile, and Sony® PSP™ browser.
[0101] All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.
[0102] While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims

1. A lung function maneuver motivation method for motivating quality operation of a lung flow detection device, the method comprising: receiving, via a processor associated with a mobile device, an indication of the start of a lung function maneuver for measuring lung function of a user using a flow detection device; receiving data at least during the maneuver, via the processor, from at least one flow detection sensor of the flow detection device; initiating, via the processor, a visualization of the lung function maneuver; displaying the visualization on a screen associated with the mobile device; adjusting, via the processor and based on the data from the at least one flow detection sensor, the visualization displayed on the screen; determining, via the processor, an end of the lung function maneuver; aggregating, via the processor, data received from the at least one flow detection sensor between the indicator to start the lung function maneuver and the end of the lung function maneuver; performing, via the processor, at least one pre-processing error detection on the aggregated data; and displaying, via the processor, a result of the lung function maneuver.
2. The method of claim 1, further comprising performing, via the processor, at least one pre-test error detection test prior to or at the start of the lung function maneuver.
3. The method of claim 1, further comprising performing, via the processor, at least one error detection test during the lung function maneuver.
4. The method of claim 3, wherein the at least one error detection test during the lung function maneuver comprises comparing data from the at least one flow detection sensor to a threshold value in real time.
5. The method of claim 3, wherein the at least one error detection test during the lung function maneuver comprises determining power or input data and comparing the power or input data to a threshold in real time.
6. The method of claim 3, further comprising: calculating a scaled force vector based on data from the at least one flow detection sensor; and comparing the scaled force vector to a threshold.
7. The method of claim 6 wherein the scaled force vector is compared to the threshold in real time.
8. The method of claim 6, wherein the scaled force vector is calculated based on the data from the at least one flow detection sensor and at least one of randomized game variables and interactive game variables.
9. The method of claim 1, further comprising: sending, via the processor, the aggregated data to at least one server; and receiving, from the server and via the processor, the result of the lung function maneuver.
10. The method of claim 9, wherein the pre-test error detection comprises collecting data comprising at least one of: data from the at least one flow detection sensor; at least one of microphone power data and input data from the mobile device; environmental data; and video data from a camera associated with the mobile device.
11. The method of claim 10, wherein the environmental data comprises at least one of ambient pressure, ambient temperature, and an ambient volume
12. The method of claim 2, wherein the pre-test error detection further comprises: determining environmental data comprising at least one of ambient pressure, ambient temperature, and ambient volume; scaling parameters for measurements from the at least one flow detection sensor based on at least one of the ambient pressure, the ambient temperature, and the ambient volume.
13. The method of claim 1, wherein the flow detection device comprises an oscillation chamber and the at least one flow detection sensor comprises a microphone.
14. The method of claim 13, wherein the data from the microphone comprises amplitude data.
15. The method of claim 14, wherein aggregated data is an audio file.
16. The method of claim 1, the visualization comprises a display of at least one moving object, and wherein the adjustment of the visualization comprises changing at least one of a speed and a direction of the at least one moving object.
17. The method of claim 1, wherein the flow detection sensor is at least one of: an acoustic sensor, a microphone, an oscillation rate sensor, a pressure sensor, a pressure transducer, an ultrasound sensor, a rotary wheel sensor, a piezoelectric sensor, a cantilever beam sensor, an acoustic sensor, a thermal sensor, and a hot wire sensor.
18. The method of claim 1, further comprising displaying a countdown via the screen to tell a user when to initiate the lung function maneuver.
19. The method of claim 1, wherein the visualization is configured to provide motivation to a user to correctly perform the lung function maneuver.
20. The method of claim 1, wherein the at least one pre-processing error detection on the aggregated data comprises a pre-processing error detection machine learning algorithm.
21. A lung function maneuver motivation method for motivating quality operation of a lung flow detection device, the method comprising: receiving, at a server, the aggregated data of claim 1; performing, at the server, at least one post-processing error detection based on the aggregated data, the post-processing error detection comprising an analysis of at least one of: lung function results, data from the at least one flow detection sensor, fast Fourier transform heat map, data from prior lung function maneuvers, and results from prior lung function maneuvers; and determining a result of the lung function maneuver.
22. The method of claim 21, further comprising sending the determined result of the lung function maneuver to the mobile device.
23. A lung function maneuver motivation system for motivating quality operation by a user of a lung flow detection device, the system comprising: a lung flow detection device comprising at least one flow detection sensor and configured to produce lung function maneuver data; a mobile device comprising: a processor, a memory, and a display, and an application program operating on the mobile device, wherein: the lung flow detection device connects with or otherwise sends signals to the mobile device; the application program comprises a plurality of computer instructions configured to cause the processor of the mobile device to: collect lung function maneuver data from the at least one flow detection sensor, generate and adjust a visualization based on the lung function maneuver data, and display a result of the lung function maneuver.
24. The system of claim 23, wherein the flow detection device further comprises an oscillation chamber, and wherein the at least one flow detection sensor comprises a microphone.
25. The system of claim 23, wherein the visualization comprises a display of at least one moving object, and wherein the adjustment of the visualization comprises changing at least one of a speed and a direction of the at least one moving object.
26. The system of claim 23, wherein the mobile device determines a scaled force vector based on the lung function maneuver data, wherein the scaled force vector determines the adjustment of the visualization.
27. The system of claim 23, wherein: the mobile device further comprises at least one sensor, the mobile device sets at least one parameter based on a reading from the at least one sensor, and data from the at least one flow detection sensor is interpreted based on the at least one parameter.
28. The system of claim 23, wherein the mobile device aggregates the lung function maneuver data.
29. The system of claim 28, wherein the mobile device verifies that the aggregated lung function maneuver data meets at least one threshold.
30. The system of claim 29, further comprising a server, wherein when the threshold is met, the mobile device uploads the aggregated lung function maneuver data to the server.
31. The system of claim 30, wherein the server performs at least one post-processing error detection based on the aggregated lung function maneuver data, the post-processing error detection comprising an analysis of at least one of: lung function results, data from the at least one flow detection sensor, fast Fourier transform heat map, data from prior lung function maneuvers, and results from prior lung function maneuvers; and the result of the lung function maneuver.
The system of claim 30, wherein the server compares the aggregated lung function maneuver data to a threshold range, wherein the threshold range is determined based on at least one of generic or patient specific data.
The system of claim 30, wherein the server inputs the aggregated lung function maneuver data into a machine learning algorithm configured to identify errors during the post-processing error detection.
PCT/US2017/059783 2016-11-02 2017-11-02 Apparatuses, methods, and systems for motivating quality home-based spirometry maneuvers and automated evaluation and coaching WO2018085583A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662416657P 2016-11-02 2016-11-02
US62/416,657 2016-11-02

Publications (1)

Publication Number Publication Date
WO2018085583A1 true WO2018085583A1 (en) 2018-05-11

Family

ID=62076811

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/059783 WO2018085583A1 (en) 2016-11-02 2017-11-02 Apparatuses, methods, and systems for motivating quality home-based spirometry maneuvers and automated evaluation and coaching

Country Status (1)

Country Link
WO (1) WO2018085583A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190015081A1 (en) * 2017-07-17 2019-01-17 Spirosure, Inc. Apparatus and Method for Offline Collection of Breath Samples for Nitric Oxide Measurement
US20210045657A1 (en) * 2018-03-02 2021-02-18 Singapore Health Services Pte Ltd Device and method for measuring respiratory air flow

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150283341A1 (en) * 2014-04-07 2015-10-08 Boehringer Ingelheim International Gmbh Method, electronic device, inhalation training system and information storage medium for practicing and/or controlling an inhalation process of a patient
WO2016079336A1 (en) * 2014-11-20 2016-05-26 Digidoc Technologies As Measurement of respiratory function
WO2016154139A1 (en) * 2015-03-20 2016-09-29 University Of Washington Sound-based spirometric devices, systems, and methods using audio data transmitted over a voice communication channel

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150283341A1 (en) * 2014-04-07 2015-10-08 Boehringer Ingelheim International Gmbh Method, electronic device, inhalation training system and information storage medium for practicing and/or controlling an inhalation process of a patient
WO2016079336A1 (en) * 2014-11-20 2016-05-26 Digidoc Technologies As Measurement of respiratory function
WO2016154139A1 (en) * 2015-03-20 2016-09-29 University Of Washington Sound-based spirometric devices, systems, and methods using audio data transmitted over a voice communication channel

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190015081A1 (en) * 2017-07-17 2019-01-17 Spirosure, Inc. Apparatus and Method for Offline Collection of Breath Samples for Nitric Oxide Measurement
US20210045657A1 (en) * 2018-03-02 2021-02-18 Singapore Health Services Pte Ltd Device and method for measuring respiratory air flow

Similar Documents

Publication Publication Date Title
Salvi et al. The mobile-based 6-minute walk test: usability study and algorithm development and validation
US10971035B2 (en) Training in dispensing a medicament
US11007427B2 (en) Method and apparatus for monitoring and calibrating performances of gamers
CN109069350B (en) Infant feeding system
WO2018085583A1 (en) Apparatuses, methods, and systems for motivating quality home-based spirometry maneuvers and automated evaluation and coaching
US11410774B2 (en) Computer system, cognitive function evaluation method, and program
JP2019523473A5 (en)
EP3507731A1 (en) Systems and methods for health monitoring
JP2018149173A5 (en)
KR20180109867A (en) Apparatus and method for detecting heart rate variability
EP4038633A1 (en) Efficient diagnosis of behavioral disorders, developmental delays, and neurological impairments
JP7062065B2 (en) Information processing terminals, information processing methods, information processing systems and programs
US11763929B2 (en) Medical tool aiding diagnosed psychosis patients in detecting auditory psychosis symptoms associated with psychosis
EP3292476B1 (en) Methods using fractional rank precision and mean average precision as test-retest reliability measures
Shannon et al. Speech analysis and depression
US11890078B2 (en) System and method for conducting on-device spirometry test
JP2018149081A (en) Information processing apparatus, information processing method, and program
JP2018527133A (en) Pulse oximeter suggesting another test
US20230225695A1 (en) Analyzing a patient's breathing based on one or more audio signals
WO2022125802A1 (en) Systems and methods for estimation of forced vital capacity using speech acoustics
CN115114950B (en) Method, device and equipment for testing decision uncertainty
CN112753056B (en) System and method for physical training of body parts
CN117898701A (en) Respiratory impedance detection method, respiratory impedance detection device, computer equipment and storage medium
Wise Mobile Health Sensing with Smart Phones
US20190184118A1 (en) Interactive guidance related to a subject's expiratory flow limiation results

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17868162

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17868162

Country of ref document: EP

Kind code of ref document: A1