WO2021040428A1 - Method and system for identifying an activity of a user - Google Patents

Method and system for identifying an activity of a user Download PDF

Info

Publication number
WO2021040428A1
WO2021040428A1 PCT/KR2020/011478 KR2020011478W WO2021040428A1 WO 2021040428 A1 WO2021040428 A1 WO 2021040428A1 KR 2020011478 W KR2020011478 W KR 2020011478W WO 2021040428 A1 WO2021040428 A1 WO 2021040428A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
activity
sensor
candidate
sensor data
Prior art date
Application number
PCT/KR2020/011478
Other languages
French (fr)
Inventor
Vijayanand KUMAR
Mayank MEGHAWAT
Bharat GUPTA
Dinesh BANSAL
Ankit Verma
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Publication of WO2021040428A1 publication Critical patent/WO2021040428A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • A61B5/6807Footwear
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0252Load cells
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/04Arrangements of multiple sensors of the same type
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/06Arrangements of multiple sensors of different types
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6828Leg
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6829Foot or ankle

Definitions

  • the present disclosure relates to providing notification to Signal Processing, Machine Learning, Statistical analysis, Duty cycle of the signal, Sensors configuration, Smart Wearable and more specifically to a smart wearable device for identifying an activity of a user.
  • the activities attributes are for example but not limited to a step counts, cycling distance and the like. Accelerometer, gyroscope and pressure sensors used in conventional smart wearable device increases the cost of the smart wearable device significantly.
  • the conventional art handles accuracy improvement of activities attribute by sensor configuration in shoes as smart wearable devices.
  • Multiple proximity sensors and magnetic coupled devices used for computing step counts.
  • Pressure sensor in shoes is used to determine gait cycle.
  • Smart insole for diabetic foot treatment is used.
  • GPS enabled shoes held in tracking the pathways are also used.
  • Prior art also deals with fall detection shoes. Sweat analysis shoes are also available in art. Prior art also deals with determining heat energy generated from shoes and being used for self charging etc.
  • the principal object of the embodiments herein is to identify an activity of a user.
  • Another object of the embodiments herein is to obtain sensor data from a plurality of sensors of the electronic device operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration.
  • Another object of the embodiments herein is to determine an activity class based on the sensor data obtained from each of the sensors.
  • Another object of the embodiments herein is to configure the electronic device in an asynchronous mode based on the activity class.
  • Another object of the embodiments herein is to identify at least one activity, from the plurality of candidate activities, currently performed by user based on the sensor data obtained from the at least one candidate sensor.
  • Another object of the embodiments herein is to compare the sensor data for the plurality of the sensor with predefined sensor data present in a predefine sensor database.
  • Another object of the embodiments herein is to determine a change in the activity class based on the obtained sensor data from the at least one candidate sensor by comparing the sensor data for the plurality of the sensor with predefined sensor data.
  • Another object of the embodiments herein is to reconfigure the electronic device in the synchronous mode in response to determining that there is a change in the activity class.
  • Another object of the embodiments herein is to determine a new activity class based on the sensor data obtained from each of the sensors.
  • Another object of the embodiments herein is to configure the electronic device in the asynchronous mode based on the new activity class.
  • Another object of the embodiments herein is to a duty cycle with respect to a predetermined time based on the sensor data from at least one candidate sensor.
  • Another object of the embodiments herein is to sample the duty cycle into a set of discrete signals based on a sampling frequency.
  • Another object of the embodiments herein is to identify the at least one activity, from the plurality of candidate activities based on the set of discrete signals.
  • Another object of the embodiments herein is to determine whether the discrete signal is a non-overlapping signal.
  • Another object of the embodiments herein is to map the sampling frequency with a predefined frequency in the predefined sensor database.
  • Another object of the embodiments herein is to sample the duty cycle into a set of discrete signals based on a candidate sampling frequency until the discrete signal is non-overlapping.
  • Another object of the embodiments herein is to identify the at least one activity from the plurality of candidate activities based on at least one of the sampling frequency and the candidate sampling frequency.
  • Another object of the embodiments herein is to determine a plurality of attributes associated with the predefined frequency from the predefined sensor database for the at least one activity.
  • Embodiments herein disclose a method to identify an activity of a user.
  • the method includes, obtaining by an electronic device, sensor data from a plurality of sensors of the electronic device operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration.
  • the method also includes determining, by the electronic device, an activity class based on the sensor data obtained from each of the sensors, wherein the activity class indicates a plurality of candidate activities associated with the user and configuring the electronic device in an asynchronous mode based on the activity class.
  • the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a second configuration different than the first configuration.
  • the method further includes obtaining sensor data from the at least one candidate sensor and identifying at least one activity, from the plurality of candidate activities, currently performed by user based on the sensor data obtained from the at least one candidate sensor.
  • determining the activity class based on the sensor data obtained from each of the sensors comprises comparing the sensor data for the plurality of the sensor with predefined sensor data present in a predefined sensor database and determining the activity class of the sensor data based on the comparison.
  • identifying the at least one activity, from the plurality of candidate activities, currently performed by user comprises determining a change in the activity class based on the obtained sensor data from the at least one candidate sensor by comparing the sensor data for the plurality of the sensor with predefined sensor data.
  • the identifying further comprises reconfiguring the electronic device in the synchronous mode in response to determining that there is a change in the activity class and determining a new activity class based on the sensor data obtained from each of the sensors.
  • the identifying further comprises configuring the electronic device in the asynchronous mode based on the new activity class.
  • the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a third configuration different than the first configuration and second configuration.
  • identifying the at least one activity, from the plurality of candidate activities, currently performed by user further comprises determining a duty cycle with respect to a predetermined time based on the sensor data from at least one candidate sensor.
  • the identifying further comprises sampling the duty cycle into a set of discrete signals based on a sampling frequency and identifying the at least one activity, from the plurality of candidate activities based on the set of discrete signals.
  • identifying the at least one activity, from the plurality of candidate activities based on the set of discrete signals comprises determining whether the discrete signal is a non-overlapping signal.
  • the identifying also includes mapping, the sampling frequency with a predefined frequency in the predefined sensor database, in response to determining that the discrete signal is the non-overlapping signal.
  • the identifying further includes sampling the duty cycle into a set of discrete signals based on a candidate sampling frequency until the discrete signal is non-overlapping, if the discrete signal is overlapping.
  • the method further includes identifying the at least one activity from the plurality of candidate activities based on at least one of the sampling frequency and the candidate sampling frequency and determining a plurality of attributes associated with the predefined frequency from the predefined sensor database for the at least one activity.
  • the duty cycle is determined using a base band signal having a base frequency.
  • the plurality of sensors comprises short range sensor.
  • the sensor data obtained from the at least one candidate sensor is at least one of non-continuous, continuous, periodic and non-periodic in nature.
  • the plurality of attributed comprises acceleration, speed, angle, pressure, step gap, step distance, stair height, revolution distance, driving behavior, type of vehicle and terrain type.
  • the present invention describes a method to identify an activity of a user.
  • the method comprises obtaining sensor data from a plurality of sensor associated with the electronic device.
  • the method also includes determining a duty cycle with respect to a predetermined time based on the sensor data from the at least one candidate sensor.
  • the method also includes sampling the duty cycle into a set of discrete signals based on a sampling frequency and identifying at least one activity, from a plurality of activities associated with user based on the set of discrete signals.
  • identifying the at least one activity, from the plurality of activities based on the set of discrete signals comprises determining whether the discrete signal is a non-overlapping signal.
  • the identifying further includes mapping the sampling frequency with a predefined frequency in the predefined sensor database, in response to determining that the discrete signal is the non-overlapping signal.
  • the method also includes sampling the duty cycle into a set of discrete signals based on a candidate sampling frequency until the discrete signal is non-overlapping if the discrete signal is overlapping, then if the signal is overlapping.
  • the method further includes identifying the at least one activity from the plurality of candidate activities based on at least one of the sampling frequency and the candidate sampling frequency and determining a plurality of attributes associated with the predefined frequency from the predefined sensor database for the at least one activity.
  • obtaining sensor data from the plurality of sensor further comprises obtaining a sensor data from a plurality of sensors of the electronic device operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration.
  • the obtaining further includes determining an activity class based on the sensor data obtained from each of the sensors.
  • the obtaining also includes configuring the electronic device in an asynchronous mode based on the activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a second configuration different than the first configuration and obtaining sensor data from the at least one candidate sensor.
  • obtaining sensor data from the at least one candidate sensor comprises determining a change in the activity class based on the obtained sensor data from the at least one candidate sensor.
  • the obtaining also includes reconfiguring the electronic device in the synchronous mode in response to determining that there is a change in the activity class.
  • the obtaining also includes determining, by the electronic device, a new activity class based on the sensor data obtained from each of the sensors and configuring the electronic device in the asynchronous mode based on the new activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a third configuration different than the first configuration and second configuration.
  • the present invention provides an electronic device for identifying an activity of a user.
  • the electronic device comprises a memory, a processor, an attribute determiner and a communicator.
  • the processor is configured to obtain sensor data from a plurality of sensors of the electronic device operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration.
  • the processor is further configured to determine an activity class based on the sensor data obtained from each of the sensors, wherein the activity class indicates a plurality of candidate activities associated with the user.
  • the processor is also configured to configure the electronic device in an asynchronous mode based on the activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a second configuration different than the first configuration.
  • the processor is also configured to obtain sensor data from the at least one candidate sensor and identify at least one activity, from the plurality of candidate activities, currently performed by user based on the sensor data obtained from the at least one candidate sensor.
  • the present invention provides an electronic device for identifying an activity of a user
  • the electronic device comprises a memory, a processor, an attribute determiner and a communicator.
  • the processor is configured to obtain sensor data from a plurality of sensor associated with the electronic device and determine a duty cycle with respect to a predetermined time based on the sensor data from the at least one candidate sensor.
  • the processor is also configured to sample the duty cycle into a set of discrete signals based on a sampling frequency; and identify at least one activity, from a plurality of activities associated with user based on the set of discrete signals.
  • Fig. 1 is a bock diagram of an electronic device 100 for identifying an activity of a user of the electronic device 100, according to the embodiments as disclosed herein;
  • Fig.2 is a flow diagram of the proposed method for identifying the activity of the user and determining the attributes associated with the identified activity, according to the embodiments as disclosed herein;
  • Fig. 3 is a schematic diagram, illustrating the synchronous and asynchronous operating mode for the sensors, according to the embodiments as disclosed herein;
  • Fig. 4 is a schematic diagram illustrating how the plurality of sensor works when in asynchronous mode, according to the embodiments as disclosed herein;
  • Fig. 5 is a flow diagram illustrating a flow for determining the attributes associated with the activity of the user after obtaining the candidate sensor data, according to the embodiments as disclosed herein;
  • Fig. 6 is a schematic diagram, illustrating the flow 500 with an exemplary activity performed by the user, according to the embodiments as disclosed herein;
  • Fig. 7 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is cycling, according to an embodiment as disclosed herein;
  • Fig. 8 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is swimming, according to an embodiment as disclosed herein;
  • Fig. 9 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is neck rotation, according to an embodiment as disclosed herein;
  • Fig. 10 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is driving., according to an embodiment as disclosed herein;
  • Fig. 11 is a schematic diagram, illustrating an example embodiment, wherein the activity is being performed by a group of user, and the activity is dancing;
  • Fig. 12 is a schematic diagram illustrating the activity being performed by the user is climbing steps
  • Fig. 13 is a table illustrating the grouping of similar activities.
  • circuits may, for example, be embodied in one or more semiconductor chips, or on substrate supports such as printed circuit boards and the like.
  • circuits constituting a block may be implemented by dedicated hardware, or by a processor (e.g., one or more programmed microprocessors and associated circuitry), or by a combination of dedicated hardware to perform some functions of the block and a processor to perform other functions of the block.
  • a processor e.g., one or more programmed microprocessors and associated circuitry
  • Each block of the embodiments may be physically separated into two or more interacting and discrete blocks without departing from the scope of the disclosure.
  • the blocks of the embodiments may be physically combined into more complex blocks without departing from the scope of the disclosure.
  • the embodiments herein disclose a method to identify an activity of a user.
  • the method includes, obtaining by an electronic device, sensor data from a plurality of sensors of the electronic device operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration.
  • the method also includes determining, by the electronic device, an activity class based on the sensor data obtained from each of the sensors, wherein the activity class indicates a plurality of candidate activities associated with the user and configuring the electronic device in an asynchronous mode based on the activity class.
  • the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a second configuration different than the first configuration.
  • the method further includes obtaining sensor data from the at least one candidate sensor and identifying at least one activity, from the plurality of candidate activities, currently performed by user based on the sensor data obtained from the at least one candidate sensor.
  • Fig. 1 is a bock diagram of an electronic device 100 for identifying an activity of a user of the electronic device 100, according to the embodiments as disclosed herein.
  • the electronic device 100 may be, for example, but not limited to a smart social robot, a smart watch, a cellular phone, a smart phone, a Personal Digital Assistant (PDA), a tablet computer, a head mounted smart device with display, a laptop computer, a music player, a video player, an Internet of things (IoT) device or the like.
  • the electronic device 100 includes a memory 110, a processor 120, attribute determiner 130 and a communicator 140.
  • the user of the electronic device 100 may be a smart wearable device.
  • the smart wearable device may allow the users to track the activities preformed by the user and may provide details about the attributes associated to the activities.
  • the electronic device 100 may comprise a plurality of sensors for sensing the activities performed by the user.
  • the plurality of sensors is short range sensors.
  • the short range sensor may be for example but not limited to Near Field Communication Sensor (NFC) and proximity sensors.
  • NFC Near Field Communication Sensor
  • the activities performed by the user may be for example but not limited to cycling, jumping, skipping, walking, running, swimming, dancing, jogging, paddling, skiing, standing, sitting and the like.
  • the attributes associated with the activities may be for example but not limited to step count, distance, calorie burnt, speed, angle, pressure, voltage, current, flux and the like.
  • the attribute determiner 130 determines the various attributes associated with the activities performed by the user.
  • the attribute determiner 130 obtains data (sensor data) from the plurality of sensors operating in a synchronous mode.
  • the plurality of sensors are configured a first configuration.
  • the first configuration describes that each sensor of the plurality of sensors works as a transmitter as well as a receiver.
  • the attribute determiner 130 determines an activity class of the activity performed by the user based on the sensor data.
  • the activity class associated with the activity performed by the user indicates a plurality of candidate activities.
  • the activities performed by the user may be classified in class A, class B, class C, class D and the like.
  • the class may indicate plurality of candidate activities.
  • class A may indicate activities such as walking, running and jogging.
  • Class B may indicate activities such as cycling and paddling.
  • Class C may indicate activities such as skiing and cross- country machine.
  • Class D may indicate activities such as stand, sitting and the like.
  • the above mentioned activities are for example embodiment and the classes may have other activities than mentioned.
  • the attribute determiner 130 configures the electronic device 100 in an asynchronous mode based on the activity class.
  • the asynchronous mode configures the plurality of sensors in a second configuration.
  • each of the sensors of the plurality of sensors may work either as a transmitter or a receiver.
  • all the sensors may be configured into the second configuration.
  • at least one sensor may be configured in the second configuration.
  • the sensor may be termed as candidate sensors which are configured into asynchronous mode.
  • the attribute determiner 130 After configuring the candidate of sensors in asynchronous mode, the attribute determiner 130 obtains candidate sensor data from the candidate sensors and identifies an activity of the user from the plurality of candidate activities of the activity class based on the candidate sensor data.
  • the attribute determiner 130 also determines a plurality of attributes for the identified activity of the user based on a predefined frequency.
  • the processor 120 is coupled to the attribute determiner 130, the memory 110 and the communicator 140.
  • the processor 120 is configured to execute instructions stored in the memory 110 and to perform various other processes.
  • the memory 110 stores the effect and condition.
  • the memory 110 also stores instructions to be executed by the processor 120.
  • the memory 110 may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of erasable programmable memories (EPROM) or electrically erasable and programmable read only (EEPROM) memories.
  • the memory 110 may, in some examples, be considered a non-transitory storage medium.
  • the term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term “non-transitory” should not be interpreted that the memory 110 is non-movable.
  • the memory 110 can be configured to store larger amounts of information than the memory.
  • a non-transitory storage medium may store data that can, over time, change (e.g., in Random Access Memory (RAM) or cache).
  • RAM Random Access Memory
  • the communicator 140 is configured for communicating internally between internal hardware components and with external devices via one or more networks.
  • the Fig.1 shows the limited overview of the electronic device 100, but it is to be understood that other embodiments are not limited thereto. Further, the electronic device 100 may include any number of hardware or software components communicating with each other. By way of illustration, both an application running on the electronic device 100 and the device itself can be a component.
  • Fig.2 is a flow diagram of the proposed method for identifying the activity of the user and determining the attributes associated with the identified activity.
  • the method states signal acquisition in synchronous mode. Signal acquisition refers to measuring the sensor data for the plurality of sensors.
  • the electronic device 100 measures the data from the plurality of sensors of the electronic device 100.
  • the electronic device 100 compares the measured sensor data with a predefined sensor database (salient point database) values present in the electronic device 100 and determines the activity class based on the comparison.
  • the predefined sensor database may be present outside the electronic device 100.
  • the activity class of may be for example but not limited to class A, class B, class C and class D.
  • the method iteratively compares the measured sensor data with a predefined sensor database for determining a change in the activity class.
  • the method includes configuring the plurality of sensors in the asynchronous mode.
  • the plurality of sensors may work either as a transmitter or a sender, configuration if the plurality of sensors in asynchronous mode increases the battery life of the plurality of sensors thereby increasing the battery life of the electronic device 100.
  • the method includes obtaining by the electronic device 100, data from the plurality of sensors configured in asynchronous mode which may be termed as candidate sensor data.
  • the electronic device 100 determines a duty cycle with respect to a predetermined time based on the candidate sensor data from at least one candidate sensor. After determining the duty cycle, the electronic device 100 samples the duty cycle into a set of discrete signals based on a sampling frequency.
  • the method determines whether the discrete signal is an overlapping signal or a non-over lapping signal. If the signal is non-overlapping signal then the flow proceeds to 214. If the signal is an overlapping signal then the flow proceeds to 212.
  • the method includes re-sampling, by the electronic device 100, the duty cycle into a set of discrete signals based on another sampling frequency and performing the sampling iteratively until the discrete signal is non-overlapping.
  • the method includes mapping the sampling frequency for which the discrete signal is non-overlapping with a mapping, the sampling frequency with a predefined frequency in the predefined sensor database and determining the activity of the user by based on the mapping.
  • the method includes determining, by the electronic device 100 the plurality of attributes associated with the determined user activity and from the predefined sensor database.
  • the method helps in increasing the battery life of the electronic device 100 and also provides accurate output.
  • the output provided is the attributes associated with the user activity.
  • Fig. 3 is a schematic diagram, illustrating the synchronous and asynchronous operating mode for the sensors.
  • 3, 1 illustrates the clock cycles, wherein the x-axis indicates time and y-axis indicates the voltage.
  • 2 indicate the sensor signal, wherein the plurality of sensors are in synchronous mode, and each of the sensor acts as a sender as well as a receiver. As seen in 2 for each clock cycle the sensor signal is obtained.
  • 3 indicate the sensor signal, wherein the plurality of sensors is in asynchronous mode, and each of the sensor acts either as a sender or as a receiver.
  • the sensor signal is available only for alternate clock cycles, thereby saving battery life.
  • 4 show a combined sensor signal for asynchronous and synchronous mode. As seen in fig. 4, for the first clock cycle the plurality of sensors are in synchronous mode, for the second clock cycle the plurality of sensors are in asynchronous mode and for the third, fourth and fifth cycle, the sensor signals are in synchronous mode.
  • Fig. 4 is a schematic diagram illustrating how the plurality of sensor works when in asynchronous mode.
  • Fig. 4 illustrates the proximity sensor and the NFC sensors present in the electronic device 100 (wearable device/smart shoes) worn by the user while performing a walking activity.
  • S1 indicates a first station of the NFC sensor and S2 indicates a second sensor of the NFC sensor.
  • a left leg of the user with the smart shoes is indicated by station S1 and a right leg of the user with smart shoes is indicated by station S2 of the NFC sensors.
  • S1 is at ground and at state II, S2 is at ground.
  • While walking the user lifts the right leg and hence at state III the S2 is in mid-air and has some vertical distance from the ground.
  • the proximity sensor measures some vertical distance and S2 acts as a sender and sends the measured data to either the electronic device 100 or to S1.
  • S2 acts as a sender and sends the measured data to either the electronic device 100 or to S1.
  • the right leg is not completely at ground and thus S2 still acts as a sender.
  • state I to state IV S1 acts as a receiver as S1 is not measuring any values and thus no need to send any data.
  • Table 1 shows the configuration of S1 and S2 and the proximity sensors.
  • Fig. 5 is a flow diagram illustrating a flow for determining the attributes associated with the activity of the user after obtaining the candidate sensor data.
  • the electronic device 100 after obtaining the candidate sensor data and determines a duty cycle having a baseband frequency (fm) using the obtained candidate sensor data.
  • the method includes selecting a discrete signal with a sampling frequency (fs).
  • the method includes applying a sampling process on the duty cycle with frequency fm and the discrete signal with frequency fs for obtaining a sampled signal.
  • the method includes applying a Fourier transform on the sampled signal and obtaining an output signal.
  • the flow determines whether the signal obtained at 508 is overlapping or non-overlapping.
  • flow 500 proceeds to 512 and to 504 if the signal is an overlapping signal.
  • the sampling frequency for which the discrete signal obtained at 508 is non-overlapping is compared with a predefined frequency in the frequency database and the activity of the user is determined based on the comparison.
  • the electronic device 100 determines the attributes associated for the sampling frequency for which the discrete signal obtained at 508 is non-overlapping from the predefined database.
  • Fig. 6 is a schematic diagram, illustrating the flow 500 with an exemplary activity performed by the user.
  • 6.1 show the duty cycle m(t).
  • the signal m (t) may be a non-continuous, continuous, periodic and non-periodic in nature.
  • 6.2 indicate the signal c(t) with sampling frequency fs.
  • 6.3 indicates a sampling signal s(t) after combining signal m(t) and c(t).
  • 6.4 indicate the sampled discrete signal s(t) obtained after applying performing sampling.
  • 6.5 indicates a non-overlapping signal w(t) obtained after performing a Fourier transform on signal s(t).
  • the electronic device 100 determines the activity of the user and the attributes associated with the activity.
  • sampled cycle s (t) obtained from a first user may be compared with the sampled cycle obtained from the second user and may determine the activity of the second user just by comparing the sampled cycle.
  • Fig. 7 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is cycling. While cycling the distance between the two paddles are going to remain constant and hence the NFC sensor input will be constant. However the distance between the paddle and ground may vary and is captured using the proximity sensor.
  • Fig. 7 shows the proximity sensor waveform at S1 and S2, when the plurality of sensors are configured asynchronously.
  • the input to the electronic device 100 is the output of the NFC and the proximity sensor. Based on the input values the electronic device 100 determines the activity and the plurality of attributes associated with the activity such as cycling distance, cycling speed, duration and the like.
  • Fig. 8 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is swimming.
  • the user is wearing the smart device (electronic device 100) in both the hands and legs.
  • the wearable device includes proximity and NFC sensor.
  • the proximity sensor and the NFC sensors are used in combination for determining the activity of the user and the attributes associated with the activity.
  • the proximity sensor and the NFC sensors assists the electronic device 100 in determining the activity of the user as swimming and also determining the attributes associated with swimming such as swimming distance, swimming speed, hand-leg synchronization, and the like.
  • State Hand Proximity Sensor [Distance] [mm] Left Right A B State I in-water in-water 0 0 State II, State III in-water mid-air 0 Some value State IV, State V mid-air in-water Some value 0
  • Fig. 9 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is neck rotation.
  • the user is wearing the smart device (electronic device 100) as headphone.
  • the wearable device includes proximity and NFC sensor.
  • the proximity sensor and the NFC sensors are used in combination for determining the activity of the user and the attributes associated with the activity.
  • the Magnetic field of the NFC sensor is used for determining the head rotation and the proximity sensor is used for determining the change in polarity.
  • the proximity sensor and the NFC sensors together assists the electronic device 100 in determining the activity of the user as neck rotation and also determining the attributes associated with neck rotation such as neck rotation speed, number of repetitions, duration, effective posture and the like.
  • Fig. 10 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is driving.
  • the user is wearing the smart device (electronic device 100) in both hands as well as both legs denoted by A, B, C and D.
  • the wearable device includes proximity, NFC and NFC electromagnetic sensor.
  • the proximity sensor and the NFC sensors are used in combination for determining the activity of the user and the attributes associated with the activity.
  • the proximity sensor and the NFC sensors together assists the electronic device 100 in determining the activity of the user as driving and also determining the attributes associated with driving such as pressure applying pattern (Brake & Acceleration), frequency, intensity & duration for user’s driving activities and the like
  • Table 4 illustrated the configuration of the sensors.
  • Fig. 11 is a schematic diagram, illustrating an example embodiment, wherein the activity is being performed by a group of user, and the activity is dancing.
  • the group of dancers (users) may be wearing the smart device (electronic device 100) in hand and/or legs.
  • the electronic device 100 may determine whether the users are in synchronization, whether their postures are similar and the like.
  • user 1 of the user group is wearing devices A, B, C and D.
  • the user 2 and user are wearing device A’ and A’’ respectively.
  • the proximity sensor and the NFC sensors together assists the electronic device 100 in determining the activity of the group of users as dancing and also determining the attributes associated with neck rotation such as synchronization between users, posture determination and the like
  • Table 5 illustrated the configuration of the sensors.
  • the proposed method may be used in health care domain.
  • the proposed method may be used for determining the walking speed of the user and the progress of the user speed using the NFC and the proximity sensors.
  • the proposed method may also be used for monitoring a child walking pattern using the sensors and may provide a detailed report indicating nutrient deficiency found in the child and the like.
  • the proposed may also be used for monitoring a pregnant lady for having a smooth journey.
  • Fig. 12 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is climbing steps.
  • the user is wearing the smart device (electronic device 100) in both legs.
  • the wearable device includes proximity, NFC and NFC electromagnetic sensor.
  • the proximity sensor and the NFC sensors are used in combination for determining the activity of the user and the attributes associated with the activity.
  • the attributes associated with climbing steps are no. of steps and the distance covered by the user.
  • the height of the stair is determined using the values obtained from the proximity senor.
  • the method determines the strength of the user.
  • the method determines that the strength of the user is more than usual and hence the method detects that the user is climbing stairs.
  • Fig.12.3 determines the values obtained through NFC sensor using an NFC waveform. Further, the consecutive proximity reading results in determining the stair height.
  • the proximity sensor and the NFC sensors together assists the electronic device 100 in determining the activity of the user as climbing stairs and also determining the attributes associated with climbing stairs.
  • fig. 13 is a table illustrating the grouping of similar activities. The similar activities are grouped into one class based on various inputs.
  • table 6 shows the smart devices (electronic device 100) to be worn by the user and the possible activities that may be detected using the electronic device 100.
  • Controller - IOT Devices TV, AC, etc.6.
  • Wrist Exercise7 Gloves - Sign Language Earpiece (IconX, LevelU) Head Gear (Glasses) 1. Virtual Gaming Controller2. Text Neck (Neck Bending Problem)3. Exercise (Neck) Walking Sticking 1. Blind and Elderly Care
  • table 7 indicates the plurality of attributes for various sampling frequencies and the activities associated with the sampling frequency.
  • sampling frequency may be replaced with sampling current, voltage, flux etc. for determining activity related attributes.
  • sampling frequency for sampling class is mapped with different attributes, however it may be possible that sampling frequency value is same for different class group but its attribute mapping would be different for classes.
  • for fs class A is mapped with step length and for same fs class B may be mapped with circular distance.
  • the embodiments disclosed herein can be implemented using at least one software program running on at least one hardware device and performing network management functions to control the elements.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Physiology (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Dentistry (AREA)
  • Artificial Intelligence (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Signal Processing (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Embodiments herein disclose a method to identify an activity of a user. The method comprises obtaining sensor data from a plurality of sensors of the electronic device (100) operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration. The method also includes determining an activity class based on the sensor data obtained from each of the sensors and configuring the electronic device (100) in an asynchronous mode based on the activity class. The method also includes obtaining sensor data from the at least one candidate sensor and identifying at least one activity, from the plurality of candidate activities, currently performed by user based on the sensor data obtained from the at least one candidate sensor.

Description

METHOD AND SYSTEM FOR IDENTIFYING AN ACTIVITY OF A USER
The present disclosure relates to providing notification to Signal Processing, Machine Learning, Statistical analysis, Duty cycle of the signal, Sensors configuration, Smart Wearable and more specifically to a smart wearable device for identifying an activity of a user.
There is an increasing demand for measuring user activities attributes accurately using a smart wearable device. The activities attributes are for example but not limited to a step counts, cycling distance and the like. Accelerometer, gyroscope and pressure sensors used in conventional smart wearable device increases the cost of the smart wearable device significantly.
The conventional art handles accuracy improvement of activities attribute by sensor configuration in shoes as smart wearable devices. Multiple proximity sensors and magnetic coupled devices used for computing step counts. Pressure sensor in shoes is used to determine gait cycle. Smart insole for diabetic foot treatment is used. GPS enabled shoes held in tracking the pathways are also used.
Further prior art also deals with fall detection shoes. Sweat analysis shoes are also available in art. Prior art also deals with determining heat energy generated from shoes and being used for self charging etc.
However, for all the scenarios mentioned above, the cost of the smart wearable device (shoes) is very high. Further the prior arts mentioned above do not produce accurate result and hence are not up to the mark.
Thus, it is desired to address the above mentioned disadvantages or other shortcomings or at least provide a useful alternative.
The principal object of the embodiments herein is to identify an activity of a user.
Another object of the embodiments herein is to obtain sensor data from a plurality of sensors of the electronic device operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration.
Another object of the embodiments herein is to determine an activity class based on the sensor data obtained from each of the sensors.
Another object of the embodiments herein is to configure the electronic device in an asynchronous mode based on the activity class.
Another object of the embodiments herein is to identify at least one activity, from the plurality of candidate activities, currently performed by user based on the sensor data obtained from the at least one candidate sensor.
Another object of the embodiments herein is to compare the sensor data for the plurality of the sensor with predefined sensor data present in a predefine sensor database.
Another object of the embodiments herein is to determine a change in the activity class based on the obtained sensor data from the at least one candidate sensor by comparing the sensor data for the plurality of the sensor with predefined sensor data.
Another object of the embodiments herein is to reconfigure the electronic device in the synchronous mode in response to determining that there is a change in the activity class.
Another object of the embodiments herein is to determine a new activity class based on the sensor data obtained from each of the sensors.
Another object of the embodiments herein is to configure the electronic device in the asynchronous mode based on the new activity class.
Another object of the embodiments herein is to a duty cycle with respect to a predetermined time based on the sensor data from at least one candidate sensor.
Another object of the embodiments herein is to sample the duty cycle into a set of discrete signals based on a sampling frequency.
Another object of the embodiments herein is to identify the at least one activity, from the plurality of candidate activities based on the set of discrete signals.
Another object of the embodiments herein is to determine whether the discrete signal is a non-overlapping signal.
Another object of the embodiments herein is to map the sampling frequency with a predefined frequency in the predefined sensor database.
Another object of the embodiments herein is to sample the duty cycle into a set of discrete signals based on a candidate sampling frequency until the discrete signal is non-overlapping.
Another object of the embodiments herein is to identify the at least one activity from the plurality of candidate activities based on at least one of the sampling frequency and the candidate sampling frequency.
Another object of the embodiments herein is to determine a plurality of attributes associated with the predefined frequency from the predefined sensor database for the at least one activity.
Embodiments herein disclose a method to identify an activity of a user. The method includes, obtaining by an electronic device, sensor data from a plurality of sensors of the electronic device operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration. The method also includes determining, by the electronic device, an activity class based on the sensor data obtained from each of the sensors, wherein the activity class indicates a plurality of candidate activities associated with the user and configuring the electronic device in an asynchronous mode based on the activity class. The asynchronous mode configures at least one candidate sensor from the plurality of sensors in a second configuration different than the first configuration. The method further includes obtaining sensor data from the at least one candidate sensor and identifying at least one activity, from the plurality of candidate activities, currently performed by user based on the sensor data obtained from the at least one candidate sensor.
In an embodiment, determining the activity class based on the sensor data obtained from each of the sensors comprises comparing the sensor data for the plurality of the sensor with predefined sensor data present in a predefined sensor database and determining the activity class of the sensor data based on the comparison.
In another embodiment, identifying the at least one activity, from the plurality of candidate activities, currently performed by user comprises determining a change in the activity class based on the obtained sensor data from the at least one candidate sensor by comparing the sensor data for the plurality of the sensor with predefined sensor data. The identifying further comprises reconfiguring the electronic device in the synchronous mode in response to determining that there is a change in the activity class and determining a new activity class based on the sensor data obtained from each of the sensors. The identifying further comprises configuring the electronic device in the asynchronous mode based on the new activity class. The asynchronous mode configures at least one candidate sensor from the plurality of sensors in a third configuration different than the first configuration and second configuration.
In another embodiment, identifying the at least one activity, from the plurality of candidate activities, currently performed by user further comprises determining a duty cycle with respect to a predetermined time based on the sensor data from at least one candidate sensor. The identifying further comprises sampling the duty cycle into a set of discrete signals based on a sampling frequency and identifying the at least one activity, from the plurality of candidate activities based on the set of discrete signals.
In another embodiment identifying the at least one activity, from the plurality of candidate activities based on the set of discrete signals comprises determining whether the discrete signal is a non-overlapping signal. The identifying also includes mapping, the sampling frequency with a predefined frequency in the predefined sensor database, in response to determining that the discrete signal is the non-overlapping signal. The identifying further includes sampling the duty cycle into a set of discrete signals based on a candidate sampling frequency until the discrete signal is non-overlapping, if the discrete signal is overlapping. The method further includes identifying the at least one activity from the plurality of candidate activities based on at least one of the sampling frequency and the candidate sampling frequency and determining a plurality of attributes associated with the predefined frequency from the predefined sensor database for the at least one activity.
In another embodiment the duty cycle is determined using a base band signal having a base frequency. The plurality of sensors comprises short range sensor. The sensor data obtained from the at least one candidate sensor is at least one of non-continuous, continuous, periodic and non-periodic in nature.
In another embodiment the plurality of attributed comprises acceleration, speed, angle, pressure, step gap, step distance, stair height, revolution distance, driving behavior, type of vehicle and terrain type.
In an embodiment the present invention describes a method to identify an activity of a user. The method comprises obtaining sensor data from a plurality of sensor associated with the electronic device. The method also includes determining a duty cycle with respect to a predetermined time based on the sensor data from the at least one candidate sensor. The method also includes sampling the duty cycle into a set of discrete signals based on a sampling frequency and identifying at least one activity, from a plurality of activities associated with user based on the set of discrete signals.
In another embodiment, identifying the at least one activity, from the plurality of activities based on the set of discrete signals comprises determining whether the discrete signal is a non-overlapping signal. The identifying further includes mapping the sampling frequency with a predefined frequency in the predefined sensor database, in response to determining that the discrete signal is the non-overlapping signal. The method also includes sampling the duty cycle into a set of discrete signals based on a candidate sampling frequency until the discrete signal is non-overlapping if the discrete signal is overlapping, then if the signal is overlapping. The method further includes identifying the at least one activity from the plurality of candidate activities based on at least one of the sampling frequency and the candidate sampling frequency and determining a plurality of attributes associated with the predefined frequency from the predefined sensor database for the at least one activity.
In another embodiment obtaining sensor data from the plurality of sensor further comprises obtaining a sensor data from a plurality of sensors of the electronic device operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration. The obtaining further includes determining an activity class based on the sensor data obtained from each of the sensors. The obtaining also includes configuring the electronic device in an asynchronous mode based on the activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a second configuration different than the first configuration and obtaining sensor data from the at least one candidate sensor.
In another embodiment obtaining sensor data from the at least one candidate sensor comprises determining a change in the activity class based on the obtained sensor data from the at least one candidate sensor. The obtaining also includes reconfiguring the electronic device in the synchronous mode in response to determining that there is a change in the activity class. The obtaining also includes determining, by the electronic device, a new activity class based on the sensor data obtained from each of the sensors and configuring the electronic device in the asynchronous mode based on the new activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a third configuration different than the first configuration and second configuration.
Accordingly the present invention provides an electronic device for identifying an activity of a user. The electronic device comprises a memory, a processor, an attribute determiner and a communicator. The processor is configured to obtain sensor data from a plurality of sensors of the electronic device operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration. The processor is further configured to determine an activity class based on the sensor data obtained from each of the sensors, wherein the activity class indicates a plurality of candidate activities associated with the user. The processor is also configured to configure the electronic device in an asynchronous mode based on the activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a second configuration different than the first configuration. The processor is also configured to obtain sensor data from the at least one candidate sensor and identify at least one activity, from the plurality of candidate activities, currently performed by user based on the sensor data obtained from the at least one candidate sensor.
Accordingly the present invention provides an electronic device for identifying an activity of a user the electronic device comprises a memory, a processor, an attribute determiner and a communicator. The processor is configured to obtain sensor data from a plurality of sensor associated with the electronic device and determine a duty cycle with respect to a predetermined time based on the sensor data from the at least one candidate sensor. The processor is also configured to sample the duty cycle into a set of discrete signals based on a sampling frequency; and identify at least one activity, from a plurality of activities associated with user based on the set of discrete signals.
This method and system is illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
Fig. 1 is a bock diagram of an electronic device 100 for identifying an activity of a user of the electronic device 100, according to the embodiments as disclosed herein;
Fig.2 is a flow diagram of the proposed method for identifying the activity of the user and determining the attributes associated with the identified activity, according to the embodiments as disclosed herein;
Fig. 3 is a schematic diagram, illustrating the synchronous and asynchronous operating mode for the sensors, according to the embodiments as disclosed herein;
Fig. 4 is a schematic diagram illustrating how the plurality of sensor works when in asynchronous mode, according to the embodiments as disclosed herein;
Fig. 5 is a flow diagram illustrating a flow for determining the attributes associated with the activity of the user after obtaining the candidate sensor data, according to the embodiments as disclosed herein;
Fig. 6 is a schematic diagram, illustrating the flow 500 with an exemplary activity performed by the user, according to the embodiments as disclosed herein;
Fig. 7 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is cycling, according to an embodiment as disclosed herein;
Fig. 8 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is swimming, according to an embodiment as disclosed herein;
Fig. 9 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is neck rotation, according to an embodiment as disclosed herein;
Fig. 10 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is driving., according to an embodiment as disclosed herein;
Fig. 11 is a schematic diagram, illustrating an example embodiment, wherein the activity is being performed by a group of user, and the activity is dancing;
Fig. 12 is a schematic diagram illustrating the activity being performed by the user is climbing steps;
Fig. 13 is a table illustrating the grouping of similar activities.
The embodiments herein and the various features and advantageous details thereof are explained with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. Also, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The term “or” as used herein, refers to a non-exclusive or, unless otherwise indicated. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein can be practiced and to further enable those skilled in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
As is traditional in the field, embodiments may be described and illustrated in terms of blocks which carry out a described function or functions. These blocks, which may be referred to herein as managers, units, modules, hardware components or the like, are physically implemented by analog and/or digital circuits such as logic gates, integrated circuits, microprocessors, microcontrollers, memory circuits, passive electronic components, active electronic components, optical components, hardwired circuits and the like, and may optionally be driven by firmware and software. The circuits may, for example, be embodied in one or more semiconductor chips, or on substrate supports such as printed circuit boards and the like. The circuits constituting a block may be implemented by dedicated hardware, or by a processor (e.g., one or more programmed microprocessors and associated circuitry), or by a combination of dedicated hardware to perform some functions of the block and a processor to perform other functions of the block. Each block of the embodiments may be physically separated into two or more interacting and discrete blocks without departing from the scope of the disclosure. Likewise, the blocks of the embodiments may be physically combined into more complex blocks without departing from the scope of the disclosure.
The accompanying drawings are used to help easily understand various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed to extend to any alterations, equivalents and substitutes in addition to those which are particularly set out in the accompanying drawings. Although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another.
Accordingly, the embodiments herein disclose a method to identify an activity of a user. The method includes, obtaining by an electronic device, sensor data from a plurality of sensors of the electronic device operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration. The method also includes determining, by the electronic device, an activity class based on the sensor data obtained from each of the sensors, wherein the activity class indicates a plurality of candidate activities associated with the user and configuring the electronic device in an asynchronous mode based on the activity class. The asynchronous mode configures at least one candidate sensor from the plurality of sensors in a second configuration different than the first configuration. The method further includes obtaining sensor data from the at least one candidate sensor and identifying at least one activity, from the plurality of candidate activities, currently performed by user based on the sensor data obtained from the at least one candidate sensor.
Referring now to the drawings and more particularly to Fig.1-Fig 11 where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
Fig. 1 is a bock diagram of an electronic device 100 for identifying an activity of a user of the electronic device 100, according to the embodiments as disclosed herein. The electronic device 100 may be, for example, but not limited to a smart social robot, a smart watch, a cellular phone, a smart phone, a Personal Digital Assistant (PDA), a tablet computer, a head mounted smart device with display, a laptop computer, a music player, a video player, an Internet of things (IoT) device or the like. The electronic device 100 includes a memory 110, a processor 120, attribute determiner 130 and a communicator 140.
In an embodiment the user of the electronic device 100 may be a smart wearable device. The smart wearable device may allow the users to track the activities preformed by the user and may provide details about the attributes associated to the activities. The electronic device 100 may comprise a plurality of sensors for sensing the activities performed by the user. The plurality of sensors is short range sensors. In an embodiment the short range sensor may be for example but not limited to Near Field Communication Sensor (NFC) and proximity sensors. The activities performed by the user may be for example but not limited to cycling, jumping, skipping, walking, running, swimming, dancing, jogging, paddling, skiing, standing, sitting and the like. The attributes associated with the activities may be for example but not limited to step count, distance, calorie burnt, speed, angle, pressure, voltage, current, flux and the like.
The attribute determiner 130 determines the various attributes associated with the activities performed by the user. In an embodiment the attribute determiner 130 obtains data (sensor data) from the plurality of sensors operating in a synchronous mode. In synchronous mode, the plurality of sensors are configured a first configuration. The first configuration describes that each sensor of the plurality of sensors works as a transmitter as well as a receiver.
After obtaining the data from the plurality of sensors, the attribute determiner 130 determines an activity class of the activity performed by the user based on the sensor data. The activity class associated with the activity performed by the user indicates a plurality of candidate activities.
In an embodiment the activities performed by the user may be classified in class A, class B, class C, class D and the like. The class may indicate plurality of candidate activities. For example class A may indicate activities such as walking, running and jogging. Class B may indicate activities such as cycling and paddling. Class C may indicate activities such as skiing and cross- country machine. Class D may indicate activities such as stand, sitting and the like. The above mentioned activities are for example embodiment and the classes may have other activities than mentioned.
After classifying the activities into the activity class, the attribute determiner 130 configures the electronic device 100 in an asynchronous mode based on the activity class. The asynchronous mode configures the plurality of sensors in a second configuration. In the second configuration each of the sensors of the plurality of sensors may work either as a transmitter or a receiver. In an embodiment all the sensors may be configured into the second configuration. In another embodiment at least one sensor may be configured in the second configuration. The sensor may be termed as candidate sensors which are configured into asynchronous mode.
After configuring the candidate of sensors in asynchronous mode, the attribute determiner 130 obtains candidate sensor data from the candidate sensors and identifies an activity of the user from the plurality of candidate activities of the activity class based on the candidate sensor data.
The attribute determiner 130 also determines a plurality of attributes for the identified activity of the user based on a predefined frequency.
In an embodiment, the processor 120 is coupled to the attribute determiner 130, the memory 110 and the communicator 140. The processor 120 is configured to execute instructions stored in the memory 110 and to perform various other processes.
The memory 110 stores the effect and condition. The memory 110 also stores instructions to be executed by the processor 120. The memory 110 may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of erasable programmable memories (EPROM) or electrically erasable and programmable read only (EEPROM) memories. In addition, the memory 110 may, in some examples, be considered a non-transitory storage medium. The term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term “non-transitory” should not be interpreted that the memory 110 is non-movable. In some examples, the memory 110 can be configured to store larger amounts of information than the memory. In certain examples, a non-transitory storage medium may store data that can, over time, change (e.g., in Random Access Memory (RAM) or cache).
The communicator 140 is configured for communicating internally between internal hardware components and with external devices via one or more networks.
The Fig.1 shows the limited overview of the electronic device 100, but it is to be understood that other embodiments are not limited thereto. Further, the electronic device 100 may include any number of hardware or software components communicating with each other. By way of illustration, both an application running on the electronic device 100 and the device itself can be a component.
Fig.2 is a flow diagram of the proposed method for identifying the activity of the user and determining the attributes associated with the identified activity. At 202, the method states signal acquisition in synchronous mode. Signal acquisition refers to measuring the sensor data for the plurality of sensors. At 202, the electronic device 100 measures the data from the plurality of sensors of the electronic device 100. At 204, the electronic device 100 compares the measured sensor data with a predefined sensor database (salient point database) values present in the electronic device 100 and determines the activity class based on the comparison. In an embodiment the predefined sensor database may be present outside the electronic device 100. The activity class of may be for example but not limited to class A, class B, class C and class D. In an embodiment, the method iteratively compares the measured sensor data with a predefined sensor database for determining a change in the activity class.
At 206, the method includes configuring the plurality of sensors in the asynchronous mode. By configuring the plurality of sensors in asynchronous mode, the plurality of sensors may work either as a transmitter or a sender, configuration if the plurality of sensors in asynchronous mode increases the battery life of the plurality of sensors thereby increasing the battery life of the electronic device 100.
At 208, the method includes obtaining by the electronic device 100, data from the plurality of sensors configured in asynchronous mode which may be termed as candidate sensor data.
At 210, the electronic device 100, determines a duty cycle with respect to a predetermined time based on the candidate sensor data from at least one candidate sensor. After determining the duty cycle, the electronic device 100 samples the duty cycle into a set of discrete signals based on a sampling frequency.
At 212, the method determines whether the discrete signal is an overlapping signal or a non-over lapping signal. If the signal is non-overlapping signal then the flow proceeds to 214. If the signal is an overlapping signal then the flow proceeds to 212. At 212, the method includes re-sampling, by the electronic device 100, the duty cycle into a set of discrete signals based on another sampling frequency and performing the sampling iteratively until the discrete signal is non-overlapping.
At 214, the method includes mapping the sampling frequency for which the discrete signal is non-overlapping with a mapping, the sampling frequency with a predefined frequency in the predefined sensor database and determining the activity of the user by based on the mapping.
At 216, the method includes determining, by the electronic device 100 the plurality of attributes associated with the determined user activity and from the predefined sensor database.
Thus as seen above the method helps in increasing the battery life of the electronic device 100 and also provides accurate output. The output provided is the attributes associated with the user activity.
Fig. 3 is a schematic diagram, illustrating the synchronous and asynchronous operating mode for the sensors. As seen in fig, 3, 1 illustrates the clock cycles, wherein the x-axis indicates time and y-axis indicates the voltage. 2 indicate the sensor signal, wherein the plurality of sensors are in synchronous mode, and each of the sensor acts as a sender as well as a receiver. As seen in 2 for each clock cycle the sensor signal is obtained. 3 indicate the sensor signal, wherein the plurality of sensors is in asynchronous mode, and each of the sensor acts either as a sender or as a receiver. Thus as seen in 3, the sensor signal is available only for alternate clock cycles, thereby saving battery life. 4 show a combined sensor signal for asynchronous and synchronous mode. As seen in fig. 4, for the first clock cycle the plurality of sensors are in synchronous mode, for the second clock cycle the plurality of sensors are in asynchronous mode and for the third, fourth and fifth cycle, the sensor signals are in synchronous mode.
Fig. 4 is a schematic diagram illustrating how the plurality of sensor works when in asynchronous mode. Fig. 4 illustrates the proximity sensor and the NFC sensors present in the electronic device 100 (wearable device/smart shoes) worn by the user while performing a walking activity. S1 indicates a first station of the NFC sensor and S2 indicates a second sensor of the NFC sensor. A left leg of the user with the smart shoes is indicated by station S1 and a right leg of the user with smart shoes is indicated by station S2 of the NFC sensors. At state I, S1 is at ground and at state II, S2 is at ground. While walking the user lifts the right leg and hence at state III the S2 is in mid-air and has some vertical distance from the ground. In here the proximity sensor measures some vertical distance and S2 acts as a sender and sends the measured data to either the electronic device 100 or to S1. At state IV the right leg is not completely at ground and thus S2 still acts as a sender. In the meantime form state I to state IV S1 acts as a receiver as S1 is not measuring any values and thus no need to send any data.
At state V the right leg is completely grounded and hence S2 acts as a receiver. Stating form VI-VIII the left leg is in air and the proximity sensor has to measure and send the vertical distance, thus S1 acts as a sender. Whereas from VI-VIII the right leg is at ground and need not measure any values and hence S2 acts as a receiver.
Table 1, below shows the configuration of S1 and S2 and the proximity sensors.
Leg Proximity Sensor [Vertical Distance] Station
1 2 S1(Left Leg) S2(Right Leg)
Right (in air);Left (Grounded) 0 Some variable values Receiver Sender
Left (in air)Right (Grounded) Some variable values 0 Sender Receiver
Fig. 5 is a flow diagram illustrating a flow for determining the attributes associated with the activity of the user after obtaining the candidate sensor data. At 502, the electronic device 100 after obtaining the candidate sensor data and determines a duty cycle having a baseband frequency (fm) using the obtained candidate sensor data. At 504, the method includes selecting a discrete signal with a sampling frequency (fs). At 506, the method includes applying a sampling process on the duty cycle with frequency fm and the discrete signal with frequency fs for obtaining a sampled signal. At 508, the method includes applying a Fourier transform on the sampled signal and obtaining an output signal. At 510 the flow determines whether the signal obtained at 508 is overlapping or non-overlapping. If the signal obtained is a non-overlapping signal then flow 500 proceeds to 512 and to 504 if the signal is an overlapping signal. At 512, the sampling frequency for which the discrete signal obtained at 508 is non-overlapping is compared with a predefined frequency in the frequency database and the activity of the user is determined based on the comparison. At 514, after determining the activity, the electronic device 100 determines the attributes associated for the sampling frequency for which the discrete signal obtained at 508 is non-overlapping from the predefined database.
Fig. 6 is a schematic diagram, illustrating the flow 500 with an exemplary activity performed by the user. As seen in fig. 6, 6.1 show the duty cycle m(t). The signal m (t) may be a non-continuous, continuous, periodic and non-periodic in nature. 6.2 indicate the signal c(t) with sampling frequency fs. The sampling frequency is selected using Nqyist criteria where fs >= 2fm. 6.3 indicates a sampling signal s(t) after combining signal m(t) and c(t). 6.4 indicate the sampled discrete signal s(t) obtained after applying performing sampling. 6.5 indicates a non-overlapping signal w(t) obtained after performing a Fourier transform on signal s(t). Using the sampling frequency for which the non-overlapping signal is obtained, the electronic device 100 determines the activity of the user and the attributes associated with the activity.
In an embodiment the sampled cycle s (t) obtained from a first user may be compared with the sampled cycle obtained from the second user and may determine the activity of the second user just by comparing the sampled cycle.
Fig. 7 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is cycling. While cycling the distance between the two paddles are going to remain constant and hence the NFC sensor input will be constant. However the distance between the paddle and ground may vary and is captured using the proximity sensor. Fig. 7 shows the proximity sensor waveform at S1 and S2, when the plurality of sensors are configured asynchronously. In here the input to the electronic device 100 is the output of the NFC and the proximity sensor. Based on the input values the electronic device 100 determines the activity and the plurality of attributes associated with the activity such as cycling distance, cycling speed, duration and the like.
Fig. 8 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is swimming. The user is wearing the smart device (electronic device 100) in both the hands and legs. The wearable device includes proximity and NFC sensor. In here the proximity sensor and the NFC sensors are used in combination for determining the activity of the user and the attributes associated with the activity. As seen in fig. 8 and as stated in table 2, the proximity sensor and the NFC sensors assists the electronic device 100 in determining the activity of the user as swimming and also determining the attributes associated with swimming such as swimming distance, swimming speed, hand-leg synchronization, and the like.
State Hand Proximity Sensor [Distance] [mm]
Left Right A B
State I in-water in-water 0 0
State II, State III in-water mid-air 0 Some value
State IV, State V mid-air in-water Some value 0
Fig. 9 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is neck rotation. The user is wearing the smart device (electronic device 100) as headphone. The wearable device includes proximity and NFC sensor. In here the proximity sensor and the NFC sensors are used in combination for determining the activity of the user and the attributes associated with the activity. The Magnetic field of the NFC sensor is used for determining the head rotation and the proximity sensor is used for determining the change in polarity. Thus as seen in fig. 9 and as stated in table 3, the proximity sensor and the NFC sensors together assists the electronic device 100 in determining the activity of the user as neck rotation and also determining the attributes associated with neck rotation such as neck rotation speed, number of repetitions, duration, effective posture and the like.
Head Related Exercises ( Repetitive Movement - Linear/Non-linear)
Head Configurations Signal Analysis
Stations Left Ear(E1) Right Ear(E2) Single Ear(Left OR Right) Both Ear(Between Left AND Right)
NFC [Magnetic Field] Variable Variable ㅿHeight, ㅿDistance, ㅿStrength, ㅿVoltage, ㅿCurrent, Power, ㅿFlux RMS Values, In Phase / Out of phase voltage or current, Phase difference / Polarity change, Wavelength differences
Proximity On/Off(toggle) On/Off(toggle) Toggle Level (On/Off)Polarity change
Fig. 10 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is driving. The user is wearing the smart device (electronic device 100) in both hands as well as both legs denoted by A, B, C and D. The wearable device includes proximity, NFC and NFC electromagnetic sensor. In here the proximity sensor and the NFC sensors are used in combination for determining the activity of the user and the attributes associated with the activity. Thus as seen in fig. 10 and as stated in table 4, the proximity sensor and the NFC sensors together assists the electronic device 100 in determining the activity of the user as driving and also determining the attributes associated with driving such as pressure applying pattern (Brake & Acceleration), frequency, intensity & duration for user’s driving activities and the like Table 4 illustrated the configuration of the sensors.
Configuration
A B C D
NFC O O X X
Proximity X X O O
NFC [Electro-magnetic] X X O O
Fig. 11 is a schematic diagram, illustrating an example embodiment, wherein the activity is being performed by a group of user, and the activity is dancing. In an embodiment the group of dancers (users) may be wearing the smart device (electronic device 100) in hand and/or legs. Using the data obtained from the NFC and proximity sensor the electronic device 100 may determine whether the users are in synchronization, whether their postures are similar and the like. As seen in fig. 11 user 1 of the user group is wearing devices A, B, C and D. Similarly the user 2 and user are wearing device A’ and A’’ respectively. Thus as seen in fig. 11 and as stated in table 5, the proximity sensor and the NFC sensors together assists the electronic device 100 in determining the activity of the group of users as dancing and also determining the attributes associated with neck rotation such as synchronization between users, posture determination and the like Table 5 illustrated the configuration of the sensors.
Input Output
NFC (Variable)Proximity (Variable)NFC [Electro-magnetic] (Variable) Communication Intra (A-B) / Inter (A - A')
Dance Type Solo / Paired / Group
Analysis SynchronizationPosture difficulty level
In other embodiments the proposed method may be used in health care domain. In an example embodiment, where a user is recovering from a leg fracture, the proposed method may be used for determining the walking speed of the user and the progress of the user speed using the NFC and the proximity sensors. The proposed method may also be used for monitoring a child walking pattern using the sensors and may provide a detailed report indicating nutrient deficiency found in the child and the like. The proposed may also be used for monitoring a pregnant lady for having a smooth journey.
Fig. 12 is a schematic diagram, illustrating an example embodiment, wherein the activity being performed by the user is climbing steps. The user is wearing the smart device (electronic device 100) in both legs. The wearable device includes proximity, NFC and NFC electromagnetic sensor. In here the proximity sensor and the NFC sensors are used in combination for determining the activity of the user and the attributes associated with the activity. The attributes associated with climbing steps are no. of steps and the distance covered by the user. Further as seen in fig. 12 the height of the stair is determined using the values obtained from the proximity senor. Similarly, the method determines the strength of the user. As seen in 12.2, if there is a drastic change in the proximity, then the method determines that the strength of the user is more than usual and hence the method detects that the user is climbing stairs. Fig.12.3 determines the values obtained through NFC sensor using an NFC waveform. Further, the consecutive proximity reading results in determining the stair height. Thus, as discussed above the proximity sensor and the NFC sensors together assists the electronic device 100 in determining the activity of the user as climbing stairs and also determining the attributes associated with climbing stairs.
In another embodiment fig. 13 is a table illustrating the grouping of similar activities. The similar activities are grouped into one class based on various inputs.
In another embodiment table 6 shows the smart devices (electronic device 100) to be worn by the user and the possible activities that may be detected using the electronic device 100.
Electronic device Activities
Shoe / Leg-Band / Anklet-Toe ring/ Ankle Band 1. Steps, Activity Detection (Walk, Run, Recovery Monitoring, Fall Detection, Exercise - Repetition, Count, Duration)2. Body Pressure Index (Obesity Detection)3. Vehicle Type Detection (Terrain detection)4. Driving Pattern5. Leg Synchronization6. Stamina Deduction7. Gaming - Virtual Dance Floor, etc.8. Controller - IOT Devices (Vacuum Cleaner)
Wrist-Band / Ring / Bangle / Bracelet / Glove 1. Gloves - Grip2. Arm Patch - Exercise Repetition Count3. IOT Control4. Gaming 5. Controller - IOT Devices (TV, AC, etc.)6. Wrist Exercise7. Gloves - Sign Language
Earpiece (IconX, LevelU) Head Gear (Glasses) 1. Virtual Gaming Controller2. Text Neck (Neck Bending Problem)3. Exercise (Neck)
Walking Sticking 1. Blind and Elderly Care
In another embodiment, table 7 indicates the plurality of attributes for various sampling frequencies and the activities associated with the sampling frequency. In an embodiment the sampling frequency may be replaced with sampling current, voltage, flux etc. for determining activity related attributes.
As seen in table 7, sampling frequency for sampling class is mapped with different attributes, however it may be possible that sampling frequency value is same for different class group but its attribute mapping would be different for classes. In an example scenario, for fs class A is mapped with step length and for same fs class B may be mapped with circular distance.
Sampling frequency Activity Attributes1 Attributes2 Attributes3 Attributes4 Attributes5 Attributes6 Attributes7 Attributes8
fs Walk Step distance Step Gap(H- horizontal) Pressure(V- vertical) Angle(H + V) Speed(horizontal) Acceleration(horizontal) Speed(vertical) Acceleration(vertical)
'fs Stair Stair height Step Gap(H- horizontal) Pressure(V- vertical) Angle(H + V) Speed(horizontal) Acceleration(horizontal) Speed(vertical) Acceleration(vertical)
''fs Cycling Revolution distance Step Gap(H- horizontal) Pressure(V- vertical) Angle(H + V) Speed(horizontal) Acceleration(horizontal) Speed(vertical) Acceleration(vertical)
The embodiments disclosed herein can be implemented using at least one software program running on at least one hardware device and performing network management functions to control the elements.
The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.

Claims (15)

  1. A method to identify an activity of a user, the method comprising:
    obtaining, by an electronic device (100), sensor data from a plurality of sensors of the electronic device (100) operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration;
    determining, by the electronic device (100), an activity class based on the sensor data obtained from each of the sensors, wherein the activity class indicates a plurality of candidate activities associated with the user;
    configuring the electronic device (100) in an asynchronous mode based on the activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a second configuration different than the first configuration;
    obtaining sensor data from the at least one candidate sensor; and
    identifying at least one activity, from the plurality of candidate activities, currently performed by user based on the sensor data obtained from the at least one candidate sensor.
  2. The method as claimed in claim 1, wherein determining, by the electronic device (100), the activity class based on the sensor data obtained from each of the sensors comprises:
    comparing, by the electronic device (100), the sensor data for the plurality of the sensor with predefined sensor data present in a predefine sensor database; and
    determining, by the electronic device (100), the activity class of the sensor data based on the comparison.
  3. The method as claimed in claim 1 and claim 2, wherein identifying the at least one activity, from the plurality of candidate activities, currently performed by user comprises:
    determining, by the electronic device (100), a change in the activity class based on the obtained sensor data from the at least one candidate sensor by comparing the sensor data for the plurality of the sensor with predefined sensor data;
    reconfiguring the electronic device (100) in the synchronous mode in response to determining that there is a change in the activity class;
    determining, by the electronic device (100), a new activity class based on the sensor data obtained from each of the sensors; and
    configuring the electronic device (100) in the asynchronous mode based on the new activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a third configuration different than the first configuration and second configuration.
  4. The method as claimed in claim 1, wherein identifying the at least one activity, from the plurality of candidate activities, currently performed by user further comprises:
    determining, by the electronic device (100), a duty cycle with respect to a predetermined time based on the sensor data from at least one candidate sensor;
    sampling, by the electronic device (100), the duty cycle into a set of discrete signals based on a sampling frequency; and
    identifying, by the electronic device (100), the at least one activity, from the plurality of candidate activities based on the set of discrete signals.
  5. The method as claimed in claim 4, wherein identifying, by the electronic device (100), the at least one activity, from the plurality of candidate activities based on the set of discrete signals comprises:
    determining, by the electronic device (100), whether the set of discrete signal is a non-overlapping signal;
    performing by the electronic device (100) one of:
    mapping, the sampling frequency with a predefined frequency in the predefined sensor database, in response to determining that the discrete signal is the non-overlapping signal; and
    if the discrete signal is overlapping, then sampling, by the electronic device (100), the duty cycle into a set of discrete signals based on a candidate sampling frequency until the discrete signal is non-overlapping;
    identifying, by the electronic device (100), the at least one activity from the plurality of candidate activities based on at least one of the sampling frequency and the candidate sampling frequency; and
    determining, by the electronic device (100) a plurality of attributes associated with the predefined frequency from the predefined sensor database for the at least one activity.
  6. The method as claimed in claim 4, wherein the plurality of attributed comprises acceleration, speed, angle, pressure, step gap, step distance, stair height, revolution distance and terrain type.
  7. A method to identify an activity of a user, the method comprising:
    obtaining, by the electronic device (100), sensor data from a plurality of sensor associated with the electronic device (100);
    determining, by the electronic device (100), a duty cycle with respect to a predetermined time based on the sensor data from the at least one candidate sensor;
    sampling, by the electronic device (100), the duty cycle into a set of discrete signals based on a sampling frequency; and
    identifying, by the electronic device (100), at least one activity, from a plurality of activities associated with user based on the set of discrete signals.
  8. The method as claimed in claim 7, wherein identifying, by the electronic device (100), the at least one activity, from the plurality of activities based on the set of discrete signals comprises:
    determining, by the electronic device (100), whether the discrete signal is a non-overlapping signal;
    performing by the electronic device (100) one of:
    mapping, the sampling frequency with a predefined frequency in the predefined sensor database, in response to determining that the discrete signal is the non-overlapping signal; and
    if the discrete signal is overlapping, then sampling, by the electronic device (100), the duty cycle into a set of discrete signals based on a candidate sampling frequency until the discrete signal is non-overlapping;
    identifying, by the electronic device (100), the at least one activity from the plurality of candidate activities based on at least one of the sampling frequency and the candidate sampling frequency; and
    determining, by the electronic device (100) a plurality of attributes associated with the predefined frequency from the predefined sensor database for the at least one activity.
  9. The method as claimed in claim 7, wherein obtaining sensor data from the plurality of sensor further comprises:
    obtaining, by an electronic device (100), sensor data from a plurality of sensors of the electronic device (100) operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration;
    determining, by the electronic device (100), an activity class based on the sensor data obtained from each of the sensors;
    configuring the electronic device (100) in an asynchronous mode based on the activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a second configuration different than the first configuration; and
    obtaining sensor data from the at least one candidate sensor.
  10. The method as claimed in claim 7, wherein obtaining sensor data from the at least one candidate sensor comprises:
    determining, by the electronic device (100), a change in the activity class based on the obtained sensor data from the at least one candidate sensor;
    reconfiguring the electronic device (100) in the synchronous mode in response to determining that there is a change in the activity class;
    determining, by the electronic device (100), a new activity class based on the sensor data obtained from each of the sensors; and
    configuring the electronic device (100) in the asynchronous mode based on the new activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a third configuration different than the first configuration and second configuration.
  11. An electronic device (100) for identifying an activity of a user, comprising:
    a memory (110);
    a processor (120);
    an attribute determiner (130); and
    a communicator (140), wherein the processor is configured to:
    obtain sensor data from a plurality of sensors of the electronic device (100) operating in a synchronous mode, wherein the synchronous mode configures each of the plurality of sensors in a first configuration;
    determine an activity class based on the sensor data obtained from each of the sensors, wherein the activity class indicates a plurality of candidate activities associated with the user;
    configure the electronic device (100) in an asynchronous mode based on the activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a second configuration different than the first configuration;
    obtain sensor data from the at least one candidate sensor; and
    identify at least one activity, from the plurality of candidate activities, currently performed by user based on the sensor data obtained from the at least one candidate sensor.
  12. The electronic device (100) as claimed in claim 11, wherein determine the activity class based on the sensor data obtained from each of the sensors comprises:
    compare the sensor data for the plurality of the sensor with predefined sensor data present in a predefine sensor database; and
    determine the activity class of the sensor data based on the comparison.
  13. The electronic device (100) as claimed in claim 11 and claim 12, wherein identify the at least one activity, from the plurality of candidate activities, currently performed by user comprises:
    determine a change in the activity class based on the obtained sensor data from the at least one candidate sensor by comparing the sensor data for the plurality of the sensor with predefined sensor data;
    reconfigure the electronic device (100) in the synchronous mode in response to determining that there is a change in the activity class;
    determine a new activity class based on the sensor data obtained from each of the sensors; and
    configure the electronic device (100) in the asynchronous mode based on the new activity class, wherein the asynchronous mode configures at least one candidate sensor from the plurality of sensors in a third configuration different than the first configuration and second configuration.
  14. The electronic device (100) as claimed in claim 11, wherein identify the at least one activity, from the plurality of candidate activities, currently performed by user further comprises:
    determine a duty cycle with respect to a predetermined time based on the sensor data from at least one candidate sensor;
    sample the duty cycle into a set of discrete signals based on a sampling frequency; and
    identify the at least one activity, from the plurality of candidate activities based on the set of discrete signals.
  15. The electronic device (100) as claimed in claim 11, wherein identify the at least one activity, from the plurality of candidate activities based on the set of discrete signals comprises:
    determine whether the discrete signal is a non-overlapping signal;
    perform by the electronic device (100) one of:
    map the sampling frequency with a predefined frequency in the predefined sensor database, in response to determining that the discrete signal is the non-overlapping signal; and
    if the discrete signal is overlapping, then sample the duty cycle into a set of discrete signals based on a candidate sampling frequency until the discrete signal is non-overlapping;
    identify the at least one activity from the plurality of candidate activities based on at least one of the sampling frequency and the candidate sampling frequency; and
    determine a plurality of attributes associated with the predefined frequency from the predefined sensor database for the at least one activity.
PCT/KR2020/011478 2019-08-29 2020-08-27 Method and system for identifying an activity of a user WO2021040428A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201941034907 2019-08-29
IN201941034907 2019-08-29

Publications (1)

Publication Number Publication Date
WO2021040428A1 true WO2021040428A1 (en) 2021-03-04

Family

ID=74685587

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/011478 WO2021040428A1 (en) 2019-08-29 2020-08-27 Method and system for identifying an activity of a user

Country Status (1)

Country Link
WO (1) WO2021040428A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009502231A (en) * 2005-07-19 2009-01-29 カーディアック・ペースメーカーズ・インコーポレーテッド Monitoring physiological responses to steady-state activity
US20160074706A1 (en) * 2013-03-05 2016-03-17 Microsoft Technology Licensing, Llc Automatic exercise segmentation and recognition
US20160367855A1 (en) * 2012-01-19 2016-12-22 Nike, Inc. Action Detection and Activity Classification
US20170325740A1 (en) * 2016-05-11 2017-11-16 WiseWear Corporation Waistband monitoring analysis for a user
US20190167102A1 (en) * 2009-06-01 2019-06-06 The Curators Of The University Of Missouri Integrated Sensor Network Methods and Systems

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009502231A (en) * 2005-07-19 2009-01-29 カーディアック・ペースメーカーズ・インコーポレーテッド Monitoring physiological responses to steady-state activity
US20190167102A1 (en) * 2009-06-01 2019-06-06 The Curators Of The University Of Missouri Integrated Sensor Network Methods and Systems
US20160367855A1 (en) * 2012-01-19 2016-12-22 Nike, Inc. Action Detection and Activity Classification
US20160074706A1 (en) * 2013-03-05 2016-03-17 Microsoft Technology Licensing, Llc Automatic exercise segmentation and recognition
US20170325740A1 (en) * 2016-05-11 2017-11-16 WiseWear Corporation Waistband monitoring analysis for a user

Similar Documents

Publication Publication Date Title
WO2018217060A1 (en) Method and wearable device for performing actions using body sensor array
WO2018070718A1 (en) Output device outputting audio signal and control method thereof
WO2020105841A1 (en) Electronic device for obtaining blood pressure value using pulse wave velocity algorithm and method for obtaining blood pressure value
WO2019050277A1 (en) Method of obtaining biometric information based on wearing state and electronic device thereof
CN106705989B (en) step recording method, device and terminal
WO2016111592A1 (en) Wearable device and method for controlling the same
WO2016165665A1 (en) Motion-sensing interactive system and motion-sensing interaction method
WO2018124809A1 (en) Wearable terminal and method for operating same
WO2015192416A1 (en) Electronic device and wearable input apparatus for electronic device
WO2019156416A1 (en) Apparatus and method for tracking movement of electronic device
EP3691521A1 (en) Electronic device and method for providing stress index corresponding to activity of user
WO2017090971A1 (en) Wear system and method for providing service
WO2021040428A1 (en) Method and system for identifying an activity of a user
WO2019221438A1 (en) Electronic device for measuring blood pressure and operating method thereof
WO2017222100A1 (en) Body composition measurement device and server for compensating for body composition measurement result
WO2018044059A1 (en) Fitness monitoring system.
AU2019208949B2 (en) Apparatus and method for determining calibration timing for blood pressure measurement in electronic device
WO2021172843A1 (en) Method for transceiving information and electronic device thereof
WO2023003232A1 (en) Electronic devices comprising sensors and method for operating same
Weghorn Applying mobile phone technology for making health and rehabilitation monitoring more affordable
CN114209298A (en) PPG sensor control method and device and electronic equipment
CN113867666A (en) Information display method and device and wearable device
WO2020096311A1 (en) Electronic device and method for identifying occurrence of hypotension
WO2019143056A1 (en) Apparatus and method for determining calibration timing for blood pressure measurement in electronic device
WO2020009327A1 (en) Electronic device and method for providing personalized biometric information based on biometric signal using same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20858445

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20858445

Country of ref document: EP

Kind code of ref document: A1