WO2023119305A1 - Posture tracking - Google Patents

Posture tracking Download PDF

Info

Publication number
WO2023119305A1
WO2023119305A1 PCT/IN2022/050256 IN2022050256W WO2023119305A1 WO 2023119305 A1 WO2023119305 A1 WO 2023119305A1 IN 2022050256 W IN2022050256 W IN 2022050256W WO 2023119305 A1 WO2023119305 A1 WO 2023119305A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
posture
laser beam
reflected laser
reflected
Prior art date
Application number
PCT/IN2022/050256
Other languages
French (fr)
Inventor
Syed Irfan AHMED
Akhil S. KUMAR
Akash MURTHY
Original Assignee
Ahmed Syed Irfan
Kumar Akhil S
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ahmed Syed Irfan, Kumar Akhil S filed Critical Ahmed Syed Irfan
Publication of WO2023119305A1 publication Critical patent/WO2023119305A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • A61B5/4561Evaluating static posture, e.g. undesirable back curvature

Definitions

  • the present subject matter described herein in general, relates to tracking a posture of a user.
  • a device for tracking posture of a user in real time is disclosed.
  • one or more sensors may be configured to project a Light Amplification by Stimulated Emission of Radiation (LASER) beam to generate a depth map.
  • LASER Stimulated Emission of Radiation
  • the LASER beam is simultaneously and separately projected on a user’s forehead and the user’s chest.
  • a receiver may be configured to receive a first reflected LASER beam ,a second reflected LASER beam and the depth map. The first reflected LASER beam may be reflected from the user’s forehead and the second reflected LASER beam may be reflected from the user’s chest.
  • a distance sensor may be configured to calculate a distance travelled by the first reflected LASER beam, and a distance travelled by the second reflected LASER beam.
  • a transmitter may be configured to transmit the distance travelled by the first reflected LASER beam, and the distance travelled by the second reflected LASER beam to a system for tracking posture of the user in real time.
  • a system for tracking posture of a user in real time is disclosed. Initially, a distance travelled by a first reflected LASER beam , a distance travelled by a second reflected LASER beam and a depth map may be received from a device.
  • the first reflected LASER beam may be understood to be reflected from a user’s forehead and the second reflected LASER beam may be reflected from the user’s chest.
  • an angular displacement of the user’s neck may be computed. It may be noted that the angular displacement is computed based on the distance travelled by the first reflected LASER beam and the distance travelled by the second reflected LASER beam.
  • an angle of flexion between a tragus of the user’s ear and a seventh cervical vertebra may be measured.
  • a strain on the user’s neck may be determined in real time based on the angle of flexion, the angular displacement, and a historic data.
  • a posture of the user may be detected as an acceptable posture or as an unacceptable posture.
  • the posture may be detected as the acceptable posture when the strain is within or less than a threshold of the strain.
  • the posture may be detected as the unacceptable posture when the strain is more than the threshold of the strain.
  • a change in the posture of the user may be recommended upon detecting the unacceptable posture.
  • the user may be alerted if the unacceptable posture is maintained for more than a predetermined interval of time. It may be understood that the user may be alerted as the unacceptable posture is not changed after recommending the change in posture for the predetermined interval of time.
  • a method for tracking a posture of a user in real time using a device is disclosed.
  • a Light Amplification by Stimulated Emission of Radiation (LASER) beam may be projected by one or more sensor to generate a depth map.
  • the LASER beam may be projected simultaneously and separately on the user’s forehead, and the user’s chest.
  • a first reflected LASER beam ,a second reflected LASER beam and the depth map may be received by a receiver.
  • the first reflected LASER beam may be understood to be reflected from the user’s forehead and the second reflected LASER beam may be understood to be reflected from the user’s chest.
  • a distance travelled by the first reflected LASER beam and a distance travelled by the second reflected LASER beam may be calculated by a distance sensor.
  • the distance travelled by the first reflected LASER beam and the distance travelled by the second reflected LASER beam may be transmitted by a transmitter to a system.
  • the aforementioned method for tracking posture of a user in real time may be performed by a processor using programmed instructions stored in a memory.
  • a method for tracking posture of a user in real time is disclosed. Initially, a distance travelled by a first reflected LASER beam , a distance travelled by a second reflected LASER beam, and a depth map may be received from a device.
  • the first reflected LASER beam may be understood to be reflected from a user’s forehead
  • the second reflected LASER beam may be understood to be reflected from the user’s chest.
  • an angular displacement of the user’s neck may be computed. The angular displacement may be computed based on the distance travelled by the first reflected LASER beam, and the distance travelled by the second reflected LASER beam.
  • an angle of flexion may be measured between a tragus of the user’s ear and a seventh cervical vertebra of the user.
  • a strain on the user’s neck may be determined in real time based on the angle of flexion, the angular displacement, and a historic data.
  • a posture of the user may be detected as an acceptable posture, or as an unacceptable posture.
  • the posture may be detected as the acceptable posture when the strain is within or less than a threshold of the strain and the posture may be detected as the unacceptable posture when the strain is more than the threshold of the strain.
  • a change in the posture of the user may be recommended upon detecting the unacceptable posture.
  • the user may be alerted if the unacceptable posture is maintained for more than a predetermined interval of time. It may be understood that the user is alerted as the unacceptable posture is not changed after recommending the change in the posture for the predetermined interval of time.
  • Figure 1 illustrates a network implementation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter.
  • Figure 2 illustrates an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter.
  • Figure 3 illustrates an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter.
  • Figure 4a and Figure 4b illustrate a device for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter.
  • Figure 5 illustrates an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter.
  • Figure 6 illustrates an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter.
  • Figure 7 illustrates a method for tracking posture of a user in real time using a device. The method is illustrated in the form of flow chart, in accordance with an embodiment of the present subject matter.
  • Figure 8 illustrates a method for tracking posture of a user in real time. The method is illustrated in the form of flow chart, in accordance with an embodiment of the present subject matter.
  • the present subject matter discloses a method and a system for tracking posture of a user in real time.
  • a device for tracking the posture is disclosed in the present invention.
  • the device comprises one or more sensor, a receiver, a distance sensor, and a transmitter installed in the device.
  • the sensor in the device is configured to project a Light Amplification by Stimulated Emission of Radiation (LASER) beam simultaneously and separately on the user’s forehead and the user’s chest to generate a depth map.
  • the receiver is configured to receive the depth map, a first reflected LASER beam ,and a second reflected LASER beam from the user’s forehead and the user’s chest respectively.
  • LASER Light Amplification by Stimulated Emission of Radiation
  • the distance sensor is configured to calculate a distance travelled by the first reflected LASER beam and a distance travelled by the second reflected LASER beam. Further, the transmitter is configured to transmit the distance travelled by the first reflected LASER beam and the distance travelled by the second reflected LASER beam to a system for tracking the posture of the user in real time.
  • the device for tracking posture of the user in real time transmits the distance travelled by the first reflected LASER beam, the distance travelled by the second reflected LASER beam, and the depth map to a system.
  • the system may receive the depth map, the distance travelled by the first reflected LASER beam, and the distance travelled by the second reflected LASER beam from the device and compute an angular displacement of the user’s neck. Further the system measures an angle of flexion between a tragus of the user’s ear and a seventh cervical vertebra of the user, determines a strain on the user’s neck and detects a posture of the user as an unacceptable posture, or as an unacceptable posture. Upon detecting the unacceptable posture, the system may recommend a change in the posture to the user and alert the user if the posture is not changed in a predetermined interval of time.
  • a network implementation 100 of a device 102 for tracking posture of a user in real time is disclosed.
  • the device 102 sends information to a system 114 upon calculating a distance travelled by a first reflected LASER beam and a distance by travelled by a second reflected LASER beam.
  • the user uses one or more display devices 104-1, 104-2, ... 104-N, collectively referred to as display devices 104, hereinafter, or applications residing on the display devices 104 for receiving notifications related to the tracking in real time from the system 114.
  • the system 114 may send a recommendation for change in posture, followed by an alert upon detection of an unacceptable posture of the user in real time.
  • the device 102 for tracking posture of the user in real time may be accessed by multiple users through one or more display devices 104-1, 104-2... 104-N.
  • the device 102 may be communicatively connected to a cloud-based computing environment.
  • the display devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, a mobile device, a tablet, and a workstation.
  • the display devices 104 are communicatively coupled to the device 102 through a network 106.
  • the system 114 may be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a virtual environment, a mainframe computer, a server, a network server, the cloud-based computing environment. It will be understood that the system 114 may be accessed by multiple users through one or more display devices 104-1, 104-2... 104-N. In one implementation, the system 114 may comprise the cloud-based computing environment in which the user may operate individual computing systems configured to execute remotely located applications.
  • Examples of the display devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, a mobile device, a tablet, and a workstation.
  • the display devices 104 are communicatively coupled to the system 114 for tracking the posture of the user in real time through a network 106.
  • the network 106 may be a wireless network, a wired network, or a combination thereof.
  • the network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like.
  • the network 106 may either be a dedicated network or a shared network.
  • the shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another.
  • the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
  • the device 102 for tracking posture of the user in real time may include at least one processor 108-A, an input/output (I/O) interface 110-A, and a memory 112- A.
  • the system 114 for tracking posture of the user in real time may also include at least one processor 108-B, an input/output (I/O) interface 110-B, and a memory 112-C.
  • at least one processor 108-A, 108-B may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, Central Processing Units (CPUs), state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions.
  • the at least one processor 108-A, 108-B is configured to fetch and execute computer-readable instructions stored in the memory 112-A, 112-B.
  • the I/O interface 110-A, 110-B may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like.
  • the I/O interface 110-A may allow the device 102 to interact with the user directly or through the display devices 104.
  • the TO interface 110-B may allow the system 114 to interact with the user directly or through the display devices 104.
  • the I/O interface 110 may enable the device 102 and the system 114 to communicate with other computing devices, such as web servers and external data servers (not shown).
  • the TO interface 110-A, 110-B can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite.
  • the I/O interface 110- A, 110-B may include one or more ports for connecting a number of devices to one another or to another server.
  • the memory 112-A, 112-B may include any computer-readable medium or computer program product known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, Solid State Disks (SSD), optical disks, and magnetic tapes.
  • volatile memory such as static random access memory (SRAM) and dynamic random access memory (DRAM)
  • non-volatile memory such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, Solid State Disks (SSD), optical disks, and magnetic tapes.
  • the memory 112-A, 112-B may include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types.
  • the memory 112-A, 112-B may include programs or coded instructions that supplement applications and functions of the postbox 102 and the post office system 114.
  • a user may use the display device 104 to access the device 102 via the I/O interface 110- A.
  • the user may register the display devices 104 using the I/O interface 110-A in order to use the device 102.
  • the user may access the I/O interface 110-A of the device 102.
  • the system 114 may also be used for tracking the posture of the user in real time.
  • a user may use the display device 104 to access the system 114 via the I/O interface 110-B.
  • the user may register the display devices 104 using the I/O interface 110-B in order to use the system 114.
  • the user may access the I/O interface 110-B of the system 114.
  • the detail functioning of the device 102 and the system 114 is described below with the help of figures.
  • the present subject matter discloses a system and a method for tracking posture of a user in real time.
  • a user may be understood to be using a display device 104.
  • the user may be a student.
  • the user may be an employee using the display device such as a desktop or a laptop to work.
  • the user may be a person using the display device for entertainment and recreational purposes.
  • the user may be provided with a device 102 for tracking the posture of the user in real time.
  • the device 102 may be mounted on the display device 104.
  • the device may comprise at least a sensor 116, a receiver 118, a distance sensor 120, and a transmitter 122.
  • the device 102 may also comprise a housing 124, a LASER opening 128, a hook 130, and a USB port 132.
  • the housing 124 may be understood as a covering structure forming outermost covering of the device 102.
  • the housing 124 may be made of a non-conductive, temperature stable and non- reactive material. In one example the housing 124 may be manufactured from a high- quality plastic material.
  • the LASER opening 128 may be provided as a point for projecting the LASER beam from the device 102. The number of LASER opening may be changed as per requirement and purpose of the device 102.
  • the device 102 may comprise a pair of LASER opening 128.
  • the device 102 may be mounted on to a screen of the display device 104 with the hook 130.
  • the hook 130 may be understood to provide anchoring support to the device 102 while it is mounted on the screen of the display device 104.
  • the hook 130 may also be prepared with a non-conductive, temperature stable and non-reactive material. In one example the hook 130 may be manufactured from the high-quality plastic material.
  • the device 102 may also comprise the USB port 132.
  • the USB port 132 may be used to connect the device 102 to the display device 104 through a cable.
  • the USB port 132 may also be used to transfer data from the device 102 to the display device 104.
  • the USB port 132 may also serve as a charging point for the device 102.
  • one or more of the sensor 116 may be configured to project a Light Amplification by Stimulated Emission of Radiation (LASER) beam.
  • the LASER beam may be simultaneously and separately projected on the user at two or more points.
  • the LASER beam may be projected at the user’s head and the LASER beam projected at the user’s head may be referred to as Beam 1.
  • the LASER beam may be projected at the user’s chest and the LASER beam projected at the user’s chest may be referred to as Beam 2.
  • the LASER beam may be projected by multiple sensors 116.
  • a total number of the sensor 116 used may be four.
  • a total number of the sensor 116 used may be six. It may be understood that with increase in the number of sensors 116 used in the device 102, an accuracy of the depth map being generated by the sensors 116 would increase.
  • the depth map may be understood as a representation of an image or an image channel that contains information regarding distance of a point or object of interest from a viewpoint.
  • the depth map may be referred to as a depth buffer.
  • the depth map may be referred to as a Z-buffer.
  • the depth map may be referred to as a Z-depth. It may be understood that the depth map may be generated from one or more LASER beam projected by one or more of the sensors 116.
  • the depth map may be a 2- dimensional map creating a distance representation from the viewpoint.
  • the senor 116 used in the device 102 may be an optic sensor, a motor sensor, a proximity sensor, a light sensor, a time-of-flight sensor, and an ultrasonic sensor.
  • the sensor 116 may include a reflective, a thru beam, and a retro-reflective LASER sensor.
  • the device 102 may comprise two sets of Class 1 Eye-Safe LASER sensors.
  • the Class 1 Eye-safe LASER may be understood to function as Time of Flight (TOF) sensors.
  • the Class 1 Eye-safe LASER sensors may be replaced with a myriad of solutions ranging from a multi-focal laser projector, typically used in a Face identifying LASER sensor.
  • the receiver 118 of the device 102 may be configured to receive a first reflected LASER beam and a second reflected LASER beam. It may be understood that the first reflected LASER beam is reflected from the user’s forehead after projection of the Beam 1, and the second reflected LASER beam is reflected from the user’s chest, after projection of the Beam 2.
  • the distance sensor 120 of the device 102 may be configured to calculate a distance travelled by the first reflected LASER beam, referred to as ‘dl’, and a distance travelled by the second reflected LASER beam, referred to as ‘d2’.
  • the transmitter 122 may be configured to transmit the distance travelled by the first reflected LASER beam, i.e. dl, and the distance travelled by the second reflected LASER beam i.e. d2. During operation, the transmitter 122 may transmit the dl and the d2 to a system 114.
  • the system 114 may receive the distance travelled by the first reflected LASER beam (dl) and the distance travelled by the second reflected LASER beam (d2).
  • the system 114 may compute an angular displacement (0) between the dl and d2. In one example, the system 114 may use an algorithm to compute the angular displacement (0). In another example, the algorithm may extrapolate the dl and d2 to form a right-angled triangle as illustrated in Figure 2.
  • ‘S’ Shift i.e. difference between dl (the distance travelled by the first reflected LASER beam projected at the user’s forehead) and d2 (the distance travelled by the second reflected LASER beam projected at the user’s chest);
  • ‘0’ an angular displacement and an angle formed between dl and d2;
  • ‘ 116-1 ’ a point on device 102 from where LASER beam is projected at the user’s forehead;
  • the algorithm may compute the shift (S) using following Equation 1-
  • Shift calculated by the algorithm of the system 114 may also be regarded as the linear distance between the user’s forehead and the user’s chest. Further, statistically the value of Shift (S) would be close to zero for an ideal posture of the user without any deviation.
  • the system 114 may measure an angle of flexion (D).
  • the angle of flexion (D) may be understood as an angle between the tragus of the user’s ear and the seventh cervical vertebra of the user. It may be understood that the angle of flexion (D) may provide an additional angular displacement of the user’s head from a mean resting position of the user’s head.
  • Figure 3 may be used to measure the angle of flexion (D): wherein,
  • ‘S’ Shift is the difference between dl (the distance travelled by the first reflected LASER beam projected at the user’s forehead) and d2 (the distance travelled by the second reflected LASER beam projected at the user’s chest); and S is also the linear distance between the user’s forehead and the user’s chest;
  • P average length of the user’s neck, generally known to be approximately 11 centimetres (cm).
  • the algorithm of the system 114 may follow an Equation 2 as given below to determine the Angle of flexion (D)-
  • the system 114 may then use D through the algorithm to determine a strain on the user’s neck in real time based on the angle of flexion (D), the angular displacement (0), and a historic data stored in the system 114.
  • the historic data may be understood to comprise a demographic data of a plurality of users, a medical history of the plurality of the users, and a posture data of the users.
  • the historic data would include the angle of flexion (D) for a user suffering from a head slouch condition, and a crooked spine condition.
  • the historic data may also include neck length of the plurality of users, which may be used to calculate an average neck length of the users.
  • the average neck length may be identified as 11 cm and an average head weight of the user may be identified as 5 kilograms (kg). It may be understood that the strain may be experienced at the base of the user’s neck.
  • the system 114 may detect a posture of the user as an acceptable posture, or as an unacceptable posture. It may be understood that the posture may be detected as the acceptable posture when the strain on the user’s neck is within or less than a threshold of the strain. In one example, the threshold of the strain is between -2 cm to 2 cm.
  • the system 114 may recommend a change in the posture of the user upon detecting the unacceptable posture.
  • the system 114 may recommend the user on the display device 104, wherein the display device 104 may be a mobile, or a laptop device.
  • the recommendation may be sent in real time as a notification to the user on the display device 104.
  • the system 114 may alert the user if the unacceptable posture is maintained for more than a predetermined interval of time. It may be noted that, the alert may be sent to the display device 104 of the user in a situation when even after recommending a change in posture, the unacceptable posture is continued for a time more than the predetermined interval of time. In one example the predetermined interval of time given to the user for changing the unacceptable posture to the acceptable posture may be 15 minutes. Upon completion of 15 minutes, the system 114 may send an alert to the user on the display device 104. The alert may be sent in form of a notification on the display device 104 connected through the network 106.
  • Alpha may be a full-time employee working for a company XYZ. It may be understood that as a part of his job, user Alpha is required to spend minimum 8 hours working on his display device 104, which is a laptop in this case. While working on the display device 104, multiple times a posture of the user Alpha may not be acceptable and exceed the threshold of strain allowed for the user’s neck.
  • the company XYZ may decide to install a device 102 to track posture of the user Alpha in real time. Therefore, Alpha may mount the device 102 on the display device 104 and continue working.
  • the device 102 and a system 114 may continuously monitor the posture of the user Alpha in real time and recommend a change in posture at every instance of detecting an unacceptable posture of the user Alpha. Further, the system 114 may wait for a predetermined period of 15 minutes for the user Alpha to change the posture and change the unacceptable posture to an acceptable posture.
  • the device 102 and the system 114 may perform a plurality of steps as explained in earlier paragraphs for tracking the posture of the user Alpha in real time daily. This may assist the user Alpha to maintain good posture throughout the workday and prevent possibility of upper body postural disorders for him.
  • Figure 4a may be understood to be a front view of a prototype for the device 102 for tracking posture of a user in real time.
  • the device 102 may comprise a housing 124, a LASER opening 128 and a sensor 116.
  • Figure 4 b may be understood to be a side view of the prototype for the device 102 for tracking the posture of the user in real time.
  • the device 102 may comprise a housing 124, a hook 130, and a USB port 132.
  • FIG. 5 an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter is shown.
  • This may be understood as an example of an acceptable posture of the user where, Laser 1 and Laser 2 are the beams projected by a device 102 on the user’s forehead and the user’s chest respectively.
  • the shift of the user as shown in figure 5 may be within a threshold of -2 cm to 2 cm. Therefore, the posture is an example of an acceptable posture of the user.
  • FIG. 6 an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter is shown.
  • This may be understood as an example of an unacceptable posture of the user where, Laser 1 and Laser 2 are the beams projected by a device 102 on the user’s forehead and the user’s chest respectively.
  • the shift of the user as shown in figure 6 may be more than 3 cm, which is outside a threshold of -2 cm to 2 cm. Therefore, the posture is an example of an unacceptable posture of the user.
  • the system 114 may recommend the user to change the posture and alert the user if the unacceptable posture is maintained for a predetermined interval of time.
  • the system 114 may recommend and alert the user in form of a notification sent on a display device 104.
  • the display device 104 may be a mobile or a laptop of the user.
  • FIG. 7 a method 700 for tracking posture of a user in real time using a device is shown, in accordance with an embodiment of the present subject matter.
  • the method 700 may be described in the general context of computer executable instructions.
  • computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
  • the order in which the method 700 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 700 or alternate methods for tracking posture of the user in real time. Additionally, individual blocks may be deleted from the method 700 without departing from the scope of the subject matter described herein. Furthermore, the method 700 for tracking posture of the user in real time can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below the method 700 may be considered to be implemented in the above-described device 102.
  • a Light Amplification by Stimulated Emission of Radiation (LASER) beam may be projected by one or more sensor 116.
  • the LASER beam may be projected simultaneously and separately on the user’s forehead and the user’s chest. It may be understood that one or more sensor 116 project the LASER beams to generate a depth map.
  • a first reflected LASER beam, a second reflected LASER beam, and the depth map may be received by a receiver 118. It may be noted that the first reflected LASER beam is reflected from the user’s forehead, and the second reflected LASER beam is reflected from the user’s chest.
  • a distance travelled by the first reflected LASER beam, and a distance travelled by the second reflected LASER beam may be calculated by a distance sensor.
  • the distance travelled by the first reflected LASER beam, and the distance travelled by the second reflected LASER beam may be transmitted by a transmitter to a system 114.
  • a method 800 for tracking the posture of an user in real time using the system 114 is shown, in accordance with an embodiment of the present subject matter.
  • the method 800 may be described in the general context of computer executable instructions.
  • computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
  • the order in which the method 800 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 800 or alternate methods for tracking posture of the user in real time. Additionally, individual blocks may be deleted from the method 800 without departing from the scope of the subject matter described herein. Furthermore, the method 800 for tracking posture in real time can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below the method 800 may be considered to be implemented in the above-described system 114.
  • a first reflected LASER beam, a second reflected LASER beam, and a depth map may be received in real time from a device 102.
  • the first reflected LASER beam may be understood to be reflected from a user’s forehead and the second reflected LASER beam may be reflected from the user’s chest.
  • an angular displacement of the user’s neck may be computed.
  • the angular displacement may be computed based on the distance travelled by the first reflected LASER beam, and the distance travelled by the second reflected LASER beam.
  • an angle of flexion between a tragus of the user’s ear and a seventh cervical vertebra of the user may be measured.
  • a strain on the user’s neck may be determined in real time.
  • the strain may be based on the angle of flexion, the angular displacement, and a historic data.
  • a posture of the user may be detected as an acceptable posture, or as an unacceptable posture.
  • the posture may be detected as an acceptable posture when the strain is within or less than a threshold of the strain, and the posture may be detected as an unacceptable posture when the strain is more than the threshold of the strain.
  • a change in the posture of the user may be recommended upon detecting the unacceptable posture.
  • the user may be alerted if the unacceptable posture is maintained for more than a predetermined interval of time. It may be understood that the user may be alerted as the unacceptable posture is not changed after recommending the change in the posture for the predetermined interval of time.
  • the system for tracking posture of a user in real time may help in prevention of upper body disorders related to neck and spine for the user by correcting the user’s posture in real time.
  • the system may assist in treatment of an upper body disorder due to improper posture of the user by alerting the user in real time of any unacceptable posture.
  • the system may help in tracking posture of the user remotely and without requirement of a camera-based monitoring system.
  • the system may provide a posture tracking alternative for the user who do not prefer wearing a posture tracking device on body.
  • the system may inculcate a habit for maintaining a healthy posture in the user by tracking, recommending changes and alerting the user of any unacceptable postures in real time.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Rheumatology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A method (700) for tracking posture using a device (102) comprises projecting a LASER beam, receiving a first reflected LASER beam, a second reflected LASER beam and a depth map. Further, the method (700) comprises calculating a distance travelled by the first reflected LASER beam and the second reflected LASER beam. Finally, the method (700) comprises transmitting the distance travelled by the first reflected LASER beam, the second reflected LASER beam and the depth map to a system (114). A method (800) for tracking posture comprises receiving the distance from the device (102), computing an angular displacement of the user's neck, measuring an angle of flexion, determining a strain on the user's neck, and detecting a posture of the user. Further, the method (800) comprises recommending a change in the posture of the user upon detection of an unacceptable posture and alerting the user.

Description

POSTURE TRACKING
PRIORITY INFORMATION
[001] The present application claims priority from the Indian patent application no. 202141060052 filed on 22nd December, 2021.
TECHNICAL FIELD
[002] The present subject matter described herein, in general, relates to tracking a posture of a user.
BACKGROUND
[003] The advent of digital technology and display devices such as computers, laptops, and televisions has led to several advancements in corporate world and work culture. In the recent times, with the increased use of remote desktops, and internet services led to increase in remote working opportunities. Working in front of display devices like laptops has been challenging as the working hours of the corporate work culture are long. Along with work efficiency, long working hours also pose threat to the health of a worker using laptops for most part of the working day. There may be several factors of the sedentary work style, affecting the worker’s physical, mental and metabolic health. Moreover, in recent pandemic times work from home culture increased manifold which also increased screen time for working as well as nonworking population. Excess use of the display devices has aggravated health risks among all users of such devices. Therefore, a sustainable approach for use of display devices in a safe and secure manner is imperative.
SUMMARY
[004] Before the present system(s) and method(s), are described, it is to be understood that this application is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosure. It is also to be understood that the terminology used in the description is for the purpose of describing the particular versions or embodiments only and is not intended to limit the scope of the present application. This summary is provided to introduce concepts related to systems and methods for tracking posture of a user in real time and the concepts are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[005] In one implementation, a device for tracking posture of a user in real time is disclosed. Initially, one or more sensors may be configured to project a Light Amplification by Stimulated Emission of Radiation (LASER) beam to generate a depth map. It may be noted that the LASER beam is simultaneously and separately projected on a user’s forehead and the user’s chest. Further, a receiver may be configured to receive a first reflected LASER beam ,a second reflected LASER beam and the depth map. The first reflected LASER beam may be reflected from the user’s forehead and the second reflected LASER beam may be reflected from the user’s chest. Furthermore, a distance sensor may be configured to calculate a distance travelled by the first reflected LASER beam, and a distance travelled by the second reflected LASER beam. Finally, a transmitter may be configured to transmit the distance travelled by the first reflected LASER beam, and the distance travelled by the second reflected LASER beam to a system for tracking posture of the user in real time.
[006] In another implementation, a system for tracking posture of a user in real time is disclosed. Initially, a distance travelled by a first reflected LASER beam ,a distance travelled by a second reflected LASER beam and a depth map may be received from a device. The first reflected LASER beam may be understood to be reflected from a user’s forehead and the second reflected LASER beam may be reflected from the user’s chest. Further, an angular displacement of the user’s neck may be computed. It may be noted that the angular displacement is computed based on the distance travelled by the first reflected LASER beam and the distance travelled by the second reflected LASER beam. Furthermore, an angle of flexion between a tragus of the user’s ear and a seventh cervical vertebra may be measured. Subsequently, a strain on the user’s neck may be determined in real time based on the angle of flexion, the angular displacement, and a historic data. Further, a posture of the user may be detected as an acceptable posture or as an unacceptable posture. The posture may be detected as the acceptable posture when the strain is within or less than a threshold of the strain. The posture may be detected as the unacceptable posture when the strain is more than the threshold of the strain. Furthermore, a change in the posture of the user may be recommended upon detecting the unacceptable posture. Finally, the user may be alerted if the unacceptable posture is maintained for more than a predetermined interval of time. It may be understood that the user may be alerted as the unacceptable posture is not changed after recommending the change in posture for the predetermined interval of time.
[007] In yet another implementation, a method for tracking a posture of a user in real time using a device is disclosed. Initially, a Light Amplification by Stimulated Emission of Radiation (LASER) beam may be projected by one or more sensor to generate a depth map. The LASER beam may be projected simultaneously and separately on the user’s forehead, and the user’s chest. Further, a first reflected LASER beam ,a second reflected LASER beam and the depth map may be received by a receiver. The first reflected LASER beam may be understood to be reflected from the user’s forehead and the second reflected LASER beam may be understood to be reflected from the user’s chest. Furthermore, a distance travelled by the first reflected LASER beam and a distance travelled by the second reflected LASER beam may be calculated by a distance sensor. Finally, the distance travelled by the first reflected LASER beam and the distance travelled by the second reflected LASER beam may be transmitted by a transmitter to a system. In one implementation, the aforementioned method for tracking posture of a user in real time may be performed by a processor using programmed instructions stored in a memory.
[008] In yet another implementation, a method for tracking posture of a user in real time is disclosed. Initially, a distance travelled by a first reflected LASER beam ,a distance travelled by a second reflected LASER beam, and a depth map may be received from a device. The first reflected LASER beam may be understood to be reflected from a user’s forehead, and the second reflected LASER beam may be understood to be reflected from the user’s chest. Further, an angular displacement of the user’s neck may be computed. The angular displacement may be computed based on the distance travelled by the first reflected LASER beam, and the distance travelled by the second reflected LASER beam. Furthermore, an angle of flexion may be measured between a tragus of the user’s ear and a seventh cervical vertebra of the user. Subsequently, a strain on the user’s neck may be determined in real time based on the angle of flexion, the angular displacement, and a historic data. Further, a posture of the user may be detected as an acceptable posture, or as an unacceptable posture. The posture may be detected as the acceptable posture when the strain is within or less than a threshold of the strain and the posture may be detected as the unacceptable posture when the strain is more than the threshold of the strain. Furthermore, a change in the posture of the user may be recommended upon detecting the unacceptable posture. Finally, the user may be alerted if the unacceptable posture is maintained for more than a predetermined interval of time. It may be understood that the user is alerted as the unacceptable posture is not changed after recommending the change in the posture for the predetermined interval of time.
BRIEF DESCRIPTION OF THE DRAWINGS [009] The foregoing detailed description of embodiments is better understood when read in conjunction with the appended drawings. For the purpose of illustrating of the present subject matter, an example of a construction of the present subject matter is provided as figures, however, the invention is not limited to the specific method and system for tracking posture of a user in real time is disclosed in the document and the figures.
[010] The present subject matter is described in detail with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer to various features of the present subject matter.
[Oil] Figure 1 illustrates a network implementation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter.
[012] Figure 2 illustrates an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter.
[013] Figure 3 illustrates an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter.
[014] Figure 4a and Figure 4b illustrate a device for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter.
[015] Figure 5 illustrates an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter.
[016] Figure 6 illustrates an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter. [017] Figure 7 illustrates a method for tracking posture of a user in real time using a device. The method is illustrated in the form of flow chart, in accordance with an embodiment of the present subject matter.
[018] Figure 8 illustrates a method for tracking posture of a user in real time. The method is illustrated in the form of flow chart, in accordance with an embodiment of the present subject matter.
[019] The figure depicts an embodiment of the present disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
[020] Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words “receiving”, “detecting”, “sending,”, “determining”, and “alerting”, and other forms thereof, are intended to be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the exemplary systems and methods are now described.
[021] The disclosed embodiments are merely exemplary of the disclosure, which may be embodied in various forms. Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure is not intended to be limited to the embodiments illustrated but is to be accorded the widest scope consistent with the principles and features described herein.
[022] Conventional solutions for tracking posture of a user in real time comprise of various wearable devices required to be worn by the user for tracking purpose. Further, some other conventional solutions for tracking posture of a user of a display device such as a laptop or a desktop comprise a camera-based process. It may be understood that constantly wearing a tracking device on one’s body or being continuously monitored by a camera placed on a display device may not be comfortable, feasible and reliable approach for many users. Therefore, there is a requirement for a sustainable solution for tracking posture of the user without the limitations of the wearable tracking devices, and the camera-based monitoring devices. Ensuring proper posture of the users is crucial for prevention and treatment of any upper body related ailments for the neck. The device, the system, and the methods disclosed in following paragraphs of the present invention are aimed to function together for tracking posture of the user in real time with an unconventional approach.
[023] The present subject matter discloses a method and a system for tracking posture of a user in real time. In order to track a posture of the user in real time, a device for tracking the posture is disclosed in the present invention. The device comprises one or more sensor, a receiver, a distance sensor, and a transmitter installed in the device. The sensor in the device is configured to project a Light Amplification by Stimulated Emission of Radiation (LASER) beam simultaneously and separately on the user’s forehead and the user’s chest to generate a depth map. Further, the receiver is configured to receive the depth map, a first reflected LASER beam ,and a second reflected LASER beam from the user’s forehead and the user’s chest respectively. The distance sensor is configured to calculate a distance travelled by the first reflected LASER beam and a distance travelled by the second reflected LASER beam. Further, the transmitter is configured to transmit the distance travelled by the first reflected LASER beam and the distance travelled by the second reflected LASER beam to a system for tracking the posture of the user in real time.
[024] In one implementation, the device for tracking posture of the user in real time, transmits the distance travelled by the first reflected LASER beam, the distance travelled by the second reflected LASER beam, and the depth map to a system. The system may receive the depth map, the distance travelled by the first reflected LASER beam, and the distance travelled by the second reflected LASER beam from the device and compute an angular displacement of the user’s neck. Further the system measures an angle of flexion between a tragus of the user’s ear and a seventh cervical vertebra of the user, determines a strain on the user’s neck and detects a posture of the user as an unacceptable posture, or as an unacceptable posture. Upon detecting the unacceptable posture, the system may recommend a change in the posture to the user and alert the user if the posture is not changed in a predetermined interval of time.
[025] While aspects of described system and method for tracking posture of the user in real time may be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system.
[026] Referring now to Figure 1, a network implementation 100 of a device 102 for tracking posture of a user in real time is disclosed. The device 102 sends information to a system 114 upon calculating a distance travelled by a first reflected LASER beam and a distance by travelled by a second reflected LASER beam. The user uses one or more display devices 104-1, 104-2, ... 104-N, collectively referred to as display devices 104, hereinafter, or applications residing on the display devices 104 for receiving notifications related to the tracking in real time from the system 114. Further, the system 114 may send a recommendation for change in posture, followed by an alert upon detection of an unacceptable posture of the user in real time.
[027] It may be noted that the device 102 for tracking posture of the user in real time may be accessed by multiple users through one or more display devices 104-1, 104-2... 104-N. In one implementation, the device 102 may be communicatively connected to a cloud-based computing environment. Examples of the display devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, a mobile device, a tablet, and a workstation. The display devices 104 are communicatively coupled to the device 102 through a network 106.
[028] Although the present disclosure is explained considering that the system 114 for tracking posture of the user in real time, it may be understood that the system 114 may be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a virtual environment, a mainframe computer, a server, a network server, the cloud-based computing environment. It will be understood that the system 114 may be accessed by multiple users through one or more display devices 104-1, 104-2... 104-N. In one implementation, the system 114 may comprise the cloud-based computing environment in which the user may operate individual computing systems configured to execute remotely located applications. Examples of the display devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, a mobile device, a tablet, and a workstation. The display devices 104 are communicatively coupled to the system 114 for tracking the posture of the user in real time through a network 106.
[029] In one implementation, the network 106 may be a wireless network, a wired network, or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[030] In one embodiment, the device 102 for tracking posture of the user in real time, may include at least one processor 108-A, an input/output (I/O) interface 110-A, and a memory 112- A. The system 114 for tracking posture of the user in real time, may also include at least one processor 108-B, an input/output (I/O) interface 110-B, and a memory 112-C. It may be noted that at least one processor 108-A, 108-B may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, Central Processing Units (CPUs), state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 108-A, 108-B is configured to fetch and execute computer-readable instructions stored in the memory 112-A, 112-B.
[031] The I/O interface 110-A, 110-B may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 110-A may allow the device 102 to interact with the user directly or through the display devices 104. Similarly, the TO interface 110-B may allow the system 114 to interact with the user directly or through the display devices 104. Further, the I/O interface 110 may enable the device 102 and the system 114 to communicate with other computing devices, such as web servers and external data servers (not shown). The TO interface 110-A, 110-B can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 110- A, 110-B may include one or more ports for connecting a number of devices to one another or to another server.
[032] The memory 112-A, 112-B may include any computer-readable medium or computer program product known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, Solid State Disks (SSD), optical disks, and magnetic tapes. The memory 112-A, 112-B may include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. The memory 112-A, 112-B may include programs or coded instructions that supplement applications and functions of the postbox 102 and the post office system 114. In one embodiment, the memory 112-A, 112-B, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the programs or the coded instructions.
[033] As there are various challenges observed in the existing art, the challenges necessitate the need to build the device 102 for tracking the posture of the user in real time. At first, a user may use the display device 104 to access the device 102 via the I/O interface 110- A. The user may register the display devices 104 using the I/O interface 110-A in order to use the device 102. In one aspect, the user may access the I/O interface 110-A of the device 102.
[034] Similarly, the system 114 may also be used for tracking the posture of the user in real time. At first, a user may use the display device 104 to access the system 114 via the I/O interface 110-B. The user may register the display devices 104 using the I/O interface 110-B in order to use the system 114. In one aspect, the user may access the I/O interface 110-B of the system 114. The detail functioning of the device 102 and the system 114 is described below with the help of figures. [035] The present subject matter discloses a system and a method for tracking posture of a user in real time. In one implementation, a user may be understood to be using a display device 104. In one example, the user may be a student. In another example, the user may be an employee using the display device such as a desktop or a laptop to work. In yet another example, the user may be a person using the display device for entertainment and recreational purposes.
[036] Further, the user may be provided with a device 102 for tracking the posture of the user in real time. In one example, the device 102 may be mounted on the display device 104.
[037] It may be understood that the device may comprise at least a sensor 116, a receiver 118, a distance sensor 120, and a transmitter 122.
[038] Apart from the above-mentioned, the device 102 may also comprise a housing 124, a LASER opening 128, a hook 130, and a USB port 132. The housing 124 may be understood as a covering structure forming outermost covering of the device 102. The housing 124 may be made of a non-conductive, temperature stable and non- reactive material. In one example the housing 124 may be manufactured from a high- quality plastic material. Further, the LASER opening 128 may be provided as a point for projecting the LASER beam from the device 102. The number of LASER opening may be changed as per requirement and purpose of the device 102. In one example, the device 102 may comprise a pair of LASER opening 128.
[039] Further, the device 102 may be mounted on to a screen of the display device 104 with the hook 130. The hook 130 may be understood to provide anchoring support to the device 102 while it is mounted on the screen of the display device 104. The hook 130 may also be prepared with a non-conductive, temperature stable and non-reactive material. In one example the hook 130 may be manufactured from the high-quality plastic material. [040] Furthermore, the device 102 may also comprise the USB port 132. The USB port 132 may be used to connect the device 102 to the display device 104 through a cable. In one example, the USB port 132 may also be used to transfer data from the device 102 to the display device 104. In yet another example, the USB port 132 may also serve as a charging point for the device 102.
[041] In one example, one or more of the sensor 116 may be configured to project a Light Amplification by Stimulated Emission of Radiation (LASER) beam. Further, the LASER beam may be simultaneously and separately projected on the user at two or more points. In one example, the LASER beam may be projected at the user’s head and the LASER beam projected at the user’s head may be referred to as Beam 1. Further, the LASER beam may be projected at the user’s chest and the LASER beam projected at the user’s chest may be referred to as Beam 2. In other example, the LASER beam may be projected by multiple sensors 116. In one example, a total number of the sensor 116 used may be four. In another example, a total number of the sensor 116 used may be six. It may be understood that with increase in the number of sensors 116 used in the device 102, an accuracy of the depth map being generated by the sensors 116 would increase.
[042] Typically, the depth map may be understood as a representation of an image or an image channel that contains information regarding distance of a point or object of interest from a viewpoint. In one example, the depth map may be referred to as a depth buffer. In other example, the depth map may be referred to as a Z-buffer. In yet another example the depth map may be referred to as a Z-depth. It may be understood that the depth map may be generated from one or more LASER beam projected by one or more of the sensors 116. In one example, the depth map may be a 2- dimensional map creating a distance representation from the viewpoint.
[043] In one example, the sensor 116 used in the device 102 may be an optic sensor, a motor sensor, a proximity sensor, a light sensor, a time-of-flight sensor, and an ultrasonic sensor. In another example, the sensor 116 may include a reflective, a thru beam, and a retro-reflective LASER sensor. In yet another example, the device 102 may comprise two sets of Class 1 Eye-Safe LASER sensors. The Class 1 Eye-safe LASER may be understood to function as Time of Flight (TOF) sensors. In yet another example, the Class 1 Eye-safe LASER sensors may be replaced with a myriad of solutions ranging from a multi-focal laser projector, typically used in a Face identifying LASER sensor.
[044] Further, the receiver 118 of the device 102 may be configured to receive a first reflected LASER beam and a second reflected LASER beam. It may be understood that the first reflected LASER beam is reflected from the user’s forehead after projection of the Beam 1, and the second reflected LASER beam is reflected from the user’s chest, after projection of the Beam 2.
[045] Furthermore, the distance sensor 120 of the device 102 may be configured to calculate a distance travelled by the first reflected LASER beam, referred to as ‘dl’, and a distance travelled by the second reflected LASER beam, referred to as ‘d2’.
[046] Further, the transmitter 122 may be configured to transmit the distance travelled by the first reflected LASER beam, i.e. dl, and the distance travelled by the second reflected LASER beam i.e. d2. During operation, the transmitter 122 may transmit the dl and the d2 to a system 114.
[047] During operation, the system 114 may receive the distance travelled by the first reflected LASER beam (dl) and the distance travelled by the second reflected LASER beam (d2).
[048] Upon receiving, the dl and the d2, the system 114 may compute an angular displacement (0) between the dl and d2. In one example, the system 114 may use an algorithm to compute the angular displacement (0). In another example, the algorithm may extrapolate the dl and d2 to form a right-angled triangle as illustrated in Figure 2.
[049] Referring now to Figure 2, wherein,
‘S’= Shift i.e. difference between dl (the distance travelled by the first reflected LASER beam projected at the user’s forehead) and d2 (the distance travelled by the second reflected LASER beam projected at the user’s chest);
‘0’ = an angular displacement and an angle formed between dl and d2;
‘ 116-1 ’= a point on device 102 from where LASER beam is projected at the user’s forehead;
‘ 116-2’= a point on device 102 from where LASER beam is projected at the user’s chest; and dotted lines ‘ — ’ = extrapolation lines for completing the right-angled triangle.
[050] Further, the algorithm may compute the shift (S) using following Equation 1-
Shift (S) = dl- d2 cos (0) Equation 1
[051] It may be pertinent to note that the Shift calculated by the algorithm of the system 114, from Equation 1, may also be regarded as the linear distance between the user’s forehead and the user’s chest. Further, statistically the value of Shift (S) would be close to zero for an ideal posture of the user without any deviation.
[052] Further, after computing the angular displacement (0) and the Shift (S) from the Equation 1 as given above, the system 114 may measure an angle of flexion (D). The angle of flexion (D) may be understood as an angle between the tragus of the user’s ear and the seventh cervical vertebra of the user. It may be understood that the angle of flexion (D) may provide an additional angular displacement of the user’s head from a mean resting position of the user’s head.
[053] Referring now to Figure 3. In one example, Figure 3 may be used to measure the angle of flexion (D): wherein,
‘S’= Shift is the difference between dl (the distance travelled by the first reflected LASER beam projected at the user’s forehead) and d2 (the distance travelled by the second reflected LASER beam projected at the user’s chest); and S is also the linear distance between the user’s forehead and the user’s chest;
‘D’= Angle of flexion formed between the S and P;
‘P’= average length of the user’s neck, generally known to be approximately 11 centimetres (cm).
[054] Further, the algorithm of the system 114 may follow an Equation 2 as given below to determine the Angle of flexion (D)-
Angle of flexion (D)= arc cos (S/P) Equation 2
[055] Subsequently, the system 114 may then use D through the algorithm to determine a strain on the user’s neck in real time based on the angle of flexion (D), the angular displacement (0), and a historic data stored in the system 114. The historic data may be understood to comprise a demographic data of a plurality of users, a medical history of the plurality of the users, and a posture data of the users. For example, the historic data would include the angle of flexion (D) for a user suffering from a head slouch condition, and a crooked spine condition. The historic data may also include neck length of the plurality of users, which may be used to calculate an average neck length of the users. [056] In one example, the average neck length may be identified as 11 cm and an average head weight of the user may be identified as 5 kilograms (kg). It may be understood that the strain may be experienced at the base of the user’s neck. In one example, the system 114 may identify the strain by plugging D into a standard torque calculation with a weight W (average human head weight = 5 kg).
[057] Further, the system 114 may detect a posture of the user as an acceptable posture, or as an unacceptable posture. It may be understood that the posture may be detected as the acceptable posture when the strain on the user’s neck is within or less than a threshold of the strain. In one example, the threshold of the strain is between -2 cm to 2 cm.
[058] Furthermore, the system 114 may recommend a change in the posture of the user upon detecting the unacceptable posture. In one example, the system 114 may recommend the user on the display device 104, wherein the display device 104 may be a mobile, or a laptop device. In another example, the recommendation may be sent in real time as a notification to the user on the display device 104.
[059] Finally, the system 114 may alert the user if the unacceptable posture is maintained for more than a predetermined interval of time. It may be noted that, the alert may be sent to the display device 104 of the user in a situation when even after recommending a change in posture, the unacceptable posture is continued for a time more than the predetermined interval of time. In one example the predetermined interval of time given to the user for changing the unacceptable posture to the acceptable posture may be 15 minutes. Upon completion of 15 minutes, the system 114 may send an alert to the user on the display device 104. The alert may be sent in form of a notification on the display device 104 connected through the network 106.
[060] Consider an example, of a user Alpha. Alpha may be a full-time employee working for a company XYZ. It may be understood that as a part of his job, user Alpha is required to spend minimum 8 hours working on his display device 104, which is a laptop in this case. While working on the display device 104, multiple times a posture of the user Alpha may not be acceptable and exceed the threshold of strain allowed for the user’s neck. The company XYZ may decide to install a device 102 to track posture of the user Alpha in real time. Therefore, Alpha may mount the device 102 on the display device 104 and continue working.
[061] The device 102 and a system 114 may continuously monitor the posture of the user Alpha in real time and recommend a change in posture at every instance of detecting an unacceptable posture of the user Alpha. Further, the system 114 may wait for a predetermined period of 15 minutes for the user Alpha to change the posture and change the unacceptable posture to an acceptable posture.
[062] It may be understood that during tracking, the device 102 and the system 114 may perform a plurality of steps as explained in earlier paragraphs for tracking the posture of the user Alpha in real time daily. This may assist the user Alpha to maintain good posture throughout the workday and prevent possibility of upper body postural disorders for him.
[063] Referring now to Figure 4a and 4b, the device 102 is illustrated in accordance with an embodiment of the present subject matter. Figure 4a may be understood to be a front view of a prototype for the device 102 for tracking posture of a user in real time. In an embodiment, as shown in figure 4a the device 102 may comprise a housing 124, a LASER opening 128 and a sensor 116. Further, Figure 4 b may be understood to be a side view of the prototype for the device 102 for tracking the posture of the user in real time. As shown in Figure 4b, the device 102 may comprise a housing 124, a hook 130, and a USB port 132.
[064] Referring now to Figure 5, an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter is shown. This may be understood as an example of an acceptable posture of the user where, Laser 1 and Laser 2 are the beams projected by a device 102 on the user’s forehead and the user’s chest respectively. In one example, the shift of the user as shown in figure 5 may be within a threshold of -2 cm to 2 cm. Therefore, the posture is an example of an acceptable posture of the user.
[065] Referring now to Figure 6, an exemplary representation for tracking posture of a user in real time, in accordance with an embodiment of the present subject matter is shown. This may be understood as an example of an unacceptable posture of the user where, Laser 1 and Laser 2 are the beams projected by a device 102 on the user’s forehead and the user’s chest respectively. In one example, the shift of the user as shown in figure 6 may be more than 3 cm, which is outside a threshold of -2 cm to 2 cm. Therefore, the posture is an example of an unacceptable posture of the user. In such a situation, the system 114 may recommend the user to change the posture and alert the user if the unacceptable posture is maintained for a predetermined interval of time. The system 114 may recommend and alert the user in form of a notification sent on a display device 104. The display device 104 may be a mobile or a laptop of the user.
[066] Referring now to Figure 7, a method 700 for tracking posture of a user in real time using a device is shown, in accordance with an embodiment of the present subject matter. The method 700 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
[067] The order in which the method 700 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 700 or alternate methods for tracking posture of the user in real time. Additionally, individual blocks may be deleted from the method 700 without departing from the scope of the subject matter described herein. Furthermore, the method 700 for tracking posture of the user in real time can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below the method 700 may be considered to be implemented in the above-described device 102.
[068] At block 702, a Light Amplification by Stimulated Emission of Radiation (LASER) beam may be projected by one or more sensor 116. The LASER beam may be projected simultaneously and separately on the user’s forehead and the user’s chest. It may be understood that one or more sensor 116 project the LASER beams to generate a depth map.
[069] At block 704, a first reflected LASER beam, a second reflected LASER beam, and the depth map may be received by a receiver 118. It may be noted that the first reflected LASER beam is reflected from the user’s forehead, and the second reflected LASER beam is reflected from the user’s chest.
[070] At block 706, a distance travelled by the first reflected LASER beam, and a distance travelled by the second reflected LASER beam may be calculated by a distance sensor.
[071] At block 708, the distance travelled by the first reflected LASER beam, and the distance travelled by the second reflected LASER beam may be transmitted by a transmitter to a system 114.
[072] Referring now to Figure 8, a method 800 for tracking the posture of an user in real time using the system 114 is shown, in accordance with an embodiment of the present subject matter. The method 800 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
[073] The order in which the method 800 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 800 or alternate methods for tracking posture of the user in real time. Additionally, individual blocks may be deleted from the method 800 without departing from the scope of the subject matter described herein. Furthermore, the method 800 for tracking posture in real time can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below the method 800 may be considered to be implemented in the above-described system 114.
[074] At block 802, a first reflected LASER beam, a second reflected LASER beam, and a depth map may be received in real time from a device 102. The first reflected LASER beam may be understood to be reflected from a user’s forehead and the second reflected LASER beam may be reflected from the user’s chest.
[075] At block 804, an angular displacement of the user’s neck may be computed. The angular displacement may be computed based on the distance travelled by the first reflected LASER beam, and the distance travelled by the second reflected LASER beam.
[076] At block 806, an angle of flexion between a tragus of the user’s ear and a seventh cervical vertebra of the user may be measured.
[077] At block 808, a strain on the user’s neck may be determined in real time. The strain may be based on the angle of flexion, the angular displacement, and a historic data. [078] At block 810, a posture of the user may be detected as an acceptable posture, or as an unacceptable posture. The posture may be detected as an acceptable posture when the strain is within or less than a threshold of the strain, and the posture may be detected as an unacceptable posture when the strain is more than the threshold of the strain.
[079] At block 812, a change in the posture of the user may be recommended upon detecting the unacceptable posture.
[080] At block 814, the user may be alerted if the unacceptable posture is maintained for more than a predetermined interval of time. It may be understood that the user may be alerted as the unacceptable posture is not changed after recommending the change in the posture for the predetermined interval of time.
[081] Exemplary embodiments discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include those provided by the following features.
[082] In some embodiments, the system for tracking posture of a user in real time may help in prevention of upper body disorders related to neck and spine for the user by correcting the user’s posture in real time.
[083] In some embodiments, the system may assist in treatment of an upper body disorder due to improper posture of the user by alerting the user in real time of any unacceptable posture.
[084] In some embodiments, the system may help in tracking posture of the user remotely and without requirement of a camera-based monitoring system.
[085] In some embodiments, the system may provide a posture tracking alternative for the user who do not prefer wearing a posture tracking device on body. [086] In some embodiments, the system may inculcate a habit for maintaining a healthy posture in the user by tracking, recommending changes and alerting the user of any unacceptable postures in real time.
[087] Although implementations for a method and system for tracking posture of a user in real time have been described in a language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for constructing the method and system for tracking posture of a user in real time.

Claims

We Claim:
1. A device (102) for tracking posture of a user in real time, the device (102) comprising: one or more sensor (116) configured to project a Light Amplification by Stimulated Emission of Radiation (LASER) beam to generate a depth map, wherein the LASER beam is simultaneously and separately projected on a user’s forehead, and the user’s chest; a receiver (118) configured to receive a first reflected LASER beam a second reflected LASER beam, and the depth, wherein the first reflected LASER beam is reflected from the user’s forehead, and the second reflected LASER beam is reflected from the user’s chest; a distance sensor (120) configured to calculate a distance travelled by the first reflected LASER beam, and a distance travelled by the second reflected LASER beam; and a transmitter (122) configured to transmit the distance travelled by the first reflected LASER beam, the distance travelled by the second reflected LASER beam, and the depth map to a system (114).
2. The device (102) as claimed in claim 1, wherein the device (102) is a non- invasive, and mountable, and wherein the sensor is an optic sensor, a motor sensor, a proximity sensor, a light sensor, a time-of-flight sensor, and an ultrasonic sensor.
3. The device (102) as claimed in claim 1, wherein the depth map is specific for the LASER beam projected by the one or more sensor (116), and wherein quality of the depth map increases with increase in number of the sensor (116).
4. The device (102) as claimed in claim 1, wherein four sensor (116) are configured to project the LASER beam separately to generate the depth map.
5. A system (114) for tracking posture of a user in real time, the system (114) comprising: a memory;
24 a processor coupled to the memory, wherein the processor is configured to execute a set of instructions stored in the memory to: receive a distance travelled by a first reflected LASER beam, a distance travelled by a second reflected LASER beam, and a depth map in real time from a device (102), wherein the first reflected LASER beam is reflected from a user’s forehead, and the second reflected LASER beam is reflected from the user’s chest; compute an angular displacement of the user’s neck, wherein the angular displacement is computed based on the first reflected LASER beam, and the second reflected LASER beam; measure an angle of flexion between a tragus of the user’s ear and a seventh cervical vertebra of the user; determine a strain on the user’s neck in real time based on the angle of flexion, the angular displacement, and a historic data; detect a posture of the user as an acceptable posture, or as an unacceptable posture, wherein the posture is detected as an acceptable posture when the strain is within or less than a threshold of the strain, and wherein the posture is detected as an unacceptable posture when the strain is more than the threshold of the strain; recommend a change in the posture of the user upon detecting the unacceptable posture; and alert the user if the unacceptable posture is maintained for more than a predetermined interval of time, wherein the unacceptable posture is not changed after recommendation for the predetermined interval of time.
6. The system (114) as claimed in claim 5, wherein the historic data comprises a demographic data, a medical history, and a posture data of a plurality of user.
7. A method (700) for tracking posture of a user in real time using a device (102), the method (700) comprises: projecting, by one or more sensor (116), a Light Amplification by Stimulated Emission of Radiation (LASER) beam to generate a depth map, wherein the LASER beam is simultaneously and separately projected on a user’s forehead, and the user’s chest; receiving, by a receiver (118), a first reflected LASER beam, a second reflected LASER beam and the depth map, wherein the first reflected LASER beam is reflected from the user’s forehead, and the second reflected LASER beam is reflected from the user’s chest; calculating, by a distance sensor (120), a distance travelled by the first reflected LASER beam, and a distance travelled by the second reflected LASER beam; and transmitting, by a transmitter (122), the distance travelled by the first reflected LASER beam, the distance travelled by the second reflected LASER beam, and the depth map to a system (114).
8. A method (800) for tracking posture of a user in real time, the method (800) comprises: receiving, by a processor, a distance travelled by a first reflected LASER beam, a distance travelled by a second reflected LASER beam, and a depth map in real time from a device (102), wherein the first reflected LASER beam is reflected from a user’s forehead, and the second reflected LASER beam is reflected from the user’s chest; computing, by the processor, an angular displacement of the user’s neck, wherein the angular displacement is computed based on the first reflected LASER beam, and the second reflected LASER beam; measuring, by the processor, an angle of flexion between a tragus of the user’s ear and a seventh cervical vertebra of the user; determining, by the processor, a strain on the user’s neck in real time based on the angle of flexion, the angular displacement, and a historic data; detecting, by the processor, a posture of the user as an acceptable posture, or as an unacceptable posture, wherein the posture is detected as an acceptable posture when the strain is within or less than a threshold of the strain, and wherein the posture is detected as an unacceptable posture when the strain is more than the threshold of the strain; recommending, by the processor, a change in the posture of the user upon detecting the unacceptable posture; and alerting, by the processor, the user if the unacceptable posture is maintained for more than a predetermined interval of time, wherein the unacceptable posture is not changed after recommendation for the predetermined interval of time.
9. The method (800) as claimed in claim 8, wherein the historic data comprises a demographic data, a medical history, and a posture data of a plurality of user.
27
PCT/IN2022/050256 2021-12-22 2022-03-17 Posture tracking WO2023119305A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202141060052 2021-12-22
IN202141060052 2021-12-22

Publications (1)

Publication Number Publication Date
WO2023119305A1 true WO2023119305A1 (en) 2023-06-29

Family

ID=86901527

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2022/050256 WO2023119305A1 (en) 2021-12-22 2022-03-17 Posture tracking

Country Status (1)

Country Link
WO (1) WO2023119305A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2818623C1 (en) * 2023-09-13 2024-05-03 Иван Викторович Грехов System and device for monitoring and correction of sitting posture

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109059798A (en) * 2018-04-17 2018-12-21 西安蒜泥电子科技有限责任公司 A kind of said three-dimensional body scanner based on multi-angle of view depth information
US20190302257A1 (en) * 2016-07-13 2019-10-03 Naked Labs Austria Gmbh Method for Creating a 3D-Model and 3D-Body-Scanner
CN112014850A (en) * 2020-10-23 2020-12-01 四川写正智能科技有限公司 Method for judging read-write state based on laser ranging sensor and mobile device
WO2021138964A1 (en) * 2020-01-10 2021-07-15 鄢家厚 Read/write distance identification method based on smart watch

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190302257A1 (en) * 2016-07-13 2019-10-03 Naked Labs Austria Gmbh Method for Creating a 3D-Model and 3D-Body-Scanner
CN109059798A (en) * 2018-04-17 2018-12-21 西安蒜泥电子科技有限责任公司 A kind of said three-dimensional body scanner based on multi-angle of view depth information
WO2021138964A1 (en) * 2020-01-10 2021-07-15 鄢家厚 Read/write distance identification method based on smart watch
CN112014850A (en) * 2020-10-23 2020-12-01 四川写正智能科技有限公司 Method for judging read-write state based on laser ranging sensor and mobile device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2818623C1 (en) * 2023-09-13 2024-05-03 Иван Викторович Грехов System and device for monitoring and correction of sitting posture

Similar Documents

Publication Publication Date Title
US11042034B2 (en) Head mounted display calibration using portable docking station with calibration target
US8810413B2 (en) User fatigue
JP7138931B2 (en) Posture analysis device, posture analysis method, and program
US20210011298A1 (en) Head-Mounted Electronic Display Device With Lens Position Sensing
EP3343320A1 (en) Information processing apparatus, information processing system, and information processing method
US10481680B2 (en) Systems and methods to provide a shared augmented reality experience
US20210004195A1 (en) Remote work-support system
EP3387629B1 (en) Baby tracker
JP2010500642A (en) Method and apparatus for monitoring user actions on a computer screen to encourage exercise
US20140188499A1 (en) Human action monitor
US11681209B1 (en) Structured light projector with solid optical spacer element
EP3760102B1 (en) Technique for determining a risk indicator for myopia
CN107533244B (en) Head-mounted device position monitoring assembly
US10817053B2 (en) Information processing apparatus and information processing method
CN110969061A (en) Neural network training method, neural network training device, visual line detection method, visual line detection device and electronic equipment
KR101105229B1 (en) Portable position correction system
TW202329880A (en) Passive health monitoring system for detecting symptoms of parkinson’s disease
WO2023119305A1 (en) Posture tracking
US20210361382A1 (en) Methods and Systems for Tracking an Asset in a Medical Environment and Determining its Status
Paliyawan et al. Office workers syndrome monitoring using kinect
JP2018134274A (en) Information processing method, information processing device, and program
JP7243871B2 (en) Wearable device and body temperature display system
US10163014B2 (en) Method for monitoring the visual behavior of a person
WO2018128542A1 (en) Method and system for providing information of an animal
US20230068620A1 (en) Adjustment or Weighting of Blood Pressure in Response to One or More Biophysical or Environmental Conditions

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22910383

Country of ref document: EP

Kind code of ref document: A1