CN111414780B - Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium - Google Patents

Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium Download PDF

Info

Publication number
CN111414780B
CN111414780B CN201910006352.0A CN201910006352A CN111414780B CN 111414780 B CN111414780 B CN 111414780B CN 201910006352 A CN201910006352 A CN 201910006352A CN 111414780 B CN111414780 B CN 111414780B
Authority
CN
China
Prior art keywords
current
sitting posture
standard
shoulder
inclination angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910006352.0A
Other languages
Chinese (zh)
Other versions
CN111414780A (en
Inventor
张世芳
陈超
夏亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aspire Technologies Shenzhen Ltd
Original Assignee
Aspire Technologies Shenzhen Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aspire Technologies Shenzhen Ltd filed Critical Aspire Technologies Shenzhen Ltd
Priority to CN201910006352.0A priority Critical patent/CN111414780B/en
Publication of CN111414780A publication Critical patent/CN111414780A/en
Application granted granted Critical
Publication of CN111414780B publication Critical patent/CN111414780B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a sitting posture real-time intelligent judging method, which comprises the following steps: acquiring a current sitting posture image of a user in real time, identifying key point data of human body characteristics of the user, and if the key point data of the human body characteristics of the user cannot be identified, considering that the sitting posture is abnormal; the human body characteristic key point data comprise eye coordinates, mouth coordinates, neck coordinates and shoulder coordinates; if the human body characteristic key point data of the user is identified, calculating current sitting posture data according to the human body characteristic key point data; the current sitting posture data comprise a current head inclination angle, a current shoulder inclination angle, a current height difference value between the neck and the face and a current height difference value between the shoulder and the face; comparing the current sitting posture data with the standard sitting posture data, and judging whether the current sitting posture is abnormal or not. The invention also discloses a sitting posture real-time intelligent judging system, equipment and a storage medium. The invention relates to the technical field of artificial intelligence, in particular to a sitting posture real-time intelligent judging method, system, equipment and storage medium, which realize high sitting posture judging accuracy.

Description

Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a sitting posture real-time intelligent judging method, system, equipment and storage medium.
Background
At present, parents are very concerned about the learning sitting posture condition of their children in the primary school stage, worry about affecting the healthy growth of the children, and have very strict requirements on the sitting posture of the children. In the prior art, a common method for recognizing sitting postures adopts a picture processing method, and specifically comprises the steps of geometrically calculating a central position of pixel points of a face and shoulders in a picture, then calculating angles of each pixel point on a curve of the face and the shoulders relative to the central position, subtracting the calculated angle value from a preset standard angle value, and considering that the sitting postures are abnormal if the obtained difference is larger than a threshold value. The center position is calculated through the pixel sets of the face and the shoulders, and then the deviation angle is calculated, so that the accuracy of the face deviation angle obtained by the method is not high.
Disclosure of Invention
The present invention aims to solve at least one of the technical problems in the related art to some extent. Therefore, an object of the present invention is to provide a method, a system, a device and a storage medium for real-time intelligent determination of sitting postures, which have high accuracy in determining sitting postures and can determine various abnormal sitting postures.
The technical scheme adopted by the invention is as follows:
in a first aspect, the invention provides a sitting posture real-time intelligent judging method, which comprises the following steps:
acquiring a current sitting posture image of a user in real time, identifying key point data of human body characteristics of the user, and if the key point data of the human body characteristics of the user cannot be identified, considering that the sitting posture is abnormal;
the human body characteristic key point data comprise eye coordinates, mouth coordinates, neck coordinates and shoulder coordinates;
if the human body characteristic key point data of the user is identified, calculating current sitting posture data according to the human body characteristic key point data;
the current sitting posture data comprise a current head inclination angle, a current shoulder inclination angle, a current height difference value between the neck and the face and a current height difference value between the shoulder and the face;
comparing the current sitting posture data with the standard sitting posture data, and judging whether the current sitting posture is abnormal or not.
As a further improvement of the above solution, if the user's human body feature key point data is identified, calculating the current sitting posture data according to the human body feature key point data specifically includes:
calculating the current head inclination angle of the user according to the mouth coordinate and the neck coordinate;
calculating the current shoulder inclination angle of the user according to shoulder coordinates, wherein the shoulder coordinates comprise left shoulder coordinates and right shoulder coordinates;
calculating the height difference between the current neck and face of the user according to the eye coordinates, the mouth coordinates and the neck coordinates;
and calculating the height difference between the current shoulder and the face of the user according to the eye coordinates, the mouth coordinates and the shoulder coordinates.
As a further improvement of the above solution, the comparing the current sitting posture data with the standard sitting posture data, and determining whether the current sitting posture is abnormal specifically includes:
comparing the current head inclination angle of the user with a standard head inclination angle threshold value, and judging whether the head inclination is abnormal or not;
comparing the current shoulder inclination angle of the user with a standard shoulder inclination angle threshold value, and judging whether the shoulder inclination is abnormal or not;
calculating the ratio of the height difference between the current neck and the face of the user and the height difference between the standard neck and the face of the user, taking the ratio as a first ratio, comparing the first ratio with a standard over-near-with-eyes difference ratio threshold value, and judging whether the user is over-near with eyes or not;
and calculating the ratio of the height difference between the current shoulder and the face of the user and the height difference between the standard shoulder and the face of the user, using the ratio as a second ratio, comparing the second ratio with a standard table-lying difference ratio threshold value, and judging whether the user lies down on the table.
As a further improvement of the scheme, before the step of collecting the current sitting posture image of the user and identifying the key point data of the human body characteristics of the user in real time, the method further comprises the following steps:
inputting standard sitting posture images, performing big data training through a machine learning supervised learning classification algorithm, and obtaining standard sitting posture data, wherein the standard sitting posture data comprise a standard head inclination angle threshold value, a standard shoulder inclination angle threshold value, a standard eye approach difference value ratio threshold value and a standard groveling table difference value ratio threshold value.
As a further improvement of the above solution, the method further includes: and when the current sitting posture is abnormal, sending out reminding information in real time.
In a second aspect, the present invention provides a sitting posture real-time intelligent discriminating system, the system comprising:
the acquisition and identification module is used for acquiring the current sitting posture image of the user in real time, identifying key point data of human body characteristics of the user, and if the key point data of the human body characteristics of the user cannot be identified, considering that the sitting posture is abnormal;
the human body characteristic key point data comprise eye coordinates, mouth coordinates, neck coordinates and shoulder coordinates;
the computing module is used for computing current sitting posture data according to the human body characteristic key point data if the human body characteristic key point data of the user are identified;
the current sitting posture data comprise a current head inclination angle, a current shoulder inclination angle, a current height difference value between the neck and the face and a current height difference value between the shoulder and the face;
the comparison and judgment module is used for comparing the current sitting posture data with the standard sitting posture data and judging whether the current sitting posture is abnormal or not.
As a further improvement of the above, the system further comprises:
the learning acquisition module is used for inputting standard sitting posture images, performing big data training through a machine learning supervised learning classification algorithm, and acquiring standard sitting posture data, wherein the standard sitting posture data comprises a standard head inclination angle threshold value, a standard shoulder inclination angle threshold value, a standard eye approach difference value ratio threshold value and a standard groveling table difference value ratio threshold value.
As a further improvement of the above, the system further comprises:
and the reminding module is used for sending out reminding information in real time when judging that the current sitting posture is abnormal.
In a third aspect, the present invention provides a sitting posture real-time intelligent discriminating apparatus, comprising:
at least one processor, and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the sitting posture real-time intelligent distinguishing method.
In a fourth aspect, the present invention provides a computer-readable storage medium storing computer-executable instructions for causing a computer to perform the sitting posture real-time intelligent discriminating method as described above.
The beneficial effects of the invention are as follows:
according to the sitting posture real-time intelligent judging method, system, equipment and storage medium, through collecting the current sitting posture image of the user in real time, identifying the key point data of the human body characteristics of the user, calculating the current sitting posture data according to the key point data of the human body characteristics, comparing the current sitting posture data with the standard sitting posture data, judging whether the current sitting posture is abnormal or not, solving the technical problem that the sitting posture judging accuracy is not high due to the fact that only face deviation angles exist in the prior art, realizing the high sitting posture judging accuracy, and being suitable for judging scenes of various abnormal sitting postures.
Drawings
The following is a further description of embodiments of the invention, taken in conjunction with the accompanying drawings:
FIG. 1 is a flow chart of a sitting posture real-time intelligent judging method according to the first embodiment of the invention;
FIG. 2 is a schematic diagram of a human body feature key point distribution;
fig. 3 is a schematic diagram of a sitting posture real-time intelligent distinguishing system according to a second embodiment of the present invention.
Detailed Description
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other.
Example 1
Fig. 1 is a flow chart of a sitting posture real-time intelligent judging method according to a first embodiment of the present invention, fig. 2 is a distribution diagram of key points of human body characteristics, and in combination with fig. 1 and fig. 2, a sitting posture real-time intelligent judging method includes steps S1 to S3.
S1, acquiring a current sitting posture image of a user in real time, identifying key point data of human body characteristics of the user, and if the key point data of the human body characteristics of the user cannot be identified, considering that the sitting posture is abnormal;
the human body characteristic key point data comprise eye coordinates, mouth coordinates, neck coordinates, shoulder coordinates and the like;
s2, if the human body characteristic key point data of the user are identified, calculating current sitting posture data according to the human body characteristic key point data;
the current sitting posture data comprise a current head inclination angle, a current shoulder inclination angle, a current height difference value between the neck and the face and a current height difference value between the shoulder and the face;
s3, comparing the current sitting posture data with the standard sitting posture data, and judging whether the current sitting posture is abnormal or not.
In this embodiment, before step S1, the method further includes the steps of:
s0, inputting standard sitting posture images, and performing big data training through a supervised learning classification algorithm of machine learning to obtain standard sitting posture data. The standard sitting posture data comprise a standard head inclination angle threshold A, a standard shoulder inclination angle threshold B, a standard eye approach difference value ratio threshold C and a standard groveling table difference value ratio threshold D.
In this embodiment, step S1 specifically includes: collecting sitting posture images of users in real time, and identifying key point data P= (x) of human body characteristics of users 0 ,y 0 )&(x 1 ,y 1 )&(x 2 ,y 2 )&…&(x 23 ,y 23 )&(x 24 ,y 24 ). Wherein the eye coordinates include left eye coordinates (x 15 ,y 15 ) Coordinates of right eye (x) 16 ,y 16 ) The mouth coordinates are (x 0 ,y 0 ) The neck coordinate is (x 1 ,y 1 ) The shoulder coordinates include left shoulder coordinates (x 2 ,y 2 ) Coordinate of right shoulder (x) 5 ,y 5 )。
In this embodiment, step S2 specifically includes:
s21, according to the mouth coordinates (x 0 ,y 0 ) And neck coordinates (x 1 ,y 1 ) The current head inclination angle of the user is calculated, the current head inclination angle comprises a current head left inclination angle E1 of the user and a current head right inclination angle E2 of the user, and according to the Euler angle (attitude angle) principle, the head left inclination angle E1 and the head right inclination angle E2 are calculated as follows in combination with the sitting posture characteristic:
s22, calculating the current shoulder inclination angle of the user according to shoulder coordinates, wherein the shoulder coordinates comprise left shoulder coordinates (x 2 ,y 2 ) Coordinate of right shoulder (x) 5 ,y 5 ) The current shoulder inclination angle comprises a current shoulder left inclination angle F1 of a user and a current shoulder right inclination angle F2 of the user, and according to the Euler angle (attitude angle) principle, the shoulder left inclination angle F1 and the shoulder right inclination angle F2 are calculated as follows in combination with the sitting posture characteristic:
s23, according to the eye coordinates (including left eye coordinates (x 15 ,y 15 ) Coordinates of right eye (x) 16 ,y 16 ) Coordinates of mouth (x) 0 ,y 0 ) And neck coordinates (x 1 ,y 1 ) Calculating the difference in height between the current neck and face of the user includes calculating the difference in height between the current neck and left eye (y 1 -y 15 ) The difference in height between the current neck and the right eye (y 1 -y 16 ) And the current neck-to-mouth height difference (y 1 -y 0 )。
S24, according to the eye coordinates (including left eye coordinates (x 15 ,y 15 ) Coordinates of right eye (x) 16 ,y 16 ) Coordinates of mouth (x) 0 ,y 0 ) And shoulder coordinates (including left shoulder coordinates (x 2 ,y 2 ) Coordinate of right shoulder (x) 5 ,y 5 ) Calculating a height difference between the current shoulder and the face of the user, including calculating a height difference (y) between the current left shoulder and the left eye 2 -y 15 ) The current height difference (y 5 -y 16 ) The current left shoulder to mouth height difference (y 2 -y 0 ) And the current difference in height between the right shoulder and the mouth (y 5 -y 0 )。
In this embodiment, step S3 specifically includes:
s31, comparing the current head inclination angle of the user with a standard head angle threshold value, and judging whether the head inclination is abnormal or not;
s32, comparing the current shoulder inclination angle of the user with a standard shoulder inclination angle threshold value, and judging whether the shoulder inclination is abnormal or not;
s33, calculating the ratio of the height difference between the current neck and the face of the user and the height difference between the standard neck and the face of the user, and comparing the first ratio with a standard over-near-with-eyes difference ratio threshold value to judge whether the eyes of the user are too near;
s34, calculating the ratio of the height difference between the current shoulder and the face of the user and the height difference between the standard shoulder and the face of the user, using the ratio as a second ratio, comparing the second ratio with a standard table-lying difference ratio threshold value, and judging whether the user lies down on the table.
In a specific embodiment, step S31 is:
and comparing the calculated left head inclination angle E1 and right head inclination angle E2 of the user with a standard head inclination angle threshold A respectively, if E1 is less than A and E2 is less than A, the head inclination is not abnormal, and otherwise, the head inclination is abnormal.
In a specific embodiment, step S32 is:
and comparing the calculated left shoulder inclination angle F1 and right shoulder inclination angle F2 of the user with a standard shoulder inclination angle threshold B respectively, if F1 is less than B and F2 is less than B, the shoulder inclination is not abnormal, and otherwise, the shoulder inclination is abnormal.
In a specific embodiment, step S33 is:
calculating a ratio of a difference in height between the current neck and the face of the user and a difference in height between the standard neck and the face, as a first ratio G, the difference in height between the current neck and the face including a difference in height between the current neck and the left eye (y 1 -y 15 ) The difference in height between the current neck and the right eye (y 1 -y 16 ) And the current neck-to-mouth height difference (y 1 -y 0 ) The height difference between the standard neck and the face includes the height difference between the standard neck and the left eye (y 1 ’-y 15 '), height difference (y) between standard neck and right eye 1 ’-y 16 ') and the height difference (y) between the standard neck and mouth 1 ’-y 0 '), the first ratio G is calculated as follows:
and comparing the first ratio G with a standard over-near-to-eye difference ratio threshold C, if G is smaller than C, the eyes of the user are normal, and otherwise, the eyes of the user are over-near.
In a specific embodiment, step S34 is:
calculating a ratio of a height difference between the current shoulder and the face of the user to a height difference between the standard shoulder and the face, as a second ratio H, the height difference between the current shoulder and the face including a height difference between the current left shoulder and the left eye (y 2 -y 15 ) The current height difference (y 5 -y 16 ) The current left shoulder to mouth height difference (y 2 -y 0 ) And the current difference in height between the right shoulder and the mouth (y 5 -y 0 ) The height difference between the standard shoulder and the face includes the height difference between the standard left shoulder and the left eye (y 2 ’-y 15 '), height difference (y) between standard right shoulder and right eye 5 ’-y 16 '), the height difference (y) between the standard left shoulder and the mouth 2 ’-y 0 ') and the height difference (y) between the standard right shoulder and the mouth 5 ’y 0 '), the second ratio H is calculated as follows:
and comparing the second ratio H with a standard table-lying difference value ratio threshold D, if H is smaller than D, the user does not lie down on the table, otherwise, the user lies down on the table.
In this embodiment, the method further includes the steps of:
s4, when the current sitting posture is judged to be abnormal, reminding information is sent out in real time, reminding voice is broadcasted through the intelligent sound box or reminding information is broadcasted in real time through the mobile phone APP.
According to the sitting posture real-time intelligent judging method provided by the invention, the current sitting posture image of the user is acquired in real time, the key point data of the human body characteristics of the user are identified, the current sitting posture data are calculated according to the key point data of the human body characteristics, the current sitting posture data are compared with the standard sitting posture data, and whether the current sitting posture is abnormal or not is judged, so that the technical problem that the sitting posture judging accuracy is not high due to the fact that only face deviation angles exist in the prior art is solved, the sitting posture judging accuracy is high, and the method is suitable for judging scenes of various abnormal sitting postures.
Example two
Fig. 3 is a schematic diagram of a sitting posture real-time intelligent distinguishing system according to a second embodiment of the present invention, and referring to fig. 3, a sitting posture real-time intelligent distinguishing system includes:
the acquisition and identification module is used for acquiring the current sitting posture image of the user in real time, identifying key point data of human body characteristics of the user, and if the key point data of the human body characteristics of the user cannot be identified, considering that the sitting posture is abnormal;
wherein, the human body characteristic key point data comprise eye coordinates, mouth coordinates, neck coordinates, shoulder coordinates and the like;
the computing module is used for computing current sitting posture data according to the human body characteristic key point data if the human body characteristic key point data of the user are identified;
the current sitting posture data comprise a current head inclination angle, a current shoulder inclination angle, a current height difference value between the neck and the face and a current height difference value between the shoulder and the face;
the comparison and judgment module is used for comparing the current sitting posture data with the standard sitting posture data and judging whether the current sitting posture is abnormal or not.
In this embodiment, the discriminating system further includes:
the learning acquisition module is used for inputting standard sitting posture images, and performing big data training through a supervised learning classification algorithm of machine learning to acquire standard sitting posture data; the standard sitting posture data comprise a standard head inclination angle threshold value, a standard shoulder inclination angle threshold value, a standard eye approach difference value ratio threshold value and a standard groveling table difference value ratio threshold value.
In this embodiment, the discriminating system further includes:
and the reminding module is used for sending out reminding information in real time when judging that the current sitting posture is abnormal.
The sitting posture real-time intelligent distinguishing system provided by the second embodiment of the invention is used for executing the sitting posture real-time intelligent distinguishing method of the first embodiment, and the working principles and the beneficial effects of the sitting posture real-time intelligent distinguishing system correspond one to one, so that the sitting posture real-time intelligent distinguishing system is not repeated.
Example III
The invention also provides a sitting posture real-time intelligent judging device, which comprises: the sitting posture real-time intelligent judging method comprises at least one processor and a memory which is in communication connection with the at least one processor, wherein the memory stores instructions which can be executed by the at least one processor, and the instructions are executed by the at least one processor so that the at least one processor can execute the sitting posture real-time intelligent judging method according to the first embodiment.
Example IV
The present invention also provides a computer-readable storage medium storing computer-executable instructions for causing a computer to execute the sitting posture real-time intelligent determination method according to the first embodiment.
According to the sitting posture real-time intelligent judging method, system, equipment and storage medium, through collecting the current sitting posture image of the user in real time, identifying the key point data of the human body characteristics of the user, calculating the current sitting posture data according to the key point data of the human body characteristics, comparing the current sitting posture data with the standard sitting posture data, judging whether the current sitting posture is abnormal or not, solving the technical problem that the sitting posture judging accuracy is not high due to the fact that only face deviation angles exist in the prior art, realizing the high sitting posture judging accuracy, and being suitable for judging scenes of various abnormal sitting postures.
The invention is suitable for intelligently judging the sitting postures of children in real time, and solves the problems of sitting posture standardization and healthy growth of parents on the children in the learning process.
While the preferred embodiment of the present invention has been described in detail, the invention is not limited to the embodiment, and various equivalent modifications and substitutions can be made by one skilled in the art without departing from the spirit of the invention, and these equivalent modifications and substitutions are intended to be included in the scope of the present invention as defined in the appended claims.

Claims (6)

1. The real-time intelligent sitting posture distinguishing method is characterized by comprising the following steps of:
inputting standard sitting posture images, performing big data training through a supervised learning classification algorithm of machine learning, and obtaining standard sitting posture data, wherein the standard sitting posture data comprises a standard head inclination angle threshold value, a standard shoulder inclination angle threshold value, a standard eye approach difference value ratio threshold value and a standard groveling table difference value ratio threshold value;
acquiring a current sitting posture image of a user in real time, identifying key point data of human body characteristics of the user, and if the key point data of the human body characteristics of the user cannot be identified, considering that the sitting posture is abnormal;
the human body characteristic key point data comprise eye coordinates, mouth coordinates, neck coordinates and shoulder coordinates;
if the human body characteristic key point data of the user is identified, calculating the current sitting posture data according to the human body characteristic key point data, wherein if the human body characteristic key point data of the user is identified, the step of calculating the current sitting posture data according to the human body characteristic key point data comprises the following steps: calculating the current head inclination angle of the user according to the mouth coordinate and the neck coordinate; calculating the current shoulder inclination angle of the user according to shoulder coordinates, wherein the shoulder coordinates comprise left shoulder coordinates and right shoulder coordinates; calculating the height difference between the current neck and face of the user according to the eye coordinates, the mouth coordinates and the neck coordinates; calculating the height difference between the current shoulder and the face of the user according to the eye coordinates, the mouth coordinates and the shoulder coordinates;
the current sitting posture data comprises a current head inclination angle, a current shoulder inclination angle, a current neck-face height difference value and a current shoulder-face height difference value;
comparing the current sitting posture data with the standard sitting posture data to judge whether the current sitting posture is abnormal, wherein the step of comparing the current sitting posture data with the standard sitting posture data to judge whether the current sitting posture is abnormal comprises the following steps of:
comparing the left inclination angle of the current head and the right inclination angle of the current head of the user with a standard head inclination angle threshold value respectively, and judging that the head inclination is not abnormal when the left inclination angle of the current head and the right inclination angle of the current head are smaller than the standard head inclination angle threshold value respectively;
comparing the current left shoulder inclination angle and the current right shoulder inclination angle of the user with the standard shoulder inclination angle threshold value respectively, and judging that the shoulder inclination is not abnormal when the current left shoulder inclination angle and the current right shoulder inclination angle are smaller than the standard shoulder inclination angle threshold value respectively;
calculating the ratio of the height difference between the current neck and the face of the user and the height difference between the standard neck and the face of the user, comparing the first ratio with a standard over-near-to-eye difference ratio threshold value as a first ratio, and judging whether the user is over-near to the eye or not, wherein the calculation formula of the first ratio is thatWherein->Is the ordinate of the current mouth,/>Is the ordinate of the current neck, +.>Is the ordinate of the current left eye, +.>Is the ordinate of the current right eye, +.>Is the ordinate of the standard mouth,/>Is the ordinate of the standard neck, +.>Is the ordinate of the standard left eye, +.>Is the ordinate of the standard right eye;
calculating the ratio of the height difference between the current shoulder and the face of the user and the height difference between the standard shoulder and the face of the user, comparing the second ratio with a standard table-lying difference ratio threshold value as a second ratio, and judging whether the user lies down on the table, wherein the calculation formula of the second ratio is thatWherein->Is the ordinate of the current left shoulder, +.>Is the ordinate of the current right shoulder, +.>Is the ordinate of the standard left shoulder, +.>Is the ordinate of the standard right shoulder.
2. A sitting posture real-time intelligent discriminating method according to claim 1 further comprising: and when the current sitting posture is abnormal, sending out reminding information in real time.
3. A sitting posture real-time intelligent discriminating system, characterized in that the system comprises:
the learning acquisition module is used for inputting standard sitting posture images, performing big data training through a supervised learning classification algorithm of machine learning, and acquiring standard sitting posture data, wherein the standard sitting posture data comprises a standard head inclination angle threshold value, a standard shoulder inclination angle threshold value, a standard eye approach difference value ratio threshold value and a standard groveling table difference value ratio threshold value;
the acquisition and identification module is used for acquiring the current sitting posture image of the user in real time, identifying key point data of human body characteristics of the user, and if the key point data of the human body characteristics of the user cannot be identified, considering that the sitting posture is abnormal;
the human body characteristic key point data comprise eye coordinates, mouth coordinates, neck coordinates and shoulder coordinates;
the calculation module is used for calculating current sitting posture data according to human body characteristic key point data if the human body characteristic key point data of the user are identified, and the step of calculating the current sitting posture data according to the human body characteristic key point data if the human body characteristic key point data of the user are identified comprises the following steps: calculating the current head inclination angle of the user according to the mouth coordinate and the neck coordinate; calculating the current shoulder inclination angle of the user according to shoulder coordinates, wherein the shoulder coordinates comprise left shoulder coordinates and right shoulder coordinates; calculating the height difference between the current neck and face of the user according to the eye coordinates, the mouth coordinates and the neck coordinates; calculating the height difference between the current shoulder and the face of the user according to the eye coordinates, the mouth coordinates and the shoulder coordinates;
the current sitting posture data comprises a current head inclination angle, a current shoulder inclination angle, a current neck-face height difference value and a current shoulder-face height difference value;
the comparison judging module is used for comparing the current sitting posture data with the standard sitting posture data to judge whether the current sitting posture is abnormal, the step of comparing the current sitting posture data with the standard sitting posture data to judge whether the current sitting posture is abnormal comprises the following steps:
comparing the left inclination angle of the current head and the right inclination angle of the current head of the user with a standard head inclination angle threshold value respectively, and judging that the head inclination is not abnormal when the left inclination angle of the current head and the right inclination angle of the current head are smaller than the standard head inclination angle threshold value respectively;
comparing the current left shoulder inclination angle and the current right shoulder inclination angle of the user with the standard shoulder inclination angle threshold value respectively, and judging that the shoulder inclination is not abnormal when the current left shoulder inclination angle and the current right shoulder inclination angle are smaller than the standard shoulder inclination angle threshold value respectively;
calculating the ratio of the height difference between the current neck and the face of the user and the height difference between the standard neck and the face of the user, comparing the first ratio with a standard over-near-to-eye difference ratio threshold value as a first ratio, and judging whether the user is over-near to the eye or not, wherein the calculation formula of the first ratio is thatWherein->Is the ordinate of the current mouth,/>Is the ordinate of the current neck, +.>Is the ordinate of the current left eye, +.>Is the ordinate of the current right eye, +.>Is the ordinate of the standard mouth,/>Is the ordinate of the standard neck, +.>Is the ordinate of the standard left eye, +.>Is the ordinate of the standard right eye;
calculating the ratio of the height difference between the current shoulder and the face of the user and the height difference between the standard shoulder and the face of the user, comparing the second ratio with a standard table-lying difference ratio threshold value as a second ratio, and judging whether the user lies down on the table, wherein the meter of the second ratioThe calculation formula isWherein->Is the ordinate of the current left shoulder, +.>Is the ordinate of the current right shoulder, +.>Is the ordinate of the standard left shoulder, +.>Is the ordinate of the standard right shoulder.
4. A sitting position real-time intelligent discriminating system as defined in claim 3 wherein said system further comprises:
and the reminding module is used for sending out reminding information in real time when judging that the current sitting posture is abnormal.
5. Real-time intelligent distinguishing equipment of position of sitting, its characterized in that includes:
at least one processor, and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the sitting posture real-time intelligent discrimination method according to any one of claims 1 to 2.
6. A computer-readable storage medium storing computer-executable instructions for causing a computer to perform the sitting posture real-time intelligent discrimination method according to any one of claims 1 to 2.
CN201910006352.0A 2019-01-04 2019-01-04 Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium Active CN111414780B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910006352.0A CN111414780B (en) 2019-01-04 2019-01-04 Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910006352.0A CN111414780B (en) 2019-01-04 2019-01-04 Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111414780A CN111414780A (en) 2020-07-14
CN111414780B true CN111414780B (en) 2023-08-01

Family

ID=71492572

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910006352.0A Active CN111414780B (en) 2019-01-04 2019-01-04 Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111414780B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111931640B (en) * 2020-08-07 2022-06-10 上海商汤临港智能科技有限公司 Abnormal sitting posture identification method and device, electronic equipment and storage medium
CN112364694B (en) * 2020-10-13 2023-04-18 宁波大学 Human body sitting posture identification method based on key point detection
CN112287795B (en) * 2020-10-22 2023-09-01 北京百度网讯科技有限公司 Abnormal driving gesture detection method, device, equipment, vehicle and medium
CN112617815B (en) * 2020-12-17 2023-05-09 深圳数联天下智能科技有限公司 Sitting posture assessment method, device, computer equipment and storage medium
CN112712053B (en) * 2021-01-14 2024-05-28 深圳数联天下智能科技有限公司 Sitting posture information generation method and device, terminal equipment and storage medium
CN113052097A (en) * 2021-03-31 2021-06-29 开放智能机器(上海)有限公司 Human body sitting posture real-time monitoring system and monitoring method
CN113554609B (en) * 2021-07-19 2022-07-08 同济大学 Neck dystonia identification system based on vision
CN113657271B (en) * 2021-08-17 2023-10-03 上海科技大学 Sitting posture detection method and system combining quantifiable factors and unquantifiable factor judgment
CN113780220A (en) * 2021-09-17 2021-12-10 东胜神州旅游管理有限公司 Child sitting posture detection method and system based on child face recognition
CN114038016A (en) * 2021-11-16 2022-02-11 平安普惠企业管理有限公司 Sitting posture detection method, device, equipment and storage medium
CN115035547A (en) * 2022-05-31 2022-09-09 中国科学院半导体研究所 Sitting posture detection method, device, equipment and computer storage medium
CN115909394B (en) * 2022-10-25 2024-04-05 珠海视熙科技有限公司 Sitting posture identification method and device, intelligent table lamp and computer storage medium
CN116884083B (en) * 2023-06-21 2024-05-28 圣奥科技股份有限公司 Sitting posture detection method, medium and equipment based on key points of human body

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102096801A (en) * 2009-12-14 2011-06-15 北京中星微电子有限公司 Sitting posture detecting method and device
CN103488980A (en) * 2013-10-10 2014-01-01 广东小天才科技有限公司 Camera-based sitting posture judgment method and device
CN104850820B (en) * 2014-02-19 2019-05-31 腾讯科技(深圳)有限公司 A kind of recognition algorithms and device
CN107153829A (en) * 2017-06-09 2017-09-12 南昌大学 Incorrect sitting-pose based reminding method and device based on depth image
CN107392146A (en) * 2017-07-20 2017-11-24 湖南科乐坊教育科技股份有限公司 A kind of child sitting gesture detection method and device
CN107491751B (en) * 2017-08-14 2020-06-09 成都伞森科技有限公司 Sitting posture analysis method and device

Also Published As

Publication number Publication date
CN111414780A (en) 2020-07-14

Similar Documents

Publication Publication Date Title
CN111414780B (en) Real-time intelligent sitting posture distinguishing method, system, equipment and storage medium
US10068128B2 (en) Face key point positioning method and terminal
CN110147744B (en) Face image quality assessment method, device and terminal
CN105718869B (en) The method and apparatus of face face value in a kind of assessment picture
CN103210421B (en) Article detection device and object detecting method
CN105474263B (en) System and method for generating three-dimensional face model
CN103577815B (en) A kind of face alignment method and system
CN105608448B (en) A kind of LBP feature extracting method and device based on face's key point
CN105740779B (en) Method and device for detecting living human face
CN106068514A (en) For identifying the system and method for face in free media
CN108629306A (en) Human posture recognition method and device, electronic equipment, storage medium
CN104573634A (en) Three-dimensional face recognition method
CN103902958A (en) Method for face recognition
KR20170006355A (en) Method of motion vector and feature vector based fake face detection and apparatus for the same
CN112541422A (en) Expression recognition method and device with robust illumination and head posture and storage medium
CN109274883A (en) Posture antidote, device, terminal and storage medium
CN112712053A (en) Sitting posture information generation method and device, terminal equipment and storage medium
CN109117753A (en) Position recognition methods, device, terminal and storage medium
CN111046825A (en) Human body posture recognition method, device and system and computer readable storage medium
CN112101124A (en) Sitting posture detection method and device
CN109344706A (en) It is a kind of can one man operation human body specific positions photo acquisition methods
CN110148092A (en) The analysis method of teenager's sitting posture based on machine vision and emotional state
CN113947742A (en) Person trajectory tracking method and device based on face recognition
CN111898571A (en) Action recognition system and method
CN112069863A (en) Face feature validity determination method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 518000 w601, Shenzhen Hong Kong industry university research base, 015 Gaoxin South 7th Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: ASPIRE TECHNOLOGIES (SHENZHEN) LTD.

Address before: 518000 south wing, 6th floor, west block, Shenzhen Hong Kong industry university research base building, South District, high tech Industrial Park, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: ASPIRE TECHNOLOGIES (SHENZHEN) LTD.

GR01 Patent grant
GR01 Patent grant