CN115844381A - Human body action recognition method and device, electronic equipment and storage medium - Google Patents

Human body action recognition method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115844381A
CN115844381A CN202211550077.7A CN202211550077A CN115844381A CN 115844381 A CN115844381 A CN 115844381A CN 202211550077 A CN202211550077 A CN 202211550077A CN 115844381 A CN115844381 A CN 115844381A
Authority
CN
China
Prior art keywords
human body
information
determining
limb
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211550077.7A
Other languages
Chinese (zh)
Inventor
吴忠裕
王成
李春光
吕高峰
高亚娟
姜必荣
王珏
储琦玮
许国军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luxshare Precision Technology Nanjing Co Ltd
Original Assignee
Luxshare Precision Technology Nanjing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luxshare Precision Technology Nanjing Co Ltd filed Critical Luxshare Precision Technology Nanjing Co Ltd
Priority to CN202211550077.7A priority Critical patent/CN115844381A/en
Publication of CN115844381A publication Critical patent/CN115844381A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application discloses a human body action recognition method and device, electronic equipment and a storage medium. The method specifically comprises the following steps: acquiring spatial position information of at least two positioners arranged on a human body, and acquiring limb overturning information of at least one inertial sensor arranged on the human body; wherein, different localizers are arranged at different positions on the human body, and different inertial sensors are arranged at different positions on the human body; determining the space pose data of the human body according to the at least two pieces of space position information and the limb overturning information; and according to the spatial pose data, performing action recognition on the human body. By the method, the human body posture can be determined more directly, efficiently and accurately, the calculation error is reduced, and the calculation efficiency is improved.

Description

Human body action recognition method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of technologies, and in particular, to a method and an apparatus for recognizing a human body motion, an electronic device, and a storage medium.
Background
Due to the rapid development of scientific technology, more and more industries use scientific and technological means to assist specific work, for example, CG (Computer Animation) is increasingly adopted in the movie industry to make the work, or a popular virtual anchor in the live broadcast industry, these industries can recognize the body actions of the user by adopting a human action recognition mode, and then the visual special effect of human action simulation can be achieved by later processing and simulation.
At present, to the discernment and the seizure of human action, mainly realize through machine vision and image processing, catch through the camera and arrange the target point on the human body, carry out the record to human action, but the method of machine vision relatively relies on the site environment, needs fairly pure background (for example green curtain etc.), and the commonality is relatively poor, brings more work load for the staff to the calculated amount of image processing method is big, appears wrong action capture phenomenon more easily, and recognition efficiency is relatively poor.
Disclosure of Invention
The application provides a human body action recognition method, a human body action recognition device, electronic equipment and a storage medium, so that the universality, the accuracy and the recognition efficiency of human body action recognition are improved.
According to an aspect of the present application, there is provided a human motion recognition method, the method including:
acquiring spatial position information of at least two positioners arranged on a human body, and acquiring limb overturning information of at least one inertial sensor arranged on the human body; wherein, different localizers are arranged at different positions on the human body, and different inertial sensors are arranged at different positions on the human body;
determining the space pose data of the human body according to the at least two pieces of space position information and the limb overturning information;
and according to the spatial pose data, performing action recognition on the human body.
According to another aspect of the present application, there is provided a human body motion recognition apparatus including:
the information acquisition module is used for acquiring the spatial position information of at least two locators arranged on the human body and acquiring the limb overturning information of at least one inertial sensor arranged on the human body; wherein, different localizers are arranged at different positions on the human body, and different inertial sensors are arranged at different positions on the human body;
the pose determining module is used for determining the space pose data of the human body according to the at least two pieces of space position information and the limb overturning information;
and the action recognition module is used for recognizing the actions of the human body according to the spatial pose data.
According to another aspect of the present application, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the human motion recognition method according to any of the embodiments of the present application.
According to another aspect of the present application, there is provided a computer-readable storage medium storing computer instructions for causing a processor to implement the human body motion recognition method according to any one of the embodiments of the present application when the computer instructions are executed.
According to the technical scheme of the embodiment of the application, the spatial pose data of different positions of the human body are determined through the plurality of locators and the plurality of inertial sensors which are arranged at different positions of the human body, so that the action of the human body is identified. By the method, the pose of the limb of the human body can be determined directly through the information acquired by the positioner and the inertial sensor arranged on the limb, so that the method is more direct and efficient, the accuracy of human body identification can be accurate to each part of the body, the defect that the computation amount of the prior art for computing the pose of the human body through the inertial sensor and the single positioner is very large is overcome, the computation error is reduced, and the computation efficiency is improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present application, nor do they limit the scope of the present application. Other features of the present application will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a human body motion recognition method according to an embodiment of the present application;
FIG. 2A is a schematic diagram of an arrangement of a locator, a fixed positioning base station and an inertial sensor according to the second embodiment of the present application;
FIG. 2B is a schematic diagram of arm pose determination according to the second embodiment of the present application;
FIG. 2C is a schematic diagram of another arm pose determination provided in accordance with an embodiment II of the present application;
fig. 2D is a schematic flowchart of human motion recognition according to the second embodiment of the present application;
fig. 3 is a schematic structural diagram of a human body motion recognition device according to a third embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device implementing the human body motion recognition method according to the embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Moreover, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example one
Fig. 1 is a flowchart of a human body motion recognition method provided in an embodiment of the present application, where the present embodiment is applicable to a case of recognizing a motion of a human body, and the method may be executed by a human body motion recognition device, where the human body motion recognition device may be implemented in a form of hardware and/or software, and the human body motion recognition device may be configured in an electronic device. As shown in fig. 1, the method includes:
s110, acquiring spatial position information of at least two locators arranged on a human body, and acquiring limb overturning information of at least one inertial sensor arranged on the human body; wherein, different locators are arranged at different positions on the human body, and different inertial sensors are arranged at different positions on the human body.
The locator may be a device or an instrument for determining the position of the locator itself in a given space. The predetermined space may be a preset space for recognizing a motion of the human body. Correspondingly, the spatial position information may be position data acquired by the locator in a given space, the spatial position information is different and not completely the same according to the selection of the reference system, and if the coordinate data of the navigation satellite detection locator is used as the spatial position information, the reference system correspondingly selected is the earth itself; if a reference coordinate system in a predetermined space is selected, the corresponding spatial position information should be determined according to the reference coordinate system.
In an alternative embodiment, the acquiring spatial position information of at least two locators arranged on the human body may include: and determining the spatial position information of the at least two locators on the human body according to the signal transmission time difference between the at least two locators arranged on the human body and the fixed positioning base station arranged outside the human body.
In practical cases, UWB (Ultra Wide Band ) may be used to determine the spatial location information. A locator and a fixed positioning base station are required in UWB positioning. The locator is arranged on the human body, and the fixed positioning base station can be arranged in the set space. Optionally, the number of the fixed positioning base stations is at least four. On the basis of at least four positioning base stations, the spatial position information of at least two positioners arranged on the body is determined. It can be understood that, the fixed positioning base station and each locator communicate with each other to obtain the signal transmission time difference of each communication, since the transmission speed of the signal is known (for example, the speed of light C), the UWB locator sends the signal to the UWB fixed positioning base station, the time difference (or called duration) of the signal received by the fixed positioning base station is T, and the distance S = C × T between the UWB locator and the fixed positioning base station. Because the number of the fixed positioning base stations is at least 4, according to the UWB positioning principle, the positions of at least four fixed positioning base stations are known, the reference system is the set space where all the base stations are located, the distances from the positioner arranged on the human body to all the positioning base stations can be calculated through the method, and the position information of the positioner in the set space can be determined according to the distances. Although the specific size of a given space is not limited herein.
In an alternative embodiment, different positioners may be provided at the torso location, upper arm location, lower arm location, upper leg location, lower leg location and foot of the person.
For example, different locators may be disposed on the upper torso, lower torso, left upper arm, right upper arm, left lower arm, right lower arm, left thigh, right thigh, left calf, right calf, left foot, and right foot of a human body, respectively. The positioners are arranged on different limb positions of the human body in advance, and the fixed positioning base stations and the positioners can be used for independently positioning each body part of the human body. Therefore, the negative effects of errors and the like caused by the fact that the whole human body is positioned in the space only through a single positioner in the prior art are solved; provides a very accurate human body position basis for human body action recognition, and is further favorable for improving the accuracy of human body action recognition.
In an alternative embodiment, the acquiring of the limb turning information of the at least one inertial sensor disposed on the human body may include: and determining the limb overturning information according to the acceleration information and the angular velocity information of the inertial sensor.
The Inertial sensor is used to measure attitude angle (or angular velocity) and acceleration information of a human body, and for example, an Inertial Measurement Unit (IMU) may be used. The rotation condition of limbs of the human body during movement is recorded through various inertial sensors arranged on the human body. Furthermore, the positioner and the inertial sensor are correspondingly arranged on the same position of the human body. The localizer can be arranged on the trunk and the limbs of the human body, and similarly, the inertial sensor can be arranged together with the localizer, such as adjacently arranged or stacked on the trunk position, the upper arm position, the lower arm position, the thigh position, the lower leg position, the foot position and the like of the human body. It can be understood that the inertial sensor can record different limb turnover information of different parts of the human body when limb movement occurs, and even if the movement of the human body is very complicated, the rotation condition of each important part on the human body can be recorded. By doing so, accurate recognition of human body actions can be facilitated, and specific limb parts of the human body can be accurately recognized.
And S120, determining the space pose data of the human body according to the at least two pieces of space position information and the limb overturning information.
It can be understood that the spatial position information and the limb turning information are integrated on the basis of the information recorded by the positioners and the inertial sensors arranged at the parts of the human body, and are used as the spatial pose data of the human body in a given space.
In an alternative embodiment, the determining the spatial pose data of the human body according to the at least two pieces of spatial position information and the limb overturning information may include: determining structural framework information of the human body according to the at least two pieces of spatial position information; and determining space pose data according to the limb overturning information and the structural frame information.
The structural framework information can be used for quantitatively representing the limb structure condition of the human body. It will be appreciated that each part of the body is provided with a corresponding locator and inertial sensor, i.e. the position and attitude (i.e. rotation) of all parts of the body in a given space is recorded. According to the positions of all parts of the human body in the set space, the framework of the human body structure can be determined; and the pose of the human body in the set space can be comprehensively determined according to the pose of each human body part.
And S130, performing motion recognition on the human body according to the spatial pose data.
The spatial data records the position and the posture of each part of the human body limb in a given space, so the motion of the human body can be recognized according to the spatial position data. Of course, any identification method in the related art can be adopted to identify the motion of the human body from the spatial pose data, which is not limited in the embodiment of the present application. In addition, not only can static actions be recognized, but also space pose data changing along with time can be determined according to space positioning information and limb overturning information acquired in real time, so that the motion track of a human body in a period of time can be recognized, and dynamic actions can be recognized.
According to the technical scheme of the embodiment of the application, the spatial pose data of different positions of the human body are determined through the plurality of locators and the plurality of inertial sensors which are arranged at different positions of the human body, so that the action of the human body is identified. By the method, the pose of the limb of the human body can be determined directly through the information acquired by the positioner and the inertial sensor arranged on the limb, so that the method is more direct and efficient, the accuracy of human body identification can be accurate to each part of the body, the defect that the computation amount of the prior art for computing the pose of the human body through the inertial sensor and the single positioner is very large is overcome, the computation error is reduced, and the computation efficiency is improved.
Example two
On the basis of the foregoing embodiments, the present application also provides a specific preferred embodiment. The method can comprise the following steps:
four UWB base stations (corresponding to fixed positioning base stations in the foregoing embodiment) are provided in advance in a space where human body recognition is required, and UWB positioning modules (corresponding to positioners in the foregoing embodiment) are provided at positions on the human body, such as the upper torso, the lower torso, the left upper arm, the right upper arm, the left lower arm, the right lower arm, the left thigh, the right thigh, the left lower leg, the right lower leg, the left foot, and the right foot. The time difference between the signal sent by the UWB positioning module and the signal received by the UWB base station is T, and the light speed is C, so that the distance from the UWB positioning module to the UWB base station is S = C × T. Because four UWB base stations are fixedly arranged around the human body, the distances from each UWB positioning module to the four UWB base stations on the human body can be calculated according to the principle to be S1, S2, S3 and S4 respectively, and the position of each UWB positioning module in the space can be calculated through the four distances because the fixed UWB base stations are known.
And calculating the position information of all the UWB positioning modules in the space by a preset algorithm to obtain the frame of the human body. Of course, the preset algorithm may be any human body frame determination method in the related art.
Meanwhile, an inertial sensor is disposed near each UWB positioning module, as shown in fig. 2A, for recording the posture of the corresponding body part. The acceleration information and the angular velocity information of the body part where the sensor is located can be detected by an acceleration sensor and an angular velocity sensor (gyroscope) in the inertial sensor.
For example, the accelerations output by acceleration sensors X, Y, and Z in the inertial sensor are respectively: A. b and C, the included angles of the acceleration sensors to the ground are respectively as follows: θ, ψ, and φ, then:
Figure BDA0003980633820000081
Figure BDA0003980633820000082
Figure BDA0003980633820000083
as shown in fig. 2B, assuming that the arm of the human body is parallel to the ground, a =0, B =0, and C = g (gravitational acceleration), then θ =0 °, ψ =0 °, Φ =90 ° can be calculated correspondingly according to the above formula.
For another example, the angular velocities of the outputs of the gyroscopes X, Y, and Z in the inertial sensor are respectively: ax, bx, and Cx. The sampling rate of the gyroscope is f, the data volume output by the gyroscope in t time is K = t × f, and the rotation angles of the gyroscope in t time are theta 1, psi 1 and phi 1, then:
Figure BDA0003980633820000084
Figure BDA0003980633820000085
Figure BDA0003980633820000091
assuming a sampling rate of f =1000 for the gyroscope, the arm is rotated counterclockwise 100ms along the Y-axis at an angular velocity of 900 °/S. The 100ms gyroscope outputs 100 data volumes.
The angle of rotation of the arm is then: θ 1=0 °, ψ 1=900 × 100 ÷ 1000=90 °, Φ 1=0 °, as shown in fig. 2C.
As shown in fig. 2D, accurate motion capture of the human body can be achieved by obtaining the spatial position information obtained by the UWB positioning module and calculating the turning state (posture) of each part of the limb through the inertial sensor, as shown in fig. 2D.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a human body motion recognition device according to a third embodiment of the present application.
As shown in fig. 3, the apparatus 300 includes:
the information acquisition module 310 is configured to acquire spatial position information of at least two locators disposed on a human body and acquire limb overturning information of at least one inertial sensor disposed on the human body; wherein, different localizers are arranged at different positions on the human body, and different inertial sensors are arranged at different positions on the human body;
a pose determining module 320, configured to determine spatial pose data of the human body according to the at least two pieces of spatial position information and the limb turning information;
and the action recognition module 330 is configured to perform action recognition on the human body according to the spatial pose data.
According to the technical scheme of the embodiment of the application, the spatial pose data of different positions of the human body are determined through the plurality of locators and the plurality of inertial sensors which are arranged at different positions of the human body, so that the action of the human body is identified. By the method, the pose of the limb of the human body can be determined directly through the information acquired by the positioner and the inertial sensor arranged on the limb, so that the method is more direct and efficient, the accuracy of human body identification can be accurate to each part of the body, the defect that the computation amount of the prior art for computing the pose of the human body through the inertial sensor and the single positioner is very large is overcome, the computation error is reduced, and the computation efficiency is improved.
In an alternative embodiment, the pose determination module 320 may include:
the human body frame determining unit is used for determining the structural frame information of the human body according to the at least two pieces of spatial position information;
the pose data determining unit determines spatial pose data according to the limb overturning information and the structural frame information.
In an alternative embodiment, different positioners may be provided at the torso location, upper arm location, lower arm location, upper leg location, lower leg location and foot of the person.
In an optional implementation manner, the information obtaining module 310 may be specifically configured to:
and determining the limb overturning information according to the acceleration information and the angular velocity information of the inertial sensor.
In an optional implementation manner, the information obtaining module may be further specifically configured to:
and determining the spatial position information of the at least two locators on the human body according to the signal transmission time difference between the at least two locators arranged on the human body and the fixed positioning base station arranged outside the human body.
In an alternative embodiment, the number of fixed positioning base stations may be at least four.
In an alternative embodiment, the locator and the inertial sensor are correspondingly disposed at the same position on the human body.
The human body motion recognition device provided by the embodiment of the application can execute the human body motion recognition method provided by any embodiment of the application, and has the corresponding functional modules and beneficial effects of executing each human body motion recognition method.
Example four
FIG. 4 shows a schematic structural diagram of an electronic device 10 that may be used to implement embodiments of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 4, the electronic device 10 includes at least one processor 11, and a memory communicatively connected to the at least one processor 11, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data necessary for the operation of the electronic apparatus 10 can also be stored. The processor 11, the ROM 12, and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
A number of components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, or the like; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, or the like. The processor 11 performs the various methods and processes described above, such as a human motion recognition method.
In some embodiments, the human motion recognition method may be implemented as a computer program tangibly embodied in a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into the RAM 13 and executed by the processor 11, one or more steps of the human motion recognition method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the human action recognition method by any other suitable means (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present application may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on a machine, as a stand-alone software package partly on a machine and partly on a remote machine or entirely on a remote machine or server.
In the context of this application, a computer readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solution of the present application can be achieved, and the present invention is not limited thereto.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A human body action recognition method is characterized by comprising the following steps:
acquiring spatial position information of at least two positioners arranged on a human body, and acquiring limb overturning information of at least one inertial sensor arranged on the human body; wherein different positioners are arranged at different positions on the human body, and different inertial sensors are arranged at different positions on the human body;
determining the space pose data of the human body according to at least two pieces of space position information and the limb overturning information;
and according to the space pose data, performing action recognition on the human body.
2. The method according to claim 1, wherein the determining the spatial pose data of the human body according to the at least two spatial position information and the limb overturning information comprises:
determining structural framework information of the human body according to at least two pieces of spatial position information;
and determining the space pose data according to the limb overturning information and the structural frame information.
3. The method according to claim 1 or 2, characterized in that different positioners are provided at the torso position, upper arm position, lower arm position, thigh position, lower leg position and foot of the person, respectively.
4. The method of claim 1, wherein the obtaining of the limb rollover information of at least one inertial sensor disposed on the human body comprises:
and determining the limb overturning information according to the acceleration information and the angular velocity information of the inertial sensor.
5. The method of claim 1, wherein the obtaining spatial position information of at least two locators disposed on a human body comprises:
and determining the spatial position information of the at least two locators on the human body according to the signal transmission time difference between the at least two locators arranged on the human body and a fixed positioning base station arranged outside the human body.
6. The method of claim 5, wherein the number of fixed positioning base stations is at least four.
7. The method of claim 1, wherein the locator and the inertial sensor are correspondingly located on the same location of the human body.
8. A human motion recognition device, comprising:
the information acquisition module is used for acquiring the spatial position information of at least two locators arranged on the human body and acquiring the limb overturning information of at least one inertial sensor arranged on the human body; wherein different positioners are arranged at different positions on the human body, and different inertial sensors are arranged at different positions on the human body;
the pose determining module is used for determining the space pose data of the human body according to at least two pieces of space position information and the limb overturning information;
and the motion recognition module is used for recognizing the motion of the human body according to the space pose data.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the human motion recognition method of any one of claims 1-7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores computer instructions for causing a processor to implement the human motion recognition method of any one of claims 1-7 when executed.
CN202211550077.7A 2022-12-05 2022-12-05 Human body action recognition method and device, electronic equipment and storage medium Pending CN115844381A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211550077.7A CN115844381A (en) 2022-12-05 2022-12-05 Human body action recognition method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211550077.7A CN115844381A (en) 2022-12-05 2022-12-05 Human body action recognition method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115844381A true CN115844381A (en) 2023-03-28

Family

ID=85669979

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211550077.7A Pending CN115844381A (en) 2022-12-05 2022-12-05 Human body action recognition method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115844381A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118016035A (en) * 2024-03-28 2024-05-10 深圳市戴乐体感科技有限公司 Drumstick positioning system, method, equipment and medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110025562A1 (en) * 2009-08-03 2011-02-03 Xsens Technologies, B.V. Tightly Coupled UWB/IMU Pose Estimation System and Method
US20110046915A1 (en) * 2007-05-15 2011-02-24 Xsens Holding B.V. Use of positioning aiding system for inertial motion capture
CN104473618A (en) * 2014-11-27 2015-04-01 曦煌科技(北京)有限公司 Body data acquisition and feedback device and method for virtual reality
US20180153444A1 (en) * 2016-12-05 2018-06-07 Intel Corporation Body movement tracking
US20190056422A1 (en) * 2016-01-12 2019-02-21 Bigmotion Technologies Inc. Systems and methods for human body motion capture
CN112957033A (en) * 2021-02-01 2021-06-15 山东大学 Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation
CN113074726A (en) * 2021-03-16 2021-07-06 深圳市慧鲤科技有限公司 Pose determination method and device, electronic equipment and storage medium
CN113268141A (en) * 2021-05-17 2021-08-17 西南大学 Motion capture method and device based on inertial sensor and fabric electronics
CN113499065A (en) * 2021-07-08 2021-10-15 山东蓓明医疗科技有限公司 Body motion capturing method based on inertial sensor and rehabilitation evaluation system
EP3944813A1 (en) * 2020-07-28 2022-02-02 Enari GmbH Method and device for identifying and predicting movements of a body
CN114545327A (en) * 2022-02-07 2022-05-27 四川中电昆辰科技有限公司 Motion state information and UWB fusion positioning method and positioning system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110046915A1 (en) * 2007-05-15 2011-02-24 Xsens Holding B.V. Use of positioning aiding system for inertial motion capture
US20110025562A1 (en) * 2009-08-03 2011-02-03 Xsens Technologies, B.V. Tightly Coupled UWB/IMU Pose Estimation System and Method
CN104473618A (en) * 2014-11-27 2015-04-01 曦煌科技(北京)有限公司 Body data acquisition and feedback device and method for virtual reality
US20190056422A1 (en) * 2016-01-12 2019-02-21 Bigmotion Technologies Inc. Systems and methods for human body motion capture
US20180153444A1 (en) * 2016-12-05 2018-06-07 Intel Corporation Body movement tracking
EP3944813A1 (en) * 2020-07-28 2022-02-02 Enari GmbH Method and device for identifying and predicting movements of a body
CN112957033A (en) * 2021-02-01 2021-06-15 山东大学 Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation
CN113074726A (en) * 2021-03-16 2021-07-06 深圳市慧鲤科技有限公司 Pose determination method and device, electronic equipment and storage medium
CN113268141A (en) * 2021-05-17 2021-08-17 西南大学 Motion capture method and device based on inertial sensor and fabric electronics
CN113499065A (en) * 2021-07-08 2021-10-15 山东蓓明医疗科技有限公司 Body motion capturing method based on inertial sensor and rehabilitation evaluation system
CN114545327A (en) * 2022-02-07 2022-05-27 四川中电昆辰科技有限公司 Motion state information and UWB fusion positioning method and positioning system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118016035A (en) * 2024-03-28 2024-05-10 深圳市戴乐体感科技有限公司 Drumstick positioning system, method, equipment and medium

Similar Documents

Publication Publication Date Title
US11733398B2 (en) Vehicle positioning method for determining position of vehicle through creating target function for factor graph model
JP6705972B2 (en) Attitude estimation device, attitude estimation method, control program, and recording medium
CN108871311B (en) Pose determination method and device
WO2016198009A1 (en) Heading checking method and apparatus
CN112819860B (en) Visual inertial system initialization method and device, medium and electronic equipment
JP2019078560A (en) Gyro sensor offset correcting device, offset correction program, and pedestrian autonomous navigation device
US10247558B2 (en) Travel direction determination apparatus, map matching apparatus, travel direction determination method, and computer readable medium
CN115844381A (en) Human body action recognition method and device, electronic equipment and storage medium
CN115273071A (en) Object identification method and device, electronic equipment and storage medium
KR20220100813A (en) Automatic driving vehicle registration method and device, electronic equipment and a vehicle
CN113610702B (en) Picture construction method and device, electronic equipment and storage medium
CN114972668A (en) Laser SLAM method and system based on height information
CN116348916A (en) Azimuth tracking for rolling shutter camera
CN111382701B (en) Motion capture method, motion capture device, electronic equipment and computer readable storage medium
US10197402B2 (en) Travel direction information output apparatus, map matching apparatus, travel direction information output method, and computer readable medium
CN113218380B (en) Electronic compass correction method and device, electronic equipment and storage medium
CN115727871A (en) Track quality detection method and device, electronic equipment and storage medium
CN102117199A (en) Method for improving data updating rate of star sensor by using parallel streamline mechanism
CN115097379A (en) Positioning tracking method, device, equipment and storage medium
CN112880675B (en) Pose smoothing method and device for visual positioning, terminal and mobile robot
CN115307641A (en) Robot positioning method, device, robot and storage medium
CN114187509A (en) Object positioning method and device, electronic equipment and storage medium
Yu et al. Fast extrinsic calibration for multiple inertial measurement units in visual-inertial system
CN113375667B (en) Navigation method, device, equipment and storage medium
CN116358606B (en) Initial coarse alignment method, device, equipment and medium of inertial navigation system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination