CN115809679A - Physical fitness test method, electronic device, storage medium, and computer program product - Google Patents

Physical fitness test method, electronic device, storage medium, and computer program product Download PDF

Info

Publication number
CN115809679A
CN115809679A CN202210770922.5A CN202210770922A CN115809679A CN 115809679 A CN115809679 A CN 115809679A CN 202210770922 A CN202210770922 A CN 202210770922A CN 115809679 A CN115809679 A CN 115809679A
Authority
CN
China
Prior art keywords
graphic code
target
physical
distance
positioning line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210770922.5A
Other languages
Chinese (zh)
Inventor
柳志强
周游
何琦
王炬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Kuangshi Technology Co Ltd
Original Assignee
Beijing Kuangshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Kuangshi Technology Co Ltd filed Critical Beijing Kuangshi Technology Co Ltd
Priority to CN202210770922.5A priority Critical patent/CN115809679A/en
Publication of CN115809679A publication Critical patent/CN115809679A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The embodiment of the application provides a physical fitness test method, electronic equipment, a storage medium and a computer program product. The method comprises the following steps: the method comprises the steps that graphic code images acquired by a first camera aiming at graphic codes on a base body of the seat body forward-bending device when physical testing of a physical testing person based on the seat body forward-bending device is completed are acquired, wherein a plurality of graphic codes are arranged on the base body along a moving path of a movable piece, and the movable piece of the seat body forward-bending device drives the first camera to move under the pushing of the physical testing person; determining the physical measurement result of the physical measurement personnel based on the graphic code image; the physical test result comprises at least one of a physical test score, position information corresponding to the movable piece when the physical test is completed, and distance information of the movable piece pushed. The image information scanned by the first camera can be automatically and intelligently acquired and recognized. The body measurement mode has small error and high accuracy.

Description

Physical fitness test method, electronic device, storage medium, and computer program product
Technical Field
The present application relates to the field of computer vision, and more particularly, to a fitness test method, an electronic device, a storage medium, and a computer program product.
Background
In recent years, the forward bending of the sitting position belongs to the most common sports examination items in the middle school entrance examination. The common sitting posture forward bending devices are mainly divided into two types. The other is traditional equipment, and this equipment does not use the electricity, only has a scale and a vernier, does not have intelligent design completely, and final score needs the manual work to look over, appears the error that the human error caused easily. Another kind of equipment is the semi-intelligent equipment who takes electronic sensor, and the range finding principle of this equipment promotes the vernier when the body survey personnel move, and the infrared grating of embedding in the equipment can be sheltered from in the removal of vernier, and the sensor can be based on the scope that the grating was sheltered from and indicate this body survey distance that promotes. The second device has the capability of electronizing information collection, but has a plurality of problems. For example, the data of the device still needs to be recorded manually, and cannot be identified and acquired automatically, which is not beneficial to the storage and the evidence obtaining of the body test result. Furthermore, the measurement accuracy of the device is not high enough.
In view of the above, a new seat body forward bending test technology is needed to solve the above problems.
Disclosure of Invention
The present application has been made in view of the above problems. The application provides a fitness test method, electronic equipment, a storage medium and a computer program product.
According to an aspect of the present application, there is provided a fitness test method including: acquiring a graphic code image acquired by a first camera aiming at a graphic code on a base body of seat body forward-bending equipment when a physical testing of a physical testing person based on the seat body forward-bending equipment is finished, wherein a plurality of graphic codes are arranged on the base body along a moving path of a movable piece, and a movable piece of the seat body forward-bending equipment drives the first camera to move under the pushing of the physical testing person; determining the physical measurement result of the physical measurement personnel based on the graphic code image; the physical test result comprises at least one of a physical test score, position information corresponding to the movable piece when the physical test is completed, and distance information of the movable piece pushed.
According to another aspect of the present application, there is provided an electronic device comprising a processor and a memory, wherein the memory stores computer program instructions for executing the fitness test method when the computer program instructions are executed by the processor.
According to another aspect of the present application, there is provided a storage medium having stored thereon program instructions for executing the fitness test method described above when executed.
According to another aspect of the present application, a computer program product is provided, comprising a computer program for performing the above fitness test method when the computer program is run.
According to the physical fitness test method, the electronic equipment, the storage medium and the computer program product, the performance of the physical testers can be automatically calculated based on the graphic codes scanned by the first camera. In this way, the image information scanned by the first camera can be automatically and intelligently acquired and identified without manual recording, so that the physical measurement result can be very easily stored and evidence can be obtained. In addition, compared with a simple infrared sensing detection mode, the body measurement mode based on the computer vision technology has small error and high accuracy.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in more detail embodiments of the present application with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings, like reference numbers generally represent like parts or steps.
FIG. 1 shows a schematic block diagram of an example electronic device for implementing a fitness test method and apparatus according to embodiments of the present application;
fig. 2 shows a schematic view of a sitting body flexion device according to an embodiment of the present application;
FIG. 3 illustrates a schematic view of a portion of a graphical code on a substrate according to one embodiment of the present application;
FIG. 4 shows a schematic top view of a seat body forward flexion testing system according to an embodiment of the present application;
FIG. 5 shows a schematic flow diagram of a fitness test method according to an embodiment of the present application;
FIG. 6 illustrates an exemplary flow chart of a fitness test method according to one embodiment of the present application;
FIG. 7 shows a schematic block diagram of a fitness test device according to an embodiment of the present application;
FIG. 8 shows a schematic block diagram of an electronic device according to one embodiment of the present application.
Detailed Description
In recent years, technical research based on artificial intelligence, such as computer vision, deep learning, machine learning, image processing, and image recognition, has been actively developed. Artificial Intelligence (AI) is an emerging scientific technology for studying and developing theories, methods, techniques and application systems for simulating and extending human Intelligence. The artificial intelligence subject is a comprehensive subject and relates to various technical categories such as chips, big data, cloud computing, internet of things, distributed storage, deep learning, machine learning and neural networks. Computer vision is an important branch of artificial intelligence, particularly a machine is used for identifying the world, and computer vision technologies generally comprise technologies such as face identification, living body detection, fingerprint identification and anti-counterfeiting verification, biological feature identification, face detection, pedestrian detection, target detection, pedestrian identification, image processing, image identification, image semantic understanding, image retrieval, character identification, video processing, video content identification, three-dimensional reconstruction, virtual reality, augmented reality, synchronous positioning and map construction (SLAM), computational photography, robot navigation and positioning and the like. With the research and progress of artificial intelligence technology, the technology is applied to many fields, such as safety control, city management, traffic management, building management, park management, face passage, face attendance, logistics management, warehouse management, robots, intelligent marketing, computational photography, mobile phone images, cloud services, smart homes, wearable equipment, unmanned driving, automatic driving, intelligent medical treatment, face payment, face unlocking, fingerprint unlocking, person certificate verification, smart screens, smart televisions, cameras, mobile internet, live webcasts, beauty treatment, medical beauty treatment, intelligent temperature measurement and the like.
In order to make the objects, technical solutions and advantages of the present application more apparent, exemplary embodiments according to the present application will be described in detail below with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the application described in the application without inventive step, shall fall within the scope of protection of the application.
In order to at least partially solve the technical problem, embodiments of the present application provide a fitness test method, an electronic device, a storage medium, and a computer program product. According to the physical fitness test method, the situation that the physical testing personnel push the movable piece to move can be collected and recognized based on the computer vision technology, so that automatic and intelligent acquisition of physical testing results is achieved, and the accuracy of the test method is high.
First, an example electronic device 100 for implementing the fitness test method and apparatus according to an embodiment of the present application is described with reference to fig. 1.
As shown in fig. 1, electronic device 100 includes one or more processors 102, one or more memory devices 104. Optionally, the electronic device 100 may also include an input device 106, an output device 108, and an image capture device 110, which may be interconnected via a bus system 112 and/or other form of connection mechanism (not shown). It should be noted that the components and structure of the electronic device 100 shown in fig. 1 are exemplary only, and not limiting, and the electronic device may have other components and structures as desired.
The processor 102 may be implemented in at least one hardware form of a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a microprocessor, the processor 102 may be one or a combination of several of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), an Application Specific Integrated Circuit (ASIC), or other forms of processing units having data processing capability and/or instruction execution capability, and may control other components in the electronic device 100 to perform desired functions.
The storage 104 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, etc. On which one or more computer program instructions may be stored that may be executed by processor 102 to implement the client functionality (implemented by the processor) of the embodiments of the application described below and/or other desired functionality. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer-readable storage medium.
The input device 106 may be a device used by a user to input instructions and may include one or more of a keyboard, a mouse, a microphone, a touch screen, and the like.
The output device 108 may output various information (e.g., images and/or sounds) to an external (e.g., user), and may include one or more of a display, a speaker, etc. Alternatively, the input device 106 and the output device 108 may be integrated together, implemented using the same interactive device (e.g., a touch screen).
The image capture device 110 may capture images and store the captured images in the storage device 104 for use by other components. The image photographing device 110 may be a separate camera or a camera in a mobile terminal, etc. It should be understood that the image capture device 110 is merely an example, and the electronic device 100 may not include the image capture device 110. In this case, other devices having image capturing capabilities may be used to capture an image and transmit the captured image to the electronic device 100.
For example, an example electronic device for implementing the fitness test method and apparatus according to the embodiments of the present application may be implemented on a device such as a personal computer, a terminal device, a time recorder, a panel machine, a camera, or a remote server.
The physical fitness test method described herein is a method based on a sitting body forward flexion device, and for convenience of understanding, the structure and the operation principle of a sitting body forward flexion device according to an embodiment of the present application will be described first. Fig. 2 shows a schematic view of a sitting body flexion device according to an embodiment of the present application. As shown in fig. 2, the seat body forward-bending apparatus 200 may include a movable member 210, a base 220, and a first camera 230. The movable member 210 is disposed on the base 220.
The movable member (i.e., the cursor) 210 is adapted to move in a direction parallel to the extending direction of the base 220 under the push of the examiner. A plurality of graphic codes are arranged on the base 220 along the moving path of the movable member 210, and the graphic codes at different positions are used for representing different physical positions. The lens of the first camera 230 faces the graphic code on the base 220, and the first camera 230 is used for scanning the graphic code on the base 220 and transmitting the scanned graphic code information (i.e. graphic code image) to the processing device. The processing device may determine the position where the movable member 210 arrives based on the received graphic code information, and further determine the physical measurement result of the physical measurement person.
The movable member 210 may be any suitable object capable of being pushed, and may be implemented in any shape and of any material. By way of example and not limitation, the movable member 210 may be a rectangular plate disposed perpendicular to the base 220 for movement by a physical testing person. For example, a slide rail may be laid in the moving direction of the movable member 210, and the movable member 210 may move along the slide rail.
The substrate 220 may be an object having a certain extending direction, and may be implemented by using any suitable material. By way of example and not limitation, the base 220 may be a flat plate, which may extend in a direction parallel to the moving direction of the movable member 210. The base 220 may be provided with a plurality of graphic codes, and different graphic codes are provided at different positions of the base 220 along the moving direction of the movable member 210. Each of the different graphical codes may represent a physical location. Alternatively, the graphic code may be attached to the substrate 220. The graphic code may be any suitable image representing specific coded information with a specific pattern, shape. The graphic code may be, for example, a two-dimensional barcode (i.e., a two-dimensional code), a one-dimensional barcode, or the like. In one example, the graphical code may directly represent a particular physical location. For example, the information contained in any graphic code is itself the coordinate value of a specific physical location, and the coordinate value of the physical location is obtained after scanning and decoding the graphic code. In another example, the graphical code may indirectly represent a particular physical location. For example, any graphic code contains information that is specific identification information that uniquely corresponds to coordinate values of a specific physical location. In this case, the specific identification information is obtained after scanning and decoding the graphic code, and then the coordinate value of the specific physical position can be obtained after mapping the specific identification information.
The bottom of the movable member 210 is connected with a first camera 230, the first camera 230 can move along with the movable member 210, and the lens of the first camera 230 faces the graphic code on the base 220. The height d of the first camera 230 itself and the distance h between the first camera and the base 220 may be set as required, and may have any size. For example, the height d of the first camera 230 may be 4 cm, and the distance h from the substrate may be 3.5 cm. The height d of the first camera 230 may be a distance from a first end of the first camera 230 connected with the movable member 210 to a second end of the first camera 230 where a lens is located. The distance between the first camera 230 and the base 220 may be the distance between the second end of the first camera 230 and the plane where the graphic code of the base 220 is located. It should be noted that the arrangement of the movable member 210, the base 220 and the first camera 230 shown in fig. 2 is only an example and not a limitation of the present application, and other reasonable arrangements of these components are possible. For example, the base 220 and the first camera 230 may be disposed at the side of the movable element 210, and the base 220 may be located at the same or a similar horizontal position as the first camera 230, as long as the lens of the first camera 230 can capture the graphic code on the base 220. Of course, the movable member may alternatively be provided on the side of the base 220.
The first camera 230 may scan the graphic code on the base 220 in real time and transmit the scanned graphic code information to the processing device. The processing device may further determine the position where the current movable member 210 arrives based on the currently scanned graphic code information, thereby determining the physical measurement result of the physical measurement person. The body measurement result is a body measurement result, which may include one or more of the following: position information corresponding to the movable member 210, distance information by which the movable member 210 is pushed, a body measurement score, and the like. It can be understood that when the physical testing person pushes the movable member 210 to the farthest position that can be reached, the position that the movable member 210 currently reaches, or the moving distance of the movable member 210 currently relative to the initial position, may indicate the performance of the physical testing person. Therefore, the physical measurement result of the physical measurement personnel can be judged by judging the position reached by the movable piece. The performance of the physical measuring person can be directly represented by the position reached by the movable element 210 or the moving distance of the movable element 210, and can also be converted into score representation. The processing device for receiving and analyzing the graphic code information may be a central processing device in the seat body forward bending test system described below, or may be any other suitable processing device, such as a processing device installed with the first camera 230.
The processing means may be implemented in the form of at least one hardware form of a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a microprocessor, which may be one or a combination of several of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), an Application Specific Integrated Circuit (ASIC), or other forms of processing units having data processing capabilities and/or instruction execution capabilities.
Based on above-mentioned seat body anteflexion equipment, when the physical measurement personnel promoted movable member and removed, first camera can follow movable member and remove, scans the removal condition that figure code information can confirm movable member through first camera, and then can confirm the physical measurement result of physical measurement personnel. In this way, the image information scanned by the first camera can be automatically and intelligently acquired and identified without manual recording, so that the physical measurement result can be very easily stored and evidence can be obtained. In addition, compared with a simple infrared sensing detection mode, the body measurement mode based on the computer vision technology has small error and high accuracy.
Illustratively, the first camera 230 is directly disposed on the movable member 210, and the movable member 210 moves to drive the first camera 230 to move; alternatively, the seat body forward flexion device 200 further comprises a bearing member (not shown), the movable member 210 and the first camera 230 are respectively disposed on two opposite sides of the bearing member, and the first camera 230 is driven to move by the bearing member when the movable member moves 210.
In one example, the first camera 230 is directly disposed on the movable member 210, that is, the first camera 230 and the movable member 210 can be directly connected to each other, so that the movable member 210 can drive the first camera 230 to move together when moving. The direct connection scheme has the advantages of simple equipment structure and lower manufacturing cost. In another example, the first camera 230 may be indirectly disposed on the movable member 210, such as by connecting the two together through an intermediate carrier. The carrier may be realized in any suitable shape, in any suitable material. For example and without limitation, the carrier may be implemented as a pallet, the movable member 210 may be disposed above the pallet, and the first camera 230 may be disposed below the pallet. The scheme of connecting through the bearing piece facilitates the updating of the device and has low maintenance cost.
Illustratively, the movable member 210 is disposed above the base 220; alternatively, the movable member 210 is disposed at a side of the base 220. The relative position relation of the movable member 210, the base 220 and the first camera 230 can be set arbitrarily as required, and only the moving direction of the movable member 210 is required to be kept parallel to the extending direction of the base 220, and the first camera 230 can move along with the movable member 210 and can collect graphic codes on the base 220.
The graphic codes on the substrate 220 may be of any suitable shape and size and may be spaced from one another by any suitable amount, which may be set as desired. For example, in the case where the graphic code is a two-dimensional code, the shape thereof may be a rectangle, a square, a circle, or the like. The square two-dimensional code is a two-dimensional code which is convenient to recognize and detect, and an embodiment in which the graphic code is the square two-dimensional code is described below.
Illustratively, a plurality of graphic codes may be arranged at equal intervals; and/or each of the plurality of graphic codes can be a square two-dimensional code, and the interval between the plurality of graphic codes is equal to integral multiple of the side length of any graphic code. The integer multiple may be 1, 2, 3, etc. In one example, the intervals between the plurality of graphic codes are equal to the side length of any one graphic code.
The side length of the square two-dimensional code may be arbitrary. In one embodiment, the graphical code may be a square two-dimensional code of 1cm by 1cm size. Assuming that the substrate 220 is a white rectangular substrate with a size of 5cm × 61cm, 31 square two-dimensional codes can be attached to the substrate, and the interval between every two square two-dimensional codes is equal to 1cm. FIG. 3 shows a schematic view of a portion of a graphical code on a substrate according to one embodiment of the present application. As shown in fig. 3, the graphic codes are square two-dimensional codes and are spaced apart from each other by a distance equal to a side length of each graphic code. A square two-dimensional code (e.g., a two-dimensional code having a size of 1cm × 1 cm) is attached to the base at regular intervals (e.g., 1 cm). Each two-dimensional code may correspond to an actual physical location. For example, following the above example, assuming that the substrate 220 is a white rectangular substrate of 5cm × 61cm, 31 square two-dimensional codes can be attached to the substrate. At this time, the physical positions of the two-dimensional code corresponding to the left-to-right sequence may be-20, -18, -16, -14, -12, -10, -8, -6, -4, -2,0,2,4,6,8, 10, 12, 14, 16, 18, 20, 22, 24, 26, 28, 30, 32, 34, 36, 38, 40, respectively, and the bisection distance is equal to the range on the base 220 in the sitting body forward flexion device. The unit of the physical location is centimeters. Referring back to FIG. 2, the leftmost coordinate of substrate 220 is shown as-20 and the rightmost coordinate is 40, which are also centimeters. The example shown in fig. 2 uses the position of the foot pedal as the origin of coordinates, i.e., 0. Therefore, the physical positions corresponding to the 31 two-dimensional codes are related to the example shown in fig. 2, and can be used to represent the 31 coordinates on the substrate 220 in fig. 2.
According to the regulations, the two-dimensional codes need to be provided with intervals. If the interval is not an integral multiple of the side length of the two-dimensional code, the difficulty of identification and calculation is increased when the two-dimensional code is identified and the distance is calculated. And directly setting the interval to be equal to an integral multiple of the side length of the two-dimensional code, it is very easy to recognize and calculate the distance between the two-dimensional codes, and thus it is also very easy to calculate the moving distance of the movable member 210. This helps to improve the efficiency and accuracy of the performance detection.
Illustratively, the sitting body forward flexion device 200 may further comprise: a resetting device (not shown in the figures). The resetting device may include a motor and a telescopic member, a rotation shaft of the motor is connected with a first end of the telescopic member, a second end of the telescopic member is connected with the movable member 210, and the motor is used to drive the telescopic member to contract when actively rotating.
The telescoping member in the resetting device may be any suitable device capable of performing the telescoping function. For example, the telescopic member may be a spring or a telescopic rod, etc. In one embodiment, the telescoping member may be a spring and the motor may optionally be fixed at an initial position of the movable member 210. A first end of the spring is connected to a rotation shaft of a motor in the returning means and a second end of the spring is connected to the movable member 210. During the survey personnel push movable member 210 to the farthest position from initial position, can not be for the motor power-on, the second end of extensible member removes to far away along with movable member 210 like this, and the extensible member is passive tensile, and the axis of rotation of motor also rotates passively along with the extension of extensible member this moment. When the body measurement is finished and the reset is needed, the motor can be electrified to control the motor to operate. At the moment, the rotating shaft of the motor actively rotates, and the rotating direction of the rotating shaft of the motor is opposite to the body measuring time. When the rotation shaft of the motor actively rotates, the telescopic member can be driven to contract, so that the telescopic member drives the movable member 210 to return to the initial position.
The existing sitting body forward-bending equipment lacks a reset device and has no active reset function, and a user needs to manually reset the cursor to the initial position. According to the embodiment of the application, the seat body forward flexion equipment can be automatically reset through the resetting device, the intelligent degree of the seat body forward flexion equipment can be effectively improved, and manpower is liberated.
Illustratively, the seat body flexion device 200 may also include a housing. At least a portion of the movable member 210 is exposed to the outside of the housing, and the first camera 230 and the base 220 are hidden inside the housing.
Alternatively, the movable member 210 may be entirely exposed to the outside of the housing. Alternatively, the movable member 210 may be partially exposed outside the housing and partially hidden inside the housing. The first camera 230 and the base 220 may be both hidden inside the housing. Illustratively, a first cavity may be formed within the housing, referring back to fig. 2, showing the first cavity 240. At least a portion of the movable member 210 may be exposed to the outside of the first cavity 240, and the first camera 230 and the base 220 may be hidden inside the first cavity 240. Through above scheme, can realize the hidden measurement to the body survey score, equipment outward appearance is succinct, and user experience is better.
Illustratively, a light supplement lamp may be disposed in the housing to supplement light for the first camera 230. In the casing, for example in first cavity 240, can be provided with one or more light filling lamps for first camera 230 light filling, in order to ensure under the closed environment, the shooting picture is enough clear.
Based on the shell on the seat body forward flexion equipment, partial devices of the seat body forward flexion equipment can be hidden in the shell, so that the devices can be protected from being damaged, and the attractiveness of the seat body forward flexion equipment can be improved.
Illustratively, the seat body forward flexion device 200 may further include a communication interface, which is respectively connected to the first camera 230 and the processing device, and is configured to transmit the graphic code information scanned by the first camera to the processing device.
Referring back to fig. 2, communication interface 250 is shown. The communication interface 250 and the first camera 230 may be connected in a wired or wireless manner. The wired means may be optical fiber, coaxial cable, telephone line, network line, etc. The wireless means may be bluetooth, wireless local area network (WiFi), near Field Communication (NFC), etc. The communication interface 250 and the processing device may be connected by wire or wirelessly. The communication interface 250 is used for receiving the graphic code information scanned by the first camera 230 and transmitting the graphic code information to the processing device.
The existing sitting body forward bending equipment is single-machine equipment, has no networking capability and cannot automatically transmit and process ranging information. According to the embodiment of the application, the graphic code information scanned by the first camera can be automatically transmitted to the processing device through the communication interface to be processed by the processing device, so that the automation and intelligence degree of the seat body forward bending equipment can be further improved.
Illustratively, the communication interface 250 and the processing device may be connected in a wireless manner, and the first camera 230 and the communication interface 250 may be connected by a network cable.
Referring to fig. 2, the first camera 230 is shown connected to the communication interface 250 via a network cable. The first camera 230 is close to the communication interface 250, and the positions of the first camera and the communication interface are relatively fixed, so that the first camera and the communication interface can be connected in a wired manner, the transmission efficiency is high, and transmission errors are few. The communication interface 250 is connected with the processing device in a wireless manner, so that the communication interface 250 can transmit the graphic code information to the processing device in a wireless manner. This scheme has less restrictions on the arrangement position of the processing device, and facilitates arrangement of the processing device at any desired position, for example, by implementing the processing device as a remote processing device such as a cloud server.
Illustratively, the sitting body forward flexion device 200 may comprise a housing with the mesh wire concealed inside the housing.
Exemplarily, the housing may further have a second cavity, and the mesh wire may be hidden inside the second cavity.
Referring back to fig. 2, a second cavity 260 is shown. The second cavity 260 may be the same cavity as the first cavity 240, and the second cavity 260 may also be a part of the first cavity 240. In addition, the second cavity 260 and the first cavity 240 may also be two cavities independent of each other. The network cable connected between the first camera 230 and the communication interface 250 may be hidden inside the second chamber 260.
The net line is hidden in the shell, so that the net line and other devices can be prevented from being mixed together, the safety of the seat body forward bending equipment can be effectively guaranteed, and the attractiveness of the seat body forward bending equipment can be improved.
For example, a mobile charger (not shown) connected to the first camera 230 may be disposed on the movable member 210, and the mobile charger is used to supply power to the first camera 230. The seat body anteflexion equipment can also include the slide rail, and movable piece 210 moves along the slide rail, and the slide rail end is provided with wireless charging area, and when movable piece 210 was located wireless charging area department, wireless charging area can charge for removing the charger.
In one embodiment, a mobile charger may be disposed within the movable member 210, and the mobile charger is connected with the first camera 230 for charging the first camera 230. The movable member 210 is disposed on the slide rail and can move along the slide rail. Referring back to fig. 2, a slide rail 270 is shown. By way of example and not limitation, the slide rail 270 may be disposed within the second cavity 260 described above. The slide rail 270 may have a beginning and an end. The initiating terminal of slide rail 270 is the one end that is close to the physical examination personnel of separation when the physical examination personnel normally live body survey, and the end of slide rail 270 is the one end far away from the physical examination personnel. A wireless charging area may be provided at the end of the slide rail 270. The wireless charging device is arranged in the wireless charging area, and the movable piece 210 is pushed to the wireless charging area at the tail end of the sliding rail to charge the mobile charger.
In the charging device, the wireless charging area can charge the mobile charger, so as to ensure that the mobile charger can charge the first camera 230 in time, so that the first camera 230 is in a powered state, and further, the situation that the seat body forward bending equipment cannot be used due to the fact that the first camera 230 is not charged in time is avoided.
Through the mode of wireless charging, can be portable with seat body anteflexion equipment design, be convenient for carry anytime and anywhere and settle seat body anteflexion equipment in arbitrary place.
Exemplarily, the seat body anteflexion equipment can also include the light filling lamp that is used for first camera light filling 130, and the mobile charger still is connected with the light filling lamp, and further is used for charging for the light filling lamp.
In one embodiment, the seat body anteflexion equipment can also include in the casing for one or more light filling lamps of first camera light filling, and the removal charger not only can charge for first camera, still can charge for the light filling lamp, can guarantee that the light filling lamp in time obtains electric power from this to can be for first camera light filling at any time.
Illustratively, the seat body forward-flexion device may further comprise a sliding rail, and the movable piece may be slidably connected with the sliding rail. As described above, the seat body forward flexion device 200 may also include a sliding track 270. The movable member 210 may be slidably connected to the slide rail 270 so as to be capable of moving along the slide rail 270. Alternatively, the movable member 210 may be directly slidably connected to the slide rail 270. Optionally, the movable member 210 may also be connected with the sliding rail 170 through a bearing. In this case, the movable member 210 may be disposed on a carrier member, which is directly slidably connected with the slide rail 270, so that the movable member 210 can also move along the slide rail 270.
Illustratively, the seat body forward flexion device 200 may further include a bearing member, the movable member 210 and the first camera 230 are respectively disposed on two opposite sides of the bearing member, the bearing member is connected with the sliding rail 170, and a damping adjusting member is disposed on the bearing member, and the damping adjusting member abuts against the sliding rail 270, so as to adjust the damping between the bearing member and the sliding rail by adjusting the position of the damping adjusting member on the bearing member.
The irregular motions common to the seat forward flexion test include one-handed pushing, knee bending (i.e., leg bending), and jerking within a short time. The above non-normative actions may cause unreal final body test results and destroy the test standard. Wherein, the one-hand push plate violation and the leg curl violation can be determined in the fitness test process by the non-normative action determination mode described below. While a short thrust action can be prevented in advance by improving the mechanical structure of the seat body forward bending device. The prevention may be by increasing the resistance of the movable member to sliding to prevent it from being easily pushed away.
In one example, the movable member 210 and the first camera 230 may be disposed on one carrier, respectively. For example, the carrier may comprise two opposing faces, the movable piece 210 may be disposed on a first of the two opposing faces of the carrier, and the first camera 230 is disposed on a second of the two opposing faces of the carrier. Further, the carrier may be slidably connected to the slide rail 270. After the carrier is attached to the slide rail 270, two opposing faces of the carrier may partially envelop the slide rail 270, i.e., such that a first face of the carrier may be located on a first side of the slide rail 270, e.g., above the slide rail 270, and a second face of the carrier may be located on a second side of the slide rail 270, e.g., below the slide rail 270. Thus, the movable member 210 and the first camera 230 can be located above and below the sliding rail 270 (as shown in fig. 2).
A damping adjustment member may be provided at any position of the carrier, for example, on a side of the carrier parallel to the moving direction of the movable member 210. Illustratively, the damping adjustment may be a position adjustable component such as a knob. One end of the damping adjusting piece can abut against the sliding rail, and the other end of the damping adjusting piece can be grasped and rotated. When a user rotates the damping adjusting piece, the distance between the damping adjusting piece and the sliding rail can be changed, the closer the damping adjusting piece is to the sliding rail, the larger the pressure between the damping adjusting piece and the sliding rail is, and the smaller the pressure is otherwise. And the pressure between the damping adjusting part and the sliding rail is increased, so that the damping between the bearing part and the sliding rail is increased, and the moving resistance of the movable part is increased. Conversely, a decrease in the pressure between the damping adjustment member and the slide rail results in a decrease in the damping between the load bearing member and the slide rail, and thus a decrease in the resistance to movement of the movable member. Therefore, the magnitude of the resistance to movement of the movable member can be conveniently adjusted by the damping member. By adjusting the resistance to movement of the movable member to an appropriate level, the occurrence of a jerking motion in a short time can be effectively prevented.
A seat body forward flexion testing system according to another embodiment of the present application is described below. The system comprises the seat body forward bending equipment and auxiliary testing equipment. The auxiliary test equipment may include a central processing unit. The central processing device is communicably connected to the first camera 230 and is configured to receive the currently scanned graphic code information by the first camera 230 and determine a measurement result of the person to be measured based on the currently scanned graphic code information.
The central processing device may be implemented in hardware form of at least one of a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a microprocessor, and may be one or a combination of several of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), an Application Specific Integrated Circuit (ASIC), or other forms of processing units having data processing capability and/or instruction execution capability.
The central processing device may be connected to the first camera 230 via the communication interface 250 described above. The central processing means may be provided at any suitable location, either integrally with or separately from the seat body flexion device.
As described above, the graphic code corresponds to a specific physical location, so the central processing device can determine the current location of the movable member 210 through the currently scanned graphic code information, and further determine the physical measurement result of the physical measurement person.
Based on the seat body forward bending test system comprising the seat body forward bending equipment, the body test result of the body test person can be automatically determined in a mode that the first camera scans the graphic code. In this way, the image information scanned by the first camera can be automatically and intelligently acquired and identified without manual recording, so that the physical measurement result can be very easily stored and evidence can be obtained. In addition, compared with a simple infrared sensing detection mode, the body measurement mode based on the computer vision technology has small error and high accuracy.
The auxiliary test equipment can also comprise a second camera and/or a face recognition device. The second camera is arranged at a first target distance and a first target height around a target body measurement area of the seat body forward flexion equipment and is used for collecting images of body measurement personnel at the target body measurement area; the central processing device is connected with the second camera, and the central processing device judges whether the physical testing personnel have violation behaviors or not based on the image collected by the second camera. The face recognition device is arranged at a second target distance and a second target height around the target body measurement area and used for collecting a face image of the body measurement personnel and recognizing the identity of the body measurement personnel based on the face image.
Fig. 4 illustrates a schematic top view of a seat body forward flexion testing system 400 according to one embodiment of the present application. As shown in fig. 4, the seat body forward flexion testing system 400 includes the seat body forward flexion device 200. For ease of understanding, fig. 4 also shows a target volumetric region 402. The target anthropometric region is the region that a human being may appear when using the sitting body anteflexion device 200. The target physical measurement area may be detected and determined in advance, for example, where the target physical measurement area is located when the sitting body anteflexion device is shipped. The target physical measurement region can be a three-dimensional region with a certain length, width and height, namely, the horizontal position and the vertical position which can possibly occur to a physical measurement person can be detected and determined in advance.
The second camera may be disposed at a first target distance and a first target height around the target volumetric region. The distance between the second camera and the target volumetric region may be measured as the closest horizontal distance between the second camera and the target volumetric region. That is, the second camera is disposed at the first target distance around the target measurement region, which means that the closest horizontal distance between the second camera and the target measurement region is the first target distance. The height of the second camera may be measured as the height of the second camera from the ground. That is, the second camera is arranged at the first target height, which means that the height between the second camera and the ground is the first target height. For example, a second camera may optionally be disposed on the support 404 at a first target distance around the target body side region of the seat body flexion device and may be disposed at a first target height of the support 404. The support may be any suitable device capable of supporting, for example the support may be an upright. The first target distance may be set to any suitable distance as desired, and the first target height may be set to any suitable height as desired. Illustratively, the first target distance may be 1 to 3 meters from the target volumetric region and the first target height may be 50 centimeters to 2 meters above the ground. For example, the first target distance may be 1.5 meters and the first target height may be 70 centimeters. The central processing device may receive the image captured by the second camera. The second camera can acquire images in real time. The central processing device can judge whether the physical measurement personnel make a preparation action on the cushion according to the image collected by the second camera, and can judge whether the physical measurement personnel have an irregular action in the physical measurement process according to the image collected by the second camera. The central processing means may be provided at any suitable place, for example it may also be provided on the support 404.
The number and the positions of the second cameras can be set according to needs. For example, the number of the second cameras may be one or more, and in the case where the number of the second cameras is plural, the arrangement positions of the different second cameras may be the same or different. The arrangement of the second cameras is subject to the fact that the image acquisition ranges of all the second cameras can cover the target body measurement area, namely no matter how many second cameras are arranged, no matter where the second cameras are arranged, the second cameras can be arranged, and only the images of the complete target body measurement area can be acquired.
The existing sitting body forward bending equipment does not have the monitoring capability aiming at irregular actions and can not capture and judge the irregular actions of physical measuring personnel. According to the method and the device, whether the physical measurement personnel have irregular actions or not can be judged through the images collected by the second camera in the seat body forward bending test system, so that real-time and automatic irregular action judgment can be realized, and the physical measurement process of the physical measurement personnel can be better monitored.
The face recognition device may be disposed at a second target distance and a second target height around the target volumetric region. The distance between the face recognition device and the target volumetric region may be measured as the closest horizontal distance between the face recognition device and the target volumetric region. That is, the second target distance at which the face recognition device is disposed around the target body measurement region means that the closest horizontal distance between the face recognition device and the target body measurement region is the second target distance. The height of the face recognition device can be measured by the height of the face recognition device from the ground. That is, the face recognition device is disposed at the second target height, which means that the height between the face recognition device and the ground is the second target height. Illustratively, the face recognition device may optionally be disposed on the support 404 at a second target distance around the target body side region of the seat body flexion apparatus and may be disposed at a second target height of the support 404. The second target distance may be set to any suitable distance as desired, and the second target height may be set to any suitable height as desired. Illustratively, the second target distance may be 1 to 3 meters from the target volumetric region, and the second target height may be a height of 50 centimeters to 2 meters above the ground. For example, the second target distance may be 1.5 meters and the second target height may be 70 centimeters. The second target distance may be the same as or different from the first target distance. The second target height may be the same as or different from the first target height.
The face recognition device can be used for carrying out face recognition on the body testing personnel so as to judge the identity of the body testing personnel. Illustratively, the face recognition device may include one or more of a visible light camera, an infrared light camera, a structured light projector, and the like, and the face recognition device may collect one or more of a visible light image, an infrared light image, a structured light image, and the like of the person under examination, and may perform face recognition using the images. Optionally, the face recognition device may further include a living body detection device, and the living body detection device may perform living body judgment on the person under test based on the collected living body detection information, so that the face recognition device can determine the identity of the person under test further based on the result of the living body judgment. The false body attack behavior of the person in vivo can be prevented by the in vivo detection.
Alternatively, the central processing device may be connected to a face recognition device, and the central processing device may receive identity information of the physical measurement person from the face recognition device. The central processing device may further be configured to associate the physical measurement result of the physical measurement person with the identity information. For example, the central processing device may save the correlated physical measurement results and identity information in a storage device or further upload to a third party platform. The physical measurement result of the physical measurement personnel is associated with the identity information, so that the physical measurement result of any physical measurement personnel can be conveniently checked in the later period.
The existing seat body forward-bending equipment does not have an information input function and cannot automatically acquire identity information of physical measurement personnel. According to the embodiment, the human face recognition device can be used for recognizing the human face of the physical measurement personnel to confirm the identity information, so that the identity information of the physical measurement personnel can be automatically recognized, the intelligent degree of the seat body forward bending test can be further improved, and the manpower is reduced.
Exemplary, the auxiliary test apparatus may further include: and a support member. The second camera, the face recognition device and the central processing device are all arranged on the supporting piece.
In an embodiment, the first target distance and the second target distance may be equal. The auxiliary test equipment may also include supports 404, such as poles. The second camera and the face recognition device may both be disposed on the support 404, and the support 404 may be disposed at a first target distance or a second target distance around the target volumetric region. Optionally, a central processing device may also be provided on the support 404.
All set up second camera, face recognition device and central processing apparatus on same support piece 404, conveniently manage the device, help improving the integrated level of device to and improve user experience.
Exemplary, the auxiliary test apparatus may further include: at least one of the first communication interface, the second communication interface, and the output device.
In one embodiment, the auxiliary test equipment may include a first communication interface, the first communication interface may be configured to receive graphic code information scanned by the first camera, and the central processing unit is connected to the first communication interface and configured to receive the graphic code information from the first communication interface.
In the embodiment of the present application, the first communication interface may be a wired communication interface or a wireless communication interface, which is matched with the communication interface 250 on the seat body front flexion device. The first communication interface receives the graphic code information scanned by the first camera and then transmits the graphic code information to the central processing device connected with the other end of the first communication interface.
In one embodiment, the auxiliary test equipment may include a second communication interface, the second communication interface being connected to the central processing device for uploading the physical test results to a third party platform.
In an embodiment, the auxiliary test equipment may further comprise a second communication interface connected with the central processing device. The second communication interface may be a wired communication interface or a wireless communication interface. For example, the second communication Interface may be a standard network Interface (RJ 45), which may be connected to a campus network or other third party platform for uploading performance through a router and a switch, and may upload the physical testing result to the third party platform through an Application Programming Interface (API). An API may optionally be provided in the central processing device for uploading first physical testing information of the physical testing personnel. The first measurement information may include one or more of: identity information, action non-standard information, physical measurement results, and the like.
Based on the second communication interface, data can be uploaded to a third-party platform, so that the data can be saved for subsequent checking of scores.
In one embodiment, the auxiliary test apparatus may further include: and the output device is connected with the central processing device and is used for outputting physical measurement information (which can be called as second physical measurement information) of the physical measurement personnel, and the physical measurement information comprises one or more of the following items: identity information, action non-standard information, physical testing results, voice guide information and the like.
The output device may be any suitable device capable of outputting information including, but not limited to, a display screen and/or a speaker, etc.
The voice guide information is used for guiding the physical testing personnel to conduct physical testing. For example, the voice guidance information may include one or more of: information for guiding the physical measurement personnel to move to the front of the face recognition device for identity recognition; information guiding the physical measurement personnel to move to the target physical measurement area; guiding the physical testing personnel to make preparation action information; information for guiding the physical measurement personnel to start physical measurement; information of the guide body measuring personnel body measuring end, and the like.
Through various body survey information of output device output, make things convenient for body survey personnel or other supervisors in time to know the body and survey the score, know the nonstandard action of body survey personnel etc. also be convenient for carry out the pronunciation guide for body survey personnel, make things convenient for body survey personnel to be familiar with the body and survey the flow, human-computer interaction degree can be improved to this kind of scheme, is favorable to improving user experience.
Illustratively, the seat body forward flexion testing system may further comprise a seat cushion. Figure 4 shows a seat cushion 406. The seat cushion 406 may be used in conjunction with a seat body forward flexion device. Placing the seat cushion 406 in the proper position may also guide the physical testing person to the correct position for physical testing to some extent.
The following describes a fitness test method, an electronic device, a storage medium, and a computer program product according to embodiments of the present application. The physical fitness test method can be a method implemented by the processing device for receiving the graphic code acquired by the first camera. The processing device may be any processing device, such as a central processing device in a seat body forward flexion testing system, or the like.
Next, a fitness test method according to an embodiment of the present application will be described with reference to fig. 5. FIG. 5 shows a schematic flow diagram of a fitness test method 500 according to one embodiment of the present application. As shown in fig. 5, the fitness test method 500 includes steps S510 and S520.
Step S510, acquiring a graphic code image (which may be referred to as a target graphic code image) acquired by the first camera for a graphic code on the base of the seat body forward flexion device when the physical fitness test performed by the physical testing person based on the seat body forward flexion device is completed, where a plurality of graphic codes are arranged on the base along a moving path of the movable member, and the movable member of the seat body forward flexion device drives the first camera to move under the pushing of the physical testing person.
The structure and working principle of each device in the seat body forward bending device have been described above, and are not described herein again. Optionally, during the physical fitness test, the first camera may acquire a series of graphic code images in real time and transmit the graphic code images to the processing device. The processing device can determine the target graphic code image when the physical ability test is completed from the graphic code images. Optionally, during the physical fitness test, the first camera may acquire a series of graphic code images in real time and determine therefrom a target graphic code image at the completion of the physical fitness test, and then the first camera may transmit only the target graphic code image to the processing device. Alternatively, the first camera may collect only the target image code image when the physical fitness test is completed and transmit the image to the processing device.
Step S520, determining the physical measurement result of the physical measurement person based on the graphic code image; the physical test result comprises at least one of a physical test score, position information corresponding to the movable piece when the physical test is completed, and distance information of the movable piece pushed.
Based on the graphic code image obtained in step S510 when the physical fitness test is completed, the position where the movable member arrives and/or the distance of movement can be determined when the physical fitness test is completed, and then the physical measurement result of the physical measurer can be obtained.
According to the physical fitness test method, the achievement of the physical testers can be automatically calculated based on the graphic codes scanned by the first camera. In this way, the image information scanned by the first camera can be automatically and intelligently acquired and identified without manual recording, so that the physical measurement result can be very easily stored and evidence can be obtained. In addition, compared with a simple infrared sensing detection mode, the body measurement mode based on the computer vision technology has small error and high accuracy.
In addition, if coordinate values of different physical positions are directly marked on the substrate and numbers of the coordinate values are recognized using a character recognition (OCR) technique, when only some numbers exist at an edge portion of the captured image, the numbers at the edge portion are easily recognized by mistake, thereby causing an error in the test result. And if the physical position is represented by adopting the pattern code mode, if only partial pattern codes exist at the edge of the image, the result cannot be identified, so that errors caused by error identification of coordinates are avoided, and the accuracy of the body measurement result is improved.
Illustratively, the fitness test method according to the embodiment of the present application may be implemented in a device, apparatus or system having a memory and a processor.
The physical fitness test method according to the embodiment of the application may be deployed at an image acquisition end, for example, at a sitting body forward bending device with an image acquisition function.
Alternatively, the fitness test method according to the embodiment of the present application may also be distributively deployed at a server side (or a cloud side) and a terminal. For example, the physical fitness test can be performed by collecting the graphic code at the seat body forward flexion device, transmitting the collected graphic code to the server (or cloud) and performing the physical fitness test by the server (or cloud).
Illustratively, determining the physical measurement result of the physical measurement person based on the graphic code image comprises: determining the position reached by the movable element and/or the moving distance based on the corresponding position information of the positioning line in the graphic code image; determining a physical measurement result based on the position reached and/or the distance moved by the movable member; the positioning line is a straight line which is positioned at a specified position in the picture of the first camera, and the extending direction of the positioning line is vertical to the moving direction of the movable piece; alternatively, the positioning line is determined based on projecting the side of the movable member facing the person on the body on the screen of the first camera.
Referring back to fig. 3, an exemplary locating line 310 is shown. The positioning line may be a virtual line whose position relative to the first camera is fixed, i.e. the position in the picture captured by the first camera is fixed. During the moving process of the movable piece, the collected graphic code image is changed, and the position of the positioning line is fixed, so that the physical position represented by the graphic code overlapped with the positioning line can be changed in different images. The physical position indicated by the graphic code overlapping the positioning line may be used as the physical position corresponding to the positioning line with reference to the positioning line. It can be understood that, because the picture of the first camera of location line relatively is fixed, first camera moves along with movable piece simultaneously, therefore, the difference between the physical position that the location line corresponds in the target figure code image that gathers when the physical ability test was accomplished and the physical position that the location line corresponds in the initial figure code image that gathers when the physical ability test began in the physical ability image, and the difference between the physical position of movable piece when the physical ability test was accomplished and the physical position of movable piece when the physical ability test began, these two are unanimous. That is, the position reached by the movable member and/or the distance moved by the movable member at the time of completion of the physical fitness test can be determined by the image information corresponding to the positioning line at the time of completion of the physical fitness test. In an example, the initial image acquired by the first camera does not include a positioning line, and the positioning line may be superimposed in the initial image by a processor of the first camera, so that the image code output by the first camera to the processing device will have the positioning line. In another example, the first camera may output a graphical code image without location lines to the processing means, which superimposes the location lines on the received graphical code image. Of course, neither the first camera nor the processing device may superimpose the positioning line, as long as the processing device is able to determine the position of the positioning line on each graphic code image.
The position of the location line may be specified, which may be at any suitable position in the picture of the first camera. In a preferred embodiment, the positioning line is a line determined based on projecting the side of the movable member facing the physical measurement person in the frame of the first camera. Referring back to fig. 2, the side a' of the movable member facing the physical measurement person is shown in dashed lines. By extending the side surface a' and forming a projection on the substrate or on the image captured by the first camera, a projection line parallel to the x-axis (the definition of the x-axis will be described later) can be obtained, which can be used as a positioning line. The projection line of the surface contacted by the hand of the physical measuring person is used as the positioning line, and the position reached by the movable piece and/or the moving distance are determined based on the positioning line, so that the position information and/or the distance information obtained by measurement are accurate.
It can be understood that, no matter how the positioning line is arranged, the distance between the physical position corresponding to the positioning line in the target graphic code image at the end of the physical fitness test and the physical position corresponding to the positioning line in the initial graphic code image at the beginning of the physical fitness test is equal to the moving distance of the movable member. Further, in a case where the positioning line is a line (may be referred to as a projection line) determined based on projecting the side of the movable member facing the physical measurement person on the screen of the first camera, the physical position of the positioning line corresponding to the target pattern code image at the end of the physical fitness test is equal to the position that the movable member reaches at the end of the physical fitness test. In the case that a specific distance exists between the positioning line and the projection line, the position reached by the movable member at the end of the physical fitness test can be calculated and obtained by summing or subtracting the physical position corresponding to the positioning line in the target graphic code image at the end of the physical fitness test and the specific distance.
Exemplarily, the determining the position reached by the movable element and/or the moving distance based on the position information corresponding to the positioning line in the image of the graphic code comprises: under the condition that the positioning line falls on any graphic code, determining a target graphic code positioned before or after a current graphic code from the graphic code image, wherein the current graphic code is the graphic code on which the positioning line falls; the position reached and/or the distance moved by the movable member is determined based at least on the current graphic code and the target graphic code.
Note that on the base, the front and the rear may be defined based on the moving direction of the movable member during the physical fitness test. For example, according to the advancing direction of the movable member, a position appearing earlier is regarded as preceding, and a position appearing later is regarded as succeeding. Similarly, whether the current graphic code is located before or after the current graphic code can be defined based on the moving direction of the movable member in the physical fitness test process. For example, a graphic code appearing earlier than the current graphic code in the advancing direction of the movable member may be defined as a graphic code located before the current graphic code, and a graphic code appearing later than the current graphic code in the advancing direction of the movable member may be defined as a graphic code located after the current graphic code.
The current graphic code is a graphic code on which a positioning line falls, and the recognition accuracy of the current graphic code is relatively lower than that of the graphic codes on two sides. Therefore, the current position of the movable part is indirectly calculated based on the graphic codes before or after the current graphic code, and the measurement result in the mode can be more accurate.
Illustratively, determining the position reached and/or the distance moved by the movable member based on at least the current graphic code and the target graphic code comprises: calculating the number of pixels between the positioning line and a target edge of the current graphic code, wherein the target edge is any edge parallel to the positioning line; calculating the distance from the positioning line to the target edge based on the number of pixels between the positioning line and the target edge; calculating a physical position corresponding to the positioning line at least based on the distance from the positioning line to the target edge, the interval between the current graphic code and the target graphic code and the physical position represented by the target graphic code; based on the physical position to which the location line corresponds, the position reached and/or the distance moved by the movable member is determined. The target edge may be any one of two edges of the current graphic code parallel to the positioning line, which may be a preceding edge or a succeeding edge. The front and back of the edges may also be understood with reference to the definitions of the front and back orientations described above, which are not repeated here.
Since the relationship between the size of the graphic code and the actual physical size is known, the actual size of a square two-dimensional code is 1cm × 1cm, for example. In addition, the number of pixels occupied by each graphic code in the image captured by the first camera is also known. Therefore, the distance represented by each pixel on the graphic code in the graphic code image can be calculated and obtained, and the distance is constant. The distance of the locating line to the target edge can also be calculated by the number of pixels between the locating line and the target edge. The rest of the information, such as the physical location corresponding to each graphic code, the interval between every two graphic codes, and the side length of each graphic code, etc., are known. The physical position corresponding to the location line can be calculated from this information.
For accurate determination of the distance, a coordinate system may be defined. For example, a direction parallel to the moving direction of the movable member in the graphic code image is set as the y-axis, and a direction perpendicular to the y-axis is set as the x-axis. In the case where the graphic code is a two-dimensional code, the positions of the graphic code on the x-axis and the y-axis may be identified based on the positioning blocks on the graphic code. It can be understood that when length information such as the interval between graphic codes, the side length of the graphic codes, the distance between the positioning line and the side of the graphic codes, and the like is calculated, the length in the y-axis direction is calculated.
Exemplarily, calculating the physical position corresponding to the positioning line based on at least the distance from the positioning line to the target edge, the interval between the current graphic code and the target graphic code, and the physical position represented by the target graphic code includes: under the condition that the target graphic code is a graphic code positioned in front of the current graphic code and the target edge is an edge in front of the positioning line, determining a first sum of a physical position represented by the target graphic code, a first distance, an interval between the current graphic code and the target graphic code and a distance from the positioning line to the target edge, and determining the first sum as the physical position corresponding to the positioning line; under the condition that the target graphic code is a graphic code positioned in front of the current graphic code and the target edge is an edge behind the positioning line, determining a physical position, a first distance, an interval between the current graphic code and the target graphic code and a second sum of the edge length of the current graphic code, which are represented by the target graphic code, determining a first difference between the second sum and the distance from the positioning line to the target edge, and determining the first difference as a physical position corresponding to the positioning line; under the condition that the target graphic code is a graphic code behind the current graphic code and the target edge is an edge in front of the positioning line, determining a second difference value between a physical position represented by the target graphic code and a second distance, an interval between the current graphic code and the target graphic code and a second difference value of the edge length of the current graphic code, determining a third sum value between the second difference value and the distance from the positioning line to the target edge, and determining the third sum value as the physical position corresponding to the positioning line; under the condition that the target graphic code is the graphic code behind the current graphic code and the target edge is the edge behind the positioning line, determining a third difference value between the physical position represented by the target graphic code and the second distance, the interval between the current graphic code and the target graphic code and the distance from the positioning line to the target edge, and determining the third difference value as the physical position corresponding to the positioning line; the first distance is the distance between a position scale mark corresponding to the physical position represented by any graphic code and the rear edge of the graphic code, and the second distance is the distance between a position scale mark corresponding to the physical position represented by any graphic code and the front edge of the graphic code.
The graphical code may represent a certain physical position, that is to say the graphical code may correspond to a certain cursor scale (herein referred to as position tick mark). Because the graphic code has a certain width, the alignment position of the graphic code and the position scale mark can be set according to the requirement. In one example, a certain edge of each graphic code may be aligned with a position tick mark. For example, the leading edge of a graphic code representing 16cm may be aligned with the position tick mark 16. Referring to fig. 3, a plane a in which a side in front (the left-right direction in fig. 3 corresponds to the front-back direction on the base, and is also the y-axis direction) of a graphic code representing 16cm is shown by a dotted line. In addition, in fig. 3, the value "16" of the corresponding position tick mark is also shown above the graphic code. As shown in fig. 3, the leading edge a of the graphic code is aligned with the corresponding position tick mark "16". In another example, the center line of each graphic code may also be aligned with the position tick mark. In yet another example, the trailing edge of each graphic code may also be aligned with a position tick mark. Of course, the graphic code may be aligned with the position graduation mark in other suitable manners, which are not listed here. It can be understood that when the position graduation line corresponding to the graphic code is different from the alignment position of the graphic code, the distance between the position graduation line and the rear edge and the front edge of the graphic code also changes. In the case where the alignment position of the position graduation mark and the graphic code is determined, the first distance and the second distance may also be determined in advance.
The following describes a position determination manner of the movable member of the present application, taking as an example that the preceding edge of each graphic code is aligned with the corresponding position scale mark. It can be understood that, in the case that the preceding edge of each graphic code is aligned with the corresponding position graduation mark, the first distance is equal to the edge length of the two-dimensional code, and the second distance is equal to 0.
In one embodiment, the graphic code may be a two-dimensional code, and the first camera scans three two-dimensional codes in total. Referring to fig. 3, three scanned two-dimensional codes C1, C2, and C3 are shown. In this example, the current two-dimensional code is the two-dimensional code C2 located at the middle position among the three two-dimensional codes. The two-dimensional code C1 preceding the current two-dimensional code can be selected from the three two-dimensional codes as the target two-dimensional code. Assume that the physical position obtained by scanning the target two-dimensional code C1 is 16cm. Further, the number of pixels between the positioning line and the target edge B1 of the current two-dimensional code, i.e., the edge closer to the target two-dimensional code C1, can be calculated. These pixels are then converted to distances, assuming 6mm after conversion. And finally, the physical position corresponding to the positioning line can be determined by adding the distance 1cm between the two adjacent two-dimensional codes and the side length 1cm (namely the first distance) of the two-dimensional codes. Namely, the physical position corresponding to the positioning line is 16+0.6+ 1=18.6cm. Similarly, if the target two-dimensional code selects the two-dimensional code C3 behind the current two-dimensional code, it is assumed that the physical position obtained by scanning the target two-dimensional code C3 is 20cm. Then, the number of pixels between the positioning line and the target edge B2 of the current two-dimensional code, i.e., the edge closer to the target two-dimensional code C3, is calculated. These pixels are converted into distances, and assumed to be 4mm after conversion. Finally, the distance between two adjacent two-dimensional codes is subtracted from the physical position corresponding to the target two-dimensional code C3 by 1cm, and the distance between the positioning line and the target edge B2 by 4mm (since the second distance is 0, this is equivalent to the second distance being also subtracted here). Namely, the physical position corresponding to the locating line is 20-0.4-1=18.6cm. In any case, from either direction, the physical position to which the location line corresponds can be determined to be the position at 18.6cm, and thus the position reached and/or the distance moved by the movable member can be determined.
Different ways of determining the position by selecting different graphic codes as the target graphic codes and different edges of the current graphic code as the target edges are different, and those skilled in the art can understand the way of determining the position under various conditions by combining the examples described above, which is not described in detail herein.
Illustratively, determining the position reached by the movable element and/or the moving distance based on the position information corresponding to the positioning line in the graphic code image comprises: in the case that the positioning line does not fall on any graphic code, determining a target graphic code which is positioned before or after the positioning line from the graphic code image; the position reached and/or the distance moved by the movable member is determined based on the target pattern code.
In the case where the position line does not fall on any graphical code, the position reached and/or the distance moved by the movable member can be determined directly based on the target graphical code before or after the position line.
Illustratively, determining the position reached and/or the distance moved by the movable member based on the target graphic code comprises: calculating the number of pixels between the positioning line and a target edge of the target graphic code, wherein the target edge is any edge parallel to the positioning line; calculating the distance from the positioning line to the target edge based on the number of pixels between the positioning line and the target edge; calculating a physical position corresponding to the positioning line based on the distance from the positioning line to the target edge and the physical position represented by the target graphic code; based on the physical position to which the location line corresponds, the position reached and/or the distance moved by the movable member is determined.
The target edge is any one of two edges of the target graphic code parallel to the orientation line. The above has described the calculation method of the distance between the positioning line and a certain target edge and the meaning of determining the physical position corresponding to the positioning line by combining the distance, which is not described herein again.
Exemplarily, calculating the physical position corresponding to the positioning line based on the distance from the positioning line to the target edge and the physical position represented by the target graphic code includes: under the condition that the target graphic code is a graphic code positioned in front of the positioning line and the target edge is an edge farther away from the positioning line, determining a fourth sum of the physical position represented by the target graphic code and the distance from the positioning line to the target edge, determining a fourth difference of the fourth sum and the second distance, and determining the fourth difference as the physical position corresponding to the positioning line; under the condition that the target graphic code is a graphic code positioned in front of the positioning line and the target edge is an edge closer to the positioning line, determining a fifth sum of the physical position represented by the target graphic code, the first distance and the distance from the positioning line to the target edge, and determining the fifth sum as the physical position corresponding to the positioning line; under the condition that the target graphic code is the graphic code behind the positioning line and the target edge is the edge farther away from the positioning line, determining a sixth sum of the physical position represented by the target graphic code and the first distance, determining a fifth difference between the sixth sum and the distance from the positioning line to the target edge, and determining the fifth difference as the physical position corresponding to the positioning line; determining a sixth difference value between the physical position represented by the target graphic code and the second distance and between the positioning line and the target edge under the condition that the target graphic code is the graphic code behind the positioning line and the target edge is the edge closer to the positioning line, and determining the sixth difference value as the physical position corresponding to the positioning line; the first distance is the distance between a position scale mark corresponding to the physical position represented by any graphic code and the rear edge of the graphic code, and the second distance is the distance between a position scale mark corresponding to the physical position represented by any graphic code and the front edge of the graphic code.
Still another position determination manner of the movable member of the present application is described below by taking as an example that the preceding edge of each graphic code is aligned with the corresponding position scale mark. It can be understood that, in the case that the preceding edge of each graphic code is aligned with the corresponding position graduation mark, the first distance is equal to the edge length of the two-dimensional code, and the second distance is equal to 0.
In one embodiment, the graphic code may be a two-dimensional code, and the first camera scans three two-dimensional codes in total. Referring to fig. 3, three scanned two-dimensional codes C4, C5, and C6 are shown. In this example, the locating line 310' does not fall on any two-dimensional code. The two-dimensional code C4 located before the alignment line can be selected from the three two-dimensional codes as the target two-dimensional code. It is assumed that the physical position obtained by scanning the target two-dimensional code C4 is 30cm. Further, the number of pixels between the orientation line and the target side B3 of the target two-dimensional code, i.e., the side closer to the orientation line, can be calculated. These pixels are then converted to distances, assuming 3mm after conversion. And finally, the physical position corresponding to the positioning line can be determined by adding the two values to the side length 1cm (namely the first distance) of the two-dimensional code. Namely, the physical position corresponding to the positioning line is 30+0.3+1=31.3cm. Similarly, if the target two-dimensional code is the two-dimensional code C5 located behind the positioning line, it is assumed that the physical position obtained by scanning the target two-dimensional code C5 is 32cm. The number of pixels between the orientation line and the target edge B4 of the target two-dimensional code, i.e., the edge closer to the orientation line, is then calculated. These pixels are converted into distances, and are assumed to be 7mm after conversion. Finally, the distance 7mm between the positioning line and the target edge B4 is subtracted from the physical position corresponding to the target two-dimensional code C5 (since the second distance is 0, this is equivalent to subtracting the second distance). That is, the physical position corresponding to the location line is 32-0.7=31.3cm. In any case, from either direction, the physical position to which the location line corresponds can be determined to be the position at 31.3cm, and thus the position reached and/or the distance moved by the movable member can be determined.
According to the embodiment of the application, the first camera at least acquires three graphic codes at a time. At least three graphic codes are collected each time, so that the graphic codes positioned in front of or behind the current graphic code can be conveniently selected as the target graphic code to determine the physical position corresponding to the positioning line. As described above, the scheme of determining the physical position corresponding to the positioning line by combining the graphic codes before or after the current graphic code is more accurate.
According to an embodiment of the application, the method further comprises: judging whether the graphic code image acquired by the first camera is unchanged in a first target time period in real time; and if the graphic code image acquired by the first camera is not changed in the first target time period, determining that the physical testing of the physical testing personnel is completed.
The first target period may be set to any suitable duration as desired. The graphical code image does not change within the first target time period, indicating that the physical testing person has pushed the movable member to the furthest position, at which point the physical testing of the physical testing person may be completed based on the determination.
According to the embodiment of the application, before determining the physical measurement result of the physical measurement person based on the graphic code image, the method further comprises the following steps: and carrying out distortion correction processing on the graphic code image.
The graphic code image collected by the first camera may have distortion due to shooting distance and the like. In this case, the distorted image may be corrected by using the internal reference and the external reference of the first camera, and then the position information indicated by the graphic code in the corrected image may be analyzed to determine the corresponding body measurement result.
Illustratively, the fitness test method 500 may further include: receiving an image of a person to be measured acquired by a second camera, wherein the second camera is arranged at a first target distance and a first target height around a target body measurement area of the seat body forward-flexion equipment and is used for acquiring the image of the person to be measured in the target body measurement area; carrying out human body posture recognition on the image acquired by the second camera so as to determine the positions of a plurality of key points; and judging whether the physical testing personnel have irregular actions or not based on the positions of the key points.
In one embodiment, the second camera is fixed at a first target distance and a first target height around a target body side region of the seat body flexion device. The arrangement mode and the image acquisition mode of the second camera have been described above, and are not described herein again.
In the technical scheme, whether the physical measurement personnel have nonstandard actions or not can be judged through the image collected by the second camera, so that real-time and automatic nonstandard action judgment can be realized, and the physical measurement process of the physical measurement personnel can be better monitored.
In one embodiment, the image collected by the second camera may be subjected to human posture recognition first to confirm the posture of the physical testing person during the physical testing. For example and without limitation, the image collected by the second camera may be preprocessed and then gesture recognition may be performed on the preprocessed image. The pre-processing may include such things as scaling, color space conversion, etc. Gesture recognition may be implemented using any existing or future emerging human bone detection model. That is, the positions of human-related bone points, i.e., key points, can be detected by the human bone detection model. The plurality of key points detected by the human skeleton detection model can at least comprise 12 main key points, and the 12 main key points can comprise a left foot, a right foot, a left knee, a right knee, a left hip, a right hip, a left hand, a right hand, a left elbow, a right elbow, a left shoulder and a right shoulder. In addition to the above primary keypoints, the human bone detection model may optionally detect any number of other additional keypoints.
The scheme of judging whether the posture accords with the irregular action or not through the positions of the key points of the human body is simple to realize, the posture judgment is accurate, and the physical measurement process of the physical measurement personnel is convenient to monitor rapidly and in real time.
Illustratively, determining whether the physical testing person has an irregular action based on the locations of the plurality of key points comprises: determining that the physical testing person has irregular motion of the single-handed pusher if a first condition is met, the first condition comprising one or more of: absence of keypoints on either arm; the distance between the left-hand key point and the right-hand key point on the first coordinate axis is greater than the first distance; the distance between the left-hand key point and the right-hand key point on the second coordinate axis is smaller than the second distance; the distance between the left elbow key point and the right elbow key point on the second coordinate axis is smaller than the third target distance; the first coordinate axis is a coordinate axis parallel to the moving direction of the movable member, the second coordinate axis is a coordinate axis perpendicular to the first coordinate axis, and a plane on which the first coordinate axis and the second coordinate axis are located is parallel to the horizontal plane.
The first condition may include: the keypoints on either arm are missing. Either side of the arm may include a limb from the shoulder to the hand of that side. Accordingly, the keypoints for either arm may include the shoulder keypoints, elbow keypoints, and hand keypoints for that side. Whether the key points of the arms on the 2 sides appear simultaneously can be judged, if the key points of the arms on one side only appear, the nonstandard action of the single-hand push plate can be directly judged. If the key points appear on both arms, the first condition may further include one or more of the following judgment conditions. First, the distance between the left-hand key point and the right-hand key point on the first coordinate axis is greater than the first target distance. The first coordinate axis is a coordinate axis parallel to the moving direction of the movable member, and may be represented by a y-axis. The first target distance may be any suitable size, which is not limited in this application. In one example, the first target distance is 50 pixel points, and if the distance between the left-hand key point and the right-hand key point on the y axis is greater than 50 pixel points, it can be determined that the non-standard motion of the single-hand push plate exists. And secondly, the distance between the left-hand key point and the right-hand key point on the second coordinate axis is smaller than the second target distance. The second coordinate axis is a coordinate axis perpendicular to the first coordinate axis and may be represented by an x-axis. The second target distance may be any suitable size, which is not limited in this application. In one example, the second target distance is 100 pixel points, and if the distance between the left-hand key point and the right-hand key point on the x axis is less than 100 pixel points, it can be determined that the non-standard motion of the single-hand push plate exists. And thirdly, the distance between the left elbow key point and the right elbow key point on the second coordinate axis is smaller than the third target distance. The third target distance may be any suitable size, which is not limited in this application. In one example, it is assumed that the third target distance is also 100 pixel points, i.e., if the distance between the left elbow keypoint and the right elbow keypoint on the x-axis is less than 100 pixel points, it can be determined that there is an irregular motion of the single-handed push plate.
In the technical scheme, whether the unnormalized motion of the single-hand push plate exists or not is judged by judging the distance of the arm key point on the first coordinate axis and the second coordinate axis, so that the accuracy of unnormalized motion judgment can be improved.
Illustratively, determining whether the physical testing person has an irregular action based on the locations of the plurality of key points comprises: determining that the physical testing person has irregular leg curl if a second condition is met, the second condition comprising one or more of: the first included angle is larger than the first target included angle; the second included angle is greater than the second target included angle. The first included angle is an included angle between a connecting line from the key point of the left hip to the key point of the left knee and a connecting line from the key point of the left foot to the key point of the left knee, and the second included angle is an included angle between a connecting line from the key point of the right hip to the key point of the right knee and a connecting line from the key point of the right foot to the key point of the right knee.
The first target included angle and the second target included angle can be set to be any suitable size according to needs, and the application does not limit the size. Either of the first target angle and the second target angle may be, for example, 5 °,8 °, 10 °, 15 °, and so on. The first target angle and the second target angle may be the same or different.
In one embodiment, the keypoint coordinates returned by the human bone detection model may be resolved. For example, coordinates of key points of the left hip, knee and foot may be obtained, and then a visual matrix may be reconstructed in the matrix using the coordinate points, i.e., a correlation model or skeletal model of the key points of the left hip, knee and foot may be established. Similarly, the coordinates of the key points of the right hip, the right knee and the right foot can be obtained, and a visual matrix is reconstructed in the matrix by using the key points through the same method, namely, a correlation model or a skeleton model of the key points of the right hip, the right knee and the right foot is established. A first angle between a connecting line from the left hip key point to the left knee key point and a connecting line from the left foot key point to the left knee key point, or a second angle between a connecting line from the right hip key point to the right knee key point and a connecting line from the right foot key point to the right knee key point may be calculated based on the established model. If the first included angle is larger than the first target included angle or the second included angle is larger than the second target included angle, the fact that the physical testing person has leg bending action can be indicated, and violation is judged; otherwise, the system can be judged to be in compliance.
In the technical scheme, the included angle of the leg part can be judged through the coordinate of the bone point, so that whether the physical testing personnel have irregular leg bending actions or not is judged, whether the physical testing performance of the physical testing personnel is effective or not is supervised, and the testing accuracy of the physical testing performance can be improved.
Exemplary, the physical fitness test method may further include: and under the condition that the physical testing personnel is determined to have irregular actions, outputting alarm information and/or judging that the physical testing result is invalid.
If the situation that the physical testing person has irregular actions is determined, the achievement can be judged to be invalid. At this time, optionally, the achievement invalidity can be broadcasted in a voice mode, and the reason of the achievement invalidity can be broadcasted. In addition, the null results may optionally be uploaded to a third party platform, such as a campus network or a school educational network. Optionally, in the case that it is determined that the physical testing personnel has irregular actions, alarm information may also be output. The warning information may include one or more of video, image, voice, light, etc. The alarm information may be output through an output device. The output devices may include, but are not limited to, one or more of a display screen, a speaker, a warning light, and the like.
Through the mode, under the condition that the physical measurement personnel have irregular actions, the alarm information can be output in time to give an alarm so as to prompt the physical measurement personnel or the supervisor to carry out physical measurement again or carry out other operations. And the achievement can be judged to be invalid in time under the condition that the physical testing personnel do not standardize actions.
Illustratively, the fitness test method 500 may further include: acquiring identity information of physical testing personnel; after the physical testing of the physical testing personnel is completed, the physical testing execution information of the physical testing personnel is associated with the identity information of the physical testing personnel and stored in the storage device, wherein the physical testing execution information comprises one or more of the following items: action non-normative information; measuring the result; videos or images collected by a person under physical examination in the physical examination process; the method comprises the steps of acquiring videos or images aiming at graphic codes in the body measurement process; there are images taken by the individual during irregular movements.
The manner of obtaining identity information of the physical person may be arbitrary. In one example, the identity information of the physical person may be automatically recognized by a face recognition device. In another example, identity information of the physical testing person may be input by the user through an input device. The user can be any person such as a physical testing person or a supervisor.
The physical measurement result of the physical measurement person can be represented by the physical position of the movable member when the physical measurement person pushes the movable member to the farthest position. After the physical testing of the physical testing personnel is completed, if the physical testing personnel has irregular actions, the information of the irregular actions can be recorded. The action non-specification information may include category information of the non-specification action (e.g., whether the non-specification action belongs to a one-handed push plate or a leg curl) and/or time information of the occurrence of the non-specification action, and the like. In addition, if the physical testing personnel have irregular actions, the images of the physical testing personnel when the irregular actions occur can be optionally captured through the second camera to obtain one or more captured images, and the images are stored. In addition, whether nonstandard actions exist or not, the actions of the physical measurement personnel in the physical measurement process can be recorded through the second camera in the whole process, videos collected by the physical measurement personnel in the physical measurement process are obtained, or random or periodic snapshot is carried out on the physical measurement personnel, and one or more static images of the physical measurement personnel are obtained. In addition, a video or one or more still images can be acquired by the first camera aiming at the graphic code in the body measurement process, and the video or the images acquired aiming at the graphic code can be stored.
The above actions do not specify information; measuring the result; videos or images collected by a person for physical measurement in the process of physical measurement; the method comprises the steps of acquiring videos or images aiming at graphic codes in the body measurement process; images and the like captured by the body test personnel when irregular actions exist can be regarded as body test execution information, and the information can be stored in association with identity information, so that the body test personnel can be conveniently read and checked at any time in the later period.
Illustratively, the method further comprises: receiving identity information of a target person; searching in a storage device based on the identity information of the target person; and outputting at least one part of the searched physical testing execution information associated with the identity information of the target physical testing personnel.
The identity information of the target physical measurement personnel can be automatically acquired in a face recognition mode or input by a user. When the physical measurement execution information is output, at least a part of the physical measurement execution information can be output, and specific output information can be set according to needs. For example, if one wants to view the irregular motion of the target anthropometric person, only the irregular information of the motion thereof and the snapshot image when the irregular motion exists may be output.
Through the scheme, when the physical measuring personnel make objections to the final result or the irregular action, evidence can be obtained through modes of replaying images or replaying videos and the like.
Illustratively, the fitness test method 500 may further include: receiving identity information of the physical measurement personnel output by a face recognition device, wherein the face recognition device is arranged at a second target distance and a second target height around a target physical measurement area of the seat body forward bending equipment and is used for acquiring a face image of the physical measurement personnel and recognizing the identity of the physical measurement personnel based on the face image; outputting guiding information to indicate the physical measuring personnel to reach the target physical measuring area; and judging whether the physical measurement personnel make a preparation action or not based on the image acquired by the second camera, and outputting a physical measurement starting instruction under the condition of making the preparation action, wherein the second camera is arranged at a first target distance and a first target height around the target physical measurement area and is used for acquiring the image of the physical measurement personnel in the target physical measurement area.
In one embodiment, at the beginning of the fitness test, the physical testing person may walk to the face recognition device for face recognition. The face recognition device collects the face image of the person under physical examination and determines the identity of the person under physical examination after the face image is subjected to face recognition. If the current person is confirmed to be the person with known identity, the voice broadcast of the 'xxx (name of person under test) directory success' can be carried out through the loudspeaker on the face recognition device. The physical measurement person may then optionally be further voice guided to move to the target physical measurement area through a speaker on the face recognition device and guided to make a preparatory action. After the current physical measurement personnel make a preparation action, a physical measurement starting instruction can be broadcasted through a loudspeaker voice on the face recognition device so as to indicate the physical measurement personnel to start physical measurement of the forward bending of the sitting body. Although the manner of outputting the guidance or instruction information is described above by taking a speaker on the face recognition device as an example, it is only an example, and the above various kinds of guidance or instruction information may be output by any suitable output device, which may be integrated in the face recognition device or may be independent of the face recognition device.
For example, after the physical measurement of the physical measurement person is completed, the physical measurement result of the physical measurement person may be further associated with the identity information of the physical measurement person. For example, the correlated physical measurements and identity information may be saved in a storage device or further uploaded to a third party platform. The physical measurement result of the physical measurement personnel is associated with the identity information, so that the achievement of any physical measurement personnel can be conveniently checked in the later period.
In the technical scheme, the human face recognition is carried out on the physical testing personnel through the human face recognition device so as to confirm the identity information, and the physical testing personnel are guided to carry out physical testing through the output guide information, so that the intelligent degree of physical testing can be improved, and the user experience is better.
Illustratively, determining whether the physical testing person makes a preparatory action based on the image captured by the second camera comprises: judging whether the left arm is straightened or not based on the coordinates of the left hand key point, the left elbow key point and the left shoulder key point; judging whether the right arm is straight or not based on the coordinates of the right hand key point, the right elbow key point and the right shoulder key point; judging whether the left leg is straightened or not based on the coordinates of the left foot key point, the left knee key point and the left hip key point; judging whether the right leg is straightened or not based on the coordinates of the right foot key point, the right knee key point and the right hip key point; judging whether the left hand falls on the movable piece or not based on the coordinates of the key points of the left hand; judging whether the right hand falls on the movable piece or not based on the coordinates of the key points of the right hand; judging whether the left foot falls on the pedal plate or not based on the coordinates of the key points of the left foot; judging whether the right foot falls on the pedal plate or not based on the coordinates of the key point of the right foot; under the condition that the left arm, the right arm, the left leg and the right leg are straightened, the left hand and the right hand fall on the movable piece, and the left foot and the right foot fall on the pedal, the physical measurement person is determined to make a preparation action.
Judging whether the left arm is straight can be achieved in a manner similar to the manner of judging whether the left leg or the right leg is straight. For example, it may be determined whether a third angle between a connecting line from the left shoulder key point to the left elbow key point and a connecting line from the left hand key point to the left elbow key point is greater than a third target angle, and if so, it is determined that the left arm is not straight, otherwise, it is determined that the left arm is straight. Similarly, it may be determined whether a fourth angle between a connecting line from the right shoulder keypoint to the right elbow keypoint and a connecting line from the right hand keypoint to the right elbow keypoint is greater than a fourth target angle, and if so, it is determined that the right arm is not straight, otherwise, it is determined that the right arm is straight. The third target angle and the fourth target angle may be set to any suitable size as required, and the application does not limit this. In addition, the third target angle and the fourth target angle may be the same or different. Whether judge the left side shank and straighten and judge whether the right side shank straightens and can judge based on above-mentioned first contained angle and second contained angle, and it is no longer repeated here.
Further, it is also possible to detect whether the left and right hands fall on the movable member and whether the left and right feet fall on the foot pedal. Illustratively, the area in which the movable member and the foot pedal are located may be detected by object detection techniques. In addition, the distance between the coordinates of the key points of the left hand and the area where the movable member is located can be determined, and if the distance is less than a certain threshold, it can be determined that the left hand is located on the movable member. Similarly, the distance between the coordinates of the right-hand keypoints and the area in which the movable member is located can be determined, and if the distance is less than a certain threshold, it can be determined that the left hand falls on the movable member. Similarly, the distance between the coordinates of the key point of the left foot and the area where the foot pedal is located may be determined, and if the distance is less than a certain threshold, it may be determined that the left foot is on the foot pedal. Similarly, the distance between the coordinates of the key point of the right foot and the area where the foot pedal is located may be determined, and if the distance is less than a certain threshold, it may be determined that the right foot is on the foot pedal.
Under the condition that the left arm, the right arm, the left leg and the right leg are straightened, the left hand and the right hand fall on the movable piece, and the left foot and the right foot fall on the pedal, the physical measurement person is determined to make a preparation action.
In above-mentioned technical scheme, whether judge that physical examination personnel make the preparation action in order to judge whether can begin to carry out the fitness test based on a plurality of key points, from this, can improve the intelligent degree of fitness test.
Illustratively, the fitness test method 500 further comprises: judging whether the graphic code image acquired by the first camera is unchanged in a second target time period in real time; if the graphic code image collected by the first camera does not change in the second target time period, a reset instruction is sent to a motor in the reset device to control the motor to operate, wherein the reset device can comprise a telescopic piece and a motor, a rotating shaft of the motor is connected with a first end of the telescopic piece, a second end of the telescopic piece is connected with a movable piece, and the motor is used for driving the telescopic piece to contract when the motor actively rotates.
The second target period may be the same as or different from the first target period. In one example, the first target time interval and the second target time interval are the same, and under the condition that it is judged in real time that the graphic code image acquired by the first camera does not change in the first target time interval or the second target time interval (the first target time interval and the second target time interval are the same), it can be determined that the physical fitness test of the physical testing personnel is completed, and automatic reset can be performed.
In one embodiment, whether the graphic code scanned by the first camera is unchanged in the second target time period is judged in real time. The second target time period may be any suitable length, which is not limited in this application. For example, the second target period may be 3 seconds. If the graphic code scanned by the first camera changes within 3 seconds, the physical testing personnel is not finished the physical testing. If the graphic code scanned by the first camera does not change within 3 seconds, the physical testing personnel is indicated to finish the physical testing. At this time, a reset instruction can be sent to the motor in the resetting device to control the motor in the resetting device to drive the telescopic piece to move, and the movable piece is restored to the initial position.
Based on the resetting device, the seat body forward bending equipment does not need manual resetting, and automatic resetting can be realized, so that the intelligent degree of the seat body forward bending equipment is further improved, and the manpower is liberated.
An exemplary process of the fitness test method is briefly described below. FIG. 6 illustrates an exemplary flow chart of a fitness test method according to one embodiment of the present application. Referring to fig. 6, the process of the fitness test method is generally as follows:
1. after hearing the roll name of the examiner, the examinee can answer XXX to the test point and then walk to a face recognition device in the seat body forward bending test system for face recognition.
2. After the seat body forward bending test system identifies the examinee information, the system can prompt that the name of the examinee is successfully recorded, and then guides the examinee to a seat body forward bending motion area (namely a target body measurement area) through voice and makes a preparation action.
3. After detecting that the examinee makes a preparation action, the second camera of the seat body forward bending test system can play 'XXX please start an examination' in a voice mode, and at the moment, after hearing permission to start the examination, the examinee can carry out body measurement and push a movable piece of the seat body forward bending device according to detailed requirements of the seat body forward bending action.
4. If irregular actions such as leg bending, single-hand push plate and the like occur in the examination movement, the seat body forward bending test system can identify the action of the examinee and judge the action as violation.
5. And reporting the effective score, uploading the video and/or image collected by the camera (the second camera and/or the first camera) to a campus network, and broadcasting the score of the examinee by voice. If the action which is not standardized exists, the score is broadcasted to be invalid through voice, the reason why the score is invalid is broadcasted, and meanwhile the score is also uploaded to a third-party platform.
6. After the examination result, the voice broadcast asks the examinee to leave the examination room.
While the physical ability testing method 500 provided in the embodiment of the present application is mainly described in conjunction with the seat body front flexion device 200, it should be noted that this is merely an example, the physical ability testing method 500 is not limited to be applied to the seat body front flexion device shown in fig. 2, and the physical ability testing method can also be applied to other similar physical testing devices for physical testing based on the graphic code principle.
According to another aspect of the present application, there is provided a physical fitness test device. Figure 7 shows a schematic block diagram of a fitness test device 700 according to one embodiment of the present application.
As shown in fig. 7, the fitness test device 700 according to the embodiment of the present application includes an acquisition module 710 and a determination module 720. The modules may perform the steps of the fitness test method described above with respect to fig. 5. Only the main functions of the components of the physical fitness test device 700 will be described below, and the details that have been described above will be omitted.
The obtaining module 710 is configured to obtain a graphic code image acquired by the first camera for a graphic code on a base of the seat body forward flexion equipment when a physical performance test performed by the physical testing personnel based on the seat body forward flexion equipment is completed, wherein a plurality of graphic codes are arranged on the base along a moving path of the movable member, and the movable member of the seat body forward flexion equipment drives the first camera to move under the pushing of the physical testing personnel. The obtaining module 710 may be implemented by the processor 102 in the electronic device shown in fig. 1 executing program instructions stored in the storage 104.
The determining module 720 is used for determining the physical measurement result of the physical measurement person based on the graphic code image; the physical test result comprises at least one of a physical test score, position information corresponding to the movable piece when the physical test is completed, and distance information of the movable piece pushed. The determination module 720 may be implemented by the processor 102 in the electronic device shown in fig. 1 executing program instructions stored in the storage 104.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
FIG. 8 shows a schematic block diagram of an electronic device 800 according to one embodiment of the present application. Electronic device 800 includes memory 810 and processor 820.
The memory 810 stores computer program instructions for implementing the respective steps in the fitness test method according to an embodiment of the present application.
The processor 820 is used for executing the computer program instructions stored in the memory 810 to perform the corresponding steps of the fitness test method according to the embodiment of the present application.
In one embodiment, the computer program instructions, when executed by the processor 820, are for performing the steps of: acquiring a graphic code image acquired by a first camera aiming at a graphic code on a base body of seat body forward-bending equipment when a physical testing of a physical testing person based on the seat body forward-bending equipment is finished, wherein a plurality of graphic codes are arranged on the base body along a moving path of a movable piece, and a movable piece of the seat body forward-bending equipment drives the first camera to move under the pushing of the physical testing person; determining the physical measurement result of the physical measurement personnel based on the graphic code image; the physical test result comprises at least one of a physical test score, position information corresponding to the movable piece when the physical test is completed, and distance information of the movable piece pushed.
Illustratively, the electronic device 800 may further include an image capture device 830. The image capturing device 830 is used for capturing graphic codes. That is, the image capture device 830 may be a first camera. The image capture device 830 is optional, and where the electronic device 800 includes the image capture device 830, the electronic device 800 may be the seat body flexion device itself. The electronic device 800 may not include the image capturing device 830, in which case the electronic device 800 may be a device other than a seat body forward flexion device.
Furthermore, according to an embodiment of the present application, there is also provided a storage medium, on which program instructions are stored, and when the program instructions are executed by a computer or a processor, the program instructions are used to execute corresponding steps of the fitness test method according to the embodiment of the present application, and are used to implement corresponding modules in the fitness test device according to the embodiment of the present application. The storage medium may include, for example, a memory card of a smart phone, a storage component of a tablet computer, a hard disk of a personal computer, a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a portable compact disc read only memory (CD-ROM), a USB memory, or any combination of the above storage media.
In one embodiment, the program instructions, when executed by a computer or a processor, may cause the computer or the processor to implement the respective functional modules of the fitness test device according to the embodiment of the present application, and/or may perform the fitness test method according to the embodiment of the present application.
In one embodiment, the program instructions are operable when executed to perform the steps of: acquiring a graphic code image acquired by a first camera aiming at a graphic code on a base body of seat body forward-bending equipment when a physical testing of a physical testing person based on the seat body forward-bending equipment is finished, wherein a plurality of graphic codes are arranged on the base body along a moving path of a movable piece, and a movable piece of the seat body forward-bending equipment drives the first camera to move under the pushing of the physical testing person; determining the physical measurement result of the physical measurement personnel based on the graphic code image; the physical test result comprises at least one of a physical test score, position information corresponding to the movable piece when the physical test is completed, and distance information of the movable piece pushed.
Furthermore, according to an embodiment of the present application, there is also provided a computer program product, comprising a computer program, which is configured to execute the fitness test method 500 described above when the computer program runs.
The modules in the electronic device according to the embodiments of the present application may be implemented by a processor of the electronic device implementing the fitness test according to the embodiments of the present application running computer program instructions stored in a memory, or may be implemented when computer instructions stored in a computer readable storage medium of a computer program product according to the embodiments of the present application are run by a computer.
Furthermore, according to an embodiment of the present application, there is also provided a computer program, which is configured to execute the fitness test method 500 described above when the computer program runs.
Although the example embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above-described example embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present application. All such changes and modifications are intended to be included within the scope of the present application as claimed in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, a division of a unit is only one type of division of a logical function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the present application, various features of the present application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the application and aiding in the understanding of one or more of the various application aspects. However, the method of the present application should not be construed to reflect the intent: this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Moreover, those skilled in the art will appreciate that while some embodiments herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some of the modules in the fitness test device according to embodiments of the present application. The present application may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present application may be stored on a computer readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website, or provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiments of the present application or the description thereof, and the protection scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope disclosed in the present application, and all the changes or substitutions should be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (18)

1. A fitness test method, comprising:
acquiring a graphic code image acquired by a first camera aiming at a graphic code on a base body of a seat body forward-flexion device when a physical fitness test of a physical measurement person based on the seat body forward-flexion device is completed, wherein a plurality of graphic codes are arranged on the base body along a moving path of a movable piece, and the movable piece of the seat body forward-flexion device drives the first camera to move under the pushing of the physical measurement person;
determining a physical measurement result of the physical measurement person based on the graphic code image; the physical test result comprises at least one of a physical test score, position information corresponding to the movable member when the physical test is completed, and distance information of the movable member pushed.
2. The method of claim 1, wherein said determining the physical measurement result of the physical measurement person based on the graphic code image comprises:
determining the position reached by the movable piece and/or the moving distance based on the corresponding position information of the positioning line in the graphic code image;
determining the physical measurement based on the position reached and/or distance moved by the movable member;
the positioning line is a straight line which is positioned at a specified position in the picture of the first camera, and the extending direction of the positioning line is vertical to the moving direction of the movable piece; alternatively, the positioning line is determined based on projecting a side surface of the movable member facing the physical measurement person on the screen of the first camera.
3. The method of claim 2, wherein the determining the position reached and/or the distance moved by the movable member based on the position information corresponding to the positioning line in the graphic code image comprises:
under the condition that the positioning line falls on any graphic code, determining a target graphic code positioned before or after a current graphic code from the graphic code image, wherein the current graphic code is the graphic code on which the positioning line falls;
determining a position reached and/or a distance moved by the movable member based at least on the current graphic code and the target graphic code.
4. The method of claim 3, wherein the determining the position reached and/or the distance moved by the movable member based on at least the current graphic code and the target graphic code comprises:
calculating the number of pixels between the positioning line and a target edge of the current graphic code, wherein the target edge is any edge parallel to the positioning line;
calculating a distance from the positioning line to the target edge based on the number of pixels between the positioning line and the target edge;
calculating a physical position corresponding to the positioning line at least based on the distance from the positioning line to the target edge, the interval between the current graphic code and the target graphic code and the physical position represented by the target graphic code;
and determining the position reached and/or the moving distance of the movable piece based on the physical position corresponding to the positioning line.
5. The method of claim 4, wherein the calculating the physical location corresponding to the positioning line based on at least a distance from the positioning line to the target edge, a separation between the current graphic code and the target graphic code, and the physical location represented by the target graphic code comprises:
when the target graphic code is a graphic code positioned in front of the current graphic code and the target edge is an edge in front of the positioning line, determining a first sum of a physical position represented by the target graphic code, a first distance, an interval between the current graphic code and the target graphic code and a distance from the positioning line to the target edge, and determining the first sum as a physical position corresponding to the positioning line;
when the target graphic code is a graphic code positioned in front of the current graphic code and the target edge is an edge behind the positioning line, determining a physical position represented by the target graphic code, the first distance, an interval between the current graphic code and the target graphic code and a second sum of the side length of the current graphic code, determining a first difference between the second sum and the distance from the positioning line to the target edge, and determining the first difference as a physical position corresponding to the positioning line;
when the target graphic code is a graphic code located behind the current graphic code and the target edge is an edge before the positioning line, determining a second difference between a physical position represented by the target graphic code and a second distance, a distance between the current graphic code and the target graphic code, and a second difference of the edge length of the current graphic code, determining a third sum of the second difference and the distance from the positioning line to the target edge, and determining the third sum as the physical position corresponding to the positioning line;
determining a third difference value between the physical position represented by the target graphic code and the second distance, the interval between the current graphic code and the target graphic code, and the distance from the positioning line to the target edge, and determining the third difference value as the physical position corresponding to the positioning line, when the target graphic code is the graphic code located behind the current graphic code and the target edge is the edge behind the positioning line;
the first distance is the distance between a position scale mark corresponding to the physical position represented by any graphic code and the rear edge of the graphic code, and the second distance is the distance between a position scale mark corresponding to the physical position represented by any graphic code and the front edge of the graphic code.
6. The method of claim 2, wherein the determining the position reached and/or the distance moved by the movable member based on the position information corresponding to the positioning line in the graphic code image comprises:
in the case that the location line does not fall on any graphic code, determining a target graphic code located before or after the location line from the graphic code image;
determining the position reached and/or the distance moved by the movable member based on the target graphic code.
7. The method of claim 6, wherein said determining the position reached and/or distance moved by the movable member based on the target graphical code comprises:
calculating the number of pixels between the positioning line and a target edge of the target graphic code, wherein the target edge is any edge parallel to the positioning line;
calculating a distance from the positioning line to the target edge based on the number of pixels between the positioning line and the target edge;
calculating a physical position corresponding to the positioning line based on the distance from the positioning line to the target edge and the physical position represented by the target graphic code;
and determining the position reached and/or the moving distance of the movable piece based on the physical position corresponding to the positioning line.
8. The method of claim 7, wherein the calculating the physical position corresponding to the positioning line based on the distance from the positioning line to the target edge and the physical position represented by the target graphic code comprises:
when the target graphic code is a graphic code positioned in front of the positioning line and the target edge is an edge farther away from the positioning line, determining a fourth sum of the physical position represented by the target graphic code and the distance from the positioning line to the target edge, determining a fourth difference of the fourth sum and the second distance, and determining the fourth difference as the physical position corresponding to the positioning line;
determining a fifth sum of a physical position represented by the target graphic code, a first distance and a distance from the positioning line to the target edge under the condition that the target graphic code is a graphic code located before the positioning line and the target edge is an edge closer to the positioning line, and determining the fifth sum as a physical position corresponding to the positioning line;
when the target graphic code is a graphic code located behind the positioning line and the target edge is an edge farther from the positioning line, determining a sixth sum of the physical position represented by the target graphic code and the first distance, determining a fifth difference between the sixth sum and the distance from the positioning line to the target edge, and determining the fifth difference as the physical position corresponding to the positioning line;
determining a sixth difference value between the physical position represented by the target graphic code and the second distance as well as the distance from the positioning line to the target edge, and determining the sixth difference value as the physical position corresponding to the positioning line, when the target graphic code is a graphic code located behind the positioning line and the target edge is an edge closer to the positioning line;
the first distance is the distance between a position scale mark corresponding to the physical position represented by any graphic code and the rear edge of the graphic code, and the second distance is the distance between a position scale mark corresponding to the physical position represented by any graphic code and the front edge of the graphic code.
9. The method of any one of claims 1 to 8, wherein the first camera captures at least three graphic codes at a time.
10. The method of any of claims 1 to 8, wherein the method further comprises:
judging whether the graphic code image acquired by the first camera is unchanged in a first target time period in real time;
and if the graphic code image acquired by the first camera does not change within the first target time period, determining that the physical fitness test of the physical testing personnel is completed.
11. The method of any one of claims 1 to 10, wherein prior to said determining the physical result of the physical person based on the graphical code image, the method further comprises:
and carrying out distortion correction processing on the graphic code image.
12. The method of any one of claims 1 to 11, wherein the method further comprises:
receiving an image of the physical measurement person acquired by a second camera, wherein the second camera is arranged at a first target distance and a first target height around a target physical measurement area of the seat body forward flexion equipment and is used for acquiring the image of the physical measurement person in the target physical measurement area;
carrying out human body posture recognition on the image acquired by the second camera so as to determine the positions of a plurality of key points;
and judging whether the physical testing personnel has irregular actions or not based on the positions of the key points.
13. The method of any one of claims 1 to 12, wherein the method further comprises:
acquiring identity information of the physical testing personnel;
after the physical testing of the physical testing personnel is completed, the physical testing execution information of the physical testing personnel is associated with the identity information of the physical testing personnel and stored in a storage device, wherein the physical testing execution information comprises one or more of the following items: action non-specification information; the body measurement result; videos or images collected by the person under physical examination in the physical examination process; the video or image collected aiming at the graphic code in the body measuring process; there are images that are snap-shot for the body measurer when there is an irregular motion.
14. The method of any one of claims 1 to 13, wherein the method further comprises:
receiving identity information of the physical measurement personnel output by a face recognition device, wherein the face recognition device is arranged at a second target distance and a second target height around a target body measurement area of the seat body forward flexion equipment and is used for acquiring a face image of the physical measurement personnel and recognizing the identity of the physical measurement personnel based on the face image;
outputting guidance information to indicate the physical measurement person to reach the target physical measurement area;
whether the person to be measured makes a preparation action is judged based on an image collected by a second camera, and a person to be measured is output under the condition of making the preparation action, wherein the second camera is arranged at a first target distance and a first target height around the target person to be measured and is used for collecting the image of the person to be measured in the target person to be measured.
15. The method of any one of claims 1 to 14, wherein the method further comprises:
judging whether the graphic code image acquired by the first camera is unchanged in a second target time period in real time;
if the graphic code image collected by the first camera does not change in the second target time period, sending a reset instruction to a motor in a reset device to control the motor to operate,
wherein, resetting means include the extensible member with the motor, the axis of rotation of motor with the first end of extensible member is connected, the second end of extensible member with the movable part is connected, the motor is used for driving when the initiative is rotated the extensible member shrink.
16. An electronic device comprising a processor and a memory, wherein the memory has stored therein computer program instructions for execution by the processor to perform the fitness test method of any one of claims 1 to 15.
17. A storage medium on which program instructions are stored, the program instructions being operable when executed to perform the fitness test method of any one of claims 1 to 15.
18. A computer program product comprising a computer program for performing, when running, the fitness test method of any one of claims 1 to 15.
CN202210770922.5A 2022-06-30 2022-06-30 Physical fitness test method, electronic device, storage medium, and computer program product Pending CN115809679A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210770922.5A CN115809679A (en) 2022-06-30 2022-06-30 Physical fitness test method, electronic device, storage medium, and computer program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210770922.5A CN115809679A (en) 2022-06-30 2022-06-30 Physical fitness test method, electronic device, storage medium, and computer program product

Publications (1)

Publication Number Publication Date
CN115809679A true CN115809679A (en) 2023-03-17

Family

ID=85481120

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210770922.5A Pending CN115809679A (en) 2022-06-30 2022-06-30 Physical fitness test method, electronic device, storage medium, and computer program product

Country Status (1)

Country Link
CN (1) CN115809679A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116994339A (en) * 2023-09-27 2023-11-03 成都谷帝科技有限公司 Method and system for sitting body forward-bending test based on image processing

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116994339A (en) * 2023-09-27 2023-11-03 成都谷帝科技有限公司 Method and system for sitting body forward-bending test based on image processing
CN116994339B (en) * 2023-09-27 2024-01-23 成都谷帝科技有限公司 Method and system for sitting body forward-bending test based on image processing

Similar Documents

Publication Publication Date Title
CN106407914B (en) Method and device for detecting human face and remote teller machine system
CN108875524B (en) Sight estimation method, device, system and storage medium
CN103099602B (en) Based on the physical examinations method and system of optical identification
US9224037B2 (en) Apparatus and method for controlling presentation of information toward human object
CN111295234A (en) Method and system for generating detailed data sets of an environment via game play
US20150092981A1 (en) Apparatus and method for providing activity recognition based application service
KR101181967B1 (en) 3D street view system using identification information.
CN105956586A (en) Intelligent tracking system based on TOF 3D camera
CN107194985A (en) A kind of three-dimensional visualization method and device towards large scene
CN109815813A (en) Image processing method and Related product
CN113435236A (en) Home old man posture detection method, system, storage medium, equipment and application
CN115428050A (en) Hybrid gait analysis device for fall prevention and fall prevention management system comprising same
CN115509360B (en) Virtual reality VR interactive system based on meta-universe
KR20200134502A (en) 3D human body joint angle prediction method and system through the image recognition
KR20200081629A (en) Dance evaluation device using joint angle comparison and the method thereof
CN112184898A (en) Digital human body modeling method based on motion recognition
CN115809679A (en) Physical fitness test method, electronic device, storage medium, and computer program product
CN210199780U (en) Three-dimensional face reconstruction device and system based on infrared structured light
He et al. A New Kinect‐Based Posture Recognition Method in Physical Sports Training Based on Urban Data
CN211375621U (en) Iris 3D information acquisition equipment and iris identification equipment
CN109426336A (en) A kind of virtual reality auxiliary type selecting equipment
KR20160035497A (en) Body analysis system based on motion analysis using skeleton information
CN115568823B (en) Human body balance capability assessment method, system and device
Salau et al. 2.3. Development of a multi-Kinect-system for gait analysis and measuring body characteristics in dairy cows
CN111462227A (en) Indoor personnel positioning device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination