KR102025595B1 - Method for recognizing user motion and motion recognition apparatus using the same - Google Patents

Method for recognizing user motion and motion recognition apparatus using the same Download PDF

Info

Publication number
KR102025595B1
KR102025595B1 KR1020140143931A KR20140143931A KR102025595B1 KR 102025595 B1 KR102025595 B1 KR 102025595B1 KR 1020140143931 A KR1020140143931 A KR 1020140143931A KR 20140143931 A KR20140143931 A KR 20140143931A KR 102025595 B1 KR102025595 B1 KR 102025595B1
Authority
KR
South Korea
Prior art keywords
motion
user
sensor
relative rotation
recognition
Prior art date
Application number
KR1020140143931A
Other languages
Korean (ko)
Other versions
KR20160047710A (en
Inventor
김지훈
Original Assignee
에스케이텔레콤 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 에스케이텔레콤 주식회사 filed Critical 에스케이텔레콤 주식회사
Priority to KR1020140143931A priority Critical patent/KR102025595B1/en
Publication of KR20160047710A publication Critical patent/KR20160047710A/en
Application granted granted Critical
Publication of KR102025595B1 publication Critical patent/KR102025595B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P15/00Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration
    • G01P15/18Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration in two or more dimensions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/42Devices characterised by the use of electric or magnetic means
    • G01P3/44Devices characterised by the use of electric or magnetic means for measuring angular speed
    • G01P3/46Devices characterised by the use of electric or magnetic means for measuring angular speed by measuring amplitude of generated current or voltage
    • G01P3/465Devices characterised by the use of electric or magnetic means for measuring angular speed by measuring amplitude of generated current or voltage by using dynamo-electro tachometers or electric generator

Abstract

The present invention relates to a method for recognizing a user's motion and a motion recognition device using the same, and particularly, to calculate a relative rotation angle and a relative rotation angular velocity based on sensor values collected according to a change in user motion divided into a plurality of detailed motions, and a motion management table. By performing the motion recognition by comparing with the value of, if the defined motion occurs regardless of the user's environment, the motion recognition has the expandability (Expandability). In addition, the acceleration sensor and the gyro sensor can be utilized at the same time to improve the accuracy of sensor measurement and motion recognition. In addition, data on various user actions can be defined as a metric to distinguish the actions for each user. In addition, the value of the value applied by calculating the sensor value for motion recognition as a relative change value rather than an absolute value is small, and accurate motion recognition is possible even with a simple calculation.

Description

Method for recognizing user motion and motion recognition apparatus using the same

The present invention relates to a method for recognizing a user motion and a motion recognition device using the same, and more particularly, to calculate a relative rotation angle and a relative rotation angular velocity based on sensor values collected according to a change in user motion divided into a plurality of detailed motions. The present invention relates to a user gesture recognition method for performing gesture recognition in comparison with a value of a gesture management table, and a gesture recognition apparatus using the same.

Recently, with the introduction of an open operating system (Operating System), smart phones, which combine the high-performance features of personal computers (PCs) with mobile phones, have become popular, and various attempts have been made in the direction of utilizing high-performance and high-performance smartphones. ought.

In particular, with the development of micro fabrication technology, as advanced sensors become smaller and cheaper, more sensors can be mounted on smartphones, and many intelligent applications such as augmented reality and 3D games are used. Is being developed.

In addition, the sensors mounted on smartphones will evolve from a device that senses the surrounding environment to an intelligent sensor that takes into account changes in the user's body and emotional state, and will be able to play a key role in interacting with humans. As such, intelligent applications utilizing sensors are expected to grow further.

Sensors mounted on a smartphone include a camera (image) sensor, an acoustic sensor, a proximity sensor, an illumination sensor, a gravity sensor, a GPS (Global Positioning System) sensor, an acceleration sensor, a gyro sensor, a geomagnetic sensor, and the like.

Among them, a camera (image) sensor detects light and converts the intensity of the light into digital image data. The camera (image) sensor can be used for face recognition and the like. As a sensor that converts into a conventional signal, it can be used for a voice recognition-based service, and the proximity sensor is not a detection method by a mechanical contact, but a non-contact method that determines the presence or absence of an object to be detected when the detector is in close proximity. As a sensor, it is usually used to turn off the screen automatically when the smartphone is placed close to the face or placed in a pocket for a call.

In addition, the ambient light sensor is a sensor for detecting the ambient brightness, and is usually used to set the brightness of the screen to increase the brightness in the bright place and lower the dark place to reduce the power consumption of the mobile terminal and reduce eye fatigue. It is a sensor that detects the direction of movement and detects the movement of the object.It is used to determine the display direction (horizontal and vertical) of the smartphone and automatically correct the screen orientation.

In addition, the GPS sensor is a sensor that can collect the time and location information of the object through the satellite positioning system, it is used in a variety of location-based services, the acceleration sensor detects the dynamic force changes, such as changes in the object speed per unit time, impact Recently, 3-axis accelerometer using MEMS (Micro Electro Mechanical Systems) technology has been widely used, and it is possible to detect object movements such as tilt change and shaking, and geomagnetic sensor detects azimuth angle like a compass by grasping the flow of the earth's magnetic field. Gyro sensor detects the inertia force of an object as an electric signal, and mainly detects the rotation angle, and can directly detect height, rotation, and inclination. Recognition is possible.

However, the user's motion recognition using proximity and illuminance sensors has a simple recognition method, so that the error of motion recognition is not large. In addition, when the terminal itself detects the movement by using the acceleration sensor and the gyro sensor, the recognition rate is lowered unless a range of values is clearly specified. The range that can be recognized for N is narrowed, so there is a lack of scalability that can only recognize a simple operation.

Therefore, in order to recognize the operation by using sensors mounted in the terminal, a specific and clear recognition value and range should be specified. When the recognition value and range are determined only by the absolute rotation angle value, the recognition can be performed only in a limited state of the terminal. There is a problem. Accordingly, there is a need for providing scalability / compatibility through a method and application for accurately recognizing a specified operation in various environments regardless of the state where the terminal is located.

Korean Laid-Open Patent Publication No. 10-2010-0081552, published July 19, 2010 (Name: Motion detection device and method of a portable terminal)

In order to solve such a conventional problem, an object of the present invention is to set the angle of the preset operation management table to the relative rotation angle and relative rotation angular velocity based on sensor values collected according to the change of user motion divided into a plurality of detailed operations. If the amount of change is checked based on the priority of three axes for each motion, and if the error range is satisfied, the next detailed motion is sequentially recognized among the detailed motions. It is intended to provide a user gesture recognition method and a gesture recognition apparatus using the same.

A user motion recognition method according to an embodiment of the present invention for achieving the object as described above calculates the absolute rotation angle and the absolute rotation angle speed based on the sensor values collected according to the change of the user motion divided into a plurality of detailed motions And calculating the rotation angle and the relative rotation angular velocity by checking the calculated changes in the absolute rotation angle and the absolute rotation angular velocity and for the relative rotation angle and the relative rotation angular velocity for each of the three axes of each operation in the preset motion management table. Checking the amount of change based on the priority, and if the result of the check satisfies the error range, the next detailed motion in the detailed operation is sequentially recognized, and finally, if the motion recognition is completed, the motion recognition step of providing a motion recognition result Characterized in that.

In addition, in the user motion recognition method according to the present invention, the calculating step is characterized by calculating at least one of the direction, magnitude and rotational speed by checking the sensor values collected according to the user motion change in the motion recognition standby mode. do.

Further, in the user motion recognition method according to the present invention, the step of calculating the relative rotation angular velocity by calculating a change value of the rotational angular velocity with respect to the peak moment of the motion recognition waiting time among the values detected by the sensor during the user's movement time It is characterized by calculating.

In addition, in the user motion recognition method according to the present invention, the motion management table includes a relative rotation angle and tolerance range for each of three axes, and a relative rotation angular velocity and tolerance range, recognition time, and priority for each axis. It is characterized by including information on at least one.

In addition, in the user motion recognition method according to the present invention, in order to perform the initial motion recognition before the step of calculating, a predetermined range for the change of the sensor value is specified, and a specific sensor value satisfies the specified range. In this case, the method further includes a motion recognition waiting step of performing the motion recognition waiting mode.

Further, in the user motion recognition method according to the present invention, before the motion recognition step, in order to compare the relative rotation angle and the tolerance range, the relative rotation angular velocity and the tolerance range in the information in the motion management table, It further comprises the step of setting the priority for each axis that is necessary for the movement to define in the motion management table.

Further, in the user motion recognition method according to the present invention, the motion recognition step checks the priority of the three axes set for each sub-motion in the motion management table, and the relative rotation angle and relative rotation angular velocity of the axis corresponding to the priority When the amount of change is satisfied, the user may be determined by recognizing the following detailed operation.

An apparatus for recognizing a motion according to an exemplary embodiment of the present invention includes a sensor value collection module periodically collecting sensor values of one or more sensors changed according to a user's motion divided into a plurality of detailed operations, and a sensor collected through a sensor value collection module. Calculate absolute rotation angle and absolute rotation angular velocity based on the value, and calculate relative rotation angle and absolute rotation angular velocity and calculate relative rotation angle and relative rotation angular velocity. For the angle and relative rotational angular velocity, check the amount of change based on the priority of three axes for each motion in the preset motion management table, and if the error range is satisfied, the next detailed motion is recognized sequentially. And finally, when motion recognition is completed, including a motion recognition module that provides a result of motion recognition. It is characterized by.

According to the present invention, when a defined action occurs regardless of a user's environment, it has expandability that can recognize the action.

In addition, the acceleration sensor and the gyro sensor can be utilized at the same time to improve the accuracy of sensor measurement and motion recognition.

In addition, data on various user actions can be defined as a metric to distinguish the actions for each user.

In addition, the value of the value applied by calculating the sensor value for motion recognition as a relative change value rather than an absolute value is small, and accurate motion recognition is possible even with a simple calculation.

In addition, it is possible to have flexibility for a complex motion by applying the priority of the essential confirmation elements required for motion recognition for each axis.

In addition, since only a relative change value needs to be transmitted when interworking with a wearable device and another motion recognition device, the range of values to be expressed is small so that unnecessary compatibility with data traffic is generated, and a small memory space is used, thereby achieving high compatibility. have.

1 is a diagram illustrating a reference coordinate system and motion information for recognizing user motion according to an exemplary embodiment of the present invention.
2 is a block diagram showing the configuration of a motion recognition apparatus according to the present invention.
3 is a flowchart illustrating a user gesture recognition method according to an exemplary embodiment of the present invention.
4 is a diagram illustrating an example of a motion management table set according to a user motion recognition method of the present invention.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. However, in the following description and the accompanying drawings, detailed descriptions of well-known functions or configurations that may obscure the subject matter of the present invention will be omitted. In addition, it should be noted that like elements are denoted by the same reference numerals as much as possible throughout the drawings.

The terms or words used in the specification and claims described below should not be construed as being limited to ordinary or dictionary meanings, and the inventors are appropriate as concepts of terms for explaining their own invention in the best way. It should be interpreted as meanings and concepts in accordance with the technical spirit of the present invention based on the principle that it can be defined. Therefore, the embodiments described in the present specification and the configuration shown in the drawings are only the most preferred embodiments of the present invention, and do not represent all of the technical ideas of the present invention, and various alternatives may be substituted at the time of the present application. It should be understood that there may be equivalents and variations.

In addition, terms including ordinal numbers, such as first and second, are used to describe various components, and are used only to distinguish one component from another component, and to limit the components. Not used. For example, without departing from the scope of the present invention, the second component may be referred to as the first component, and similarly, the first component may also be referred to as the second component.

In addition, when a component is referred to as being "connected" or "connected" to another component, it means that it may be connected or connected logically or physically. In other words, although a component may be directly connected or connected to other components, it should be understood that other components may exist in the middle, and may be connected or connected indirectly.

In addition, the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. In addition, the terms "comprises" or "having" described herein are intended to indicate that there is a feature, number, step, operation, component, part, or combination thereof described in the specification, and one or the same. It is to be understood that the present invention does not exclude in advance the possibility of the presence or the addition of other features, numbers, steps, operations, components, parts, or a combination thereof.

In addition, the user motion recognition method and motion recognition apparatus according to the present invention may be applied to various fields, such as user motion recognition and robot control, but will be described below with an example applied to a user terminal (motion recognition device). In particular, the user terminal according to an embodiment of the present invention will be described as a representative example of the mobile communication terminal, but the user terminal is not limited to the mobile communication terminal, all information communication devices, multimedia terminals, wired terminals, fixed terminals and IP (Internet Protocol) ) It can be applied to various terminals such as a terminal. In addition, the terminal may be a mobile phone, a portable multimedia player (PMP), a mobile internet device (MID), a smart phone, a desktop, a tablet computer, a notebook, a net book. And a mobile terminal having various mobile communication specifications, such as an information communication device.

Motion recognition in the user terminal is used to control a specific function of the user terminal by detecting a predetermined specific movement (flip, shake in a specific direction, draw a specific pattern, user approach), the movement of the user terminal is Through the three-dimensional rectangular coordinate system as shown in FIG. 1, the azimuth, the pitch, and the roll may be represented.

That is, as illustrated in FIG. 1, when the horizontal direction of the user terminal (motion recognition device) is referred to as the X axis, the vertical direction as the Y axis, and the width direction as the Z axis, the azimuth angle is the direction (eg, West, South, North) according to the Z-axis 0 ~ 360 ° or -180 ° ~ 180 °, the pitch is the reference axis of the horizontal axis, the -90 ° ~ 90 ° depending on the shape of the user terminal is standing, roll Is a reference axis of rotation of the vertical axis, and is represented by -180 ° to 180 ° depending on the shape of the user terminal.

Of course, in the recognition of the user's motion, the reference coordinate system and the expression method of the movement may be different, and the above definition is merely an example.

The present invention is to recognize the motion of the user terminal by dividing the motion into detailed motions. In order to perform such a function, the motion recognition apparatus 10 according to the present invention may be configured as shown in FIG.

2 is a block diagram illustrating a gesture recognition apparatus 10 according to the present invention. Referring to the figure, the gesture recognition apparatus 10 according to the present invention includes a sensor value collection module 100, a calculation module 200, and an operation. Recognition module 300, the storage module 400 and the sensor module 500 may be made to include.

The sensor value collection module 100 is a component for collecting sensor values detected from the plurality of sensor modules 500. In this case, the sensor value collection module 100 may collect sensor values from one or more sensors among the plurality of sensor modules 500 at regular intervals, and the collected sensor values may have different units according to the sensors. For example, the sensor value of the illuminance sensor indicates the amount of illuminance (Lux), the sensor value of the proximity sensor indicates the distance and / or proximity to a nearby object, and the sensor value of the 3-axis acceleration sensor is 3 Acceleration value in the direction of the axis (X, Y, Z), sensor value of the three-axis gyro sensor represents the angular velocity in the direction of the three axis (X, Y, Z), the sensor value of the geomagnetic sensor is Indicates the direction.

The calculation module 200 is configured to calculate one or more sensor values collected by the sensor value collection module 100 as motion recognition information that can be recognized by the motion recognition module 300. For example, the calculation module 200 calculates the rotation angle and / or the angular velocity of the motion information described in FIG. 1, that is, the azimuth angle, the pitch angle, the roll angle, and the like from the collected sensor values through the rotation vector and the rotation matrix calculation. can do. In addition, the calculation module 200 may calculate an illuminance measurement value or an illuminance change amount as a sensor value of the illuminance sensor, and calculate the proximity count and the proximity distance from the sensing value of the proximity sensor.

In particular, the calculation module 200 according to the present invention calculates the absolute rotation angle and the absolute rotation angle speed based on the sensor value periodically collected from the sensor value of one or more sensors that are changed according to the user's operation divided into a plurality of detailed operations. The relative rotation angle and the relative rotation angular velocity are calculated by checking the calculated absolute rotation angle and the change amount of the absolute rotation angle. Here, the calculation module 200 checks the sensor values collected according to the user's motion change in the motion recognition standby mode, calculates at least one of direction, size, and rotation speed, and among the values detected by the sensor during the time the user moves. The relative rotational angular velocity is calculated by calculating the change value of the rotational angular velocity with respect to the peak instant of motion recognition waiting time. For example, the calculation module 200 may select and calculate only information on a sensor value selectively needed among directions, sizes, and rotational speeds of sensor values collected according to the user's motion change, but for better determination of change amount It can also be calculated by applying all of the information.

The motion recognition module 300 compares the relative rotation angle and the relative rotation angular velocity calculated by the calculation module 200 with an error range of a value of a preset operation management table. When the comparison result satisfies the error range, the operation recognition module 300 sequentially recognizes the next detailed operation among the detailed operations. That is, the motion recognition module 300 checks the change amount based on the priority of the three axes for each motion of the preset motion management table with respect to the relative rotation angle and the relative rotation angle speed, and satisfies the error range. Next, the next detailed operation among the detailed operations is sequentially recognized. Here, the motion recognition module 300 checks the priority of the three axes set for each detailed motion in the motion management table, and if the change amount of the relative rotation angle and relative rotation angular velocity of the axis corresponding to the priority is satisfied, the following details Recognize the motion to determine the user's motion.

 Finally, when gesture recognition is completed, the gesture recognition module 300 provides a result of gesture recognition to a user. Here, before performing the first motion recognition, the motion recognition module 300 designates a predetermined range for the change of the sensor value in order to detect a user motion, and when a specific sensor value satisfies the specified range. Performs the motion recognition standby mode.

On the other hand, the motion recognition module 300 is a priority for each axis that essentially requires rotation or movement in the detailed motion unit in order to compare the relative rotation angle and tolerance range, relative rotation angular velocity and tolerance range of the information in the operation management table The rank is set and defined in the motion management table.

The storage module 400 is a device for storing data, and includes a main memory device and an auxiliary memory device, and stores an application program required for a functional operation of the motion recognition device 10. The storage module 400 includes a motion management table 401. The motion management table 401 includes a relative angle size and a tolerance range, a rotation angle velocity size and a tolerance range, and a recognition time with respect to three axes for each motion. And information about priority of each axis. Here, the motion management table 401 is defined by setting priorities for the axes that are essentially required to rotate or move in the detailed motion unit in order to compare the angular size and the tolerance range, the rotational angular velocity magnitude and the tolerance range.

The sensor module 500 collects sensor values for detecting changes in position and motion through an acceleration sensor and a gyro sensor, and includes raw data of the X, Y, and Z axes, azimuth, roll, and the like. Provides information about the pitch.

In particular, the acceleration sensor can measure the value of the gravitational acceleration acting on the earth, thereby determining how the motion recognition device 10 is placed. For example, if the motion recognition device 10 is lying on the ground, the Z axis is affected by gravity acceleration and outputs a value of about 1G (= 9.8 m / s 2 ). By using these characteristics, the absolute angles of the pitch (X-axis reference rotation angle) and the roll (Y-axis reference rotation angle) of the gesture recognition apparatus 10 may be determined.

On the other hand, the gyro sensor measures the rotational angular velocity of the motion recognition device 10 through three axes (X, Y, Z axis) like the acceleration sensor. Through the gyro sensor, it is possible to determine in which direction and at which speed the motion recognition device 10 rotates. In this case, the gyro sensor outputs a value that converges to 0 when the motion recognition device 10 does not move, and expresses the rotation degree as a value based on each axis when the motion recognition device 10 rotates. Through this, it may be determined whether the motion recognition apparatus 10 is actually rotated.

On the other hand, the gesture recognition apparatus 10 according to an embodiment of the present invention may include an input unit, a display unit and a communication unit as necessary. In particular, the input unit (not shown) receives various information such as numeric and character information, sets various functions, and receives a signal input in relation to the function control of the operation recognition apparatus 10 by the operation module 200 or the operation recognition module. Forward to 300. In addition, the display unit (not shown) displays information about a series of operation states and operation results generated while performing the function of the operation recognition apparatus 10. In addition, the display unit may display a menu of the gesture recognition apparatus 10 and user data input by the user. In addition, the communication unit (not shown) performs a function for transmitting and receiving data with other devices and servers through various communication networks.

In particular, the gesture recognition apparatus 10 according to the present invention measures acceleration and gyro sensor values for gesture recognition of the gesture recognition apparatus 10, and calculates the rotation angle of the gesture recognition apparatus 10 only as an absolute value. Without determining, by measuring the relative angle change size and the rotational speed in accordance with the position change of the motion recognition device 10 in real time, by accurately recognizing the defined motion regardless of the state of the terminal, the motion recognition device 10 It can provide scalability and compatibility for gesture recognition provided by.

In addition, the motion recognition apparatus 10 divides and manages the entire motion into detailed units of the same continuous motion in order to support complex and complex motion recognition instead of a simple motion. In addition, the motion recognition apparatus 10 defines a start time point for motion recognition, and calculates a rotation angle of the stationary state of the motion recognition device 10 measured at the start time point. Afterwards, the motion recognition device 10 measures the sensor value of each axis of the gyro sensor and the acceleration sensor when the motion is taken by the user, and then changes the relative change of the rotation angle in the stationary state (delta). Calculate

When the relative change value within a predetermined time satisfies a condition of a predetermined detailed unit, the gesture recognition apparatus 10 switches to the next detailed recognition stage, and even when the relative recognition value of the gesture recognition apparatus 10 is determined through each sensor measurement value. Measure the change to see if it is working. On the other hand, the gesture recognition apparatus 10 measures the relative change in each step and finally considers the gesture recognition only when the condition is satisfied, and informs the user that the gesture recognition is completed.

In addition, the memory mounted in the gesture recognition apparatus 10 according to an embodiment of the present invention stores information in the device. In one embodiment, the memory is a computer readable medium. In one implementation, the memory may be a volatile memory unit, and for other implementations, the memory may be a nonvolatile memory unit. In one embodiment, the storage device is a computer readable medium. In various different implementations, the storage device may include, for example, a hard disk device, an optical disk device, or some other mass storage device.

Although the specification and drawings describe exemplary device configurations, the functional operations and subject matter implementations described herein may be embodied in other types of digital electronic circuitry, or modified from the structures and structural equivalents disclosed herein. It may be implemented in computer software, firmware or hardware, including, or a combination of one or more of them. Implementations of the subject matter described herein relate to one or more computer program products, ie computer program instructions encoded on a program storage medium of tangible type for controlling or by the operation of an apparatus according to the invention. It may be implemented as the above module. The computer readable medium may be a machine readable storage device, a machine readable storage substrate, a memory device, a composition of materials affecting a machine readable propagated signal, or a combination of one or more thereof.

A process of recognizing a user motion according to the above-described embodiment of the present invention will be described in more detail with reference to FIGS. 3 and 4.

3 is a flowchart illustrating a user motion recognition method according to an exemplary embodiment of the present invention, and FIG. 4 is a diagram illustrating an example of a motion management table set according to the user motion recognition method of the present invention.

3 and 4, the gesture recognition apparatus 10 for recognizing a user's gesture according to the present invention performs an idle mode in step S11. Here, the idle mode means that the gesture recognition apparatus 10 is in an idle state without being used for communication. And, before performing the first gesture recognition, the gesture recognition apparatus 10 designates a predetermined range for the change of the sensor value in order to detect the user's motion, and operates when the specific sensor value satisfies the specified range. Operation to perform the recognition standby mode Performs the recognition standby mode. That is, the gesture recognition apparatus 10 calculates the pitch at which the current gesture recognition apparatus 10 is located and the absolute angle of the roll, regardless of the predefined angle. In this case, the gesture recognition apparatus 10 checks the current position for a predetermined time to determine whether the gesture recognition apparatus 10 is stable and there is almost no movement.

The gesture recognition apparatus 10 checks whether a user gesture is detected in step S13. In this case, the user motion detection may be confirmed through sensor values of one or more sensors that change according to user motions divided into a plurality of detailed motions. When the user's motion is detected, the gesture recognition apparatus 10 collects sensor values generated according to the user's motion in step S15.

The motion recognition apparatus 10 calculates an absolute rotation angle and an absolute rotation angle speed based on the sensor values collected in step S17, and checks the calculated amount of change in the absolute rotation angle and the absolute rotation angle to determine the relative rotation speed and the relative rotation angle speed. Calculate That is, the gesture recognition apparatus 10 calculates the relative rotation angle and the rotation angular velocity magnitude based on sensor values collected according to the change of the user's motion divided into a plurality of detailed motions.

At this time, the gesture recognition apparatus 10 checks the sensor values collected according to the user's motion change in the gesture recognition standby mode, and calculates at least one of a direction, a magnitude, and a rotation speed. In particular, the gesture recognition apparatus 10 calculates a change value of the rotational angular velocity with respect to the peak moment of the motion recognition waiting time among the values detected by the sensor during the time the user moves to calculate the relative rotational angular velocity. For example, the gesture recognition apparatus 10 may select and calculate only information on a sensor value selectively among direction, magnitude, and rotation speed of sensor values collected according to the change of the user's motion, but may determine a better change amount. It can be calculated by applying all of the information.

That is, after the motion recognition device 10 determines that there is almost no motion in the motion recognition standby mode, when the motion of the motion recognition device 10 is detected, the motion recognition device 10 checks the direction, size, speed, etc. and calculates a rotation change. do. Here, the time point at which the calculation is completed is the time point at which the motion recognition apparatus 10 stops while maintaining the same direction, which is soon to coincide with the defined detailed motion unit.

In this case, the rotation angle of the gesture recognition apparatus 10 may be expressed as an absolute value, and the delta value Ad = | At + 1 is calculated by calculating how much the rotation is made from the value measured in the recognition standby mode without storing the absolute value. Calculate At | t is the recognition standby mode, and t + 1 indicates the first detailed operation unit. In addition, the angular velocity Vd is calculated as a value measured by the gyro sensor during the time that the motion recognition apparatus 10 moves, and the change value of the angular velocity value at the peak instant at t time among the values measured and output by the gyro sensor is measured. Calculate

In operation S19, the gesture recognition apparatus 10 compares an error range between the value of the motion management table and the calculated relative rotation angle and relative rotation angular velocity. Here, the motion management table includes information on at least one of relative rotation angles and allowable error ranges, relative rotation angular velocity and allowable error ranges, recognition time, and priority for each axis for each motion. That is, the motion recognition apparatus 10 may determine the priority of the axes necessary for rotation or movement in a detailed motion unit in order to compare the relative angle size and the tolerance range, the rotation angle velocity and the tolerance range among the information in the motion management table. It is set and defined in the operation management table. That is, after the absolute rotation angle and the absolute rotation angle velocity calculation, the motion recognition apparatus 10 compares the relative rotation angle magnitude delta and the relative rotation angle velocity calculated according to the change of motion with the value of the motion management table. At this time, the operation management table manages the defined relative rotation angle delta and tolerance margin, relative rotation angular velocity delta and tolerance margin, and detailed unit min / max recognition time. The above values may be defined for each axis of the gesture recognition apparatus 10.

Looking at the motion recognition table of FIG. 4 in detail, the motion recognition table defines conditions such as relative value change magnitudes in each step for pre-defined motion recognition. Here, the conditions are largely relative rotation angle change size and relative rotation angle speed magnitude for motion recognition, tolerance angle range for rotation angle change and rotation angle speed magnitude, min / max time for motion recognition, and mandatory condition of each axis. It can be defined by confirmation, etc. It is tabled to manage motion recognition. In addition, the motion recognition table provides a function of efficiently managing a plurality of independent motions and detailed angles and times in each motion.

On the other hand, each delta and margin must define the priority of each axis by specifying the priority for the axis of the motion recognition device 10 that needs to rotate or move in the detailed motion unit. For example, M1 is the first priority and M2 is the second priority. However, the priority is arbitrarily set and may vary depending on the operation definition.

That is, the motion recognition apparatus 10 checks the change amount based on the priority of the three axes for each motion in the preset motion management table with respect to the relative rotation angle and the relative rotation angular velocity, and satisfies the error range. Next, the next detailed operation among the detailed operations is sequentially recognized. Here, the motion recognition apparatus 10 checks the priority of the three axes set for each detailed motion in the motion management table, and if the change amount of the relative rotation angle and relative rotation angular velocity of the axis corresponding to the priority is satisfied, the following details. Recognize the motion to determine the user's motion.

In operation S21, the gesture recognition apparatus 10 checks whether the comparison result satisfies an error range. If the error range is satisfied, the gesture recognition apparatus 10 determines whether the current gesture recognition step is the final gesture recognition step in step S23. In the final motion recognition step, the gesture recognition apparatus 10 provides a result of motion recognition to the user in step S25. After that, when the conditions of all the detailed operation units are satisfied, the gesture recognition apparatus 10 notifies the user that motion recognition is finally completed, and returns to the motion recognition standby mode to perform a preparation process for new motion recognition. On the other hand, if the error range is not satisfied, it may be switched to the idle state. That is, when the motion recognition apparatus 10 satisfies the value defined in the motion management table and the error range, the motion recognition apparatus 10 determines that the first detailed motion unit condition is satisfied, and enters the next motion recognition step.

Meanwhile, when the next motion recognition step exists, the motion recognition apparatus 10 enters the next detailed motion recognition step in step S27 and sequentially performs motion recognition. That is, the gesture recognition apparatus 10 performs a process of calculating the relative rotation angle and the relative rotation angular velocity.

Through this, the present invention has expandability (Expandability) that can recognize the motion when the defined motion occurs regardless of the user in any environment. In addition, the acceleration sensor and the gyro sensor can be utilized at the same time to improve the accuracy of sensor measurement and motion recognition. In addition, data on various user actions can be defined as a metric to distinguish the actions for each user. In addition, the value of the value applied by calculating the sensor value for motion recognition as a relative change value rather than an absolute value is small, and accurate motion recognition is possible even with a simple calculation. In addition, it is possible to have flexibility for a complex motion by applying the priority of the essential confirmation elements required for motion recognition for each axis. In addition, since only a relative change value needs to be transmitted when interworking with a wearable device and another motion recognition device, the range of values to be expressed is small so that unnecessary compatibility with data traffic is generated, and a small memory space is used, thereby achieving high compatibility. have. Computer-readable media suitable for storing computer program instructions and data include, for example, magnetic media such as hard disks, floppy disks, and magnetic tape, such as magnetic disks, compact disk read only memory (CD-ROM), and DVDs. Optical Media such as Digital Video Disk, Magnetic-Optical Media such as Floppy Disk, and Read Only Memory, RAM, Random Semiconductor memories such as access memory (EPM), flash memory, erasable programmable ROM (EPROM), and electrically erasable programmable ROM (EEPROM). The processor and memory can be supplemented by or integrated with special purpose logic circuitry. Examples of program instructions may include high-level language code that can be executed by a computer using an interpreter as well as machine code such as produced by a compiler. Such hardware devices may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

Although the specification includes numerous specific implementation details, these should not be construed as limiting to any invention or the scope of the claims, but rather as a description of features that may be specific to a particular embodiment of a particular invention. It must be understood. Certain features that are described in this specification in the context of separate embodiments may be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments individually or in any suitable subcombination. Furthermore, while the features may operate in a particular combination and may be initially depicted as so claimed, one or more features from the claimed combination may in some cases be excluded from the combination, the claimed combination being a subcombination Or a combination of subcombinations.

Likewise, although the operations are depicted in the drawings in a specific order, it should not be understood that such operations must be performed in the specific order or sequential order shown in order to obtain desirable results or that all illustrated operations must be performed. In certain cases, multitasking and parallel processing may be advantageous. Moreover, the separation of the various system components of the above-described embodiments should not be understood as requiring such separation in all embodiments, and the described program components and systems will generally be integrated together into a single software product or packaged into multiple software products. It should be understood that it can.

On the other hand, the embodiments of the present invention disclosed in the specification and drawings are merely presented specific examples for clarity and are not intended to limit the scope of the present invention. It is apparent to those skilled in the art that other modifications based on the technical idea of the present invention can be carried out in addition to the embodiments disclosed herein.

The present invention calculates the relative rotation angle and the relative rotation angular velocity based on the sensor values collected according to the change of the user's motion divided into a plurality of detailed motions and compares them with the values of the preset motion management table and satisfies the error range. In this case, the next detailed operation is sequentially recognized among the detailed operations, and when the motion recognition is finally completed, the operation recognition result is provided. Accordingly, the present invention has the expandability (Expandability) that can recognize the motion when the defined motion occurs regardless of the user in any environment. In addition, the acceleration sensor and the gyro sensor can be utilized at the same time to improve the accuracy of sensor measurement and motion recognition. In addition, data on various user actions can be defined as a metric to distinguish the actions for each user. In addition, the value of the value applied by calculating the sensor value for motion recognition as a relative change value rather than an absolute value is small, and accurate motion recognition is possible even with a simple calculation. In addition, it is possible to have flexibility for a complex motion by applying the priority of the essential confirmation elements required for motion recognition for each axis. In addition, since only a relative change value needs to be transmitted when interworking with a wearable device and another motion recognition device, the range of values to be expressed is small so that unnecessary compatibility with data traffic is generated, and a small memory space is used, thereby achieving high compatibility. have. This has industrial applicability because it is not only sufficient marketable or business possibility, but also practically evident.

10: motion recognition device 100: sensor value collection module
200: operation module 300: motion recognition module
400: storage module 500: sensor module
401: operation management table

Claims (8)

The terminal stores the motion of the terminal according to the user's motion into detailed operations, and stores a motion management table storing error ranges of the relative rotation angles and the relative rotation angular velocities corresponding to the detailed motions.
Calculating an absolute rotation angle and an absolute rotation angular velocity based on a sensor value obtained according to a movement occurrence based on the position of the terminal in the stationary state;
The change in the rotational angular velocity among the values sensed by the sensor during the time the terminal is moved according to the user's motion and the relative rotation angle with respect to the movement of the terminal according to the change amount by checking the calculated absolute rotation angle and the absolute rotation angle change amount Calculating a relative rotational angular velocity by calculating; And
Regarding the relative rotation angle and the relative rotation angular velocity, the amount of change is checked based on a priority of three axes for each detailed motion predefined in the motion management table, and the identified change amount is a specific detail defined in the motion management table. Determining a plurality of detailed operations by performing the operation of determining the specific detailed operation while the sensor signal is collected by the movement of the terminal when the relative rotation angles and the relative rotation angular velocities of the operation are within an error range;
Finally, a gesture recognition step of displaying a gesture recognition result on the terminal when gesture recognition is completed;
User gesture recognition method comprising a.
The method of claim 1, wherein the calculating
And detecting at least one of a direction, a magnitude, and a rotational speed by checking sensor values collected according to a change in movement of the terminal according to a user's motion in a motion recognition standby mode.
The method of claim 1, wherein the relative rotational angular velocity
A method for recognizing a user's motion using a sensor, characterized by calculating a value of change in rotational angular velocity with respect to a peak instant of motion recognition waiting time.
The method of claim 1, wherein the operation management table
And a method for recognizing at least one of a relative rotation angle and an allowable error range, a relative rotation angle speed and an allowable error range, a recognition time, and a priority for each axis for each motion.
The method of claim 1, wherein prior to said calculating:
In order to perform motion recognition corresponding to the movement of the first terminal, a predetermined range for the change of the sensor value is designated, and the motion recognition waiting step for performing the motion recognition standby mode when the specific sensor value satisfies the specified range. ;
User gesture recognition method further comprises.
The method of claim 1, wherein before the gesture recognition step,
In order to compare relative rotation angle and allowable error range, relative rotation angular velocity and allowable error range among the information in the motion management table, priority is set for each axis that requires rotation or movement in a detailed motion unit. Defining on;
User gesture recognition method further comprising.
The method of claim 1, wherein the gesture recognition step is
Check the priority of the three axes set for each detailed motion in the motion management table, and if the change amount of the relative rotation angle and the relative rotation angular velocity of the axis corresponding to the priority is satisfied, the next detailed motion is recognized and the user motion is recognized. And determining a user motion.
A storage module configured to classify the motion of the terminal according to a user's motion into detailed motions and to store a motion management table storing error ranges of relative rotational angles and relative rotational angular velocities corresponding to the detailed motions;
A sensor value collection module for periodically collecting sensor values of one or more sensors that change according to the movement of the terminal;
The absolute rotation angle and the absolute rotation angular velocity are calculated based on the sensor value obtained according to the movement occurrence based on the position of the terminal in the stationary state, and the amount of change of the calculated absolute rotation angle and the absolute rotation angular velocity is checked to the change amount. A calculation module for calculating a relative rotational angular velocity by calculating a change in rotational angular velocity among values detected by a sensor during the time that the terminal is moved according to a user's motion and a relative rotational angle with respect to the movement of the terminal; And
With respect to the relative rotation angle and the relative rotation angular velocity calculated by the calculation module, the change amount is checked based on the priority of three axes for each detailed motion predefined in the operation management table, and the checked change amount is When within the error range of the relative rotation angles and relative rotation angular velocities of the specific detailed motion defined in the motion management table, the operation of determining the specific detailed motion is performed while the sensor signal is collected by the movement of the terminal. A motion recognition module for determining detailed motions and finally displaying a motion recognition result on the terminal when motion recognition is completed;
Motion recognition apparatus comprising a.
KR1020140143931A 2014-10-23 2014-10-23 Method for recognizing user motion and motion recognition apparatus using the same KR102025595B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020140143931A KR102025595B1 (en) 2014-10-23 2014-10-23 Method for recognizing user motion and motion recognition apparatus using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140143931A KR102025595B1 (en) 2014-10-23 2014-10-23 Method for recognizing user motion and motion recognition apparatus using the same

Publications (2)

Publication Number Publication Date
KR20160047710A KR20160047710A (en) 2016-05-03
KR102025595B1 true KR102025595B1 (en) 2019-11-04

Family

ID=56022417

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140143931A KR102025595B1 (en) 2014-10-23 2014-10-23 Method for recognizing user motion and motion recognition apparatus using the same

Country Status (1)

Country Link
KR (1) KR102025595B1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102095392B1 (en) * 2019-01-18 2020-03-31 국방과학연구소 Communication device control system and thereof method for control
KR102321931B1 (en) * 2021-05-07 2021-11-09 휴텍 주식회사 Game control method according to motion of console game machine

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101511160B1 (en) 2009-01-06 2015-04-13 삼성전자주식회사 Charge pump circuit and voltage converting apparatus using the same
KR101640072B1 (en) * 2010-03-31 2016-07-15 삼성전자주식회사 Apparatus and method for revising value of accelerometer sensor in portalble terminal
KR101859771B1 (en) * 2011-11-16 2018-05-21 삼성전자주식회사 Terminal device for correcting gyro-sensor sensing value and accelation sensor sensing value and method for controlling thereof
KR20130097284A (en) * 2012-02-24 2013-09-03 김철환 Mobile terminal performing action recognition and method thereof

Also Published As

Publication number Publication date
KR20160047710A (en) 2016-05-03

Similar Documents

Publication Publication Date Title
US11500536B2 (en) Neural network system for gesture, wear, activity, or carry detection on a wearable or mobile device
WO2016153612A1 (en) Facilitating dynamic detection and intelligent use of segmentation on flexible display screens
US20170256096A1 (en) Intelligent object sizing and placement in a augmented / virtual reality environment
JP5937076B2 (en) Method and apparatus for gesture-based user input detection in a mobile device
US20160372083A1 (en) Facilitating increased user experience and efficient power performance using intelligent segmentation on flexible display screens
CN107003821B (en) Facilitating improved viewing capabilities for glass displays
KR20130140408A (en) Flexible portable device
KR101678292B1 (en) METHOD, SYSTEM AND COMPUTER-READABLE RECORDING MEDIUM FOR CONTROLLING IoT(Internet of Things) DEVICE USING A WEARABLE DEVICE
US20130226505A1 (en) Dual Accelerometer Plus Magnetometer Body Rotation Rate Sensor-Gyrometer
EP3195092A1 (en) Facilitating dynamic eye torsion-based eye tracking on computing devices
US11670056B2 (en) 6-DoF tracking using visual cues
KR101685388B1 (en) Method and apparatus for recognizing motion using a plurality of sensors
US20140047259A1 (en) Methods and Apparatus for Mobile Device Power Management Using Accelerometer Data
KR102025595B1 (en) Method for recognizing user motion and motion recognition apparatus using the same
CN116348916A (en) Azimuth tracking for rolling shutter camera
KR102058158B1 (en) Method for processing sensor value to motion recognition and apparatus using the same
WO2019127509A1 (en) Entity globe having touch function, display terminal, and map displaying method
KR101886033B1 (en) Method for establishing user definition motion and motion recognition apparatus using the same
KR102081966B1 (en) Apparatus for motion recognition based on context awareness and storage medium therefor
KR102061567B1 (en) Method for providing standby mode for motion recognition and motion recognition apparatus using the same
KR101900754B1 (en) Method for establishing user definition motion and motion recognition apparatus using the same
KR101934930B1 (en) Apparatus and storage medium for mutually complementary motion recognition
KR102058894B1 (en) motion recognition apparatus based on prediction and storage medium therefor
KR101703645B1 (en) Method and apparatus for recognizing motion with adjustable sensitivity
Kokaji et al. User Interface Input by Device Movement

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right