CN106919958B - Human body finger action recognition method based on smart watch - Google Patents

Human body finger action recognition method based on smart watch Download PDF

Info

Publication number
CN106919958B
CN106919958B CN201710168313.1A CN201710168313A CN106919958B CN 106919958 B CN106919958 B CN 106919958B CN 201710168313 A CN201710168313 A CN 201710168313A CN 106919958 B CN106919958 B CN 106919958B
Authority
CN
China
Prior art keywords
finger
action
motion
intelligent watch
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710168313.1A
Other languages
Chinese (zh)
Other versions
CN106919958A (en
Inventor
王佳昊
钱立权
谢樱姿
唐诗超
卫玄龙
臧雅萱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201710168313.1A priority Critical patent/CN106919958B/en
Publication of CN106919958A publication Critical patent/CN106919958A/en
Application granted granted Critical
Publication of CN106919958B publication Critical patent/CN106919958B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a finger subtle action recognition method based on a smart watch. The application and identification method can solve the problems of misoperation and experience caused by the fact that a screen is small and keys are single when the intelligent watch is operated, and people are inconvenient to use. The application and identification method specifically comprises the following contents: 1. segmenting the original data of the finger action acceleration, and accurately extracting a finger action fragment algorithm; 2. only extracting the characteristic value of the finger motion acceleration signal to accurately identify the finger motion, thereby reducing the system operation expense; 3. and controlling and operating the intelligent watch according to the identification result. According to the intelligent watch, the data of the human body finger fine motion sensor is acquired through the acceleration sensor arranged in the intelligent watch, the intelligent watch fully utilizes the finger motion characteristic data and good data processing capacity, the intelligent watch can be conveniently controlled and operated by recognizing the fine motion of the finger, the man-machine interaction process can be greatly simplified, and the use value of the intelligent watch is enhanced.

Description

Human body finger action recognition method based on smart watch
Technical Field
The invention belongs to the technical field of communication electronics, and particularly relates to a human finger fine motion recognition method based on an intelligent watch.
Background
In recent years, intelligent wearable devices have been rapidly developed, and various civilian wearable devices represented by smart watches and smart bracelets have been widely popularized. However, due to the small screen and single key of the smart watch, people are very inconvenient in information interaction and use.
Meanwhile, the existing human activity recognition research based on the intelligent watch mainly focuses on the recognition of lower limb simple behaviors, such as a step counting function, and the capacity of intelligent hardware is not fully exerted, so that the stickiness of a user is insufficient.
If the smart watch can distinguish different slight actions of fingers, huge commercial value can be brought to the smart watch, the functions of the smart watch are enriched, and the man-machine interaction process is simplified. Early research on fine movements of human fingers mainly utilized special sensors and specialized devices, such as muscle electrical signal sensors and custom gloves, which, although having strong ability to detect finger muscle movements, required additional hardware and increased costs. Meanwhile, most of current smart watches are only provided with a single acceleration sensor, and it is very meaningful to research the human body finger fine motion recognition based on the single acceleration sensor.
The invention provides a human body finger action recognition method based on an intelligent watch. The invention is applied to a general intelligent watch, does not increase hardware cost, brings more intelligent applications to the intelligent watch, and improves the convenience of human-watch interaction.
Disclosure of Invention
The invention provides a human finger fine action recognition method based on an intelligent watch, which is used for solving the problems of information interaction misoperation between a user and the intelligent watch, high recognition cost of finger action depending on special equipment and the like.
In order to accurately recognize and understand the actions of human fingers and correctly operate the intelligent watch, the invention utilizes the built-in acceleration sensor and the operational capability of the universal intelligent watch to acquire and process data, and designs a set of human finger action recognition method to realize the accurate recognition of the actions of the human fingers and operate and control the intelligent watch, and the method mainly comprises the following characteristics:
(1) and (5) accurately extracting finger action fragment algorithm. In the process of collecting the fine actions of the human fingers by the intelligent watch, the noise of the originally collected data is filtered through a data preprocessing algorithm, and the acceleration signal data generated by the effective finger actions are obtained. Meanwhile, a data sequence segmentation and action fragment extraction algorithm is adopted to segment a continuous acceleration signal data sequence generated by finger action to obtain a plurality of data fragments with equal length, and the acceleration and mean value threshold value is utilized to filter and extract the data fragments containing the complete finger action, so that the automation of finger action fragment extraction is realized, and the machine is actively made to recognize the action.
(2) A finger motion acceleration characteristic value selecting method. The finger action segments extracted by the intelligent watch still have the original data form during collection, and the acceleration data characteristic value in the action window is obtained through characteristic value extraction. And selecting a characteristic value algorithm. The method is used for selecting the characteristic value with a part of excellent data representation capability from all the acceleration data characteristic values of the finger movement, and the system overhead is reduced.
(3) And controlling and operating the smart watch according to the recognized finger action. According to the scheme, a finger motion recognition classification model is constructed through a large number of finger motion sample training set combination classification algorithms, and the classification model is stored in the application of the intelligent watch. Meanwhile, the selected data characteristic value is used as the input of the finger motion recognition classification model, so that the intelligent watch can correctly understand and recognize different finger motions to obtain a recognition result. The intelligent watch responds to the identification result to perform corresponding operation and control on the intelligent watch, and the purposes of controlling the action of the human finger and using the intelligent watch are achieved.
Drawings
FIG. 1 is a schematic diagram of the overall process of the system of the present invention
FIG. 2 schematic diagram of smart watch collecting finger movements
FIG. 3 is a flow chart of action fragment extraction in the present invention
FIG. 4 is a flow chart of selecting eigenvalues in the present invention
FIG. 5 is a schematic diagram of the smart watch operating in accordance with the present invention
Detailed Description
The present invention may be better understood, and its numerous objects, features, and advantages made apparent to those skilled in the art by referencing the accompanying drawings. It is to be expressly noted that in the following description, a detailed description of known functions and designs will be omitted when it may obscure the subject matter of the present invention.
The method comprises an algorithm for accurately extracting finger action segments from finger action signals collected by an acceleration sensor of the intelligent watch, a finger action segment acceleration characteristic value selection method and a scheme for controlling and operating the intelligent watch according to recognized finger actions, wherein the overall flow of the system is shown in figure 1, and the detailed description is respectively given below.
(1) Algorithm for accurately extracting finger action fragment
1) Collecting raw data of continuous finger movements
The human finger fine motion data acquisition is mainly to acquire an acceleration value generated by a human body at the moment of motion through a three-dimensional acceleration sensor worn in an intelligent watch on the wrist of the human body. When the finger makes different fine motions, different sensor variation amplitudes exist. The different fingers are connected to tendon tissues on the wrist, and different acceleration is generated and fed back to the smart watch according to different actions. The acceleration sensor provides acceleration vector data in x, y and z three-dimensional directions, the x, y and z directions are fixed, and the relationship between the acceleration sensor and the watch screen is shown in figure 2.
2) Data preprocessing (noise reduction and filtering)
The acquired data is inevitably influenced by noise, and the influence of noise is mainly two points: firstly, an accelerometer built in the smart watch is influenced by gravity, so that the acceleration output comprises a gravity acceleration component and an acceleration component generated by movement; secondly, the arms and the hands of the human body can have involuntary small shakes and different angles and directions of wearing the watch, and the noises cause certain interference on the fine motion recognition of the fingers. In order to further obtain more stable and usable data features, some preprocessing operations, mainly including noise reduction, filtering and the like, need to be performed on the acquired raw data before extracting the action segments and feature values.
i. Noise reduction
In order to measure the acceleration actually produced by the finger movement, the contribution of gravity must be removed from the accelerometer data. Because the gravity acceleration component is low frequency in the signal, a low pass filter can be used for isolating gravity, the gravity acceleration signal and the motion generation acceleration signal on each axis are separated from the original acceleration output, and finally the motion acceleration signal data generated by the motion of the finger is obtained.
Filtering
For denoising the motion acceleration signal, calculating the motion acceleration square sum on each axis, such as the formula Acc _ square ═ ax 2+ay 2+az 2Acc _ squar is a model of an acceleration vector in a three-dimensional space of the intelligent watch, only reflects the change of the acceleration, filters the influence of the angle and the direction of wearing the watch, and extracts the feedback strength of wrist tendons in different finger actions; because the arms and fingers of the human body shake slightly in an involuntary manner in the acquisition process, the output data still has more burrs and shakes after being doped with certain noise signals, the method utilizes the moving average Filter to eliminate the square sum of the acceleration of the original data and the noise in Acc _ square, so that the motion acceleration is smoother, the burrs and random signals are filtered, and the filtered data is Acc _ Filter.
3) Accurate extraction of single finger motion segment
i. Sliding window segmentation
Because the continuous original data sequence collected from the acceleration sensor still cannot directly represent the behavior and activity of the human body after being preprocessed, the data form needs to be transformed again, and the data is segmented to extract the finger fine action segments. The invention adopts a sliding window segmentation method to segment the action segment, the size of the sliding window is N, the coverage rate is Cov, and the data segment segmentation is carried out on the finger fine action data sequence.
Extracting finger movement data segments
Sliding window partitioned data fragment { D1,D2,D3...DnContains still behavior or other motion interference segments, so that the proper finger motion data segment D needs to be extractedi. Hair brushAcceleration variance threshold V within explicit adoption sliding windowTSum-mean threshold MTAs a specific measure of data fluctuation, the acceleration variance V of all sampling points within the sliding windowaSum mean MaSatisfy at Va>VT∩Ma>MTThen, a minute finger movement segment D satisfying the condition is dividedi. The flow of precisely extracting the finger motion segment is shown in fig. 3.
(2) Finger action fragment acceleration characteristic value selection method
1) Feature extraction
The processed sensing data still has the data form during collection, and the behavior characteristics in the action window need to be obtained through characteristic value extraction, so that the behavior category can be accurately judged through a classification model. 8 time domain characteristics such as a mean value, a variance, a standard deviation, a skewness, a kurtosis, a root mean square, a maximum value, a peak-valley value interval and the like are respectively extracted through 4 dimensions of 3-dimensional acceleration data (an X axis, a Y axis and a Z axis) and Acc _ Filter data, and the characterization and classification of the action are carried out through 32 characteristic values in total.
2) Selecting the optimal characteristic value
To calculate the value of these feature values, we use feature evaluation based on information gain. The change in information occurring before and after dividing the data set becomes an information gain, and the information gain obtained by dividing the data set by calculating each eigenvalue is the best choice to obtain the characteristic with the highest information gain. The process of selecting the characteristic value is shown in fig. 4:
(3) scheme for controlling and operating smart watch according to recognized finger actions
1) Classification recognition model construction
In order to recognize fine motions of human fingers, a classifier model is first constructed to classify the motions of the fingers. After a large number of finger motion training sample sets are processed and characteristic values are selected, an excellent finger recognition classification model is constructed by combining a classification algorithm and is stored on the smart watch.
2) Operation control intelligent watch
The intelligent watch correctly identifies the finger movement type to obtain a finger movement result, and the finger movement result is sent to the intelligent watch operation module for receiving the identification result of the finger movement, responding to the result and controlling the operation of the intelligent watch. The operation is shown in fig. 5.
The application and identification method of the finger action in the smart watch has the following advantages:
1) accurate extraction of finger motion segments
According to the invention, the intelligent watch acquires the human finger action original data which is a continuous data sequence and is interfered by external noise, and the acquired hand original data is subjected to noise reduction and filtration by using an accurate finger action fragment extraction algorithm and a data preprocessing technology to obtain a processed continuous effective data sequence. And (3) segmenting the continuous sequence into data segments with equal length by a data sequence segmentation and action segment extraction algorithm, and accurately extracting the finger action segments by adopting a threshold comparison mode.
2) Accurately comprehending and identifying finger action and having low requirement on equipment
According to the invention, the action of the human finger is accurately identified based on the single acceleration sensor built in the universal intelligent watch, and compared with the early stage identification of the action of the finger by utilizing the customized equipment, the cost is reduced, the man-machine interaction process is simplified, and the functions of the intelligent watch are enriched.
3) Operate the intelligent wrist-watch through pointing the action, it is interactive convenient nature
According to the invention, a user can control and operate the intelligent watch only through the action of the fingers of the human body under the condition that the user wears the intelligent watch without using the tiny screen keys.
Although the embodiments of the present invention have been described above in order to facilitate the understanding of the present invention by those skilled in the art, it should be understood that the present invention is not limited to the scope of the embodiments, and various changes may be apparent to those skilled in the art as long as they are within the spirit and scope of the present invention as defined and defined in the appended claims, and all inventions utilizing the inventive concept are protected.

Claims (5)

1. A human body finger action recognition method based on an intelligent watch is characterized in that human body finger fine action data are collected based on an acceleration sensor arranged in the intelligent watch, finger fine action segments and action characteristic values are accurately extracted, and the effect of controlling and operating the intelligent watch is achieved by recognizing finger fine actions;
an algorithm for accurately extracting finger motion segment is used for preprocessing data by the smart watch, obtaining a starting point and an end point of finger motion by segmentation through a sliding window segmentation method, and then adopting an acceleration variance threshold value in a sliding window
Figure DEST_PATH_IMAGE002
Sum mean threshold
Figure DEST_PATH_IMAGE004
Extracting a finger fine motion segment as a measurement standard;
the finger action acceleration signal characteristic value selection method is used for only obtaining an acceleration signal action characteristic value generated by finger action, reducing the system overhead of the intelligent watch, and inputting the acceleration signal action characteristic value as a classification model to construct a classifier to identify the slight action of the finger;
the intelligent watch control and operation scheme comprises a classification identification model module, a finger action identification module and an intelligent watch operation module, wherein the classification identification model module is used for constructing a finger action classification model, classifying and identifying finger actions, and the intelligent watch responds to an identification result to perform corresponding operation.
2. The human body finger motion recognition method based on the smart watch as claimed in claim 1, wherein the algorithm for accurately extracting the finger motion segments further comprises a data preprocessing algorithm and a data sequence segmentation and motion segment extraction algorithm, the data preprocessing algorithm is used for preprocessing the finger motion original acceleration signals collected by the smart watch to obtain valid data, the data sequence segmentation algorithm is used for obtaining the starting point and the ending point of the finger motion, and then the finger motion segments are extracted.
3. The human body finger motion recognition method based on the smart watch as claimed in claim 2, wherein the data sequence segmentation and motion segment extraction algorithm further comprises a sliding window segmentation method, a finger motion segment acquisition algorithm using a threshold value, and a motion acceleration cutting algorithm for obtaining a start point and an end point of a finger motion;
the sliding window segmentation method is used for segmenting the motion acceleration data sequence, dividing the data sequence into a plurality of data segments with equal length and size, wherein the data segments consist of meaningless action segments and finger action segments;
the algorithm for obtaining the finger action fragments by using the threshold is used for filtering the meaningless action fragments and extracting the finger action fragments.
4. The human body finger motion recognition method based on the smart watch as claimed in claim 1, wherein the method for selecting the characteristic value of the acceleration signal of the finger motion further comprises a module for extracting the characteristic value and a characteristic value selection algorithm for extracting the characteristic value of the acceleration signal of the finger motion, and selecting the characteristic value with excellent data representation capability as a classification model to be input into a construction classifier for recognizing the finger fine motion, thereby reducing the system overhead.
5. The human body finger motion recognition method based on the smart watch as claimed in claim 1, wherein the finger motion recognition module is used for correctly recognizing the finger motion category, obtaining the finger motion result and sending the finger motion result to the smart watch operating module;
and the operation intelligent watch module is used for receiving the recognition result of the finger action, responding to the result and controlling the operation of the intelligent watch.
CN201710168313.1A 2017-03-21 2017-03-21 Human body finger action recognition method based on smart watch Active CN106919958B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710168313.1A CN106919958B (en) 2017-03-21 2017-03-21 Human body finger action recognition method based on smart watch

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710168313.1A CN106919958B (en) 2017-03-21 2017-03-21 Human body finger action recognition method based on smart watch

Publications (2)

Publication Number Publication Date
CN106919958A CN106919958A (en) 2017-07-04
CN106919958B true CN106919958B (en) 2020-06-26

Family

ID=59460665

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710168313.1A Active CN106919958B (en) 2017-03-21 2017-03-21 Human body finger action recognition method based on smart watch

Country Status (1)

Country Link
CN (1) CN106919958B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI665535B (en) * 2018-01-19 2019-07-11 巨擘科技股份有限公司 Function selection device and operating method thereof
CN108898062B (en) * 2018-05-31 2021-12-10 电子科技大学 Hand motion recognition method based on improved signal segment extraction algorithm
CN109508677B (en) * 2018-11-15 2021-07-13 电子科技大学 Improved CNN network-based recognition method for handwriting actions in air
WO2020140266A1 (en) 2019-01-04 2020-07-09 深圳大学 Interaction method and interactive system for smart watch
TWI689859B (en) * 2019-03-19 2020-04-01 國立臺灣科技大學 System for recognizing user gestures according to mechanomyogram detected from user's wrist and method thereof
CN110069199B (en) * 2019-03-29 2022-01-11 中国科学技术大学 Skin type finger gesture recognition method based on smart watch
CN110674683B (en) * 2019-08-15 2022-07-22 深圳供电局有限公司 Robot hand motion recognition method and system
CN111166340B (en) * 2019-12-31 2022-11-18 石家庄学院 Human body posture real-time identification method based on self-adaptive acceleration signal segmentation
CN111563468B (en) * 2020-05-13 2023-04-07 电子科技大学 Driver abnormal behavior detection method based on attention of neural network
CN112784731A (en) * 2021-01-20 2021-05-11 深圳市科思创动科技有限公司 Method for detecting physiological indexes of driver and establishing model
CN112817450A (en) * 2021-01-28 2021-05-18 Oppo广东移动通信有限公司 Action recognition method and device, electronic equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9904360B2 (en) * 2013-11-15 2018-02-27 Kopin Corporation Head tracking based gesture control techniques for head mounted displays
CN103885584B (en) * 2014-01-06 2016-11-09 北京奇虎科技有限公司 Intelligent watch
CN104267807A (en) * 2014-09-16 2015-01-07 华东理工大学 Hand action mechanomyography based man-machine interaction method and interaction system
CN105068650B (en) * 2015-08-11 2018-05-25 王燕军 A kind of real time motion capture, the game bracelet of interaction
CN106468945A (en) * 2015-08-20 2017-03-01 上海汽车集团股份有限公司 Wearable device and its control method
CN106406571B (en) * 2016-08-31 2019-06-11 广州视源电子科技股份有限公司 Interactive identification method, interactive identification equipment and intelligent wearable device

Also Published As

Publication number Publication date
CN106919958A (en) 2017-07-04

Similar Documents

Publication Publication Date Title
CN106919958B (en) Human body finger action recognition method based on smart watch
US10061389B2 (en) Gesture recognition system and gesture recognition method
He et al. Activity recognition from acceleration data based on discrete consine transform and SVM
WO2018040757A1 (en) Wearable device and method of using same to monitor motion state
US9299248B2 (en) Method and apparatus for analyzing capacitive EMG and IMU sensor signals for gesture control
CN110069199B (en) Skin type finger gesture recognition method based on smart watch
Uddin et al. Random forests based recognition of human activities and postural transitions on smartphone
CN108371545B (en) Human body arm action sensing method based on Doppler radar
US10521018B2 (en) Human body-based interaction method and interaction apparatus
CN108703824B (en) Bionic hand control system and control method based on myoelectricity bracelet
CN107300971A (en) The intelligent input method and system propagated based on osteoacusis vibration signal
CN113609976B (en) Direction-sensitive multi-gesture recognition system and method based on WiFi equipment
Ahmed et al. An approach to classify human activities in real-time from smartphone sensor data
CN106228200A (en) A kind of action identification method not relying on action message collecting device
CN108985157A (en) A kind of gesture identification method and device
Carfi et al. Online human gesture recognition using recurrent neural networks and wearable sensors
CN110412566A (en) A kind of fine granularity human arm motion's recognition methods based on Doppler radar time and frequency domain characteristics
CN112464738A (en) Improved naive Bayes algorithm user behavior identification method based on mobile phone sensor
CN110866468A (en) Gesture recognition system and method based on passive RFID
Sheng et al. An adaptive time window method for human activity recognition
CN111700718A (en) Holding posture identifying method, holding posture identifying device, artificial limb and readable storage medium
Zheng et al. L-sign: Large-vocabulary sign gestures recognition system
CN113051972A (en) Gesture recognition system based on WiFi
CN112069483A (en) User identification and authentication method of intelligent wearable device
CN109620241B (en) Wearable device and motion monitoring method based on same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant