WO2017113389A1 - Wearable human-machine interaction apparatus, and human-machine interaction system and method - Google Patents

Wearable human-machine interaction apparatus, and human-machine interaction system and method Download PDF

Info

Publication number
WO2017113389A1
WO2017113389A1 PCT/CN2015/100310 CN2015100310W WO2017113389A1 WO 2017113389 A1 WO2017113389 A1 WO 2017113389A1 CN 2015100310 W CN2015100310 W CN 2015100310W WO 2017113389 A1 WO2017113389 A1 WO 2017113389A1
Authority
WO
WIPO (PCT)
Prior art keywords
magnetic field
human
command
data
computer interaction
Prior art date
Application number
PCT/CN2015/100310
Other languages
French (fr)
Chinese (zh)
Inventor
王子健
黄立明
卓越
Original Assignee
西门子公司
王子健
黄立明
卓越
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 西门子公司, 王子健, 黄立明, 卓越 filed Critical 西门子公司
Priority to PCT/CN2015/100310 priority Critical patent/WO2017113389A1/en
Publication of WO2017113389A1 publication Critical patent/WO2017113389A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the invention relates to the field of intelligent control technology, in particular to a wearable human-computer interaction device, a human-computer interaction system and a human-computer interaction method.
  • Another human-computer interaction device that uses an accelerometer and an artificial neural network to recognize an operator's arm posture, thereby realizing control of the robot.
  • the accelerometer must be placed in a horizontal position, and only some simple actions can be identified, as above. Lower, left, and right movements, clockwise rotation, and counterclockwise rotation, etc., cannot implement complex commands.
  • the operator needs to wear a myoelectric sensor, and the operator's arm is attached with a myoelectric sensor, and the human-machine interaction device can judge the action made by the operator according to the myoelectric signal sensed by the myoelectric sensor, for example, , clenching fists, sticking out fingers, patting the palms, etc.
  • the myoelectric sensor is more complicated to wear and needs to be wrapped around the operator's entire arm, and the myoelectric sensor is expensive and the sensing accuracy is not good.
  • the operator outputs an instruction by means of a fist or a finger extension, the number of instructions that can be output is small, and it is difficult to satisfy the demand for actually executing a complicated command.
  • the object of the present invention is to provide a wearable human-machine interaction device, a human-computer interaction system and a human-computer interaction method, which can achieve higher control precision and can support simple or complicated control commands. input of.
  • the invention provides a human-computer interaction system, comprising: an accelerometer for sensing acceleration of an instruction input along three mutually perpendicular axes; and a magnetic field meter for sensing the instruction input The strength of the earth's magnetic field along the three mutually perpendicular axes of the object; a gyroscope for sensing the An angular velocity of the command input along three mutually perpendicular axes; a controller receiving data of the acceleration, the earth magnetic field strength, the angular velocity, and the true earth magnetic field strength of the command input location And determining whether the strength of the earth magnetic field is normal; if the strength of the earth magnetic field is normal, the controller processes, extracts, and recognizes the data input of the acceleration and the magnetic field strength sensed by the magnetic field meter.
  • the control command input by the object outputs the identified control command; if the earth magnetic field strength is abnormal, the controller processes, extracts and identifies the command input data of the acceleration and the angular velocity data.
  • the input control command
  • the controller includes:
  • a data preprocessing module that receives the acceleration sensed by the accelerometer, the intensity of the earth magnetic field sensed by the magnetometer, the angular velocity sensed by the gyroscope, and the real earth where the command input is located Data information of the magnetic field strength, and performing denoising and correction processing on the acceleration, the intensity of the earth magnetic field sensed by the magnetic field meter, and the angular velocity, and the intensity of the earth magnetic field sensed by the magnetic field meter Compare the real Earth's magnetic field strength to determine if the Earth's magnetic field is disturbed;
  • a feature extraction module receiving the acceleration processed by the data preprocessing module, the magnetic field strength sensed by the magnetic field meter, and the angular velocity data information, and extracting feature data therefrom;
  • An instruction recognition module receives feature data extracted by the feature extraction module, and identifies an instruction command gesture action issued by the instruction input object according to the feature data, and outputs the instruction corresponding to the command gesture action Control instruction.
  • the feature data is trajectory data or acceleration change data.
  • the feature extraction module receives the data information output by the data pre-processing module, and obtains three-dimensional posture information according to the data information, and the three-dimensional posture information and the acceleration data information. Calculating the trajectory data corresponding to the command gesture action in the earth coordinate system, thereby defining a virtual operation plane, projecting the trajectory data onto the virtual operation plane, and performing feature data extraction on the projected data.
  • the instruction recognition module inputs the trajectory feature data into a command gesture recognition model obtained by offline training, and recognizes a command gesture action, and the instruction recognition module is further configured to search for a command.
  • the gesture and control instruction association database acquires a control instruction corresponding to the command gesture action.
  • the human-computer interaction system further includes:
  • the command recognition model, the command gesture and the control instruction association database are stored in the memory, after the power is turned on, the command recognition model, the command gesture and the control instruction association database are loaded into the controller .
  • the human-computer interaction system further includes:
  • a temperature sensor for sensing an ambient temperature of the location where the command input is located, and transmitting the ambient temperature to the data pre-processing module, wherein the data pre-processing module can measure the acceleration according to the ambient temperature
  • the sensed acceleration, the measured magnetic field strength of the magnetometer, and the angular velocity sensed by the gyroscope are corrected.
  • the human-computer interaction system further includes:
  • control command outputted by the instruction recognition module is transmitted to a controlled device, the feedback module is configured to receive information fed back by the controlled device, and determine whether the control command is received or executed correctly ;
  • a pointing device control module controls the pointing device to issue a corresponding indication according to whether the control command determined by the feedback module is received or correctly executed.
  • the indicating device includes at least one of a vibration motor and an indicator light.
  • the command input is a human hand.
  • the present invention also provides a wearable human-machine interaction device comprising the human-computer interaction system according to any of the above.
  • the invention further provides a human-computer interaction method, characterized in that the human-computer interaction method comprises the following steps:
  • the data information of the acceleration and the angular velocity is processed, extracted, and the control command input by the command input object is recognized, and the recognized control command is output.
  • the acceleration, the earth magnetic field strength, and the angular velocity are denoised and corrected.
  • the human-computer interaction method further includes the following steps:
  • the human-computer interaction method further includes the following steps:
  • the control command is transmitted to a controlled device, and the controlled device feeds back information about the execution of the control command;
  • the controller can receive the acceleration of the command input sensed by the accelerometer, and the earth sensed by the magnetic field meter.
  • the magnetic field strength, the angular velocity of the command input object sensed by the gyroscope, and the true earth magnetic field strength of the command input object and determine whether the earth magnetic field strength sensed by the magnetic field meter is normal; when the magnetic field meter senses When the earth's magnetic field strength is normal, the controller processes the acceleration of the command input object and the data information of the earth magnetic field strength sensed by the magnetic field meter, extracts and recognizes the control command input by the command input object; when the magnetic field meter senses the earth When the magnetic field strength is abnormal, the controller processes, extracts and recognizes the control information input by the command input object for the acceleration of the command input object and the angular velocity of the command input object, and the identification mode of the control command is interfered by the magnetic field.
  • control accuracy of the human-computer interaction system of the present invention is not affected by external noise or ambient light.
  • the wearable human-machine interaction device of the present invention can be directly worn by the operator's hand. On the wrist, the operator can control by simply waving the arm without touching the control panel. When the control panel is far away, the input of the control command can still be realized, and the normal operation of the device can be controlled.
  • FIG. 1 is a schematic diagram of functions of a human-machine interaction system according to an embodiment of the present invention.
  • FIG. 2 is a schematic structural diagram of the human-machine interaction system of FIG. 1.
  • FIG. 3 is a startup flowchart of the human-machine interaction system shown in FIG. 2.
  • FIG. 4 is a flowchart of a command gesture motion pattern recognition thread in a human-computer interaction method according to an embodiment of the present invention.
  • FIG. 5 is a flow chart of gesture recognition in the human-computer interaction method shown in FIG. 4.
  • FIG. 6 is a flow chart of instruction feedback in the human-computer interaction method shown in FIG. 4.
  • FIG. 7 is a flow chart of an offline training process for the command recognition model in the human-computer interaction method shown in FIG.
  • FIG. 1 is a schematic diagram of functions of a human-machine interaction system according to an embodiment of the present invention.
  • 2 is a schematic structural diagram of the human-machine interaction system of FIG. 1.
  • the human-machine interaction system 10 of the present embodiment includes an accelerometer 12, a magnetic field meter 13, a gyroscope 14 and a controller 15, wherein the accelerometer 12, the magnetic field meter 13 and the gyroscope 14 are both controlled.
  • the device 15 is electrically connected.
  • the human-computer interaction system 10 can sense the global attitude in real time, has high gesture recognition accuracy, and can be used to input commands and control mechanical motion.
  • the accelerometer 12 is a three-axis accelerometer that can be used to sense the acceleration of the command input along three mutually perpendicular axes (eg, X, Y, Z axes) and will sense The acceleration is transmitted to the controller 15.
  • the magnetic field meter 13 is a three-axis magnetic field meter that can be used to sense the strength of the earth's magnetic field along the three mutually perpendicular axes of the command input and transmit the sensed earth's magnetic field strength to the controller 15.
  • the gyroscope 14 is a three-axis gyroscope for sensing angular velocity of the command input in three mutually perpendicular axis directions and transmitting the sensed angular velocity to the controller 15.
  • the acceleration, the strength of the earth magnetic field, and the angular velocity are all vectors.
  • the command input is manually, but is not limited thereto.
  • Determining the posture of the human hand requires two kinds of information, one is the information of the earth's magnetic field strength, and the other is the acceleration information of the human hand.
  • the two sets of information are vector information in the three directions of X, Y, and Z. Three-dimensional pose.
  • the data sensed by the magnetometer 13 is relatively accurate, and the data sensed by the magnetometer 13 can be used as the recognition of the human hand gesture.
  • the strength of the Earth's magnetic field will be affected by the environment. For example, in the vicinity of the stainless steel door frame, the Earth's magnetic field will be disturbed.
  • the strength of the Earth's magnetic field sensed by the magnetic field meter 13 is inaccurate, affecting the correct posture of the subsequent human hand. Identification.
  • the gyroscope 14 is used instead of the magnetic field meter 13, that is, the gyroscope 14 is used to compensate for the error of the earth's magnetic field, and the data sensed by the gyroscope 14 is used as the recognition of the human hand posture.
  • the gyroscope 14 is used for a long time, its accuracy is degraded and there is a drift error. Therefore, when the earth's magnetic field returns to normal, it is necessary to switch to the magnetometer 13 mode, that is, the data sensed by the magnetic field meter 13 is used again for the recognition of the human hand posture. .
  • the controller 15 receives the acceleration, the earth magnetic field strength, the angular velocity, and the real earth magnetic field strength of the position of the human hand, and determines whether the earth magnetic field strength is normal; if the earth magnetic field strength is normal, the controller 15 The acceleration and the data information of the earth magnetic field strength sensed by the magnetic field meter are processed, extracted and recognized by a control command input by a human hand, and then the recognized control command is output; if the earth magnetic field strength is abnormal, the controller 15 processing, extracting and recognizing the control information input by the human hand, and outputting the recognized control command.
  • the real earth magnetic field strength is offline measured data, which is real earth magnetic field strength information of each latitude and longitude of the earth, and the real earth magnetic field strength may be preset in the controller 15 or may be stored in the memory, and then used. Load it into memory.
  • the controller 15 includes a data pre-processing module 152, a feature extraction module 153, and an instruction recognition module 154.
  • the data pre-processing module 152 communicates with the accelerometer 12, the magnetic field meter 13 and the gyroscope 14 to receive the acceleration sensed by the accelerometer 12, the intensity of the earth magnetic field sensed by the magnetic field meter 13, the angular velocity sensed by the gyroscope 14, and Data information of the true earth magnetic field strength at the position of the human hand, denoising and correcting the acceleration, the earth magnetic field strength sensed by the magnetic field meter 13 and the angular velocity, and the earth magnetic field sensed by the magnetic field meter 13 The intensity is compared to the true Earth's magnetic field strength to determine if the Earth's magnetic field is disturbed.
  • the correction processing mainly performs noise filtering, singular point removal, temperature error compensation, and the like on the acceleration, the earth magnetic field strength sensed by the magnetic field meter 13 and the angular velocity.
  • determining whether the earth magnetic field is interfered by is comparing the strength of the earth magnetic field sensed by the magnetic field meter 13 with the strength of the real earth magnetic field, and if the error of the two is less than or equal to a predetermined threshold, determining that the earth's magnetic field is not If the error between the two is greater than the preset threshold, the earth's magnetic field is judged to be disturbed.
  • the feature extraction module 153 receives the acceleration processed by the data preprocessing module 152, the earth magnetic field strength sensed by the magnetic field meter 13, and the angular velocity data information, and extracts feature data therefrom.
  • the feature extraction module 153 receives the data information output by the data preprocessing module 152, and obtains three-dimensional posture information according to the data information, and the trajectory corresponding to the command gesture action in the earth coordinate system can be calculated from the three-dimensional posture information and the acceleration data information.
  • Data (through the second integration), which in turn defines a virtual operation plane (so that all motion trajectory data is closer to the plane), projects the trajectory data onto the virtual operation plane, and performs feature data extraction on the projected data.
  • the command gesture action can be recognized, and the action operation plane is a two-dimensional plane.
  • the two-dimensional plane is not necessarily perpendicular to the horizontal plane. That is to say, in general, the command written by a person is three-dimensional, but is generally on a two-dimensional plane.
  • the three-dimensional attitude information is obtained based on the acceleration data sensed by the accelerometer 12 and the earth magnetic field strength data sensed by the magnetic field meter 13 when the earth's magnetic field is disturbed.
  • the three-dimensional attitude information is obtained based on the acceleration data sensed by the accelerometer 12 and the angular velocity data sensed by the gyroscope 14, but the invention is not limited thereto.
  • the feature extraction module 153 can also extract the feature data corresponding to the three-dimensional command gesture action.
  • the command recognition module 154 is electrically connected to the feature extraction module 153, receives the feature data extracted by the feature extraction module 153, and inputs the feature data into the command gesture recognition model obtained by offline training, thereby identifying the command gesture action issued by the command input object.
  • the search command gesture is further associated with the control command to obtain an acquisition control command corresponding to the command gesture action, and then the control command is transmitted to the controlled device.
  • the feature data is a set of data used to represent a motion trajectory mode, which may be trajectory data, acceleration change data, or other representations.
  • the instruction recognition module 154 inputs the trajectory feature data into the command gesture recognition model obtained by offline training, and recognizes the command gesture action, for example, based on the trajectory feature data, whether the command gesture action is “S” or “A”. Next, the instruction recognition module 154 determines whether the command gesture action corresponds to a command gesture and a control command in the control instruction association database, thereby obtaining a control command and transmitting the control command to the controlled device.
  • the command gesture and the control instruction associated database may be provided with a preset table between the command gesture action and the control command, and different command gesture actions correspond to different control commands.
  • the command gesture model is used to identify the command gesture action, which is obtained by offline training according to the collected data, and is loaded into the controller 15 during real-time operation; and the feature recognition module 154 extracts the feature extracted by the feature extraction module 153.
  • the data input command gesture model can identify the command gesture action. Identification After the command gesture action, the corresponding control command can be searched in the command gesture and the control instruction association database, and the command gesture and the control instruction association database are artificially defined. For example, the command gesture action “A” corresponds to the control command “moving to the left” Or “move to the right.”
  • the recognition of the command gesture action may be based on the trajectory feature data, or may be based on other methods, such as changing the direction based on the direction of the acceleration in the motion trajectory, for example, the direction of the acceleration commanding the gesture action "L" is first downward, then Right; accordingly, the information input in the command recognition model is no longer the trajectory information, but the change information of the acceleration.
  • the human-computer interaction system 10 further includes a temperature sensor 16 and a pointing device 17.
  • the temperature sensor 16 is configured to sense the ambient temperature of the location of the human hand and transmit the ambient temperature to the data pre-processing module 152 of the controller 15, and the data pre-processing module 152 can sense the accelerometer according to the ambient temperature sensed by the temperature sensor 16.
  • the measured acceleration, the intensity of the earth magnetic field sensed by the magnetic field meter 13, and the angular velocity sensed by the gyroscope 14 are corrected.
  • Temperature sensor 16 is used to sense ambient temperature for calibration and deskew.
  • the sensing accuracy of accelerometer 12, magnetic field meter 13 and gyroscope 14 are affected by temperature, and need to be pre-calibrated or dynamically calibrated. Accelerometer 12, magnetic field meter 13 and gyroscope 14 will be calibrated before leaving the factory to provide The deviation relationship between the measured value and the true value of the accelerometer 12, the magnetic field meter 13 and the gyroscope 14 under different temperature conditions, thereby calibrating the accelerometer 12, the magnetic field meter 13 and the gyroscope 14 according to the temperature sensed by the temperature sensor 16. The measured value is such that the measured value is closer to the true value.
  • the measured value of the accelerometer 12 is 10 m/s
  • the true value is 9 m/sec
  • the accelerometer 12 has a measured value of 10 m/s and a true value of 9.5 m/sec
  • there is an error of 0.5 m/s between the measured value and the true value which can be eliminated by calibration, that is, the measured value is Calibrate to obtain a calibrated measurement equal to the true value.
  • the operator After the control command is sent to the controlled device, the operator needs to know whether the control command is received, or whether the control command is correctly executed after being received, and the corresponding control command receiving and executing information needs to be displayed by the pointing device 17.
  • the setting of the pointing device 17 allows the operator to not have to constantly observe the operating panel, even if the operator is far from the operating panel, the operator can still know that the control command is received or executed.
  • the pointing device 17 includes a vibration motor 172 and an indicator light 173.
  • the controller 15 further includes a feedback module 155, a first pointing device control module 156, and a second pointing device control module 157, the command recognition module.
  • the control command outputted by 154 is transmitted to the controlled device, and the feedback module 155 is configured to receive the information fed back by the control device and determine whether the control command is received or executed correctly.
  • the first indicating device control module 156 is electrically connected to the vibration motor 172 and the feedback module 155, and the second indicating device control module 157 It is electrically connected to the indicator light 173 and the feedback module 155.
  • the first pointing device control module 156 and the second pointing device control module 157 control the pointing device 17 to issue a corresponding indication based on whether the control command determined by the feedback module 155 is received or correctly executed.
  • the vibration mode of the vibration motor 172 and the indication mode of the indicator light 173 can be arbitrarily set according to actual needs. For example, when the control command is correctly executed, the first pointing device control module 156 controls the vibration motor 172 to issue two intervals for a long time. Vibration; when the control command is executed incorrectly, the first pointing device control module 156 controls the vibration motor 172 to continuously vibrate continuously.
  • the vibration mode of the vibration motor 172 and the indication mode of the indicator light 173 can be arbitrarily set according to actual needs.
  • the indicator light 173 can be set in three colors: red, yellow, green, and red indicates that an execution control command error occurs, and is an alarm signal, green. Indicates that the control command was executed correctly, and yellow indicates that the control command was not received.
  • the indicating device 17 includes at least one of the vibration motor 172 and the indicator light 173, that is, the vibration motor 172 and the indicator light 173 can be used simultaneously, or only one of them can be used, and when the vibration motor 172 is used, it is correspondingly used.
  • the first pointing device control module 156 when using the indicator light 173, correspondingly also uses the second pointing device control module 157.
  • the indicating device 17 can also be a code table, and a plurality of feedback status codes can be set through the code table. For example, 01 indicates success, 02 indicates an error, and 03 indicates an alarm.
  • the human-computer interaction system 10 further includes a memory 18 and a power source 19, the memory 18 being connectable to the controller 15, and the controller 15 can obtain storage information from the memory 18, for example, a command recognition model, a command gesture and a control instruction associated database can be stored in In the memory 18, after power-on, the command recognition model, the command gesture and the control instruction association database can be loaded into the controller 15; the vibration mode of the vibration motor 172 and the indication mode of the indicator light 173 can also be stored in the memory 18, and the controller 15 After the vibration mode and the indication mode are read from the memory 18, the control can be performed in accordance with the corresponding mode.
  • the power source 19 can supply power to the entire human-machine interaction system 10. As shown in FIG.
  • the power source 19 is electrically connected to the controller 15, and the electric energy of the accelerometer 12, the magnetic field meter 13, the gyroscope 14, the temperature sensor 16, and the pointing device 17 is controlled.
  • the pin of the device 15 is supplied.
  • the accelerometer 12, the magnetic field meter 13, the gyroscope 14, the temperature sensor 16, and the pointing device 17 may be powered directly from the power source 19 lead.
  • command recognition model may also be preset in the controller 15, and the controller 15 may pass through the low-power Bluetooth or Bluetooth 4.0 and the accelerometer 12, the magnetic field meter 13, the gyroscope 14, the temperature sensor 16, and the indicating device 17. And the memory 18 communicates, but the communication method is not limited thereto.
  • the present invention also provides a wearable human-machine interaction device, the wearable human-machine interaction device comprising the above-mentioned human-machine Interactive system 10.
  • the wearable human-machine interaction device can be made into a wristband, and the operator directly wears it on the wrist. In some working environments, such as carrying heavy objects and preventing radiation, the operator needs to wear gloves. However, the operator wears gloves. After that, the control panel cannot be directly operated by hand, and the wearable human-machine interaction device can realize the input of the control command and control the normal operation of the device without contacting the control panel.
  • FIG. 3 is a startup flowchart of the human-machine interaction system shown in FIG. 2. Before the human-computer interaction system 10 is in normal use, the human-computer interaction system 10 performs the steps shown in FIG. 3:
  • Step S22 hardware initialization; for example, initialization of a communication interface such as IIC (Inter-Integrated Circuit), SPI (Serial Peripheral Interface), and GPIO (General Purpose Input Output), initialization of the accelerometer 12, the magnetic field meter 13, and the gyroscope 14;
  • IIC Inter-Integrated Circuit
  • SPI Serial Peripheral Interface
  • GPIO General Purpose Input Output
  • the IIC, SPI, GPIO, accelerometer 12, magnetic field meter 13 and gyroscope 14 are powered on, ready for normal operation;
  • Step S23 defining a global variable; the step is for storing a global variable of the software runtime, for example, buffering sensor data for a period of time, saving the earth coordinate system (x, y, z), and saving the virtual operation plane coordinate system (x, y) ,z);
  • Step S24 parameter initialization; for example, loading command recognition model, command gesture and control instruction association database, vibration mode of vibration motor 172 and indication mode of indicator light 173, command gesture and control instruction association database may include command gesture action and control instruction Preset table
  • Step S25 the real-time data processing thread is started; for example, the feature extraction module 153 and the instruction recognition module 154 are activated, the feature extraction module 153 can extract the trajectory feature data, and the instruction recognition module 154 inputs the extracted trajectory feature data into the command recognition model, and the command The gesture action is identified, and the corresponding command is obtained by the search command gesture and the control instruction associated database.
  • feature extraction module 153 can also extract other types of feature data.
  • the human-computer interaction method of the present invention includes the following steps:
  • the data information of the acceleration and the angular velocity is processed, extracted, and the control command input by the command input object is recognized, and the recognized control command is output.
  • the human-computer interaction method of the present invention further includes the following steps:
  • De-noising and correction processing is performed on the acceleration, the earth magnetic field strength, and the angular velocity.
  • the control command is transmitted to a controlled device, and the controlled device feeds back information about the execution of the control command;
  • the control instruction means 17 issues a corresponding indication based on the result of the judged whether the control command is received or correctly executed.
  • FIG. 4 is a flowchart of a command gesture motion pattern recognition thread in a human-computer interaction method according to an embodiment of the present invention.
  • the command gesture motion pattern recognition thread includes the following steps:
  • Step S32 reconstructing the motion trajectory
  • the feature extraction module 153 receives the acceleration, the magnetic field strength sensed by the magnetic field meter 13 and the angular velocity data processed by the data preprocessing module 152, and obtains the data information according to the data information.
  • the three-dimensional posture information converts the acceleration data of the command gesture motion to the earth coordinate system based on the three-dimensional posture information, and then performs second integration on the acceleration data in the earth coordinate system (one speed is obtained for the integral data, and the second integral is obtained for the trajectory data) ), thereby obtaining movement trajectory data of the command gesture action;
  • step S33 it is determined whether the segment of the motion track is found.
  • the feature extraction module 153 determines whether to find the segment of the motion track according to the movement track data. If yes, step S34 is performed, if not, step S32 is performed; and in step S32, the command gesture action is obtained.
  • the movement of the trajectory data, and the command gesture action may be a set of multiple command gesture actions, such as continuously inputting two command gestures of "S" "A", in which case two tracks of S A need to be found.
  • the intermediate segmentation of the data thereby segmenting the trajectory data into two groups, corresponding to two command gesture actions.
  • Step S34 defining a virtual operation plane; for each set of trajectory data obtained by the step S33, the feature extraction module 153 defines a corresponding virtual operation plane, so that most of the set of trajectory data is located on the corresponding virtual operation plane or distance.
  • the corresponding virtual operation plane is very close;
  • Step S35 extracting feature data; the feature extraction module 153 projects each set of trajectory data (corresponding to a gesture command) to a corresponding virtual operation plane, and extracts feature data from the projected trajectory data for command gesture recognition.
  • the extracted feature data may be motion trajectory feature data, acceleration change data, or the like.
  • Step S36 Identifying the command gesture action; the command recognition module 154 identifies the command gesture action according to the feature data extracted by the feature extraction module 153, and inputs the feature data extracted by S35 to the command recognition model obtained by offline training, and outputs the corresponding command.
  • Gesture action for example, recognizing the command gesture action as "S";
  • Step S37 Searching the command gesture action and the control instruction association database, determining whether the recognized command gesture action matches the control command, and if yes, executing step S38, if not, executing step S39; the command recognition module 154 determines whether the command gesture action is Corresponding to a certain control instruction in the database associated with the command gesture action and the control instruction, thereby obtaining the control instruction;
  • Step S38 issuing a control instruction
  • Step S39 the command gesture action is cached; the human-machine interaction system 10 can support the input of complex control commands.
  • the command gesture action "S” can represent a single control command, and the “S” can also be combined with other letters to represent a control.
  • the instruction, such as the command gesture action "SA”, represents a control command.
  • "S" is a segment
  • "A” is a segment
  • "S" is recognized, no existence and "S” are found.
  • "The corresponding control command at this time, "S” is cached first. In the next round of the identification process, if "A” is recognized, the control command corresponding to "SA” is found; after step S39, the execution is repeated. Step S32.
  • FIG. 5 is a flow chart of gesture recognition in the human-computer interaction method shown in FIG. 4.
  • the gesture recognition is for extracting feature data, thereby identifying command gesture actions, and the gesture recognition comprises the following steps:
  • Step S42 data preprocessing; the data preprocessing module 152 performs denoising and correction processing on the acceleration sensed by the accelerometer 12, the intensity of the earth magnetic field sensed by the magnetic field meter 13, and the angular velocity sensed by the gyroscope 14.
  • Step S43 determining whether the earth's magnetic field is disturbed, and if so, executing step S44, and if not, performing the step S45; the real earth magnetic field strength information of each latitude and longitude of the earth may be preset in the controller 15 or stored in the memory 18.
  • the data pre-processing module 152 may perform the earth magnetic field intensity sensed by the magnetic field meter 13 and the real earth magnetic field strength. Comparing, if the error of the two is less than or equal to a preset threshold, it is determined that the earth's magnetic field is not interfered, and if the error of the two is greater than the preset threshold, the earth's magnetic field is judged to be interfered;
  • Step S44 recognizing the command gesture action according to the data sensed by the accelerometer 12 and the gyroscope 14;
  • Step S45 identifying a command gesture action according to the data sensed by the acceleration and 12 and the magnetic field meter 13;
  • step S46 the gesture recognition information is output.
  • the data pre-processing module 152 transmits the pre-processed acceleration and the magnetic field strength and angular velocity data sensed by the magnetic field meter 13 to the feature extraction module 153.
  • the feature extraction module 153 The feature data is extracted for commanding the recognition of the gesture action.
  • the gesture recognition is a process of continuously looping. When the gesture of one moment is recognized, the process returns to step S42 to re-execute, and the gesture of the next moment is recognized, and the loop is sequentially performed.
  • FIG. 6 is a flow chart of instruction feedback in the human-computer interaction method shown in FIG. 4.
  • the instruction feedback in the human-computer interaction method includes the following steps:
  • Step S52 parsing the feedback instruction
  • the feedback module 155 is configured to receive the information fed back by the control device, and determine whether the control instruction is correctly executed, and transmit the analysis result to the first pointing device control module 156 and the second pointing device control module. 157;
  • Step S53 determining the vibration mode; the first pointing device control module 156 determines in which mode the vibration motor 172 should vibrate according to the analysis result of the feedback module 155, for example, when the control command is correctly executed, the vibration motor 172 should issue two intervals. Long-time vibration; when the control command is executed incorrectly, the vibration motor 172 should vibrate frequently and continuously;
  • Step S54 controlling the vibration motor 172 to vibrate;
  • the first pointing device control module 156 controls the vibration motor 172 to vibrate in the determined vibration mode;
  • Step S55 determining the indication mode of the indicator light 173; the second pointing device control module 157 determines, according to the analysis result of the feedback module 155, which mode the indicator light 173 should be indicated, for example, the red light is on to indicate that the execution control instruction has an error, and the green light Lights up to indicate that the control command is executed correctly, and the yellow light is on to indicate that the control command has not been received;
  • step S56 the control indicator 173 is turned on; the second pointing device control module 157 controls the vibration indicator 173 to illuminate in the determined indication mode.
  • step S52 is performed to cycle sequentially.
  • step S53 and step S55 may be performed simultaneously, and step S54 and step S56 may be performed simultaneously, but not limited thereto.
  • step S53 and step S54 may be omitted, or step S55 and step S56 may be omitted.
  • the command recognition model library indicates what kind of control command is required for the gesture action, and needs to be trained in advance to meet the complexity. Changeable usage requirements. After the command gesture action is recognized, the command gesture action needs to be compared with the command gesture and the information in the control instruction association database, so that it can be known which command command represents the command gesture action, and the command gesture and the control command are associated with the command gesture in the database. The action can be arbitrarily set according to the actual situation.
  • the formation process of the command recognition model includes the following steps:
  • Step S62 creating a new action mode
  • Step S63 collecting sensor data corresponding to a single command gesture action; the accelerometer 12 senses the acceleration of the human hand, the magnetic field meter 13 senses the strength of the earth magnetic field at the position of the human hand, and the gyroscope 14 senses the angular velocity of the human hand; , collecting corresponding sensor data multiple times.
  • Step S64 data preprocessing and feature data extraction
  • the data preprocessing module 152 demodulates the acceleration of the human hand by the accelerometer 12, the earth magnetic field strength of the magnetic field meter 13 sensing the position of the human hand, and the angular velocity of the gyroscope 14 sensing the human hand.
  • the correction processing the feature extraction module 153 receives the acceleration, the earth magnetic field strength and the angular velocity information processed by the data preprocessing module 152, and extracts the feature data, and the feature data may be the trajectory data or the acceleration change data.
  • the process of extracting the feature data of the feature extraction module 153 is that the feature extraction module 153 receives the data information output by the data preprocessing module 152, and obtains the three-dimensional posture information according to the data information, and the three-dimensional posture information and
  • the acceleration data information can calculate the trajectory data (secondary integral) corresponding to the command gesture action in the earth coordinate system, and further define a virtual operation plane (so that all trajectory data is closer to the plane), and project the trajectory data to the virtual operation. Plane data on the projected data extract.
  • Step S65 commanding gesture recognition model training; training the command gesture recognition model based on the feature data extracted by the single command gesture action; and adopting a hidden Markov process and the like;
  • Step S66 determining whether it is necessary to add a command gesture model, and if so, performing step S62 again, if not, Step S67 is performed; the human-computer interaction system 10 can support the input of a complicated control instruction.
  • the recognition model "S” is trained first, and the recognition model "A” is obtained for continuing training, and then executed again.
  • Step S62 that is, if the command corresponds to a plurality of command gesture actions, continue to collect data of the next command gesture action, and train the corresponding command gesture recognition model;
  • step S67 the model is recognized in combination with the command gesture; for example, the model identifying the model "S” and the recognition model “A” is combined into the recognition model "SA”, and the recognition of the complex command gesture is supported. ;
  • Step S68 associating the command gesture action with the control instruction; setting, in the command gesture and the control instruction association database, what control instruction is represented by each command gesture action;
  • step S69 a command recognition model and a command gesture and a control instruction association database are formed.
  • the trained command recognition model and the command gesture are associated with the control instruction database for storage.
  • the wearable human-machine interaction device, human-computer interaction system and method of the present invention have at least the following advantages:
  • the controller can receive the acceleration of the command input sensed by the accelerometer, the earth magnetic field intensity sensed by the magnetic field meter, and the gyroscope sense.
  • the control precision of the human-computer interaction system of the present invention is not affected by external noise or ambient light.
  • the wearable human-machine interaction device of the present invention can be directly worn on the operator's wrist, and the operator wields through a simple arm. The control can be realized without touching the control panel, and when the control panel is far away, the input of the control command can still be realized, and the normal operation of the device can be controlled.
  • the human-computer interaction system further includes a temperature sensor and a data pre-processing module, wherein the temperature sensor is used to sense the ambient temperature to implement Calibration and rectification, the data pre-processing module can correct the acceleration sensed by the accelerometer, the intensity of the earth magnetic field sensed by the magnetometer, and the angular velocity sensed by the gyroscope according to the ambient temperature sensed by the temperature sensor, thereby facilitating improvement control precision.
  • the human-computer interaction system is pre-configured with a command gesture and a control instruction associated database, and when the command gesture gesture is recognized, the command is commanded.
  • the gesture action input command gesture is associated with the control command database, so that the command gesture action represents which control command is represented. Therefore, when the command gesture action is more complicated, the corresponding control command can be easily found.
  • the human-computer interaction system can support the input of complex control commands, and when the command gesture action includes multiple segments, The identified partial segments are cached first, and after the other segments are identified, all the segments are combined to find the control command corresponding to the command gesture action.
  • the command recognition model can be formed by offline training, and the command gesture action and control command can be arbitrarily set according to actual needs.
  • the correspondence between the command gesture action and the control command can adapt to the complicated and varied use requirements, and the application is convenient and flexible.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A wearable human-machine interaction apparatus, a human-machine interaction system (10) and a human-machine interaction method. The human-machine interaction system (10) comprises an accelerometer (12), a magnetometer (13), a gyroscope (14) and a controller (15). If the magnetic field intensity, sensed by the magnetometer (13), of the earth is normal, the controller (15) processes data information about the magnetic field intensity, sensed by the accelerometer (12) and the magnetometer (13), of the earth, extracts and recognizes a control instruction input by an instruction input object, and then outputs the recognized control instruction. If the magnetic field intensity, sensed by the magnetometer (13), of the earth is abnormal, the controller (15) processes acceleration and angular velocity information, extracts and recognizes the control instruction input by the instruction input object, and then outputs the recognized control instruction. By means of the wearable human-machine interaction apparatus and the human-machine interaction system and method, higher control precision can be achieved, and an input of a simple or complex control instruction can be supported.

Description

穿戴式人机交互装置、人机交互***及方法Wearable human-computer interaction device, human-computer interaction system and method 技术领域Technical field
本发明涉及智能控制技术领域,特别是一种穿戴式人机交互装置、一种人机交互***及一种人机交互方法。The invention relates to the field of intelligent control technology, in particular to a wearable human-computer interaction device, a human-computer interaction system and a human-computer interaction method.
背景技术Background technique
近年来,随着技术的发展,人机交互有着越来越广泛的应用。现有的人机交互的方式有多种。例如,其中一种通过声音和影像实现人机交互,然而,声音容易受到外界噪音的干扰,影像容易受环境光线的影响,从而导致人机交互的效果不佳。In recent years, with the development of technology, human-computer interaction has become more and more widely used. There are many ways to interact with humans. For example, one of them realizes human-computer interaction through sound and image. However, the sound is easily interfered by external noise, and the image is easily affected by ambient light, resulting in poor human-computer interaction.
另一种人机交互装置,其利用加速度计及人工神经网络来识别操作人员手臂姿势,从而实现对机器人的控制,加速度计必须设置于水平位置上,且仅能识别一些简单的动作,如上、下、左、右移动及顺时针转动及逆时针转动等,无法实现复杂命令的执行。Another human-computer interaction device that uses an accelerometer and an artificial neural network to recognize an operator's arm posture, thereby realizing control of the robot. The accelerometer must be placed in a horizontal position, and only some simple actions can be identified, as above. Lower, left, and right movements, clockwise rotation, and counterclockwise rotation, etc., cannot implement complex commands.
还有一种人机交互装置,操作人员需要佩戴肌电传感器,操作人员手臂上贴附肌电传感器,人机交互装置可根据肌电传感器感测到的肌电信号判断操作人员作出的动作,例如,握拳、伸出手指、拍拍手掌等等,然而,肌电传感器穿戴起来比较复杂,需要将其包覆于操作员的整个手臂上,且肌电传感器价格昂贵,感测精度不佳。此外,由于操作人员通过握拳或伸出手指的方式输出指令,能够输出的指令的种类较少,难以满足实际执行复杂命令的需求。There is also a human-computer interaction device, the operator needs to wear a myoelectric sensor, and the operator's arm is attached with a myoelectric sensor, and the human-machine interaction device can judge the action made by the operator according to the myoelectric signal sensed by the myoelectric sensor, for example, , clenching fists, sticking out fingers, patting the palms, etc. However, the myoelectric sensor is more complicated to wear and needs to be wrapped around the operator's entire arm, and the myoelectric sensor is expensive and the sensing accuracy is not good. In addition, since the operator outputs an instruction by means of a fist or a finger extension, the number of instructions that can be output is small, and it is difficult to satisfy the demand for actually executing a complicated command.
发明内容Summary of the invention
有鉴于此,本发明的目的是提出一种穿戴式人机交互装置、一种人机交互***及一种人机交互方法,可实现较高的控制精度,且可支持简单或复杂的控制指令的输入。In view of this, the object of the present invention is to provide a wearable human-machine interaction device, a human-computer interaction system and a human-computer interaction method, which can achieve higher control precision and can support simple or complicated control commands. input of.
本发明提供了一种人机交互***,其包括:一加速度计,用于感测指令输入物沿着三个相互垂直的轴线方向上的加速度;一磁场计,用于感测所述指令输入物所在位置沿着三个相互垂直的轴线方向上的地球磁场强度;一陀螺仪,用于感测所述 指令输入物沿着三个相互垂直的轴线方向上的角速度;一控制器,接收所述加速度、所述地球磁场强度、所述角速度及所述指令输入物所在位置的真实地球磁场强度的数据信息,并判断地球磁场强度是否正常;若所述地球磁场强度正常,所述控制器对所述加速度及所述磁场计感测到的地球磁场强度的数据信息进行处理、提取并识别所述指令输入物所输入的控制指令,再将识别出的控制指令输出;若地球磁场强度不正常时,所述控制器对所述加速度及所述角速度的数据信息进行处理、提取并识别所述指令输入物所输入的控制指令,再将识别出的控制指令输出。The invention provides a human-computer interaction system, comprising: an accelerometer for sensing acceleration of an instruction input along three mutually perpendicular axes; and a magnetic field meter for sensing the instruction input The strength of the earth's magnetic field along the three mutually perpendicular axes of the object; a gyroscope for sensing the An angular velocity of the command input along three mutually perpendicular axes; a controller receiving data of the acceleration, the earth magnetic field strength, the angular velocity, and the true earth magnetic field strength of the command input location And determining whether the strength of the earth magnetic field is normal; if the strength of the earth magnetic field is normal, the controller processes, extracts, and recognizes the data input of the acceleration and the magnetic field strength sensed by the magnetic field meter. The control command input by the object outputs the identified control command; if the earth magnetic field strength is abnormal, the controller processes, extracts and identifies the command input data of the acceleration and the angular velocity data. The input control command outputs the identified control command.
在人机交互***的一种示意性实施例中,所述控制器包括:In an illustrative embodiment of the human-computer interaction system, the controller includes:
一数据预处理模块,实时接收所述加速度计感测到的加速度、所述磁场计感测到的地球磁场强度、所述陀螺仪感测到的角速度及所述指令输入物所在位置的真实地球磁场强度的数据信息,并对所述加速度、所述磁场计感测到的地球磁场强度和所述角速度进行去噪和校正处理,且将所述磁场计感测到的地球磁场强度与所述真实地球磁场强度进行比较,以判断地球磁场是否受到干扰;a data preprocessing module that receives the acceleration sensed by the accelerometer, the intensity of the earth magnetic field sensed by the magnetometer, the angular velocity sensed by the gyroscope, and the real earth where the command input is located Data information of the magnetic field strength, and performing denoising and correction processing on the acceleration, the intensity of the earth magnetic field sensed by the magnetic field meter, and the angular velocity, and the intensity of the earth magnetic field sensed by the magnetic field meter Compare the real Earth's magnetic field strength to determine if the Earth's magnetic field is disturbed;
一特征提取模块,接收经过所述数据预处理模块处理过的所述加速度、所述磁场计感测到的地球磁场强度及所述角速度的数据信息,并从中提取特征数据;a feature extraction module, receiving the acceleration processed by the data preprocessing module, the magnetic field strength sensed by the magnetic field meter, and the angular velocity data information, and extracting feature data therefrom;
一指令识别模块,接收由所述特征提取模块提取的特征数据,并根据所述特征数据对所述指令输入物发出的指令命令手势动作进行识别,且输出与所述命令手势动作对应的所述控制指令。An instruction recognition module receives feature data extracted by the feature extraction module, and identifies an instruction command gesture action issued by the instruction input object according to the feature data, and outputs the instruction corresponding to the command gesture action Control instruction.
在人机交互***的一种示意性实施例中,所述特征数据为轨迹数据或加速度改变数据。In an illustrative embodiment of the human-computer interaction system, the feature data is trajectory data or acceleration change data.
在人机交互***的一种示意性实施例中,所述特征提取模块接收所述数据预处理模块输出的数据信息,并根据所述数据信息获得三维姿态信息,由三维姿态信息和加速度数据信息计算得到地球坐标系下命令手势动作对应的轨迹数据,进而定义出一虚拟操作平面,再将所述轨迹数据投影到所述虚拟操作平面上,并对投影后的数据进行特征数据提取。In an exemplary embodiment of the human-computer interaction system, the feature extraction module receives the data information output by the data pre-processing module, and obtains three-dimensional posture information according to the data information, and the three-dimensional posture information and the acceleration data information. Calculating the trajectory data corresponding to the command gesture action in the earth coordinate system, thereby defining a virtual operation plane, projecting the trajectory data onto the virtual operation plane, and performing feature data extraction on the projected data.
在人机交互***的一种示意性实施例中,所述指令识别模块将轨迹特征数据输入离线训练得到的命令手势识别模型,对命令手势动作进行识别,所述指令识别模块还用于查找命令手势与控制指令关联数据库获取与命令手势动作对应的控制指令。 In an exemplary embodiment of the human-computer interaction system, the instruction recognition module inputs the trajectory feature data into a command gesture recognition model obtained by offline training, and recognizes a command gesture action, and the instruction recognition module is further configured to search for a command. The gesture and control instruction association database acquires a control instruction corresponding to the command gesture action.
在人机交互***的一种示意性实施例中,所述人机交互***还包括:In an exemplary embodiment of the human-computer interaction system, the human-computer interaction system further includes:
一存储器,所述命令识别模型、所述命令手势与控制指令关联数据库存储在所述存储器内,上电后,所述命令识别模型、所述命令手势与控制指令关联数据库加载进所述控制器。a memory, the command recognition model, the command gesture and the control instruction association database are stored in the memory, after the power is turned on, the command recognition model, the command gesture and the control instruction association database are loaded into the controller .
在人机交互***的一种示意性实施例中,所述人机交互***还包括:In an exemplary embodiment of the human-computer interaction system, the human-computer interaction system further includes:
一温度传感器,用于感测所述指令输入物所在位置的环境温度,并将所述环境温度传输给所述数据预处理模块,所述数据预处理模块可根据所述环境温度对所述加速度计感测到的加速度、所述磁场计感测到的地球磁场强度及所述陀螺仪感测到的角速度进行校正。a temperature sensor for sensing an ambient temperature of the location where the command input is located, and transmitting the ambient temperature to the data pre-processing module, wherein the data pre-processing module can measure the acceleration according to the ambient temperature The sensed acceleration, the measured magnetic field strength of the magnetometer, and the angular velocity sensed by the gyroscope are corrected.
在人机交互***的一种示意性实施例中,所述人机交互***还包括:In an exemplary embodiment of the human-computer interaction system, the human-computer interaction system further includes:
一反馈模块,所述指令识别模块输出的控制指令传输给一被控制装置,所述反馈模块用于接收所述被控制装置反馈回来的信息,并判断所述控制指令是否被接收或被正确执行;a feedback module, the control command outputted by the instruction recognition module is transmitted to a controlled device, the feedback module is configured to receive information fed back by the controlled device, and determine whether the control command is received or executed correctly ;
一指示装置;a pointing device;
一指示装置控制模块,根据所述反馈模块判断的所述控制指令是否被接收或被正确执行的结果控制所述指示装置发出对应的指示。A pointing device control module controls the pointing device to issue a corresponding indication according to whether the control command determined by the feedback module is received or correctly executed.
在人机交互***的一种示意性实施例中,所述指示装置包括一振动马达和一指示灯的至少其中之一。In an exemplary embodiment of the human-computer interaction system, the indicating device includes at least one of a vibration motor and an indicator light.
在人机交互***的一种示意性实施例中,所述指令输入物为人手。In an illustrative embodiment of the human-computer interaction system, the command input is a human hand.
本发明还提出一种穿戴式人机交互装置,其包括上述任意一项所述的人机交互***。The present invention also provides a wearable human-machine interaction device comprising the human-computer interaction system according to any of the above.
本发明又提出一种人机交互方法,其特征在于,所述人机交互方法包括以下步骤:The invention further provides a human-computer interaction method, characterized in that the human-computer interaction method comprises the following steps:
感测指令输入物沿着三个相互垂直的轴线方向上的加速度、所述指令输入物所在位置沿着三个相互垂直的轴线方向上的地球磁场强度、所述指令输入物沿着三个相互垂直的轴线方向上的角速度;Sensing the acceleration of the command input in three mutually perpendicular axis directions, the position of the command input object along three mutually perpendicular axes, the command input along three mutual Angular velocity in the direction of the vertical axis;
将所述地球磁场强度和所述指令输入物所在位置的真实地球磁场强度进行比 较,判断所述地球磁场强度是否正常;Comparing the strength of the earth's magnetic field with the true earth's magnetic field strength at the location of the command input Comparing, determining whether the strength of the earth magnetic field is normal;
当所述地球磁场强度正常时,对所述加速度及所述地球磁场强度的数据信息进行处理、提取并识别所述指令输入物所输入的控制指令,再将识别出的控制指令输出;When the strength of the earth magnetic field is normal, processing, extracting and identifying the control information input by the command input object, and outputting the recognized control command;
若所述地球磁场强度不正常时,对所述加速度及所述角速度的数据信息进行处理、提取并识别所述指令输入物所输入的控制指令,再将识别出的控制指令输出。If the strength of the earth magnetic field is abnormal, the data information of the acceleration and the angular velocity is processed, extracted, and the control command input by the command input object is recognized, and the recognized control command is output.
在人机交互方法的一种示意性实施例中,对所述加速度、所述地球磁场强度和所述角速度进行去噪和校正处理。In an illustrative embodiment of the human-computer interaction method, the acceleration, the earth magnetic field strength, and the angular velocity are denoised and corrected.
在人机交互方法的一种示意性实施例中,所述人机交互方法还包括以下步骤:In an exemplary embodiment of the human-computer interaction method, the human-computer interaction method further includes the following steps:
感测所述指令输入物所在位置的环境温度,根据所述环境温度对所述加速度、所述地球磁场强度及所述角速度进行校正。Sensing an ambient temperature at a location where the command input is located, and correcting the acceleration, the earth magnetic field strength, and the angular velocity according to the ambient temperature.
在人机交互方法的一种示意性实施例中,所述人机交互方法还包括以下步骤:In an exemplary embodiment of the human-computer interaction method, the human-computer interaction method further includes the following steps:
所述控制指令传输给一被控制装置,所述被控制装置反馈所述控制指令执行情况的信息;The control command is transmitted to a controlled device, and the controlled device feeds back information about the execution of the control command;
判断所述控制指令是否被接收或被正确执行;Determining whether the control instruction is received or correctly executed;
根据判断的所述控制指令是否被接收或被正确执行的结果控制一指示装置发出对应的指示。Controlling a pointing device to issue a corresponding indication according to whether the determined control command is received or correctly executed.
从上述方案中可以看出,在本发明的穿戴式人机交互装置、人机交互***及方法中,控制器可接收加速度计感测到的指令输入物的加速度、磁场计感测到的地球磁场强度、陀螺仪感测到的指令输入物的角速度及指令输入物所在位置的真实地球磁场强度的数据信息,并判断磁场计感测到的地球磁场强度是否正常;当磁场计感测到的地球磁场强度正常时,控制器对指令输入物的加速度和磁场计感测到的地球磁场强度的数据信息进行处理、提取并识别指令输入物所输入的控制指令;当磁场计感测到的地球磁场强度不正常时,控制器对指令输入物的加速度和指令输入物的角速度的数据信息进行处理、提取并识别指令输入物所输入的控制指令,此种控制指令的识别方式在磁场受到干扰的情况下仍可实现较高的控制精度,且可支持简单或复杂的控制指令的输入。本发明的人机交互***的控制精度不会因外界噪音或环境光线而受影响,此外,本发明的穿戴式人机交互装置可直接佩戴于操作人员的手 腕上,操作人员通过简单的手臂挥舞即可实现控制,无需触碰控制面板,离控制面板较远时,仍可实现控制指令的输入,控制设备的正常运行。As can be seen from the above solution, in the wearable human-machine interaction device and the human-computer interaction system and method of the present invention, the controller can receive the acceleration of the command input sensed by the accelerometer, and the earth sensed by the magnetic field meter. The magnetic field strength, the angular velocity of the command input object sensed by the gyroscope, and the true earth magnetic field strength of the command input object, and determine whether the earth magnetic field strength sensed by the magnetic field meter is normal; when the magnetic field meter senses When the earth's magnetic field strength is normal, the controller processes the acceleration of the command input object and the data information of the earth magnetic field strength sensed by the magnetic field meter, extracts and recognizes the control command input by the command input object; when the magnetic field meter senses the earth When the magnetic field strength is abnormal, the controller processes, extracts and recognizes the control information input by the command input object for the acceleration of the command input object and the angular velocity of the command input object, and the identification mode of the control command is interfered by the magnetic field. High control accuracy can still be achieved and simple or complex input of control commands can be supportedThe control accuracy of the human-computer interaction system of the present invention is not affected by external noise or ambient light. In addition, the wearable human-machine interaction device of the present invention can be directly worn by the operator's hand. On the wrist, the operator can control by simply waving the arm without touching the control panel. When the control panel is far away, the input of the control command can still be realized, and the normal operation of the device can be controlled.
附图说明DRAWINGS
下面将通过参照附图详细描述本发明的优选实施例,使本领域的普通技术人员更清楚本发明的上述及其它特征和优点,附图中:The above and other features and advantages of the present invention will become apparent to those skilled in the <RTIgt
图1为本发明的一个实施例的人机交互***的功能示意图。FIG. 1 is a schematic diagram of functions of a human-machine interaction system according to an embodiment of the present invention.
图2为图1的人机交互***的架构示意图。2 is a schematic structural diagram of the human-machine interaction system of FIG. 1.
图3为图2所示的人机交互***的启动流程图。FIG. 3 is a startup flowchart of the human-machine interaction system shown in FIG. 2.
图4为本发明的一个实施例的人机交互方法中命令手势动作运动模式识别线程的流程图。FIG. 4 is a flowchart of a command gesture motion pattern recognition thread in a human-computer interaction method according to an embodiment of the present invention.
图5为图4所示的人机交互方法中姿态识别的流程图。FIG. 5 is a flow chart of gesture recognition in the human-computer interaction method shown in FIG. 4.
图6为图4所示的人机交互方法中指令反馈的流程图。FIG. 6 is a flow chart of instruction feedback in the human-computer interaction method shown in FIG. 4.
图7为用于图4所示的人机交互方法中命令识别模型的离线训练过程的流程图。7 is a flow chart of an offline training process for the command recognition model in the human-computer interaction method shown in FIG.
具体实施方式detailed description
为使本发明的目的、技术方案和优点更加清楚,以下举实施例对本发明进一步详细说明。In order to make the objects, technical solutions and advantages of the present invention more comprehensible, the present invention will be further described in detail below.
图1为本发明的一个实施例的人机交互***的功能示意图。图2为图1的人机交互***的架构示意图。请参见图1和图2,本实施例的人机交互***10包括加速度计12、磁场计13、陀螺仪14及控制器15,其中,加速度计12、磁场计13及陀螺仪14均和控制器15电连接。人机交互***10可实时感测全局姿态,具有较高的手势识别精度,可用于输入命令,控制机械运动。FIG. 1 is a schematic diagram of functions of a human-machine interaction system according to an embodiment of the present invention. 2 is a schematic structural diagram of the human-machine interaction system of FIG. 1. Referring to FIG. 1 and FIG. 2, the human-machine interaction system 10 of the present embodiment includes an accelerometer 12, a magnetic field meter 13, a gyroscope 14 and a controller 15, wherein the accelerometer 12, the magnetic field meter 13 and the gyroscope 14 are both controlled. The device 15 is electrically connected. The human-computer interaction system 10 can sense the global attitude in real time, has high gesture recognition accuracy, and can be used to input commands and control mechanical motion.
具体地,加速度计12为三轴加速度计,其可用于感测指令输入物沿着三个相互垂直的轴线(如X、Y、Z三个轴)方向上的加速度,并将感测到的加速度传输给控制器15。磁场计13为三轴磁场计,其可用于感测所述指令输入物沿着三个相互垂直的轴线方向上的地球磁场强度,并将感测到的地球磁场强度传输给控制器15。陀螺仪14为三轴陀螺仪,其用于感测所述指令输入物沿着三个相互垂直的轴线方向上的角速度,并将感测到的角速度传输给控制器15。所述加速度、所述地球磁场强度及所述角速度均为矢量。在本实施例中,所述指令输入物为人手,但不以此为限。 Specifically, the accelerometer 12 is a three-axis accelerometer that can be used to sense the acceleration of the command input along three mutually perpendicular axes (eg, X, Y, Z axes) and will sense The acceleration is transmitted to the controller 15. The magnetic field meter 13 is a three-axis magnetic field meter that can be used to sense the strength of the earth's magnetic field along the three mutually perpendicular axes of the command input and transmit the sensed earth's magnetic field strength to the controller 15. The gyroscope 14 is a three-axis gyroscope for sensing angular velocity of the command input in three mutually perpendicular axis directions and transmitting the sensed angular velocity to the controller 15. The acceleration, the strength of the earth magnetic field, and the angular velocity are all vectors. In this embodiment, the command input is manually, but is not limited thereto.
确定人手姿态需要两种信息,一种是地球磁场强度信息,另一种是人手的加速度信息,两组信息均为X、Y、Z三个方向的矢量信息,通过两组信息可获得人手的三维姿态。在地球磁场未受环境影响的情况下,磁场计13感测到的数据较为准确,可以将磁场计13感测到的数据用作人手姿态的识别。然而,地球磁场强度会受到环境的影响,例如,在不绣钢门框附近,地球磁场会受到干扰,此种情况下,磁场计13感测到的地球磁场强度不准确,影响后续人手姿态的正确识别。当地球磁场受到干扰时,采用陀螺仪14代替磁场计13,即使用陀螺仪14弥补地球磁场的误差,将陀螺仪14感测到的数据用作人手姿态的识别。陀螺仪14使用时间较长时,其精度下降,存在漂移误差,所以当地球磁场恢复正常时,需要切换到磁场计13模式,即再次以磁场计13感测到的数据用作人手姿态的识别。Determining the posture of the human hand requires two kinds of information, one is the information of the earth's magnetic field strength, and the other is the acceleration information of the human hand. The two sets of information are vector information in the three directions of X, Y, and Z. Three-dimensional pose. In the case where the earth's magnetic field is not affected by the environment, the data sensed by the magnetometer 13 is relatively accurate, and the data sensed by the magnetometer 13 can be used as the recognition of the human hand gesture. However, the strength of the Earth's magnetic field will be affected by the environment. For example, in the vicinity of the stainless steel door frame, the Earth's magnetic field will be disturbed. In this case, the strength of the Earth's magnetic field sensed by the magnetic field meter 13 is inaccurate, affecting the correct posture of the subsequent human hand. Identification. When the earth's magnetic field is disturbed, the gyroscope 14 is used instead of the magnetic field meter 13, that is, the gyroscope 14 is used to compensate for the error of the earth's magnetic field, and the data sensed by the gyroscope 14 is used as the recognition of the human hand posture. When the gyroscope 14 is used for a long time, its accuracy is degraded and there is a drift error. Therefore, when the earth's magnetic field returns to normal, it is necessary to switch to the magnetometer 13 mode, that is, the data sensed by the magnetic field meter 13 is used again for the recognition of the human hand posture. .
控制器15接收所述加速度、所述地球磁场强度、所述角速度及人手所在位置的真实地球磁场强度的数据信息,并判断地球磁场强度是否正常;若所述地球磁场强度正常,控制器15对所述加速度及所述磁场计感测到的地球磁场强度的数据信息进行处理、提取并识别人手所输入的控制指令,再将识别出的控制指令输出;若地球磁场强度不正常时,控制器15对所述加速度及所述角速度的数据信息进行处理、提取并识别人手所输入的控制指令,再将识别出的控制指令输出。The controller 15 receives the acceleration, the earth magnetic field strength, the angular velocity, and the real earth magnetic field strength of the position of the human hand, and determines whether the earth magnetic field strength is normal; if the earth magnetic field strength is normal, the controller 15 The acceleration and the data information of the earth magnetic field strength sensed by the magnetic field meter are processed, extracted and recognized by a control command input by a human hand, and then the recognized control command is output; if the earth magnetic field strength is abnormal, the controller 15 processing, extracting and recognizing the control information input by the human hand, and outputting the recognized control command.
所述真实地球磁场强度为离线测量好的数据,其为地球各经纬度的真实地球磁场强度信息,所述真实地球磁场强度可预设于控制器15内,也可存储在存储器内,使用时再将其加载进存储器。The real earth magnetic field strength is offline measured data, which is real earth magnetic field strength information of each latitude and longitude of the earth, and the real earth magnetic field strength may be preset in the controller 15 or may be stored in the memory, and then used. Load it into memory.
控制器15包括数据预处理模块152、特征提取模块153及指令识别模块154。数据预处理模块152与加速度计12、磁场计13及陀螺仪14通信,实时接收加速度计12感测到的加速度、磁场计13感测到的地球磁场强度、陀螺仪14感测到的角速度及人手所在位置的真实地球磁场强度的数据信息,并对所述加速度、磁场计13感测到的地球磁场强度和所述角速度进行去噪和校正处理,且将磁场计13感测到的地球磁场强度与真实地球磁场强度进行比较,以判断地球磁场是否受到干扰。所述的校正处理主要是对所述加速度、磁场计13感测到的地球磁场强度及所述角速度进行噪声滤波、奇异点去除、温度误差补偿等。The controller 15 includes a data pre-processing module 152, a feature extraction module 153, and an instruction recognition module 154. The data pre-processing module 152 communicates with the accelerometer 12, the magnetic field meter 13 and the gyroscope 14 to receive the acceleration sensed by the accelerometer 12, the intensity of the earth magnetic field sensed by the magnetic field meter 13, the angular velocity sensed by the gyroscope 14, and Data information of the true earth magnetic field strength at the position of the human hand, denoising and correcting the acceleration, the earth magnetic field strength sensed by the magnetic field meter 13 and the angular velocity, and the earth magnetic field sensed by the magnetic field meter 13 The intensity is compared to the true Earth's magnetic field strength to determine if the Earth's magnetic field is disturbed. The correction processing mainly performs noise filtering, singular point removal, temperature error compensation, and the like on the acceleration, the earth magnetic field strength sensed by the magnetic field meter 13 and the angular velocity.
具体地,判断地球磁场是否受到干扰方式为,将磁场计13感测到的地球磁场强度和真实地球磁场强度相比较,若二者的误差小于或等于一预设阀值,则判断地球磁场未受干扰,若二者的误差大于所述预设阀值,则判断地球磁场受到干扰。 Specifically, determining whether the earth magnetic field is interfered by is comparing the strength of the earth magnetic field sensed by the magnetic field meter 13 with the strength of the real earth magnetic field, and if the error of the two is less than or equal to a predetermined threshold, determining that the earth's magnetic field is not If the error between the two is greater than the preset threshold, the earth's magnetic field is judged to be disturbed.
特征提取模块153接收经过数据预处理模块152处理过的所述加速度、所述磁场计13感测到的地球磁场强度及所述角速度的数据信息,并从中提取特征数据。The feature extraction module 153 receives the acceleration processed by the data preprocessing module 152, the earth magnetic field strength sensed by the magnetic field meter 13, and the angular velocity data information, and extracts feature data therefrom.
具体地,特征提取模块153接收数据预处理模块152输出的数据信息,并根据所述数据信息获得三维姿态信息,由三维姿态信息和加速度数据信息可计算得到地球坐标系下命令手势动作对应的轨迹数据(通过二次积分),进而可定义出一虚拟操作平面(使得所有运动轨迹数据距离该平面较近),将轨迹数据投影到虚拟操作平面上,并对投影后的数据进行特征数据提取,将特征数据输入离线训练得到的命令手势识别模型,即可对命令手势动作进行识别,所述的动作操作平面为二维平面。人在书写命令时一般是在二维平面上运动,但所述二维平面未必垂直于水平面,也就是说,一般情况下,人书写的命令是三维的,但大体是在二维平面上。Specifically, the feature extraction module 153 receives the data information output by the data preprocessing module 152, and obtains three-dimensional posture information according to the data information, and the trajectory corresponding to the command gesture action in the earth coordinate system can be calculated from the three-dimensional posture information and the acceleration data information. Data (through the second integration), which in turn defines a virtual operation plane (so that all motion trajectory data is closer to the plane), projects the trajectory data onto the virtual operation plane, and performs feature data extraction on the projected data. By inputting the feature data into the command gesture recognition model obtained by offline training, the command gesture action can be recognized, and the action operation plane is a two-dimensional plane. When a person writes a command, it generally moves on a two-dimensional plane, but the two-dimensional plane is not necessarily perpendicular to the horizontal plane. That is to say, in general, the command written by a person is three-dimensional, but is generally on a two-dimensional plane.
需要说明的是,当地球磁场未受干扰时,所述三维姿态信息是基于加速度计12感测到的加速度数据和磁场计13感测到的地球磁场强度数据获得的,当地球磁场受到干扰时,所述三维姿态信息是基于加速度计12感测到的加速度数据和陀螺仪14感测到的角速度数据获得的,但本发明不以此为限。若人书写的命令是分布在三维空间上,特征提取模块153同样可以提取与三维的命令手势动作对应的特征数据。It should be noted that when the earth magnetic field is not disturbed, the three-dimensional attitude information is obtained based on the acceleration data sensed by the accelerometer 12 and the earth magnetic field strength data sensed by the magnetic field meter 13 when the earth's magnetic field is disturbed. The three-dimensional attitude information is obtained based on the acceleration data sensed by the accelerometer 12 and the angular velocity data sensed by the gyroscope 14, but the invention is not limited thereto. If the command written by the person is distributed in the three-dimensional space, the feature extraction module 153 can also extract the feature data corresponding to the three-dimensional command gesture action.
指令识别模块154与特征提取模块153电连接,接收由特征提取模块153提取的特征数据,将特征数据输入离线训练得到的命令手势识别模型,即可对指令输入物发出的命令手势动作进行识别,再查找命令手势与控制指令关联数据库获取与命令手势动作对应的获得控制指令,然后,将所述控制指令输送给被控制装置。所述特征数据是用来表示运动轨迹模式的一组数据,其可以是轨迹数据,也可以是加速度改变数据,还可以是其他的表示方式。The command recognition module 154 is electrically connected to the feature extraction module 153, receives the feature data extracted by the feature extraction module 153, and inputs the feature data into the command gesture recognition model obtained by offline training, thereby identifying the command gesture action issued by the command input object. The search command gesture is further associated with the control command to obtain an acquisition control command corresponding to the command gesture action, and then the control command is transmitted to the controlled device. The feature data is a set of data used to represent a motion trajectory mode, which may be trajectory data, acceleration change data, or other representations.
具体在本实施例中,指令识别模块154将轨迹特征数据输入离线训练得到的命令手势识别模型,对命令手势动作进行识别,例如,基于轨迹特征数据识别出命令手势动作是“S”还是“A”;接着,指令识别模块154判断所述命令手势动作是否与命令手势与控制指令关联数据库中的某个控制指令对应,从而获得控制指令,并将控制指令输送给被控制装置。命令手势与控制指令关联数据库中可设有命令手势动作与控制指令间的预设表,不同的命令手势动作与不同的控制指令相对应。Specifically, in the embodiment, the instruction recognition module 154 inputs the trajectory feature data into the command gesture recognition model obtained by offline training, and recognizes the command gesture action, for example, based on the trajectory feature data, whether the command gesture action is “S” or “A”. Next, the instruction recognition module 154 determines whether the command gesture action corresponds to a command gesture and a control command in the control instruction association database, thereby obtaining a control command and transmitting the control command to the controlled device. The command gesture and the control instruction associated database may be provided with a preset table between the command gesture action and the control command, and different command gesture actions correspond to different control commands.
需要说明的是,命令手势模型是用来识别命令手势动作的,其是根据采集的数据离线训练获得,在实时运行时载入控制器15中;指令识别模块154将特征提取模块153提取的特征数据输入命令手势模型中,即可对命令手势动作进行识别。识别 出命令手势动作后,在命令手势与控制指令关联数据库中可查找对应的控制指令,命令手势与控制指令关联数据库是人为定义的,例如,命令手势动作“A”对应于控制指令“向左移动”或“向右移动”。It should be noted that the command gesture model is used to identify the command gesture action, which is obtained by offline training according to the collected data, and is loaded into the controller 15 during real-time operation; and the feature recognition module 154 extracts the feature extracted by the feature extraction module 153. The data input command gesture model can identify the command gesture action. Identification After the command gesture action, the corresponding control command can be searched in the command gesture and the control instruction association database, and the command gesture and the control instruction association database are artificially defined. For example, the command gesture action “A” corresponds to the control command “moving to the left” Or "move to the right."
此外,命令手势动作的识别可以基于轨迹特征数据,也可以基于其他方法,比如,基于运动轨迹中加速度的方向改变信息,例如,命令手势动作“L”的加速度的方向先是向下,接着是向右;相应地,命令识别模型的中输入的信息不再是轨迹信息,而是加速度的改变信息。进一步地,人机交互***10还包括温度传感器16及指示装置17。温度传感器16用于感测人手所在位置的环境温度,并将环境温度传输给控制器15的数据预处理模块152,数据预处理模块152可根据温度传感器16感测的环境温度对加速度计12感测到的加速度、磁场计13感测到的地球磁场强度及陀螺仪14感测到的角速度进行校正。In addition, the recognition of the command gesture action may be based on the trajectory feature data, or may be based on other methods, such as changing the direction based on the direction of the acceleration in the motion trajectory, for example, the direction of the acceleration commanding the gesture action "L" is first downward, then Right; accordingly, the information input in the command recognition model is no longer the trajectory information, but the change information of the acceleration. Further, the human-computer interaction system 10 further includes a temperature sensor 16 and a pointing device 17. The temperature sensor 16 is configured to sense the ambient temperature of the location of the human hand and transmit the ambient temperature to the data pre-processing module 152 of the controller 15, and the data pre-processing module 152 can sense the accelerometer according to the ambient temperature sensed by the temperature sensor 16. The measured acceleration, the intensity of the earth magnetic field sensed by the magnetic field meter 13, and the angular velocity sensed by the gyroscope 14 are corrected.
温度传感器16用于感测环境温度,以实现校准和纠偏。加速度计12、磁场计13及陀螺仪14的感测精度均会受到温度的影响,需预校准或动态校准,加速度计12、磁场计13及陀螺仪14在出厂前会进行校准测试,以提供加速度计12、磁场计13及陀螺仪14在不同温度条件下,测量值和真实值间的偏差关系,从而根据温度传感器16感测到的温度校准加速度计12、磁场计13及陀螺仪14的测量值,使得测量值更接近真实值。以加速度计12为例,在10摄氏度时,加速度计12的测量值为10米/秒,真实值为9米/秒,测量值和真实值之间存在1米/秒的误差;在20摄氏度时,加速度计12的测量值为10米/秒,真实值为9.5米/秒,测量值和真实值之间存在0.5米/秒的误差,通过校准可以消除所述误差,即对测量值进行校准,以得到与真实值相等的校准后的测量值。 Temperature sensor 16 is used to sense ambient temperature for calibration and deskew. The sensing accuracy of accelerometer 12, magnetic field meter 13 and gyroscope 14 are affected by temperature, and need to be pre-calibrated or dynamically calibrated. Accelerometer 12, magnetic field meter 13 and gyroscope 14 will be calibrated before leaving the factory to provide The deviation relationship between the measured value and the true value of the accelerometer 12, the magnetic field meter 13 and the gyroscope 14 under different temperature conditions, thereby calibrating the accelerometer 12, the magnetic field meter 13 and the gyroscope 14 according to the temperature sensed by the temperature sensor 16. The measured value is such that the measured value is closer to the true value. Taking the accelerometer 12 as an example, at 10 degrees Celsius, the measured value of the accelerometer 12 is 10 m/s, the true value is 9 m/sec, and there is an error of 1 m/sec between the measured value and the true value; at 20 degrees Celsius When the accelerometer 12 has a measured value of 10 m/s and a true value of 9.5 m/sec, there is an error of 0.5 m/s between the measured value and the true value, which can be eliminated by calibration, that is, the measured value is Calibrate to obtain a calibrated measurement equal to the true value.
控制指令输送给被控制装置后,操作人员需要知道,控制指令是否被接收,或控制指令被接收后是否被正确执行,需要通过指示装置17显示相应的控制指令接收和执行信息。指示装置17的设置可使操作人员无需一直观察操作面板,即便操作人员距离操作面板较远时,操作人员仍可获知控制指令被接收或执行的情况。After the control command is sent to the controlled device, the operator needs to know whether the control command is received, or whether the control command is correctly executed after being received, and the corresponding control command receiving and executing information needs to be displayed by the pointing device 17. The setting of the pointing device 17 allows the operator to not have to constantly observe the operating panel, even if the operator is far from the operating panel, the operator can still know that the control command is received or executed.
在本实施例中,指示装置17包括振动马达172和指示灯173,相对应地,控制器15还包括反馈模块155、第一指示装置控制模块156和第二指示装置控制模块157,指令识别模块154输出的控制指令传输给被控制装置,反馈模块155用于接收被控制装置反馈回来的信息,并判断控制指令是否被接收或被正确执行。第一指示装置控制模块156与振动马达172及反馈模块155电连接,第二指示装置控制模块157 与指示灯173及反馈模块155电连接。第一指示装置控制模块156和第二指示装置控制模块157根据反馈模块155判断的控制指令是否被接收或被正确执行的结果控制指示装置17发出对应的指示。In the present embodiment, the pointing device 17 includes a vibration motor 172 and an indicator light 173. Correspondingly, the controller 15 further includes a feedback module 155, a first pointing device control module 156, and a second pointing device control module 157, the command recognition module. The control command outputted by 154 is transmitted to the controlled device, and the feedback module 155 is configured to receive the information fed back by the control device and determine whether the control command is received or executed correctly. The first indicating device control module 156 is electrically connected to the vibration motor 172 and the feedback module 155, and the second indicating device control module 157 It is electrically connected to the indicator light 173 and the feedback module 155. The first pointing device control module 156 and the second pointing device control module 157 control the pointing device 17 to issue a corresponding indication based on whether the control command determined by the feedback module 155 is received or correctly executed.
振动马达172的振动模式和指示灯173的指示模式可依照实际需求任意设定,例如,当控制指令被正确执行时,第一指示装置控制模块156控制振动马达172发出两次间隔时间较长的振动;当控制指令执行错误时,第一指示装置控制模块156控制振动马达172频繁连续振动。The vibration mode of the vibration motor 172 and the indication mode of the indicator light 173 can be arbitrarily set according to actual needs. For example, when the control command is correctly executed, the first pointing device control module 156 controls the vibration motor 172 to issue two intervals for a long time. Vibration; when the control command is executed incorrectly, the first pointing device control module 156 controls the vibration motor 172 to continuously vibrate continuously.
振动马达172的振动模式和指示灯173的指示模式可依照实际需求任意设置,例如,指示灯173可设置三种颜色:红色、黄色、绿色,红色表示执行控制指令发生错误,为报警信号,绿色表示控制指令被正确执行,黄色表示控制指令未被接收。指示装置17包括振动马达172和指示灯173的至少其中之一,也就是说,振动马达172和指示灯173可同时使用,也可仅使用其中一个,当使用振动马达172时,对应地也使用第一指示装置控制模块156,当使用指示灯173时,对应地也使用第二指示装置控制模块157。The vibration mode of the vibration motor 172 and the indication mode of the indicator light 173 can be arbitrarily set according to actual needs. For example, the indicator light 173 can be set in three colors: red, yellow, green, and red indicates that an execution control command error occurs, and is an alarm signal, green. Indicates that the control command was executed correctly, and yellow indicates that the control command was not received. The indicating device 17 includes at least one of the vibration motor 172 and the indicator light 173, that is, the vibration motor 172 and the indicator light 173 can be used simultaneously, or only one of them can be used, and when the vibration motor 172 is used, it is correspondingly used. The first pointing device control module 156, when using the indicator light 173, correspondingly also uses the second pointing device control module 157.
指示装置17还可为码表,可通过码表设置多种反馈状态码,例如,01表示成功,02表示错误,03表示报警。The indicating device 17 can also be a code table, and a plurality of feedback status codes can be set through the code table. For example, 01 indicates success, 02 indicates an error, and 03 indicates an alarm.
人机交互***10还包括存储器18和电源19,存储器18可与控制器15连接,控制器15可从存储器18中获取存储信息,例如,命令识别模型、命令手势与控制指令关联数据库可存储在存储器18内,上电后,命令识别模型、命令手势与控制指令关联数据库可加载进控制器15;振动马达172的振动模式和指示灯173的指示模式也可存储于存储器18内,控制器15从存储器18中读取振动模式和指示模式后,可按照相应的模式进行控制。电源19可为整个人机交互***10供电,如图2所示,电源19为控制器15电连接,加速度计12、磁场计13、陀螺仪14、温度传感器16及指示装置17的电能由控制器15的管脚供应。在其他实施例中,也可直接从电源19引线对加速度计12、磁场计13、陀螺仪14、温度传感器16及指示装置17进行供电。The human-computer interaction system 10 further includes a memory 18 and a power source 19, the memory 18 being connectable to the controller 15, and the controller 15 can obtain storage information from the memory 18, for example, a command recognition model, a command gesture and a control instruction associated database can be stored in In the memory 18, after power-on, the command recognition model, the command gesture and the control instruction association database can be loaded into the controller 15; the vibration mode of the vibration motor 172 and the indication mode of the indicator light 173 can also be stored in the memory 18, and the controller 15 After the vibration mode and the indication mode are read from the memory 18, the control can be performed in accordance with the corresponding mode. The power source 19 can supply power to the entire human-machine interaction system 10. As shown in FIG. 2, the power source 19 is electrically connected to the controller 15, and the electric energy of the accelerometer 12, the magnetic field meter 13, the gyroscope 14, the temperature sensor 16, and the pointing device 17 is controlled. The pin of the device 15 is supplied. In other embodiments, the accelerometer 12, the magnetic field meter 13, the gyroscope 14, the temperature sensor 16, and the pointing device 17 may be powered directly from the power source 19 lead.
需要说明的是,命令识别模型也可预设于控制器15内,控制器15可通过低功耗蓝牙或蓝牙4.0与加速度计12、磁场计13、陀螺仪14、温度传感器16、指示装置17及存储器18进行通信,但通信方式不以此为限。It should be noted that the command recognition model may also be preset in the controller 15, and the controller 15 may pass through the low-power Bluetooth or Bluetooth 4.0 and the accelerometer 12, the magnetic field meter 13, the gyroscope 14, the temperature sensor 16, and the indicating device 17. And the memory 18 communicates, but the communication method is not limited thereto.
本发明还提供一种穿戴式人机交互装置,所述穿戴式人机交互装置包括上述的人机 交互***10。所述穿戴式人机交互装置可做成腕带,操作人员直接将其穿戴于手腕上,在有些工作环境下,如搬运重物、防止辐射,操作人员需要佩戴手套,然而,操作人员佩戴手套后便无法直接用手操作控制面板,通过所述穿戴式人机交互装置在不接触控制面板的情况下,即可实现控制指令的输入,控制设备的正常运行。The present invention also provides a wearable human-machine interaction device, the wearable human-machine interaction device comprising the above-mentioned human-machine Interactive system 10. The wearable human-machine interaction device can be made into a wristband, and the operator directly wears it on the wrist. In some working environments, such as carrying heavy objects and preventing radiation, the operator needs to wear gloves. However, the operator wears gloves. After that, the control panel cannot be directly operated by hand, and the wearable human-machine interaction device can realize the input of the control command and control the normal operation of the device without contacting the control panel.
图3为图2所示的人机交互***的启动流程图。人机交互***10在正常使用前,人机交互***10会执行图3所示的步骤:FIG. 3 is a startup flowchart of the human-machine interaction system shown in FIG. 2. Before the human-computer interaction system 10 is in normal use, the human-computer interaction system 10 performs the steps shown in FIG. 3:
步骤S21,开始;Step S21, starting;
步骤S22,硬件初始化;例如,IIC(Inter-Integrated Circuit)、SPI(Serial Peripheral Interface)及GPIO(General Purpose Input Output)等通讯接口的初始化,加速度计12、磁场计13、陀螺仪14的初始化;使IIC、SPI、GPIO、加速度计12、磁场计13及陀螺仪14上电,准备正常运行;Step S22, hardware initialization; for example, initialization of a communication interface such as IIC (Inter-Integrated Circuit), SPI (Serial Peripheral Interface), and GPIO (General Purpose Input Output), initialization of the accelerometer 12, the magnetic field meter 13, and the gyroscope 14; The IIC, SPI, GPIO, accelerometer 12, magnetic field meter 13 and gyroscope 14 are powered on, ready for normal operation;
步骤S23,定义全局变量;该步骤用于存储软件运行时的全局变量,比如,缓存一段时间的传感器数据,保存地球坐标系(x,y,z),保存虚拟操作平面坐标系(x,y,z);Step S23, defining a global variable; the step is for storing a global variable of the software runtime, for example, buffering sensor data for a period of time, saving the earth coordinate system (x, y, z), and saving the virtual operation plane coordinate system (x, y) ,z);
步骤S24,参数初始化;例如,加载命令识别模型、命令手势与控制指令关联数据库、振动马达172的振动模式和指示灯173的指示模式,命令手势与控制指令关联数据库可包括命令手势动作与控制指令间的预设表;Step S24, parameter initialization; for example, loading command recognition model, command gesture and control instruction association database, vibration mode of vibration motor 172 and indication mode of indicator light 173, command gesture and control instruction association database may include command gesture action and control instruction Preset table
步骤S25,启动实时数据处理线程;例如,启动特征提取模块153和指令识别模块154,特征提取模块153可提取轨迹特征数据,指令识别模块154将所提取的轨迹特征数据输入命令识别模型,对命令手势动作进行识别,经查找命令手势与控制指令关联数据库获取对应的控制指令。在其他实施例中,特征提取模块153也可提取其他类型的特征数据。Step S25, the real-time data processing thread is started; for example, the feature extraction module 153 and the instruction recognition module 154 are activated, the feature extraction module 153 can extract the trajectory feature data, and the instruction recognition module 154 inputs the extracted trajectory feature data into the command recognition model, and the command The gesture action is identified, and the corresponding command is obtained by the search command gesture and the control instruction associated database. In other embodiments, feature extraction module 153 can also extract other types of feature data.
请再次参见图1,人机交互***10启动后即可执行人机交互方法对应的步骤,本发明的人机交互方法包括以下步骤:Referring to FIG. 1 again, after the human-computer interaction system 10 is started, the steps corresponding to the human-computer interaction method can be executed. The human-computer interaction method of the present invention includes the following steps:
感测指令输入物沿着三个相互垂直的轴线方向上的加速度、所述指令输入物所在位置沿着三个相互垂直的轴线方向上的地球磁场强度、所述指令输入物沿着三个相互垂直的轴线方向上的角速度;Sensing the acceleration of the command input in three mutually perpendicular axis directions, the position of the command input object along three mutually perpendicular axes, the command input along three mutual Angular velocity in the direction of the vertical axis;
将所述地球磁场强度和所述指令输入物所在位置的真实地球磁场强度进行比较,判断所述地球磁场强度是否正常; Comparing the strength of the earth magnetic field with the true earth magnetic field strength of the position where the command input object is located, and determining whether the earth magnetic field strength is normal;
当所述地球磁场强度正常时,对所述加速度及所述地球磁场强度的数据信息进行处理、提取并识别所述指令输入物所输入的控制指令,再将识别出的控制指令输出;When the strength of the earth magnetic field is normal, processing, extracting and identifying the control information input by the command input object, and outputting the recognized control command;
若所述地球磁场强度不正常时,对所述加速度及所述角速度的数据信息进行处理、提取并识别所述指令输入物所输入的控制指令,再将识别出的控制指令输出。If the strength of the earth magnetic field is abnormal, the data information of the acceleration and the angular velocity is processed, extracted, and the control command input by the command input object is recognized, and the recognized control command is output.
除上述步骤外,本发明的人机交互方法还包括以下步骤:In addition to the above steps, the human-computer interaction method of the present invention further includes the following steps:
对所述加速度、所述地球磁场强度和所述角速度进行去噪和校正处理。De-noising and correction processing is performed on the acceleration, the earth magnetic field strength, and the angular velocity.
感测所述指令输入物所在位置的环境温度,根据所述环境温度对所述加速度、所述地球磁场强度及所述角速度进行校正。Sensing an ambient temperature at a location where the command input is located, and correcting the acceleration, the earth magnetic field strength, and the angular velocity according to the ambient temperature.
所述控制指令传输给一被控制装置,所述被控制装置反馈所述控制指令执行情况的信息;The control command is transmitted to a controlled device, and the controlled device feeds back information about the execution of the control command;
判断所述控制指令是否被接收或被正确执行;Determining whether the control instruction is received or correctly executed;
根据判断的所述控制指令是否被接收或被正确执行的结果控制指示装置17发出对应的指示。The control instruction means 17 issues a corresponding indication based on the result of the judged whether the control command is received or correctly executed.
更具体地,图4为本发明的一个实施例的人机交互方法中命令手势动作运动模式识别线程的流程图。请参见图4、图1和图2,命令手势动作运动模式识别线程包括以下的步骤:More specifically, FIG. 4 is a flowchart of a command gesture motion pattern recognition thread in a human-computer interaction method according to an embodiment of the present invention. Referring to FIG. 4, FIG. 1 and FIG. 2, the command gesture motion pattern recognition thread includes the following steps:
步骤S31,开始;Step S31, starting;
步骤S32,重建动作轨迹;特征提取模块153接收经过数据预处理模块152处理过的所述加速度、磁场计13感测到的地球磁场强度及所述角速度的数据信息,并根据所述数据信息获取三维姿态信息,基于三维姿态信息将命令手势动作的加速度数据转换到地球坐标系,再在地球坐标系下通过对所述加速度数据进行二次积分(一次积分得到速度数据,二次积分得到轨迹数据),从而得到命令手势动作的移动轨迹数据;Step S32, reconstructing the motion trajectory; the feature extraction module 153 receives the acceleration, the magnetic field strength sensed by the magnetic field meter 13 and the angular velocity data processed by the data preprocessing module 152, and obtains the data information according to the data information. The three-dimensional posture information converts the acceleration data of the command gesture motion to the earth coordinate system based on the three-dimensional posture information, and then performs second integration on the acceleration data in the earth coordinate system (one speed is obtained for the integral data, and the second integral is obtained for the trajectory data) ), thereby obtaining movement trajectory data of the command gesture action;
步骤S33,判断是否发现动作轨迹的分段;特征提取模块153根据移动轨迹数据判断是否发现动作轨迹的分段,若是,执行步骤S34,若否,执行步骤S32;在S32步骤得到了命令手势动作的移动轨迹数据,而命令手势动作可能是一组多个命令手势动作,比如连续输入“S”“A”两个命令手势动作,这时需要找到S A两个轨迹 数据的中间分割部分,从而把轨迹数据分割为两组,对应于两个命令手势动作。In step S33, it is determined whether the segment of the motion track is found. The feature extraction module 153 determines whether to find the segment of the motion track according to the movement track data. If yes, step S34 is performed, if not, step S32 is performed; and in step S32, the command gesture action is obtained. The movement of the trajectory data, and the command gesture action may be a set of multiple command gesture actions, such as continuously inputting two command gestures of "S" "A", in which case two tracks of S A need to be found. The intermediate segmentation of the data, thereby segmenting the trajectory data into two groups, corresponding to two command gesture actions.
步骤S34,定义虚拟操作平面;对于S33步骤分割得到的每一组轨迹数据,特征提取模块153定义出一个对应的虚拟操作平面,使得大部分的该组轨迹数据位于对应的虚拟操作平面上或者距离对应的虚拟操作平面很近;Step S34, defining a virtual operation plane; for each set of trajectory data obtained by the step S33, the feature extraction module 153 defines a corresponding virtual operation plane, so that most of the set of trajectory data is located on the corresponding virtual operation plane or distance. The corresponding virtual operation plane is very close;
步骤S35,提取特征数据;特征提取模块153将每一组轨迹数据(对应于一个手势命令)投影到对应的虚拟操作平面,并从投影后的轨迹数据中提取特征数据用于命令手势动作的识别。提取的特征数据可以是动作轨迹特征数据,也可以是加速度改变数据等等。Step S35, extracting feature data; the feature extraction module 153 projects each set of trajectory data (corresponding to a gesture command) to a corresponding virtual operation plane, and extracts feature data from the projected trajectory data for command gesture recognition. . The extracted feature data may be motion trajectory feature data, acceleration change data, or the like.
步骤S36,对命令手势动作进行识别;指令识别模块154根据特征提取模块153提取的特征数据对命令手势动作进行识别,将S35提取的特征数据输入到离线训练得到的命令识别模型,输出对应的命令手势动作,例如,识别出命令手势动作为“S”;Step S36: Identifying the command gesture action; the command recognition module 154 identifies the command gesture action according to the feature data extracted by the feature extraction module 153, and inputs the feature data extracted by S35 to the command recognition model obtained by offline training, and outputs the corresponding command. Gesture action, for example, recognizing the command gesture action as "S";
步骤S37,查找命令手势动作与控制指令关联数据库,判断识别出的命令手势动作是否与控制指令匹配,若是,执行步骤S38,若否,执行步骤S39;指令识别模块154判断所述命令手势动作是否与命令手势动作与控制指令关联数据库中的某个控制指令对应,从而获得控制指令;Step S37: Searching the command gesture action and the control instruction association database, determining whether the recognized command gesture action matches the control command, and if yes, executing step S38, if not, executing step S39; the command recognition module 154 determines whether the command gesture action is Corresponding to a certain control instruction in the database associated with the command gesture action and the control instruction, thereby obtaining the control instruction;
步骤S38,发出控制指令;Step S38, issuing a control instruction;
步骤S39,将命令手势动作缓存;人机交互***10可支持复杂的控制指令的输入,例如,命令手势动作“S”可单独代表一个控制指令,“S”也可与其他字母组合代表一个控制指令,如命令手势动作“SA”代表一个控制指令,在此种情况下,“S”为一个分段,“A”为一个分段,当识别出“S”时,未发现存在与“S”相对应的控制指令,此时先将“S”缓存,在下一轮的识别流程中,若识别出“A”,则找出与“SA”对应的控制指令;步骤S39后,再重复执行步骤S32。Step S39, the command gesture action is cached; the human-machine interaction system 10 can support the input of complex control commands. For example, the command gesture action "S" can represent a single control command, and the "S" can also be combined with other letters to represent a control. The instruction, such as the command gesture action "SA", represents a control command. In this case, "S" is a segment, "A" is a segment, and when "S" is recognized, no existence and "S" are found. "The corresponding control command, at this time, "S" is cached first. In the next round of the identification process, if "A" is recognized, the control command corresponding to "SA" is found; after step S39, the execution is repeated. Step S32.
图5为图4所示的人机交互方法中姿态识别的流程图。请参见图5、图1和图2,姿态识别是为了进行特征数据提取,进而识别命令手势动作,姿态识别包括以下步骤:FIG. 5 is a flow chart of gesture recognition in the human-computer interaction method shown in FIG. 4. Referring to FIG. 5, FIG. 1 and FIG. 2, the gesture recognition is for extracting feature data, thereby identifying command gesture actions, and the gesture recognition comprises the following steps:
步骤S41,开始;Step S41, starting;
步骤S42,数据预处理;数据预处理模块152对加速度计12感测到的加速度、磁场计13感测到的地球磁场强度及陀螺仪14感测到的角速度进行去噪和校正处理;Step S42, data preprocessing; the data preprocessing module 152 performs denoising and correction processing on the acceleration sensed by the accelerometer 12, the intensity of the earth magnetic field sensed by the magnetic field meter 13, and the angular velocity sensed by the gyroscope 14.
步骤S43,判断地球磁场是否受到干扰,若是,执行步骤S44,若否,执行步骤 S45;地球各经纬度的真实地球磁场强度信息可以先预设于控制器15内或存储在存储器18内,数据预处理模块152可将磁场计13感测到的地球磁场强度与真实地球磁场强度进行比较,若二者的误差小于或等于一预设阀值,则判断地球磁场未受干扰,若二者的误差大于所述预设阀值,则判断地球磁场受到干扰;Step S43, determining whether the earth's magnetic field is disturbed, and if so, executing step S44, and if not, performing the step S45; the real earth magnetic field strength information of each latitude and longitude of the earth may be preset in the controller 15 or stored in the memory 18. The data pre-processing module 152 may perform the earth magnetic field intensity sensed by the magnetic field meter 13 and the real earth magnetic field strength. Comparing, if the error of the two is less than or equal to a preset threshold, it is determined that the earth's magnetic field is not interfered, and if the error of the two is greater than the preset threshold, the earth's magnetic field is judged to be interfered;
步骤S44,根据加速度计12和陀螺仪14感测到的数据识别命令手势动作;Step S44, recognizing the command gesture action according to the data sensed by the accelerometer 12 and the gyroscope 14;
步骤S45,根据加速度及12和磁场计13感测到的数据识别命令手势动作;Step S45, identifying a command gesture action according to the data sensed by the acceleration and 12 and the magnetic field meter 13;
步骤S46,输出姿态识别信息;通过姿态识别的流程,数据预处理模块152将经过预处理的加速度、磁场计13感测到的地球磁场强度及角速度数据传输给特征提取模块153,特征提取模块153提取特征数据用于命令手势动作的识别。姿态识别是一个不断循环进行的过程,当识别得到一个时刻的姿态后,返回步骤S42重新执行,识别下一个时刻的姿态,依次循环。In step S46, the gesture recognition information is output. The data pre-processing module 152 transmits the pre-processed acceleration and the magnetic field strength and angular velocity data sensed by the magnetic field meter 13 to the feature extraction module 153. The feature extraction module 153 The feature data is extracted for commanding the recognition of the gesture action. The gesture recognition is a process of continuously looping. When the gesture of one moment is recognized, the process returns to step S42 to re-execute, and the gesture of the next moment is recognized, and the loop is sequentially performed.
图6为图4所示的人机交互方法中指令反馈的流程图。请参见图6、图1和图2,人机交互方法中指令反馈包括以下步骤:FIG. 6 is a flow chart of instruction feedback in the human-computer interaction method shown in FIG. 4. Referring to FIG. 6, FIG. 1 and FIG. 2, the instruction feedback in the human-computer interaction method includes the following steps:
步骤S51,开始;Step S51, starting;
步骤S52,解析反馈指令;反馈模块155用于接收被控制装置反馈回来的信息,并判断控制指令是否被正确执行,并将分析结果传输给第一指示装置控制模块156和第二指示装置控制模块157;Step S52, parsing the feedback instruction; the feedback module 155 is configured to receive the information fed back by the control device, and determine whether the control instruction is correctly executed, and transmit the analysis result to the first pointing device control module 156 and the second pointing device control module. 157;
步骤S53,确定振动模式;第一指示装置控制模块156根据反馈模块155的分析结果确定振动马达172应该以何种模式振动,例如,当控制指令被正确执行时,振动马达172应该发出两次间隔时间较长的振动;当控制指令执行错误时,振动马达172应该频繁连续振动;Step S53, determining the vibration mode; the first pointing device control module 156 determines in which mode the vibration motor 172 should vibrate according to the analysis result of the feedback module 155, for example, when the control command is correctly executed, the vibration motor 172 should issue two intervals. Long-time vibration; when the control command is executed incorrectly, the vibration motor 172 should vibrate frequently and continuously;
步骤S54,控制振动马达172振动;第一指示装置控制模块156控制振动马达172以所述确定的振动模式振动;Step S54, controlling the vibration motor 172 to vibrate; the first pointing device control module 156 controls the vibration motor 172 to vibrate in the determined vibration mode;
步骤S55,确定指示灯173的指示模式;第二指示装置控制模块157根据反馈模块155的分析结果确定指示灯173应该以何种模式指示,例如,红灯亮以表示执行控制指令发生错误,绿灯亮以表示控制指令被正确执行,黄灯亮以表示控制指令未被接收;Step S55, determining the indication mode of the indicator light 173; the second pointing device control module 157 determines, according to the analysis result of the feedback module 155, which mode the indicator light 173 should be indicated, for example, the red light is on to indicate that the execution control instruction has an error, and the green light Lights up to indicate that the control command is executed correctly, and the yellow light is on to indicate that the control command has not been received;
步骤S56,控制指示灯173点亮;第二指示装置控制模块157控制振指示灯173以所述确定的指示模式点亮。 In step S56, the control indicator 173 is turned on; the second pointing device control module 157 controls the vibration indicator 173 to illuminate in the determined indication mode.
需要说明的是,执行完步骤S54和步骤S56后,再执行步骤S52,依次循环。步骤S53和步骤S55可同时执行,步骤S54和步骤S56也可同时执行,但不以此为限,在其他实施例中,可省略步骤S53和步骤S54,或者省略步骤S55和步骤S56。It should be noted that after step S54 and step S56 are performed, step S52 is performed to cycle sequentially. Step S53 and step S55 may be performed simultaneously, and step S54 and step S56 may be performed simultaneously, but not limited thereto. In other embodiments, step S53 and step S54 may be omitted, or step S55 and step S56 may be omitted.
图7为用于图4所示的人机交互方法中命令识别模型的离线训练过程形成过程的流程图。因为对于不同的操作人员,即便输入相同的字符,起始笔画、倾斜度、曲率仍然有可能不同,命令识别模型库中命令手势动作代表何种控制指令,需要线下提前训练好,以适应复杂多变的使用需求。当识别命令手势动作后,需要将命令手势动作与命令手势与控制指令关联数据库中的信息进行比对,从而可以获知命令手势动作代表何种控制指令,命令手势与控制指令关联数据库中的命令手势动作可依照实际情况任意设定。7 is a flow chart showing a process of forming an offline training process for the command recognition model in the human-computer interaction method shown in FIG. Because for different operators, even if the same character is input, the starting stroke, inclination, and curvature may still be different. The command recognition model library indicates what kind of control command is required for the gesture action, and needs to be trained in advance to meet the complexity. Changeable usage requirements. After the command gesture action is recognized, the command gesture action needs to be compared with the command gesture and the information in the control instruction association database, so that it can be known which command command represents the command gesture action, and the command gesture and the control command are associated with the command gesture in the database. The action can be arbitrarily set according to the actual situation.
命令识别模型的形成过程包括以下的步骤:The formation process of the command recognition model includes the following steps:
步骤S61,开始;Step S61, starting;
步骤S62,创建新的动作模式;Step S62, creating a new action mode;
步骤S63,收集对应于单一命令手势动作的传感器数据;加速度计12感测人手的加速度,磁场计13感测人手所在位置的地球磁场强度,陀螺仪14感测人手的角速度;对于同一命令手势动作,采集多次对应的传感器数据。Step S63, collecting sensor data corresponding to a single command gesture action; the accelerometer 12 senses the acceleration of the human hand, the magnetic field meter 13 senses the strength of the earth magnetic field at the position of the human hand, and the gyroscope 14 senses the angular velocity of the human hand; , collecting corresponding sensor data multiple times.
步骤S64,数据预处理和特征数据提取;数据预处理模块152对加速度计12感测人手的加速度、磁场计13感测人手所在位置的地球磁场强度及陀螺仪14感测人手的角速度进行去噪和校正处理;特征提取模块153接收经过数据预处理模块152处理过的加速度、地球磁场强度和角速度信息,并对其进行特征数据提取,所述特征数据可以是轨迹数据,也可以是加速度改变数据等等;与前述相同,特征提取模块153的特征数据提取的过程为,特征提取模块153接收数据预处理模块152输出的数据信息,并根据所述数据信息获得三维姿态信息,由三维姿态信息和加速度数据信息可计算得到地球坐标系下命令手势动作对应的轨迹数据(二次积分),进而可定义出一虚拟操作平面(使得所有轨迹数据距离该平面较近),将轨迹数据投影到虚拟操作平面上,并对投影后的数据进行特征数据提取。Step S64, data preprocessing and feature data extraction; the data preprocessing module 152 demodulates the acceleration of the human hand by the accelerometer 12, the earth magnetic field strength of the magnetic field meter 13 sensing the position of the human hand, and the angular velocity of the gyroscope 14 sensing the human hand. And the correction processing; the feature extraction module 153 receives the acceleration, the earth magnetic field strength and the angular velocity information processed by the data preprocessing module 152, and extracts the feature data, and the feature data may be the trajectory data or the acceleration change data. The process of extracting the feature data of the feature extraction module 153 is that the feature extraction module 153 receives the data information output by the data preprocessing module 152, and obtains the three-dimensional posture information according to the data information, and the three-dimensional posture information and The acceleration data information can calculate the trajectory data (secondary integral) corresponding to the command gesture action in the earth coordinate system, and further define a virtual operation plane (so that all trajectory data is closer to the plane), and project the trajectory data to the virtual operation. Plane data on the projected data extract.
步骤S65,命令手势识别模型训练;基于对单一命令手势动作提取的特征数据,训练得到命令手势识别模型;可采用隐马尔科夫过程等训练方法;Step S65, commanding gesture recognition model training; training the command gesture recognition model based on the feature data extracted by the single command gesture action; and adopting a hidden Markov process and the like;
步骤S66,判断是否需要添加命令手势模型,若是,再次执行步骤S62,若否, 执行步骤S67;人机交互***10可支持复杂的控制指令的输入,当“SA”代表一个识别模型时,先训练得到识别模型“S”,为了继续训练得到识别模型“A”,则再次执行步骤S62;也就是说,如果命令对应于多个命令手势动作,则继续采集下一个命令手势动作的数据,并训练相应的命令手势识别模型;Step S66, determining whether it is necessary to add a command gesture model, and if so, performing step S62 again, if not, Step S67 is performed; the human-computer interaction system 10 can support the input of a complicated control instruction. When the "SA" represents a recognition model, the recognition model "S" is trained first, and the recognition model "A" is obtained for continuing training, and then executed again. Step S62; that is, if the command corresponds to a plurality of command gesture actions, continue to collect data of the next command gesture action, and train the corresponding command gesture recognition model;
步骤S67,结合命令手势识别模型;例如,将识别模型“S”和识别模型“A”的模型结合为识别模型“SA”,支持对复杂命令手势的识别。;In step S67, the model is recognized in combination with the command gesture; for example, the model identifying the model "S" and the recognition model "A" is combined into the recognition model "SA", and the recognition of the complex command gesture is supported. ;
步骤S68,将命令手势动作和控制指令相关联;在命令手势与控制指令关联数据库中设置每一命令手势动作代表何种控制指令;Step S68, associating the command gesture action with the control instruction; setting, in the command gesture and the control instruction association database, what control instruction is represented by each command gesture action;
步骤S69,形成命令识别模型和命令手势与控制指令关联数据库。将训练得到的命令识别模型以及命令手势与控制指令关联数据库进行存储。In step S69, a command recognition model and a command gesture and a control instruction association database are formed. The trained command recognition model and the command gesture are associated with the control instruction database for storage.
本发明的穿戴式人机交互装置、人机交互***及方法至少具有以下的优点:The wearable human-machine interaction device, human-computer interaction system and method of the present invention have at least the following advantages:
1.在本发明的穿戴式人机交互装置、人机交互***及方法中,控制器可接收加速度计感测到的指令输入物的加速度、磁场计感测到的地球磁场强度、陀螺仪感测到的指令输入物的角速度及指令输入物所在位置的真实地球磁场强度的数据信息,并判断磁场计感测到的地球磁场强度是否正常;当磁场计感测到的地球磁场强度正常时,控制器对指令输入物的加速度和磁场计感测到的地球磁场强度的数据信息进行处理、提取并识别指令输入物所输入的控制指令;当磁场计感测到的地球磁场强度不正常时,控制器对指令输入物的加速度和指令输入物的角速度的数据信息进行处理、提取并识别指令输入物所输入的控制指令,此种控制指令的识别方式在磁场受到干扰的情况下仍可实现较高的控制精度,且可支持简单或复杂的控制指令的输入。本发明的人机交互***的控制精度不会因外界噪音或环境光线而受影响,此外,本发明的穿戴式人机交互装置可直接佩戴于操作人员的手腕上,操作人员通过简单的手臂挥舞即可实现控制,无需触碰控制面板,离控制面板较远时,仍可实现控制指令的输入,控制设备的正常运行。1. In the wearable human-machine interaction device, human-computer interaction system and method of the present invention, the controller can receive the acceleration of the command input sensed by the accelerometer, the earth magnetic field intensity sensed by the magnetic field meter, and the gyroscope sense. Detecting the angular velocity of the command input object and the data of the true earth magnetic field strength at the position of the command input object, and determining whether the strength of the earth magnetic field sensed by the magnetometer is normal; when the strength of the earth magnetic field sensed by the magnetometer is normal, The controller processes, extracts and recognizes the control information input by the command input object by the acceleration of the command input object and the data information of the earth magnetic field strength sensed by the magnetic field meter; when the magnetic field strength sensed by the magnetic field meter is abnormal, The controller processes, extracts and recognizes the control information input by the command input object with the acceleration of the command input object and the angular velocity of the command input object, and the recognition mode of the control command can still be realized when the magnetic field is interfered. High control accuracy and support for simple or complex control command inputs. The control precision of the human-computer interaction system of the present invention is not affected by external noise or ambient light. In addition, the wearable human-machine interaction device of the present invention can be directly worn on the operator's wrist, and the operator wields through a simple arm. The control can be realized without touching the control panel, and when the control panel is far away, the input of the control command can still be realized, and the normal operation of the device can be controlled.
2.在本发明的穿戴式人机交互装置、人机交互***及方法的一个实施例中,人机交互***还包括温度传感器和数据预处理模块,温度传感器用于感测环境温度,以实现校准和纠偏,数据预处理模块可根据温度传感器感测的环境温度对加速度计感测到的加速度、磁场计感测到的地球磁场强度及陀螺仪感测到的角速度进行校正,从而有利于提高控制精度。 2. In an embodiment of the wearable human-machine interaction device, human-computer interaction system and method of the present invention, the human-computer interaction system further includes a temperature sensor and a data pre-processing module, wherein the temperature sensor is used to sense the ambient temperature to implement Calibration and rectification, the data pre-processing module can correct the acceleration sensed by the accelerometer, the intensity of the earth magnetic field sensed by the magnetometer, and the angular velocity sensed by the gyroscope according to the ambient temperature sensed by the temperature sensor, thereby facilitating improvement control precision.
3.在本发明的穿戴式人机交互装置、人机交互***及方法的一个实施例中,人机交互***中预设有命令手势与控制指令关联数据库,当识别命令手势动作后,将命令手势动作输入命令手势与控制指令关联数据库,从而可查找出命令手势动作代表何种控制指令,因此,当命令手势动作较为复杂时,仍可较为容易地找出对应的控制指令。3. In an embodiment of the wearable human-machine interaction device, the human-computer interaction system and the method of the present invention, the human-computer interaction system is pre-configured with a command gesture and a control instruction associated database, and when the command gesture gesture is recognized, the command is commanded. The gesture action input command gesture is associated with the control command database, so that the command gesture action represents which control command is represented. Therefore, when the command gesture action is more complicated, the corresponding control command can be easily found.
4.在本发明的穿戴式人机交互装置、人机交互***及方法的一个实施例中,人机交互***可支持复杂的控制指令的输入,当命令手势动作包括多个分段时,可先将识别出的部分分段进行缓存,其他分段被识别后,将所有分段结合在一起,即可找出所述命令手势动作对应的控制指令。4. In one embodiment of the wearable human-machine interaction device, human-computer interaction system and method of the present invention, the human-computer interaction system can support the input of complex control commands, and when the command gesture action includes multiple segments, The identified partial segments are cached first, and after the other segments are identified, all the segments are combined to find the control command corresponding to the command gesture action.
5.在本发明的穿戴式人机交互装置、人机交互***及方法的一个实施例中,可通过线下训练的方式形成命令识别模型,可根据实际需求任意设置命令手势动作、控制指令及命令手势动作和控制指令间的对应关系,可适应复杂多变的使用需求,应用方便灵活。5. In an embodiment of the wearable human-machine interaction device, the human-computer interaction system and the method of the present invention, the command recognition model can be formed by offline training, and the command gesture action and control command can be arbitrarily set according to actual needs. The correspondence between the command gesture action and the control command can adapt to the complicated and varied use requirements, and the application is convenient and flexible.
以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。 The above is only the preferred embodiment of the present invention, and is not intended to limit the present invention. Any modifications, equivalent substitutions, improvements, etc., which are included in the spirit and scope of the present invention, should be included in the present invention. Within the scope of protection.

Claims (15)

  1. 一种人机交互***(10),其特征在于,所述人机交互***(10)包括:A human-computer interaction system (10), characterized in that the human-computer interaction system (10) comprises:
    一加速度计(12),用于感测指令输入物沿着三个相互垂直的轴线方向上的加速度;An accelerometer (12) for sensing acceleration of the command input along three mutually perpendicular axes;
    一磁场计(13),用于感测所述指令输入物所在位置沿着三个相互垂直的轴线方向上的地球磁场强度;a magnetic field meter (13) for sensing an intensity of an earth magnetic field along three mutually perpendicular axes in a position where the command input object is located;
    一陀螺仪(14),用于感测所述指令输入物沿着三个相互垂直的轴线方向上的角速度;a gyroscope (14) for sensing an angular velocity of the command input in three mutually perpendicular axis directions;
    一控制器(15),接收所述加速度、所述地球磁场强度、所述角速度及所述指令输入物所在位置的真实地球磁场强度的数据信息,并判断地球磁场强度是否正常;若所述地球磁场强度正常,所述控制器(15)对所述加速度及所述磁场计感测到的地球磁场强度的数据信息进行处理、提取并识别所述指令输入物所输入的控制指令,再将识别出的控制指令输出;若地球磁场强度不正常时,所述控制器(15)对所述加速度及所述角速度的数据信息进行处理、提取并识别所述指令输入物所输入的控制指令,再将识别出的控制指令输出。a controller (15) receiving data of the acceleration, the earth magnetic field strength, the angular velocity, and the true earth magnetic field strength of the command input object, and determining whether the earth magnetic field strength is normal; The magnetic field strength is normal, and the controller (15) processes, extracts, and identifies the control information input by the command input object, and identifies the acceleration and the data information of the earth magnetic field strength sensed by the magnetic field meter. Outputting a control command output; if the earth magnetic field strength is abnormal, the controller (15) processes, extracts, and recognizes the control information input by the command input object, and the data information of the acceleration and the angular velocity, and then The recognized control command is output.
  2. 如权利要求1所述的人机交互***(10),其特征在于,所述控制器(15)包括:The human-machine interaction system (10) of claim 1 wherein said controller (15) comprises:
    一数据预处理模块(152),实时接收所述加速度计(12)感测到的加速度、所述磁场计(13)感测到的地球磁场强度、所述陀螺仪(14)感测到的角速度及所述指令输入物所在位置的真实地球磁场强度的数据信息,并对所述加速度、所述磁场计(13)感测到的地球磁场强度和所述角速度进行去噪和校正处理,且将所述磁场计(13)感测到的地球磁场强度与所述真实地球磁场强度进行比较,以判断地球磁场是否受到干扰;A data preprocessing module (152) receives the acceleration sensed by the accelerometer (12), the intensity of the earth magnetic field sensed by the magnetic field meter (13), and the sensed by the gyroscope (14) Data information of an angular velocity and a true earth magnetic field strength at a position where the command input is located, and denoising and correcting the acceleration, the earth magnetic field strength sensed by the magnetic field meter (13), and the angular velocity, and Comparing the intensity of the earth magnetic field sensed by the magnetic field meter (13) with the strength of the real earth magnetic field to determine whether the earth's magnetic field is disturbed;
    一特征提取模块(153),接收经过所述数据预处理模块(152)处理过的所述加速度、所述磁场计(13)感测到的地球磁场强度及所述角速度的数据信息,并从中提取特征数据;a feature extraction module (153), receiving the acceleration processed by the data preprocessing module (152), the magnetic field strength sensed by the magnetic field meter (13), and the angular velocity data information, and Extracting feature data;
    一指令识别模块(154),接收由所述特征提取模块(153)提取的特征数据,并根据所述特征数据对所述指令输入物发出的指令命令手势动作进行识别,且输出 与所述命令手势动作对应的所述控制指令。An instruction recognition module (154) receives feature data extracted by the feature extraction module (153), and identifies an instruction command gesture action issued by the instruction input object according to the feature data, and outputs The control instruction corresponding to the command gesture action.
  3. 如权利要求2所述的人机交互***(10),其特征在于,所述特征数据为轨迹数据或加速度改变数据。The human-computer interaction system (10) according to claim 2, wherein the feature data is trajectory data or acceleration change data.
  4. 如权利要求2所述的人机交互***(10),其特征在于,所述特征提取模块(153)接收所述数据预处理模块(152)输出的数据信息,并根据所述数据信息获得三维姿态信息,由三维姿态信息和加速度数据信息计算得到地球坐标系下命令手势动作对应的轨迹数据,进而定义出一虚拟操作平面,再将所述轨迹数据投影到所述虚拟操作平面上,并对投影后的数据进行特征数据提取。The human-machine interaction system (10) according to claim 2, wherein the feature extraction module (153) receives data information output by the data pre-processing module (152), and obtains three-dimensional data according to the data information. The attitude information is calculated from the three-dimensional posture information and the acceleration data information to obtain trajectory data corresponding to the command gesture action in the earth coordinate system, thereby defining a virtual operation plane, and then projecting the trajectory data onto the virtual operation plane, and The projected data is subjected to feature data extraction.
  5. 如权利要求2所述的人机交互***(10),其特征在于,所述指令识别模块(154)将轨迹特征数据输入离线训练得到的命令手势识别模型,对命令手势动作进行识别,所述指令识别模块(154)还用于查找命令手势与控制指令关联数据库获取与命令手势动作对应的控制指令。The human-machine interaction system (10) according to claim 2, wherein the instruction recognition module (154) inputs the trajectory feature data into a command gesture recognition model obtained by offline training, and recognizes the command gesture action, The instruction recognition module (154) is further configured to search the command gesture and the control instruction association database to obtain a control instruction corresponding to the command gesture action.
  6. 如权利要求5所述的人机交互***(10),其特征在于,所述人机交互***还包括:The human-computer interaction system (10) of claim 5, wherein the human-computer interaction system further comprises:
    一存储器(18),所述命令识别模型、所述命令手势与控制指令关联数据库存储在所述存储器(18)内,上电后,所述命令识别模型、所述命令手势与控制指令关联数据库加载进所述控制器(15)。a memory (18), the command recognition model, the command gesture and the control instruction association database are stored in the memory (18), after power-on, the command recognition model, the command gesture and the control instruction associated database Loaded into the controller (15).
  7. 如权利要求2所述的人机交互***(10),其特征在于,所述人机交互***还包括:The human-computer interaction system (10) of claim 2, wherein the human-computer interaction system further comprises:
    一温度传感器(16),用于感测所述指令输入物所在位置的环境温度,并将所述环境温度传输给所述数据预处理模块(152),所述数据预处理模块(152)可根据所述环境温度对所述加速度计(12)感测到的加速度、所述磁场计(13)感测到的地球磁场强度及所述陀螺仪(14)感测到的角速度进行校正。a temperature sensor (16) for sensing an ambient temperature of the location of the command input and transmitting the ambient temperature to the data pre-processing module (152), the data pre-processing module (152) The acceleration sensed by the accelerometer (12), the intensity of the earth magnetic field sensed by the magnetic field meter (13), and the angular velocity sensed by the gyroscope (14) are corrected based on the ambient temperature.
  8. 如权利要求1所述的人机交互***(10),其特征在于,所述人机交互***还包括:The human-machine interaction system (10) of claim 1, wherein the human-computer interaction system further comprises:
    一反馈模块(155),所述指令识别模块(154)输出的控制指令传输给一被控制装置,所述反馈模块(155)用于接收所述被控制装置反馈回来的信息,并判断所述控制指令是否被接收或被正确执行;a feedback module (155), the control command outputted by the command recognition module (154) is transmitted to a controlled device, and the feedback module (155) is configured to receive information fed back by the controlled device, and determine the Control whether the instruction is received or executed correctly;
    一指示装置(17); a pointing device (17);
    一指示装置控制模块(156、157),根据所述反馈模块(155)判断的所述控制指令是否被接收或被正确执行的结果控制所述指示装置(17)发出对应的指示。A pointing device control module (156, 157) controls the pointing device (17) to issue a corresponding indication based on whether the control command determined by the feedback module (155) is received or correctly executed.
  9. 如权利要求8所述的人机交互***(10),其特征在于,所述指示装置(17)包括一振动马达(172)和一指示灯(173)的至少其中之一。The human-machine interaction system (10) according to claim 8, wherein said indicating means (17) comprises at least one of a vibration motor (172) and an indicator light (173).
  10. 如权利要求1所述的人机交互***(10),其特征在于,所述指令输入物为人手。The human-computer interaction system (10) of claim 1 wherein said command input is a human hand.
  11. 一种穿戴式人机交互装置,其特征在于,所述穿戴式人机交互装置包括如权利要求1至10任意一项所述的人机交互***。A wearable human-machine interaction device, comprising: the human-machine interaction system according to any one of claims 1 to 10.
  12. 一种人机交互方法,其特征在于,所述人机交互方法包括以下步骤:A human-computer interaction method, characterized in that the human-computer interaction method comprises the following steps:
    感测指令输入物沿着三个相互垂直的轴线方向上的加速度、所述指令输入物所在位置沿着三个相互垂直的轴线方向上的地球磁场强度、所述指令输入物沿着三个相互垂直的轴线方向上的角速度;Sensing the acceleration of the command input in three mutually perpendicular axis directions, the position of the command input object along three mutually perpendicular axes, the command input along three mutual Angular velocity in the direction of the vertical axis;
    将所述地球磁场强度和所述指令输入物所在位置的真实地球磁场强度进行比较,判断所述地球磁场强度是否正常;Comparing the strength of the earth magnetic field with the true earth magnetic field strength of the position where the command input object is located, and determining whether the earth magnetic field strength is normal;
    当所述地球磁场强度正常时,对所述加速度及所述地球磁场强度的数据信息进行处理、提取并识别所述指令输入物所输入的控制指令,再将识别出的控制指令输出;When the strength of the earth magnetic field is normal, processing, extracting and identifying the control information input by the command input object, and outputting the recognized control command;
    若所述地球磁场强度不正常时,对所述加速度及所述角速度的数据信息进行处理、提取并识别所述指令输入物所输入的控制指令,再将识别出的控制指令输出。If the strength of the earth magnetic field is abnormal, the data information of the acceleration and the angular velocity is processed, extracted, and the control command input by the command input object is recognized, and the recognized control command is output.
  13. 如权利要求12所述的人机交互方法,其特征在于,所述人机交互方法还包括以下步骤:The human-computer interaction method according to claim 12, wherein the human-computer interaction method further comprises the following steps:
    对所述加速度、所述地球磁场强度和所述角速度进行去噪和校正处理。De-noising and correction processing is performed on the acceleration, the earth magnetic field strength, and the angular velocity.
  14. 如权利要求13所述的人机交互方法,其特征在于,所述人机交互方法还包括以下步骤:The human-computer interaction method according to claim 13, wherein the human-computer interaction method further comprises the following steps:
    感测所述指令输入物所在位置的环境温度,根据所述环境温度对所述加速度、所述地球磁场强度及所述角速度进行校正。Sensing an ambient temperature at a location where the command input is located, and correcting the acceleration, the earth magnetic field strength, and the angular velocity according to the ambient temperature.
  15. 如权利要求13所述的人机交互方法,其特征在于,所述人机交互方法还包括以下步骤: The human-computer interaction method according to claim 13, wherein the human-computer interaction method further comprises the following steps:
    所述控制指令传输给一被控制装置,所述被控制装置反馈所述控制指令执行情况的信息;The control command is transmitted to a controlled device, and the controlled device feeds back information about the execution of the control command;
    判断所述控制指令是否被接收或被正确执行;Determining whether the control instruction is received or correctly executed;
    根据判断的所述控制指令是否被接收或被正确执行的结果控制一指示装置(17)发出对应的指示。 A pointing device (17) issues a corresponding indication based on whether the determined control command is received or correctly executed.
PCT/CN2015/100310 2015-12-31 2015-12-31 Wearable human-machine interaction apparatus, and human-machine interaction system and method WO2017113389A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/100310 WO2017113389A1 (en) 2015-12-31 2015-12-31 Wearable human-machine interaction apparatus, and human-machine interaction system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/100310 WO2017113389A1 (en) 2015-12-31 2015-12-31 Wearable human-machine interaction apparatus, and human-machine interaction system and method

Publications (1)

Publication Number Publication Date
WO2017113389A1 true WO2017113389A1 (en) 2017-07-06

Family

ID=59224371

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/100310 WO2017113389A1 (en) 2015-12-31 2015-12-31 Wearable human-machine interaction apparatus, and human-machine interaction system and method

Country Status (1)

Country Link
WO (1) WO2017113389A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662495A (en) * 2012-03-20 2012-09-12 苏州佳世达光电有限公司 Coordinate sensing system, coordinate sensing method and display system
WO2014062906A1 (en) * 2012-10-19 2014-04-24 Interphase Corporation Motion compensation in an interactive display system
CN104266648A (en) * 2014-09-16 2015-01-07 南京诺导电子科技有限公司 Indoor location system based on Android platform MARG sensor
CN104331223A (en) * 2014-10-11 2015-02-04 广东小天才科技有限公司 Intelligent setting method and intelligent setting device for function setting of intelligent wearable device
CN104516529A (en) * 2013-09-28 2015-04-15 南京专创知识产权服务有限公司 Remote control equipment based on magnetic fields, gyroscope and acceleration sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662495A (en) * 2012-03-20 2012-09-12 苏州佳世达光电有限公司 Coordinate sensing system, coordinate sensing method and display system
WO2014062906A1 (en) * 2012-10-19 2014-04-24 Interphase Corporation Motion compensation in an interactive display system
CN104516529A (en) * 2013-09-28 2015-04-15 南京专创知识产权服务有限公司 Remote control equipment based on magnetic fields, gyroscope and acceleration sensor
CN104266648A (en) * 2014-09-16 2015-01-07 南京诺导电子科技有限公司 Indoor location system based on Android platform MARG sensor
CN104331223A (en) * 2014-10-11 2015-02-04 广东小天才科技有限公司 Intelligent setting method and intelligent setting device for function setting of intelligent wearable device

Similar Documents

Publication Publication Date Title
US11009941B2 (en) Calibration of measurement units in alignment with a skeleton model to control a computer system
US10860091B2 (en) Motion predictions of overlapping kinematic chains of a skeleton model used to control a computer system
US11474593B2 (en) Tracking user movements to control a skeleton model in a computer system
US10976863B1 (en) Calibration of inertial measurement units in alignment with a skeleton model to control a computer system based on determination of orientation of an inertial measurement unit from an image of a portion of a user
US9221170B2 (en) Method and apparatus for controlling a robotic device via wearable sensors
KR102347067B1 (en) Portable device for controlling external apparatus via gesture and operating method for same
US11079860B2 (en) Kinematic chain motion predictions using results from multiple approaches combined via an artificial neural network
US11175729B2 (en) Orientation determination based on both images and inertial measurement units
US11237632B2 (en) Ring device having an antenna, a touch pad, and/or a charging pad to control a computing device based on user motions
CN102024316B (en) Wireless intelligent sensing method, device and system
EP3725217A1 (en) Electronic device and method for measuring heart rate
US20210068674A1 (en) Track user movements and biological responses in generating inputs for computer systems
WO2020009715A2 (en) Tracking user movements to control a skeleton model in a computer system
CN108279773B (en) Data glove based on MARG sensor and magnetic field positioning technology
Li et al. Real-time hand gesture tracking for human–computer interface based on multi-sensor data fusion
WO2019173678A1 (en) Optimal hand pose tracking using a flexible electronics-based sensing glove and machine learning
KR101341481B1 (en) System for controlling robot based on motion recognition and method thereby
CN108098760A (en) A kind of new biped robot&#39;s traveling control device and method
JP2017191426A (en) Input device, input control method, computer program, and storage medium
WO2019083406A1 (en) Method of producing a virtual reality glove (embodiments)
CN113056315B (en) Information processing apparatus, information processing method, and program
WO2017113389A1 (en) Wearable human-machine interaction apparatus, and human-machine interaction system and method
CN109901439A (en) A kind of walking robot control device and method
TWI662439B (en) Virtual space positioning method and apparatus
US10156907B2 (en) Device for analyzing the movement of a moving element and associated method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15912013

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15912013

Country of ref document: EP

Kind code of ref document: A1