WO2017008182A1 - Control system for intelligent robot - Google Patents
Control system for intelligent robot Download PDFInfo
- Publication number
- WO2017008182A1 WO2017008182A1 PCT/CN2015/000814 CN2015000814W WO2017008182A1 WO 2017008182 A1 WO2017008182 A1 WO 2017008182A1 CN 2015000814 W CN2015000814 W CN 2015000814W WO 2017008182 A1 WO2017008182 A1 WO 2017008182A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- microprocessor
- control system
- data
- gesture
- control signal
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
Definitions
- the invention relates to an intelligent robot, in particular to an intelligent robot control system.
- the traditional scheme is wireless remote control, which is controlled by wirelessly transmitting the key code or directly on the surface of the robot.
- the robot is controlled by voice command control.
- the wireless remote control or the button is a physical mechanical control method, which brings reliability problems.
- the voice control method has low recognition rate and environmental noise interference, and relies on the environment. Sexually large.
- the technical problem to be solved by the present invention is to provide a control system for an intelligent robot with good reliability and strong anti-interference ability.
- the technical solution adopted by the present invention is an intelligent robot control system including a microprocessor, a control signal input device and a plurality of joint servo mechanisms, and an output terminal of the control signal input device is connected to the microprocessor.
- the plurality of control signal output terminals of the microprocessor are respectively connected to the joint servo mechanism, wherein the control signal input device is an optical sensor.
- the optical sensor comprises an infrared demodulation receiver and a plurality of pairs of infrared emission tubes, and the signal output end of the infrared demodulation receiver is connected to the microprocessor; the two emission tubes of each pair of infrared emission tubes are separately arranged and emitted The data is different; the microprocessor judges the gesture of issuing the control signal according to the order in which the data of the pair of infrared transmitting tubes arrives.
- the optical sensor is a gesture recognition smart sensor.
- the processing of the sample data of the optical sensor by the microprocessor includes sampling The data is first subjected to offset correction, and then adaptive filtering is performed to obtain valid data. Then, the gesture vector data is obtained through processing of the valid data, and the gesture result event is finally recognized by processing the gesture vector data.
- the microprocessor processing the optical sensor sample data includes applying the recognition gesture result event to the APK.
- the control system of the intelligent robot of the invention adopts an identification system composed of optical sensors and a proximity sensing system, which has a longer life than a general mechanical control system, better reliability than a voice recognition control system, high accuracy, and strong anti-interference ability.
- FIG. 1 is a block diagram of a control system of an intelligent robot according to an embodiment of the present invention.
- FIG. 2 is a block diagram of a control system of an intelligent robot according to an embodiment of the present invention.
- FIG. 3 is a functional block diagram of a driving layer of an intelligent robot according to an embodiment of the present invention.
- FIG. 4 is a functional block diagram of an application layer of an intelligent robot according to an embodiment of the present invention.
- FIG. 5 is a functional block diagram of an application layer read and write TMG399X of an intelligent robot according to an embodiment of the present invention.
- FIG. 6 is a flow chart of an intelligent robot drive implementation code according to an embodiment of the present invention.
- FIG. 7 is a functional block diagram of an intelligent robot application implementation code according to an embodiment of the present invention.
- the control system of the intelligent robot of Embodiment 1 of the present invention includes a microprocessor, a control signal input device and a plurality of joint servo mechanisms.
- the control signal input device is an optical sensor, and the output end of the optical sensor is connected to the microprocessor, and the plurality of control signal output ends of the microprocessor are respectively connected to the joint servo mechanism.
- the optical sensor includes an infrared demodulation receiver and two pairs of infrared transmitting tubes, and a letter of the infrared demodulating receiver
- the output terminal is connected to the microprocessor; the two transmitting tubes of the first pair of infrared transmitting tubes are arranged separately in the lateral direction, and the infrared transmitting tubes transmit data in a modulation mode of a carrier frequency of 38K, and each infrared transmitting tube transmits different data, 38K
- the remote infrared receiver can demodulate the data of each infrared emitter.
- the microprocessor MCU determines the order in which each data arrives, and can determine the direction of the user's hand waving horizontally in front of the robot. When the infrared receiver receives the gesture of the user's waving after demodulating the waveform, various gestures can be identified.
- the two launch tubes of the second pair of infrared transmitting tubes are arranged separately in the vertical direction, and the direction of the action of the user's hand waving vertically in front of the robot can be determined. Thereby determining the gesture of issuing the control signal.
- the control system of the intelligent robot of Embodiment 2 of the present invention is as shown in FIG. 2 to FIG. 4, and unlike the first embodiment, the optical sensor adopts the optical sensor TMG399x of Austriamicro.
- the robot body is up to 40 to 50 cm, and a single optical sensor TMG399X series transceiver sensor can be placed in the mouth of the robot and connected to the robot hardware system through PCBA and FPC.
- the optical sensor TMG399x communicates with the android system through the IIC interface and the interruptible GPIO. After the android system reads the sensor data, it calculates and recognizes various gestures recognized by the android driver layer algorithm from top to bottom. Left to right, long press and so on a series of gestures can be discerned.
- Austrian Microelectronics' single optical sensor TMG399X series of transceiver sensors has up to 16 gesture positions, obstacle proximity or proximity sensing, static heart rate detection and more.
- the signal output by the TMG399X transceiver sensor is connected to the processor of the android system through an I2C interface and an interrupt IO.
- the android system processes the obtained raw sample data through two levels to obtain the result of the gesture, one of the two levels. It is the driver level and the other is the application level.
- Figure 3 is the functional block diagram of the driver layer, which illustrates the proximity of the raw data and hands collected by the driver layer.
- the potential raw data is firstly subjected to offset correction, and then adaptive filter function processing is performed to obtain valid data, and then the gesture vector data is obtained through the effective gesture data processing function, and finally the gesture vector data is processed to obtain the final recognition.
- Various gesture result events are firstly subjected to offset correction, and then adaptive filter function processing is performed to obtain valid data, and then the gesture vector data is obtained through the effective gesture data processing function, and finally the gesture vector data is processed to obtain the final recognition.
- the optical sensor detects the gesture event, informs the driver by interrupting the IO, drives the original data in the optical sensor through the iic, and then sends a message to the AMS algorithm service program.
- the service program adjusts the tmg3993x_get_data_buf to obtain the original data from the driver layer. The result is obtained, and then the service program calls the tmg3993x_report_ges_store function to return the result to the driver. After the driver gets the calculated result, the result is notified to the JNI layer using input_report_abs.
- Various peripherals for the APP to control the robot can be configured, such as: robot motion, Bluetooth phone control, voice interaction start, and the like.
- One of the Bluetooth phone manipulation applications is now described in detail by Embodiment 4 of the present invention.
- the operating system of the robot is Android system, which needs to control the robot's equipment.
- the Bluetooth phone is controlled by the gesture recognition result.
- the Android system application APK development cannot directly operate the device, but must be called by a middle layer called JNI. Layer functions to use device features.
- the control flow chart is shown in Figure 6:
- the gesture A is defined as a answering call from left to right
- the gesture B is defined as a rejecting call from right to left.
- the bottom layer starts the gesture recognition module to obtain the original data
- the JNI layer periodically reads the driver layer data, analyzes the data and obtains the result of the gesture, and passes the gesture result as the parameter callback of the notityByJNI function to the java layer
- the application The layer broadcasts the results by broadcasting.
- the application can process the corresponding service after receiving the broadcast event by registering the broadcast and listening to the broadcast.
- the AT command is used to control the connection and rejection of the Bluetooth phone.
- gesture A is an AT Bluetooth telephone answering command
- execution of the association of gesture B is rejected by the AT Bluetooth telephone command. Listen to the phone.
- gestures can be associated with specific behaviors, and so on, other robot behaviors can be associated with corresponding gestures.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
A control system for an intelligent robot comprising a microprocessor, a control signal input device and a plurality of joint servo mechanisms, wherein an output end of the control signal input device is connected to the microprocessor, a plurality of control signal output ends of the microprocessor are connected to the joint servo mechanisms respectively, and the control signal input device is an optical sensor. Since a recognition system and a proximity induction system composed of optical sensors are used, the service life of the control system for the intelligent robot is longer than that of an ordinary mechanical control system, the reliability of the control system is higher than that of a voice recognition control system, accuracy is high, and the anti-jamming capacity is high.
Description
本发明涉及智能机器人,尤其涉及一种智能机器人控制***。The invention relates to an intelligent robot, in particular to an intelligent robot control system.
控制机器人的方法有多种多样,传统方案是无线遥控方式,通过无线传输按键码或者是直接在机器人表面有按键来操控;另外还有通过语音命令控制的方式操控机器人。通过无线遥控器或按键的方式,因为都是物理的机械的控制方式,会带来可靠性的问题;语音控制的方式,存在有识别率低的问题和环境噪音干扰的问题,对环境的依赖性较大。There are many ways to control the robot. The traditional scheme is wireless remote control, which is controlled by wirelessly transmitting the key code or directly on the surface of the robot. In addition, the robot is controlled by voice command control. The wireless remote control or the button is a physical mechanical control method, which brings reliability problems. The voice control method has low recognition rate and environmental noise interference, and relies on the environment. Sexually large.
[发明内容][Summary of the Invention]
本发明要解决的技术问题是提供一种可靠性好、抗干扰能力强的智能机器人的控制***。The technical problem to be solved by the present invention is to provide a control system for an intelligent robot with good reliability and strong anti-interference ability.
为了解决上述技术问题,本发明采用的技术方案是,一种智能机器人的控制***,包括微处理器、控制信号输入装置和复数个关节伺服机构,控制信号输入装置的输出端接微处理器,微处理器的复数个控制信号输出端分别接关节伺服机构,其特征在于,控制信号输入装置是光学传感器。In order to solve the above technical problem, the technical solution adopted by the present invention is an intelligent robot control system including a microprocessor, a control signal input device and a plurality of joint servo mechanisms, and an output terminal of the control signal input device is connected to the microprocessor. The plurality of control signal output terminals of the microprocessor are respectively connected to the joint servo mechanism, wherein the control signal input device is an optical sensor.
以上所述的控制***,光学传感器包括红外解调接收器和复数对红外发射管,红外解调接收器的信号输出端接微处理器;每对红外发射管的两个发射管分开布置,发射的数据不同;微处理器根据一对红外发射管数据到达的先后次序,判断发出控制信号的手势。In the above control system, the optical sensor comprises an infrared demodulation receiver and a plurality of pairs of infrared emission tubes, and the signal output end of the infrared demodulation receiver is connected to the microprocessor; the two emission tubes of each pair of infrared emission tubes are separately arranged and emitted The data is different; the microprocessor judges the gesture of issuing the control signal according to the order in which the data of the pair of infrared transmitting tubes arrives.
以上所述的控制***,所述的光学传感器是手势识别智能传感器。In the above control system, the optical sensor is a gesture recognition smart sensor.
以上所述的控制***,微处理器对光学传感器采样数据的处理包括把采样
数据先进行偏移校正,再进行适配滤波等处理,得到有效数据;然后通过有效数据的处理得到手势向量数据,通过对手势向量数据的处理,最终识别出手势结果事件。The control system described above, the processing of the sample data of the optical sensor by the microprocessor includes sampling
The data is first subjected to offset correction, and then adaptive filtering is performed to obtain valid data. Then, the gesture vector data is obtained through processing of the valid data, and the gesture result event is finally recognized by processing the gesture vector data.
以上所述的控制***,微处理器对光学传感器采样数据的处理包括把识别出手势结果事件应用到APK中。In the control system described above, the microprocessor processing the optical sensor sample data includes applying the recognition gesture result event to the APK.
本发明智能机器人的控制***采用光学传感器组成的识别***和接近感应***,比一般机械式的控制***寿命长,比语音识别控制***的可靠性好,准确度高,抗干扰能力强。The control system of the intelligent robot of the invention adopts an identification system composed of optical sensors and a proximity sensing system, which has a longer life than a general mechanical control system, better reliability than a voice recognition control system, high accuracy, and strong anti-interference ability.
下面结合附图和具体实施方式对本发明作进一步详细的说明。The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.
图1是本发明实施例智能机器人的控制***框图。1 is a block diagram of a control system of an intelligent robot according to an embodiment of the present invention.
图2是本发明实施例智能机器人的控制***框图。2 is a block diagram of a control system of an intelligent robot according to an embodiment of the present invention.
图3是本发明实施例智能机器人的驱动层功能框图。3 is a functional block diagram of a driving layer of an intelligent robot according to an embodiment of the present invention.
图4是本发明实施例智能机器人应用层的功能框图。4 is a functional block diagram of an application layer of an intelligent robot according to an embodiment of the present invention.
图5是本发明实施例智能机器人的应用层读写TMG399X的功能框图。FIG. 5 is a functional block diagram of an application layer read and write TMG399X of an intelligent robot according to an embodiment of the present invention.
图6是本发明实施例智能机器人驱动实现代码的流程框图。6 is a flow chart of an intelligent robot drive implementation code according to an embodiment of the present invention.
图7是本发明实施例智能机器人应用实现代码的功能框图。FIG. 7 is a functional block diagram of an intelligent robot application implementation code according to an embodiment of the present invention.
本发明实施例1智能机器人的控制***如图1所示,包括微处理器、控制信号输入装置和多个关节伺服机构。The control system of the intelligent robot of Embodiment 1 of the present invention, as shown in FIG. 1, includes a microprocessor, a control signal input device and a plurality of joint servo mechanisms.
控制信号输入装置是光学传感器,光学传感器的输出端接微处理器,微处理器的多个控制信号输出端分别接关节伺服机构。The control signal input device is an optical sensor, and the output end of the optical sensor is connected to the microprocessor, and the plurality of control signal output ends of the microprocessor are respectively connected to the joint servo mechanism.
光学传感器包括红外解调接收器和两对红外发射管,红外解调接收器的信
号输出端接微处理器;第一对红外发射管的两个发射管在横向分开布置,红外发射管以38K的载波频率的调制方式发送数据,每个红外发射管发射不一样的数据,38K遥控红外接收器便可解调出每一个红外发射管的数据。通过微处理器MCU来判断每一路数据到达的先后顺序,便可判断出用户的手在机器人前面水平挥动的动作的方向。当红外接收器接到波形解调后得到用户挥动的手势,便可识别出各种手势。The optical sensor includes an infrared demodulation receiver and two pairs of infrared transmitting tubes, and a letter of the infrared demodulating receiver
The output terminal is connected to the microprocessor; the two transmitting tubes of the first pair of infrared transmitting tubes are arranged separately in the lateral direction, and the infrared transmitting tubes transmit data in a modulation mode of a carrier frequency of 38K, and each infrared transmitting tube transmits different data, 38K The remote infrared receiver can demodulate the data of each infrared emitter. The microprocessor MCU determines the order in which each data arrives, and can determine the direction of the user's hand waving horizontally in front of the robot. When the infrared receiver receives the gesture of the user's waving after demodulating the waveform, various gestures can be identified.
为了判断更多的手势方向,第二对红外发射管的两个发射管在竖直方向分开布置,便可判断出用户的手在机器人前面竖直挥动的动作的方向。从而判断发出控制信号的手势。In order to judge more gesture directions, the two launch tubes of the second pair of infrared transmitting tubes are arranged separately in the vertical direction, and the direction of the action of the user's hand waving vertically in front of the robot can be determined. Thereby determining the gesture of issuing the control signal.
本发明实施例2智能机器人的控制***如图2至图4所示,与实施例1不同的是,光学传感器采用奥地利微电的光学传感器TMG399x。The control system of the intelligent robot of Embodiment 2 of the present invention is as shown in FIG. 2 to FIG. 4, and unlike the first embodiment, the optical sensor adopts the optical sensor TMG399x of Austriamicro.
本实施例机器人身高达40到50cm,单个光学传感器TMG399X系列收发传感器可以放置在机器人的嘴部,通过PCBA和FPC接入到机器人的硬件***。光学传感器TMG399x通过IIC接口以及可中断的GPIO与android***进行通信,android***读取传感器数据后,再经过android的驱动层算法计算和识别出各种感应识别出的手势,从上到下,从左到右,长按等等一系列的手势可判别出来。In this embodiment, the robot body is up to 40 to 50 cm, and a single optical sensor TMG399X series transceiver sensor can be placed in the mouth of the robot and connected to the robot hardware system through PCBA and FPC. The optical sensor TMG399x communicates with the android system through the IIC interface and the interruptible GPIO. After the android system reads the sensor data, it calculates and recognizes various gestures recognized by the android driver layer algorithm from top to bottom. Left to right, long press and so on a series of gestures can be discerned.
奥地利微电子的单个光学传感器TMG399X系列收发传感器具有多达16种手势姿势,障碍物接近或靠近感应,静态心率检测等。Austrian Microelectronics' single optical sensor TMG399X series of transceiver sensors has up to 16 gesture positions, obstacle proximity or proximity sensing, static heart rate detection and more.
TMG399X收发传感器输出的信号,通过一路I2C接口和一路中断IO接入到android***的处理器上,android***通过两个层面来处理获得的原始采样数据以得到手势的结果,两个层面中的一个是驱动层面,另一个是应用层面。其中,如图3为驱动层功能框图,说明了驱动层面把采集到的接近原始数据和手
势原始数据先进行偏移校正,再进行适配滤波函数处理,得到了有效的数据,然后通过有效的手势数据的处理函数得到手势向量数据,再者通过手势向量数据的处理得到最终的识别出各种手势结果事件。The signal output by the TMG399X transceiver sensor is connected to the processor of the android system through an I2C interface and an interrupt IO. The android system processes the obtained raw sample data through two levels to obtain the result of the gesture, one of the two levels. It is the driver level and the other is the application level. Among them, as shown in Figure 3 is the functional block diagram of the driver layer, which illustrates the proximity of the raw data and hands collected by the driver layer.
The potential raw data is firstly subjected to offset correction, and then adaptive filter function processing is performed to obtain valid data, and then the gesture vector data is obtained through the effective gesture data processing function, and finally the gesture vector data is processed to obtain the final recognition. Various gesture result events.
本发明实施例3智能机器人的控制***的原理如图5和图6所示,The principle of the control system of the intelligent robot of Embodiment 3 of the present invention is as shown in FIG. 5 and FIG.
光学传感器检测到手势事件,通过中断IO通知驱动,驱动通过iic读取光学传感器里面原始数据,然后发消息给AMS的算法服务程序,服务程序序调tmg3993x_get_data_buf,从驱动层获取原始数据,经过计算后得出结果,然后服务程序再调用tmg3993x_report_ges_store函数,把结果返回给驱动,驱动得到计算后的结果后,使用input_report_abs把结果通知JNI层。The optical sensor detects the gesture event, informs the driver by interrupting the IO, drives the original data in the optical sensor through the iic, and then sends a message to the AMS algorithm service program. The service program adjusts the tmg3993x_get_data_buf to obtain the original data from the driver layer. The result is obtained, and then the service program calls the tmg3993x_report_ges_store function to return the result to the driver. After the driver gets the calculated result, the result is notified to the JNI layer using input_report_abs.
可配置用于APP控制机器人的各种外设,例如:机器人的动作,蓝牙电话操控,语音交互开始等等。现以其中的一个蓝牙电话操控应用为通过本发明的实施例4来详细介绍。Various peripherals for the APP to control the robot can be configured, such as: robot motion, Bluetooth phone control, voice interaction start, and the like. One of the Bluetooth phone manipulation applications is now described in detail by Embodiment 4 of the present invention.
机器人运行的操作***是Android***,需要控制机器人的设备,如通过手势识别结果来操控蓝牙电话,Android***的应用APK开发不能直接去操作设备,而是必须通过一个叫JNI的中间层去调用驱动层的函数来使用设备功能。控制流程图如图6所示:The operating system of the robot is Android system, which needs to control the robot's equipment. For example, the Bluetooth phone is controlled by the gesture recognition result. The Android system application APK development cannot directly operate the device, but must be called by a middle layer called JNI. Layer functions to use device features. The control flow chart is shown in Figure 6:
在本实施例中,手势A为从左到右定义为接听电话,手势B为从右到左定义为拒听电话。当有电话来的时候,首先底层启动手势识别模块得到原始数据,JNI层定时读取驱动层数据,分析数据后得到手势的结果,将手势结果作为notityByJNI函数的形参回调传递到java层,应用层通过广播的方式,把结果广播出来。应用可通过注册广播并监听广播的方式,在接收到广播事件后处理相应的业务。如通过AT指令来控制蓝牙电话的接通和拒听电话等。手势A的关联的执行为AT蓝牙电话接听指令,手势B的关联的执行为AT蓝牙电话命令拒
听电话。通过此方法,便可把手势和具体的行为关联,如此类推,其他的机器人行为也可以关联到相应的手势。
In the present embodiment, the gesture A is defined as a answering call from left to right, and the gesture B is defined as a rejecting call from right to left. When there is a call, the bottom layer starts the gesture recognition module to obtain the original data, the JNI layer periodically reads the driver layer data, analyzes the data and obtains the result of the gesture, and passes the gesture result as the parameter callback of the notityByJNI function to the java layer, and the application The layer broadcasts the results by broadcasting. The application can process the corresponding service after receiving the broadcast event by registering the broadcast and listening to the broadcast. For example, the AT command is used to control the connection and rejection of the Bluetooth phone. The execution of the association of gesture A is an AT Bluetooth telephone answering command, and the execution of the association of gesture B is rejected by the AT Bluetooth telephone command.
Listen to the phone. In this way, gestures can be associated with specific behaviors, and so on, other robot behaviors can be associated with corresponding gestures.
Claims (5)
- 一种智能机器人的控制***,包括微处理器、控制信号输入装置和复数个关节伺服机构,控制信号输入装置的输出端接微处理器,微处理器的复数个控制信号输出端分别接关节伺服机构,其特征在于,控制信号输入装置是光学传感器。An intelligent robot control system comprises a microprocessor, a control signal input device and a plurality of joint servo mechanisms, wherein an output end of the control signal input device is connected to the microprocessor, and a plurality of control signal output ends of the microprocessor are respectively connected to the joint servo The mechanism is characterized in that the control signal input device is an optical sensor.
- 根据权利要求1所述的控制***,其特征在于,光学传感器包括红外解调接收器和复数对红外发射管,红外解调接收器的信号输出端接微处理器;每对红外发射管的两个发射管分开布置,发射的数据不同;微处理器根据一对红外发射管数据到达的先后次序,判断发出控制信号的手势。The control system according to claim 1, wherein the optical sensor comprises an infrared demodulation receiver and a plurality of pairs of infrared transmission tubes, and a signal output terminal of the infrared demodulation receiver is connected to the microprocessor; and two pairs of each of the infrared emission tubes The transmitting tubes are arranged separately, and the transmitted data is different; the microprocessor determines the gesture of issuing the control signal according to the order in which the data of the pair of infrared transmitting tubes arrives.
- 根据权利要求1所述的控制***,其特征在于,所述的光学传感器是手势识别智能传感器。The control system of claim 1 wherein said optical sensor is a gesture recognition smart sensor.
- 根据权利要求3所述的控制***,其特征在于,微处理器对光学传感器采样数据的处理包括把采样数据先进行偏移校正,再进行适配滤波等处理,得到有效数据;然后通过有效数据的处理得到手势向量数据,通过对手势向量数据的处理,最终识别出手势结果事件。The control system according to claim 3, wherein the processing of the sampling data of the optical sensor by the microprocessor comprises first performing offset correction on the sampled data, performing adaptive filtering and the like to obtain valid data; and then passing valid data. The processing results in gesture vector data, and the gesture result event is finally recognized by processing the gesture vector data.
- 根据权利要求4所述的控制***,其特征在于,微处理器对光学传感器采样数据的处理包括把识别出手势结果事件应用到APK中。 The control system of claim 4 wherein the processing of the optical sensor sample data by the microprocessor comprises applying the recognized gesture result event to the APK.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510417305.7A CN104959984A (en) | 2015-07-15 | 2015-07-15 | Control system of intelligent robot |
CN201510417305.7 | 2015-07-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017008182A1 true WO2017008182A1 (en) | 2017-01-19 |
Family
ID=54214154
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2015/000814 WO2017008182A1 (en) | 2015-07-15 | 2015-11-24 | Control system for intelligent robot |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN104959984A (en) |
WO (1) | WO2017008182A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108381511A (en) * | 2018-04-28 | 2018-08-10 | 刘宇栋 | Gesture control mobile platform based on induction remote control gloves |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007268696A (en) * | 2006-03-31 | 2007-10-18 | Fujitsu Ltd | Interactive robot system |
CN102601801A (en) * | 2012-03-23 | 2012-07-25 | 冠礼控制科技(上海)有限公司 | Single-shaft rotary mechanical arm |
JP2012161851A (en) * | 2011-02-03 | 2012-08-30 | Advanced Telecommunication Research Institute International | Robot system and space formation recognizing device used in the same |
CN103009388A (en) * | 2012-11-05 | 2013-04-03 | 肖林 | Light wave transmitter as well as robot track locating system and robot track locating method |
CN104182049A (en) * | 2014-08-28 | 2014-12-03 | 华南理工大学广州学院 | Non-contact type infrared two-dimensional gesture detection and recognition device and method |
CN204209691U (en) * | 2014-10-09 | 2015-03-18 | 夏彤宇 | The card machine robot of the self adaptation position of human body utilizing biological information of human body to control |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100351720C (en) * | 2002-05-05 | 2007-11-28 | 旋永南 | Operation and control system and mehod |
GB0311177D0 (en) * | 2003-05-15 | 2003-06-18 | Qinetiq Ltd | Non contact human-computer interface |
CN102299990A (en) * | 2010-06-22 | 2011-12-28 | 希姆通信息技术(上海)有限公司 | Gesture control cellphone |
CN102012740B (en) * | 2010-11-15 | 2015-10-21 | 中国科学院深圳先进技术研究院 | Man-machine interaction method and system |
CN201955771U (en) * | 2010-11-15 | 2011-08-31 | 中国科学院深圳先进技术研究院 | Human-computer interaction system |
CN103112007B (en) * | 2013-02-06 | 2015-10-28 | 华南理工大学 | Based on the man-machine interaction method of hybrid sensor |
EP2790093B1 (en) * | 2013-04-09 | 2020-06-03 | ams AG | Method for gesture detection, optical sensor circuit, in particular an optical sensor circuit for gesture detection, and optical sensor arrangement for gesture detection |
-
2015
- 2015-07-15 CN CN201510417305.7A patent/CN104959984A/en active Pending
- 2015-11-24 WO PCT/CN2015/000814 patent/WO2017008182A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007268696A (en) * | 2006-03-31 | 2007-10-18 | Fujitsu Ltd | Interactive robot system |
JP2012161851A (en) * | 2011-02-03 | 2012-08-30 | Advanced Telecommunication Research Institute International | Robot system and space formation recognizing device used in the same |
CN102601801A (en) * | 2012-03-23 | 2012-07-25 | 冠礼控制科技(上海)有限公司 | Single-shaft rotary mechanical arm |
CN103009388A (en) * | 2012-11-05 | 2013-04-03 | 肖林 | Light wave transmitter as well as robot track locating system and robot track locating method |
CN104182049A (en) * | 2014-08-28 | 2014-12-03 | 华南理工大学广州学院 | Non-contact type infrared two-dimensional gesture detection and recognition device and method |
CN204209691U (en) * | 2014-10-09 | 2015-03-18 | 夏彤宇 | The card machine robot of the self adaptation position of human body utilizing biological information of human body to control |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108381511A (en) * | 2018-04-28 | 2018-08-10 | 刘宇栋 | Gesture control mobile platform based on induction remote control gloves |
Also Published As
Publication number | Publication date |
---|---|
CN104959984A (en) | 2015-10-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108427876B (en) | Fingerprint identification method and mobile terminal | |
CN106910500B (en) | Method and device for voice control of device with microphone array | |
WO2020173390A1 (en) | Active stylus calibration method, active stylus and electronic device | |
CN106440192A (en) | Household appliance control method, device and system and intelligent air conditioner | |
US10531540B2 (en) | Intelligent lamp holder and usage method applied therein | |
EP3185521B1 (en) | Voice wake-up method and device | |
CN102937832A (en) | Gesture capturing method and device for mobile terminal | |
CN105404396A (en) | Vehicle device based on infrared sensing gesture recognition | |
CN107463897B (en) | Fingerprint identification method and mobile terminal | |
CN104182049A (en) | Non-contact type infrared two-dimensional gesture detection and recognition device and method | |
CN101370096A (en) | Interactive television remote control based on spacing positioning | |
CN109799924B (en) | False touch prevention control method and device, mobile terminal and computer readable storage medium | |
WO2017166066A1 (en) | Infrared remote control method, terminal and apparatus | |
WO2022017003A1 (en) | Voice transmission control method, voice remote controller, terminal device, and storage medium | |
US20220116758A1 (en) | Service invoking method and apparatus | |
CN105635776A (en) | Virtual operation interface remote control method and system | |
CN111387978A (en) | Method, device, equipment and medium for detecting action section of surface electromyogram signal | |
CN107784298B (en) | Identification method and device | |
CN110941469B (en) | Application splitting creation method and terminal equipment thereof | |
CN106887228B (en) | Robot voice control method and device and robot | |
WO2017008182A1 (en) | Control system for intelligent robot | |
CN109784234B (en) | Right-angled bend identification method based on forward fisheye lens and vehicle-mounted equipment | |
CN111163213A (en) | Terminal control method and device and terminal equipment | |
CN109164908B (en) | Interface control method and mobile terminal | |
CN114283798A (en) | Radio receiving method of handheld device and handheld device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15897912 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15897912 Country of ref document: EP Kind code of ref document: A1 |