NL2020224B1 - Intelligent Robot - Google Patents
Intelligent Robot Download PDFInfo
- Publication number
- NL2020224B1 NL2020224B1 NL2020224A NL2020224A NL2020224B1 NL 2020224 B1 NL2020224 B1 NL 2020224B1 NL 2020224 A NL2020224 A NL 2020224A NL 2020224 A NL2020224 A NL 2020224A NL 2020224 B1 NL2020224 B1 NL 2020224B1
- Authority
- NL
- Netherlands
- Prior art keywords
- unit
- gesture
- expression
- module
- data
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0005—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
- Toys (AREA)
Abstract
The invention belongs to the robot field, in particular, to an intelligent robot. Known intelligent robots do not automatically adjust height based on human height, accurately identify human expression and gesture and automatically match appropriate expression and gesture for interaction. The invention provides an intelligent robot, comprising a bottom base and a lower torso welded on top of the bottom base, wherein, an upper torso is formed right above the lower torso; the lower torso is mounted with a human sensing unit by bolts; a first placement cavity is formed on the lower torso; and the bottom inner wall of the first placement cavity is mounted with a first push rod motor. The invention can automatically adjust height based on human height to accurately identify human expression and gesture, and automatically match appropriate expression and gesture for interaction. The invention provides high intelligence, simple structure and convenient usage.
Description
FIELD
The present invention relates to the technical field of robots, in particular to an intelligent robot.
BACKGROUND
As technology develops, more attention has been paid to and research and development have been made for intelligent robots; as intelligent robots quickly become a part of our work and life with increasingly popular applications, it poses higher requirements on intelligent robots.
Patent 201510955745.8 discloses an intelligent robot used for improving intelligence of the intelligent robot in the prior art. However, it is still of low intelligence for failure to automatically adjust height based on human height, failure to accurately identify human expression and gesture and failure to automatically match appropriate expression and gesture for interaction.
Patent 201510339278.6 discloses an intelligent robot capable of simulating human walking, attracting the attention of children, and free obstacle avoidance and free walking within a certain scope; in addition, it can play learning files to raise learning interests of children. However, it is of poor intelligence for failure to automatically adjust height based on human height, failure to accurately identify human expression and gesture and failure to automatically match appropriate expression and gesture for interaction.
SUMMARY
The present invention provides an intelligent robot to solve the problem of poor intelligence of the prior art for failure to automatically adjust height based on human height, failure to accurately identify human expression and gesture and failure to automatically match appropriate expression and gesture for interaction.
To achieve the above object, the present invention provides the following technical scheme:
An intelligent robot comprises a bottom base and a lower torso welded on the top of the bottom base; an upper torso is formed right above the lower torso; the lower torso is mounted with a human sensing unit by bolts; a first placement cavity is formed on the lower torso; the bottom inner wall of the first placement cavity is mounted with a first push rod motor by bolts; the output shaft of the first push rod motor is welded on the bottom of the upper torso; the upper torso is mounted with a gesture identification unit by bolts; both sides of the upper torso are flexibly mounted with an arm; a top base is mounted right above the upper torso; a second placement cavity is formed on the upper torso; the bottom inner wall of the second placement cavity is mounted with a second push rod motor by bolts; the output shaft of the second push rod motor is welded on the bottom of the top base; a head is flexibly arranged on the top of the top base, wherein, the head is mounted with an expression identification unit and a display unit by bolts;
The human sensing unit, the gesture identification unit and the expression identification unit form a sensing identification module; the sensing identification module is connected to a matching module and a data processing module respectively; the matching module is connected to a multiple databases, an retrieving module and a data processing module respectively; the retrieving module is connected to a multiple databases, an execution module and a data processing module respectively; the data processing module is connected to a driver module and a multiple databases respectively; the driver module is connected to a first push rod motor and a second push rod motor respectively; and the execution module is connected to an arm and a display unit respectively.
Preferably, a first through hole connected to the first placement cavity is formed on the top of the lower torso, and the output shaft of the first push rod motor is mounted in the first through hole in a sliding manner.
Preferably, a second through hole connected to the second placement cavity is formed on the top of the upper torso, and the output shaft of the second push rod motor is mounted in the second through hole in a sliding manner.
Preferably, the human sensing unit is used for human sensing, and sends signals to the data processing module; the gesture identification unit is used for gesture identification, and transmits identification results to the matching module; and the expression identification unit is used for expression identification, and transmits identification results to the matching module.
Preferably, the matching module comprises an expression matching unit and a gesture matching unit, wherein, the expression matching unit and the gesture matching unit are connected to the expression identification unit and the gesture identification unit respectively; the expression matching unit is used for matching expression data in the multiple databases based on identification results of the expression identification unit, and transmits matching results to the retrieving module; and the gesture matching unit is used for matching gesture data in the multiple databases based on identification results of the gesture identification unit, and transmits matching results to the retrieving module.
Preferably, the retrieving module comprises an expression retrieve unit and a gesture retrieve unit, wherein, the expression retrieve unit and the gesture retrieve unit are connected to the expression matching unit and the gesture matching unit respectively; the expression retrieve unit is used for retrieving expression data in the multiple databases based on matching results of the expression matching unit, and transmitting retrieved expression data to the execution module; the gesture retrieve unit is used for retrieving gesture data in the multiple databases based on matching results of the gesture matching unit, and transmitting the retrieved gesture data to the execution module.
Preferably, the execution module comprises an expression executing unit and a gesture executing unit, wherein, the expression executing unit and the gesture executing unit are connected to the expression retrieve unit and the gesture retrieve unit respectively, and the expression executing unit and the gesture executing unit are connected to the display unit and the arm respectively; the expression executing unit is used for controlling the display unit to simulate corresponding expression based on expression data retrieved by the expression retrieve unit; and the gesture executing unit is used for controlling the arm to simulate corresponding gesture based on gesture data retrieved by the gesture retrieve unit.
Preferably, the driver module comprises a driving circuit, a first switch circuit and a second switch circuit, wherein, the driving circuit, the first switch circuit and the second switch circuit are connected to the data processing module; the first switch circuit and the second switch circuit are connected to the first push rod motor and the second push rod motor; and the driving circuit is used for driving the first push rod motor and the second push rod motor for operation.
Preferably, the multiple databases comprises a corresponding expression library, an expression library, a corresponding gesture library and a gesture library, wherein, the expression library and the gesture library are connected to the matching module; the corresponding expression library and the corresponding gesture library are connected to the retrieving module; expression data in the corresponding expression library correspond to expression data in the expression library; and gesture data in the corresponding gesture library correspond to gesture data in the gesture library.
Preferably, the data processing module is used for controlling operation of the driver module based on the sensing signals of the human sensing unit, and the data processing module is used for driving and controlling the sensing identification module, the matching module, the retrieving module and the execution module.
Compared with the prior art, the present disclosure has the advantages that:
1. Through the human sensing unit, the data processing module, the driver module, the first push rod motor and the second push rod motor, height of the gesture identification unit and the expression identification unit can be automatically adjusted so as to correctly identify human expression and gesture;
2. Through the gesture identification unit, the expression identification unit, the matching module, the retrieving module and the execution module, appropriate expression and gesture can be automatically matched for interaction to realize high intelligence.
The present invention can automatically adjust height based on human height so as to accurately identify human expression and gesture, and automatically match appropriate expression and gesture for interaction. The present invention has the advantages of high intelligence, simple structure and convenient usage.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a structural diagram of the intelligent robot according to the present invention;
FIG. 2 is profiled structural diagram of the intelligent robot according to the present invention;
FIG. 3 is a block diagram for working principles of the intelligent robot according to the present invention;
FIG. 4 is a block diagram for working principles of the sensing identification module of the intelligent robot according to the present invention;
FIG. 5 is a block diagram for working principles of the matching module of the intelligent robot according to the present invention;
FIG. 6 is a block diagram for working principles of the retrieving module of the intelligent robot according to the present invention;
FIG. 7 is a block diagram for working principles of the execution module of the intelligent robot according to the present invention;
FIG. 8 is a block diagram for working principles of the driver module of the intelligent robot according to the present invention;
FIG. 9 is a block diagram for working principles of the multiple databases of the intelligent robot according to the present invention.
In the drawings: 1 bottom base, 2 lower torso, 3 upper torso, 4 first placement cavity, 5 first push rod motor, 6 first through hole, 7 top base, 8 second placement cavity, 9 second push rod motor, 10 second through hole, 11 gesture identification unit, 12 head, 13 expression identification unit.
Claims (10)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710007531.7A CN106737745A (en) | 2017-01-05 | 2017-01-05 | Intelligent robot |
Publications (2)
Publication Number | Publication Date |
---|---|
NL2020224A NL2020224A (en) | 2018-07-23 |
NL2020224B1 true NL2020224B1 (en) | 2018-10-10 |
Family
ID=58950318
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
NL2020224A NL2020224B1 (en) | 2017-01-05 | 2018-01-02 | Intelligent Robot |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN106737745A (en) |
NL (1) | NL2020224B1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108406782A (en) * | 2018-05-29 | 2018-08-17 | 朱晓丹 | A kind of financial counseling intelligent robot easy to use |
CN109920347B (en) * | 2019-03-05 | 2020-12-04 | 重庆大学 | Motion or expression simulation device and method based on magnetic liquid |
CN114260916B (en) * | 2022-01-05 | 2024-02-27 | 森家展览展示如皋有限公司 | Interactive exhibition intelligent robot |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6604021B2 (en) * | 2001-06-21 | 2003-08-05 | Advanced Telecommunications Research Institute International | Communication robot |
US9014848B2 (en) * | 2010-05-20 | 2015-04-21 | Irobot Corporation | Mobile robot system |
CN202315292U (en) * | 2011-11-11 | 2012-07-11 | 山东科技大学 | Comprehensive greeting robot based on smart phone interaction |
EP2933067B1 (en) * | 2014-04-17 | 2019-09-18 | Softbank Robotics Europe | Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method |
FR3021891A1 (en) * | 2014-06-05 | 2015-12-11 | Aldebaran Robotics | DEVICE FOR REMOVABLE PREPOSITIONING AND FASTENING OF ARTICULATED MEMBERS OF A HUMANOID ROBOT |
CN104102346A (en) * | 2014-07-01 | 2014-10-15 | 华中科技大学 | Household information acquisition and user emotion recognition equipment and working method thereof |
CN105563493A (en) * | 2016-02-01 | 2016-05-11 | 昆山市工业技术研究院有限责任公司 | Height and direction adaptive service robot and adaptive method |
CN205594506U (en) * | 2016-04-12 | 2016-09-21 | 精效新软新技术(北京)有限公司 | Human -computer interaction device among intelligence work systems |
CN205651333U (en) * | 2016-04-21 | 2016-10-19 | 深圳市笑泽子智能机器人有限公司 | Guest -meeting robot |
-
2017
- 2017-01-05 CN CN201710007531.7A patent/CN106737745A/en active Pending
-
2018
- 2018-01-02 NL NL2020224A patent/NL2020224B1/en not_active IP Right Cessation
Also Published As
Publication number | Publication date |
---|---|
NL2020224A (en) | 2018-07-23 |
CN106737745A (en) | 2017-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
NL2020224B1 (en) | Intelligent Robot | |
Li et al. | Complicated robot activity recognition by quality-aware deep reinforcement learning | |
CN107116553B (en) | Mechanical arm operation method and device | |
CN110147091B (en) | Robot motion control method and device and robot | |
US20150283703A1 (en) | Apparatus and methods for remotely controlling robotic devices | |
KR20180013757A (en) | Using human motion sensors to detect movement when in the vicinity of hydraulic robots | |
CN108524187B (en) | six-degree-of-freedom upper limb rehabilitation robot control system | |
Chen et al. | Controlling a robot using leap motion | |
CN205068294U (en) | Human -computer interaction of robot device | |
CN207380482U (en) | Intelligent interaction service robot | |
CN105511400A (en) | Control system of stamping robots | |
CN110877334A (en) | Method and apparatus for robot control | |
Cheng et al. | Human-robot interaction method combining human pose estimation and motion intention recognition | |
CN111331603B (en) | Stress type motion posture conversion method and system for wheel-legged robot | |
Sreekar et al. | Positioning the 5-DOF robotic arm using single stage deep CNN model | |
US20220379469A1 (en) | Massage motion control method, robot controller using the same, and computer readable storage medium | |
CN104656676A (en) | Hand, leg and eye servo control device and method for humanoid robot | |
CN107263539A (en) | Jerk robot | |
CN203109954U (en) | Minimum amplitude control device at operation tail end of mechanical arm | |
Patil et al. | Design and implementation of gesture controlled robot with a robotic arm | |
Ji et al. | Improving teleoperation through human-aware haptic feedback: a distinguishable and interpretable physical interaction based on the contact state | |
Pallavan et al. | VOICE CONTROLLED ROBOT WITH REAL TIME BARRIER DETECTION AND AVERTING | |
Song et al. | The development of interface device for human robot interaction | |
Maheswari et al. | Voice Controlled Robot Using Bluetooth Module | |
Ravipati et al. | Real-time gesture recognition and robot control through blob tracking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
MM | Lapsed because of non-payment of the annual fee |
Effective date: 20210201 |