CN107291811B - A kind of sense cognition enhancing robot system based on cloud knowledge fusion - Google Patents

A kind of sense cognition enhancing robot system based on cloud knowledge fusion Download PDF

Info

Publication number
CN107291811B
CN107291811B CN201710353598.6A CN201710353598A CN107291811B CN 107291811 B CN107291811 B CN 107291811B CN 201710353598 A CN201710353598 A CN 201710353598A CN 107291811 B CN107291811 B CN 107291811B
Authority
CN
China
Prior art keywords
behavior
entity
library
extensive
task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710353598.6A
Other languages
Chinese (zh)
Other versions
CN107291811A (en
Inventor
李石坚
杨莎
陈昕伟
潘纲
吴朝晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201710353598.6A priority Critical patent/CN107291811B/en
Publication of CN107291811A publication Critical patent/CN107291811A/en
Application granted granted Critical
Publication of CN107291811B publication Critical patent/CN107291811B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

The sense cognition enhancing robot system based on cloud knowledge fusion that the invention discloses a kind of, belongs to field of artificial intelligence.The robot system include robot, processor and with the sension unit of processor communication connection, local storage unit and cloud unit;Local storage unit is stored with local behavior library, and unit fusion in cloud has cloud behavior library;Processor is for receiving service request, the surrounding enviroment information pair target object corresponding with service request obtained based on sension unit is detected and is positioned, the goal task behavior for being suitable for service request and target object is obtained in subordinate act library, controls robot to target object performance objective task behavior;Processor be also used to in behavior library movement and entity carry out it is extensive, construct extensive behavior library.The robotics system can not only obtain external information by sension unit, and have extensive learning ability and can provide polymorphic type service based on the task in local behavior library and cloud behavior library for user.

Description

A kind of sense cognition enhancing robot system based on cloud knowledge fusion
Technical field
The invention belongs to service robot technical field of intelligence, specifically, being related to a kind of based on cloud knowledge fusion Sense cognition enhancing robot system.
Background technique
With the high speed development of modern science and technology, the improvement of people's living standards and the raising required service industry, service Robot has been rapidly developed and applied.Service robot can substantially be divided by its purposes domestic type service robot, Professional service robot and amusement type service robot.Current various service robots are provided primarily directed to specific scene Service.For example, the aging of social population is more serious now, the social disability size of population also remains high, and needs household The artificial the elderly of machine and physical disabilities provide daily life and take care of, and clean robot is needed to be engaged in the cleaning of domestic hygiene, cleaning Equal work.
Robot need to have comprehensive perception when providing intelligent Service to external environment, outer can automatically obtain The information of the various perception channels in boundary, it is desirable that robot has sensing capability.Different robots are designed not due to its own structure Together, when executing specific tasks, movement locus and behavior also can be different, therefore, in order to which the service behavior of robot to be adapted to To above other robots, it is desirable that robot has study generalization ability, and the service behavior that will learn is tied according to itself Structure automatically adjusts service action.Currently, the service type that different robots provide be it is single, for some particular aspects , with the variation of people's live, the type demand that can provide service type to service robot also increasingly increases, it is desirable that machine Device people has the ability for merging a variety of service types, to adapt to changeable environment.The various aspects ability fusion of robot is existed Cloud is a trend of robot development, and the calculating and storage capacity in foundation cloud keep robot more intelligent.Meanwhile Cloud can make to be distributed in different field, the robot of different location mutually exchanges, and realize resource-sharing.
Summary of the invention
The sense cognition enhancing robot system based on cloud knowledge fusion that the present invention provides a kind of, allows the robot to feel Know surrounding enviroment, and provides corresponding with service according to service request.
To achieve the goals above, robot system provided by the invention includes robot, processor and leads to processor Interrogate sension unit, local storage unit and the cloud unit of connection;Sension unit includes for obtaining robot surrounding enviroment letter The depth camera of breath, local storage unit are stored with local behavior library, and unit fusion in cloud has cloud behavior library, and behavior library is The set of different task behavior and entity, behavior are the combinations of robot a series of actions in the task of execution, and entity is behavior The object of movement;Processor is used for: in behavior library movement and entity carry out it is extensive, construct extensive behavior library;Receive clothes Business request;Based on the surrounding enviroment information that sension unit obtains, target object corresponding with service request is detected and determined Position;The goal task behavior for being suitable for service request and target object is obtained in subordinate act library;Robot is controlled to target object Performance objective task behavior.
By the way that sension unit is arranged, robot is set to have corresponding sensing capability, external information can be obtained;Pass through Included service can not only be provided with the local behavior library and cloud behavior library of processor communication connection, robot by being arranged, and It can be using other service types in storage beyond the clouds behavior library, to provide the service of other service type robots;And lead to Cross in behavior library behavior act and entity carry out it is extensive, thus make the robot system have cognitive ability, that is, have it is general The learning ability of change.
Specific scheme is the surrounding enviroment information obtained based on sension unit, to target object corresponding with service request It is detected and includes: the image for receiving depth camera and obtaining the step of positioning;Using YOLO algorithm to the object in image It is detected and is positioned, obtain target object and its coordinate in depth camera coordinate system;Using calibrating method by target Coordinate of the object in depth camera coordinate system is converted to coordinate of the target object in robot coordinate system.
Another specific scheme is that the goal task row for being suitable for service request and target object is obtained in subordinate act library For the step of include: to judge whether the task behavior in local behavior library is applicable in;If the task behavior in local behavior library is uncomfortable With, then judge whether the task behavior in the behavior library of cloud is applicable in, if be applicable in, update local behavior library;If cloud behavior library In task behavior it is not applicable, then construct goal task behavior, and update local behavior library and cloud behavior library.It is preferential to utilize this Ground behavior library, to improve service rate.
If more specific scheme is to judge that the step whether the task behavior in behavior library is applicable in includes: not deposit in behavior library In goal task behavior, the entity of goal task behavior is generalized for its extensive list of entities with WordNet;If behavior is deposited in library There is the task behavior of intersection in the extensive list of entities of extensive list of entities and goal task behavior, calculates target with WordNet The distance of the extensive list of entities of task behavior in the entity of task behavior and behavior library is obtained apart from the smallest task behavior; If minimum range is less than pre-determined distance threshold value, it is applicable in, and minimum range task behavior is adjusted according to service request, obtains Take goal task behavior.By extensive, service behavior is suitable in different entities by realization robot, and adjusts according to the actual situation Whole behavior act, to adapt to changeable service environment.
It includes: according to clothes that specific scheme, which is the step of being adjusted according to service request to minimum range task behavior, again The entity attributes of business request and destination service behavior grab crawl angle, grasp mode and shifting when object according to robot Movement in dynamic path adjustment minimum range task behavior.
Another more specific scheme is to act to it after constructing goal task behavior and carry out extensive, structure with entity Build extensive behavior, and by goal task behavior and it is extensive after behavior update in behavior library.In the mistake for executing service request Cheng Zhong constantly carries out the new task behavior of building and its extensive behavior update in library, to enable the continuous extended theorem of robot Behavior, and by cloud unit, make other robots from repeating to construct same task behavior, but also similar from constructing Service behavior.
Another specific scheme be in behavior library movement and entity progress it is extensive, construct the step in extensive behavior library Suddenly includes: with WordNet that the entity in behavior library is extensive at extensive list of entities, and according to robot grab object when road Diameter planning, grasp mode and crawl angle are extensive at extensive action lists by the movement in behavior library, form the extensive of robot Behavior library.
Preferred scheme is that carry out extensive step to entity with WordNet include: that the part of speech of entity is arranged is noun, is obtained The word set for taking the entity to represent in WordNet;The sentence that inputs containing entity and can express Entity Semantics, is calculated with Lesk Method carries out semantic disambiguation, and the sub- word set for representing Entity Semantics is obtained from word set;With the hypernym function of WordNet Hypernyms obtains the corresponding extensive word set of sub- word set;According to the height that Context Selection is extensive in extensive word set, entity is obtained Extensive result.Since behavior library includes extensive behavior library, can constantly extended theorem behavior, that is, have corresponding study energy Power will be adapted to different service behaviors.
Another specific scheme is that cloud unit also merge and has knowledge mapping, knowledge mapping include ConceptNet and The step of WordNet, update cloud unit, is as follows: the knowledge mapping of entity in each robot, behavior library, local is traversed, if having The knowledge mapping of entity then adds the knowledge mapping of the entity there is no in the unit of cloud beyond the clouds in unit;Traverse each machine Task behavior in the behavior library of device people local, if behavior identical with a certain task behavioral concept is not present in the behavior library of cloud, The task behavior is then added in unit beyond the clouds;If cloud behavior has behavior identical with the task behavioral concept, calculate The shortest distance between each entity in same concept behavior and the task behavior, if the shortest distance is greater than default physical distance Threshold value then adds the task behavior beyond the clouds in unit;If the shortest distance is less than default physical distance threshold value, same concept is calculated The smallest edit distance of action sequence between behavior and the task behavior, if smallest edit distance is greater than default edit distance threshold, Then the behavior is added in unit beyond the clouds.It through the above steps, can be in addition new task behavior so that in its renewal process Meanwhile avoiding addition iterative task behavior and very close service behavior.
Another specific scheme is that cloud unit has also merged knowledge mapping, knowledge mapping include ConceptNet with WordNet is based on cloud unit, and processor is used for: the entity obtained from environment according to sension unit, with ConceptNet Selection target entity, if the entity perceived and goal task the behavior entity to be operated are different, using in ConceptNet Ternary relation obtains the entity for meeting goal task behavior operation object;If having in the behavior library of cloud and goal task behavior phase With the same conceptual action of concept, the entity of goal task behavior and the physical distance with conceptual action are calculated, if physical distance is small In default physical distance threshold value, then according to entity attributes in service request and goal task behavior, according to the crawl of robot Angle, grasp mode and movement routine adjustment obtain goal task behavior with the movement in conceptual action;If in the behavior library of cloud There is no the same conceptual action of goal task behavior, then construct goal task behavior and it is carried out it is extensive at its extensive behavior, Goal task behavior and its extensive behavior are added in the behavior library of cloud.
Detailed description of the invention
Fig. 1 is the structural block diagram of the embodiment of the present invention;
Fig. 2 is that extensive schematic diagram is acted in the embodiment of the present invention;
Fig. 3 is the work flow diagram of control method of the embodiment of the present invention;
Fig. 4 is to judge whether task behavior in behavior library is applicable in the work flow diagram of step in the embodiment of the present invention;
Fig. 5 is the grasp mode schematic diagram of gripper in the embodiment of the present invention, wherein 5 (a) be basic model, and 5 (b) are Wide mode, 5 (c) is pinch mode, and 5 (d) be scissors mode.
Specific embodiment
With reference to embodiments and its attached drawing the present invention is further illustrated.
Embodiment
Referring to Fig. 1, the present embodiment robot system 1 include robot 10, processor 11 and with 11 communication connection of processor Sension unit 12, cloud unit 13 and local storage unit 14.
Processor 11 issues the control signal that control 10 execution of robot acts, communication to robot 10 by communication line Be connected as carrying out data transmission connecting based on communication line, communication line be include one or more be configured in processor 11 with The data circuit of data information transfer is used between robot 10, sension unit 12, cloud unit 13 and local storage unit 14, Combination including but not limited to more than electric line, optowire, radiolink and the two.
According to the difference of robot shape, robot includes humanoid robot, animals shaped robot and arm shape machine The different robot such as device people;In the present embodiment, robot 10 is selected as arm anthropomorphic robot, by mechanical arm and machinery Pawl is constituted, wherein mechanical arm is the sixdegree-of-freedom simulation of EPSON C4 model, and gripper is that ROBOTIQ tri- refers to gripper.
Processor 11 is used to receive the data that each unit is transmitted and handles these data, and generates control machine The control information of people 10 can be single processor, can also be the set of the multiple processors of distribution at different locations.
Sension unit 12 includes the depth camera for 10 surrounding enviroment of robot to be carried out with visual perception, to machine The object to be operated of people 10 is detected and is positioned, in addition, can also pass through the different function such as pressure sensor, sonic transducer Energy sensor, so that the robot system has corresponding sensing capability.In the present embodiment, depth camera is to be arranged in machine Microsoft Kinect video camera before 10 workbench of device people, to obtain workbench scene, the i.e. yard of robot 10 Scape.
During the work time, processor 11 obtains the scene picture on workbench by Kinect video camera, uses YOLO Algorithm detects the object in picture, and orients coordinate of the target object in robot coordinate system on workbench.In this implementation In example, YOLO algorithm selects Microsoft COCO data set, which contains 80 kinds of objects as listed by the following table 1.Pass through Detect to include 80 kinds of objects in Microsoft COCO data set on workbench, obtains the object and acquired in Kinect Two-dimensional coordinate in image obtains object in Kinect by the corresponding relationship between the two-dimensional coordinate and three-dimensional coordinate of Kinect Three-dimensional coordinate in coordinate system;Then, based on converting corresponding spin matrix between Kinect coordinate system and robot coordinate system And translation vector obtains target object in robot coordinate system according to three-dimensional coordinate of the target object in Kinect coordinate system In three-dimensional coordinate.
The title of listed 80 kinds of objects in 1 Microsoft coco data set of table
Local behavior library is stored in local storage unit 14, wherein behavior library is integrated with a system of the execution of robot 10 The object that column action sequence and movement are acted on, the i.e. set of different task behavior and its action object, wherein acted on Object indicate that is, entity is the object or target that robot to be manipulated, and can be object with entity, be also possible to one Position can also be people.In order to which the information such as state, attribute to entity are described, indicated in behavior library using five-tuple ε Entity:
ε=< ID, N, P, S, I >
Wherein, ID is the number of entity, unique identification is carried out to entity, to distinguish different entities.N is the title of entity, Indicate the meaning of the entity.P is the position of entity, three-dimensional coordinate of the presentation-entity in robot coordinate system.S is the shape of entity State indicates state when robot and entity interaction;In the present embodiment, when entity is object, the state of entity is defined For Hold, tri- kinds of generic states of Target, Free, Hold presentation-entity is currently used as the entity of holding of mechanical arm, Target Presentation-entity is currently used as the target entity of mechanical arm, and Free presentation-entity currently empty spare time, i.e. manipulator do not carry out it Any operation.I is the supplemental information of entity, for example, can use the shape, size, color of its presentation-entity when entity is object Etc. supplemental informations;If I is sky when entity is without supplemental information.
Movement in behavior library is atom behavior of the robot in execution task, i.e. behavior is that robot appoints in execution The combination of a series of actions when business is acted in behavior library using four-tuple Action expression:
Action=< ID, N, E, I >
Wherein, ID is the number of movement, unique identification is carried out to movement, to distinguish different movements.N is the title of movement, Indicate the meaning of the movement.E is the execution of movement, indicates which type of interface robot should call when executing the movement It realizes, in the present embodiment, according to the structure and hardware design of manipulator itself, movement performed by manipulator is divided into and is grabbed It takes, discharge and moves, indicated respectively with grasp, release, move, correspond to manipulator and grasping manipulation is carried out to entity, is released It puts operation and object is moved to another position from a position.I is the supplemental information of movement.
With deterministic finite automation, (DFA is specifically defined and can refer to Master's thesis the implementation procedure of behavior in behavior library " tasking learning and planning towards perception enhancing mechanical arm ", Wang Shugang, Zhejiang University, in August, 2016) it realizes, it includes The combination of robot everything, is defined as:
ActionLibrary=< Action1,Action2,...,Actioni,...,Actionn>
Wherein, ActioniI-th of task behavior in expression behavior library, n are the total numbers of task behavior in behavior library.DFA In state be task behavior in behavior library all entities state set, the alphabet of DFA is state transformation in behavior library Character set, the transfer function of DFA be robot in the task of execution from a state to the transfer function of another state, The beginning state of DFA is the beginning state of each behavior, and the end state of DFA is the final state of task behavior.
Task, which refers to, needs robot to go one completed work, and task behavior refers to robot when completing a certain task The behavior act of required execution and the set of behavior entity, i-th of task behavior is defined as:
Actioni=DFAi& < obj1,obj2,...,objm>
Wherein, DFAiIndicate the DFA, obj of the execution task behaviormIndicate that the entity of the participation task behavior, m indicate ginseng With the entity number of the task behavior.
By constructing behavior library, target object is executed so as to make robot call directly the task behavior in behavior library Task behavior, the task behavior are goal task behavior, to complete a certain item service request, and the process in building behavior library In, in behavior library entity and movement carry out extensive, to entity extensive, refer to will appointing for a certain specific entity building Business behavior is applicable in higher semanteme physically, makes it from specific to general;The extensive of movement is referred to behavior act progress Behavior in behavior library is classified as the classification belonging to it by classification, to allow the robot to according to the behavior library after extensive to not Know that target object executes task behavior, that is, the service behavior of robot can be made to be fitted on more objects, expand robot Service range.
During entity carries out extensive in behavior library, using lesk algorithm to word lists same in WordNet The multiple word sets shown carry out disambiguation processing, to obtain accurate word set, and with the accurate word set of hypernym Relation acquisition in WordNet Extensive result objGeneraList:
ObjGeneraList=[hypernym1,hypernym2,...,hypernymk]
Wherein, the hypernym of hypernym presentation-entity, k are the number of hypernym in the list.
It is specifically extensive that steps are as follows:
(1) part of speech that entity is arranged is noun, obtains the word set synsets that the entity represents in WordNet;
(2) the sentence sentence that inputs containing entity and can express Entity Semantics is carried out semantic with Lesk algorithm It disambiguates, the sub- word set synset for representing Entity Semantics is obtained from word set synsets;
(3) the corresponding extensive word set of the sub- word set synset of entity is obtained with the hypernym function hypernyms of WordNet generalization synsets;
(4) height extensive according to Context Selection in extensive word set generalization synsets obtains entity Extensive result.
That is, the extensive of entity refers to according to substantive noun hyponymy, by the action object of task behavior from current Entity expand to it is similar physically, with expand task behavior can operation object type, to only need to learn to similar object And a subtask behavior is constructed, effectively improve the learning ability of robot.
Referring to fig. 2, in behavior library movement carry out it is extensive when, by the everything of robot be divided into crawl, release and Mobile three elemental motion classes, i.e. specific movement performed by robot can all be generalized for above three movement.Grab class movement In for the location and shape of crawl object set different grasp mode and crawl angle;Release class movement is to crawl object Direct release, be not related to release privately owned attribute, directly execute can complete release movement;Mobile class movement is machine People is moved to another position from a position, when moving, is related to the planning of movement routine, we define path in class The privately owned attribute of planning.Service robot selects suitable movement class for entity attribute when executing specific movement, then right Movement class is embodied, i.e., path planning, grasp mode and the crawl angle when grabbing specific object according to robot will be general The movement of change is specifically at specific movement.Grasp mode refer to robot during grabbing object, the gripper of manipulator Bending state and different gripper between relative positional relationship, by taking the grasp mode of three-jaw robot in the present embodiment as an example, Its grasp mode is classified as basic model, wide mode, pinches mode and scissor mode, as shown in Figure 5;As shown in Fig. 5 (a), Basic model is that gripper 1, gripper 2 and gripper 3 embrace a cylindrical object, gripper 1 with aduncate state Positioned at the side of the cylindrical body, other two gripper is located at the other side of the cylindrical body;As shown in Fig. 5 (b), wide mode is machine Machinery claw 1, gripper 2 and gripper 3 clamp an object with straight configuration, and the angle of adjacent two gripper is roughly the same; As shown in Fig. 5 (c), it is to curve inwardly, and pinch object with finger tip that mode of pinching, which is gripper 1, gripper 2 and gripper 3,;Such as Shown in Fig. 5 (d), scissor mode refers to that gripper 1 and another two gripper 2,3 clamp object positioned at opposite sides.Robot Crawl angular range corresponding angular range is selected into 180 degree according to structure feature -180 degree of gripper.
That is, movement it is extensive refer to everything is resolved into the movement of crawl class, the movement of release class and the movement of mobile class, will Class movement being grabbed at the crawl angle function with parameter and grasp mode function is abstracted into, the parameter for grabbing angle is -180 degree arrives 180 degree, the parameter of grasp mode are numbers representated by mode mentioned above, and mobile class movement is abstracted into parameter Path planning function.The shape that object is detected according to sension unit judges to use grasp mode needed for grabbing the object and grab Angle is taken, to embody to the grasp mode in behavior act with crawl angle, for example grabs horizontally-arranged and vertically-arranged circle Pipe, to both basic model being used to grab, and to the former using the crawl of 0 degree angle, and to the latter for using an angle of 90 degrees Degree crawl (with the forward direction of clockwise angle);Such as crawl vertically-arranged circle extra heavy pipe and crawl vertically-arranged flat bar body, then crawl angle both Degree is 90 degree, and the former grasp mode is basic model, and the latter is to pinch mode.
To in behavior library object and movement carry out it is extensive after, each task behavior in behavior library may be expressed as:
GeneraActioni=DFAi& < objGeneraList1,objGeneraList2,...,objGeneraListm >
Wherein, DFAiIndicate the DFA of the task behavior, movement be it is extensive after movement, objGeneraListmIt is each reality The extensive list of body.
In the present embodiment, four task behaviors as listed in table 2, each task behavior packet have been gathered in local behavior library Include behavior number corresponding to the title of number, behavior in behavior library, the entity and entity that are related in the behavior. Include multiple entities in some behaviors, distinguishes different entities using different numbers.
The local behavior of table 2 library
The height extensive to entity is 4, and extensive result is as shown in table 3 below.According to the extensive list to entity, in behavior library Task behavior can be adapted for it is extensive after entity.
The extensive list of entity in 3 behavior library of table
Cloud unit 14 has merged knowledge mapping and cloud behavior library, wherein cloud behavior library includes different robots Behavior library and extensive behavior library.Wherein, knowledge mapping is made of ConceptNet and WordNet, and ConceptNet is for retouching The ternary relation between entity is stated, WordNet is for describing the semantic relation between entity.
Since cloud unit 14 has merged knowledge mapping and cloud behavior library.By by different action amalgamations together, Robot can be made not have to the movement of repetitive learning behavior, so that it may directly choose phase according to the behavior in storage beyond the clouds unit 14 Answer the service behavior under scene.For example, when robot is from the scene conversion of an amusement and leisure to the scene of nursing old people, only It needs directly to extract behavior under the scene from the data that cloud is merged, how be mentioned to the elderly without Zai Jiao robot For specific behavior.The behavior of cloud fusion expands the service of robot while saving robot learning behavior cost Range.
The step of robot system 1 control robot 10 is worked is as shown in Figure 3:
Step S11, processor 10 receive user and are asked by the service that the input equipments such as touch screen, keyboard, microphone input It asks, obtains the goal task behavior B that need to be executed;For example pass through the voice class service request of microphone input " pouring ", then it requires It executes " pouring " this goal task behavior.
Step S12, the Kinect video camera that processor 10 controls in sension unit 12 are detected and are determined to target object Position, obtains out three-dimensional coordinate of the target object in robot coordinate system;For example, need to be to involved by " pouring " this service behavior Target object " Bottle " detected and positioned with " Cup ".
Step S13 judges the task behavior B in local behavior libraryjWhether be applicable in above-mentioned service request, i.e., with the presence or absence of with The corresponding task behavior B of above-mentioned goal task behavior Bj, and if it exists, then follow the steps S18;If it does not exist, it thens follow the steps S14。
Step S14 judges the task behavior B in the behavior library of cloudiWhether be applicable in above-mentioned service request, i.e., with the presence or absence of with Task behavior B corresponding to above-mentioned goal task behavior Bi, and if it exists, then follow the steps S17;If it does not exist, then step is carried out S15。
Step S15, to receive manual input-mode or (Master's thesis can be used " towards perception enhancing machine with mode of learning The tasking learning of tool arm and planning " provided by mode learnt) the goal task behavior B of building, and execute step S16。
Step S16, in the goal task behavior B newly constructed entity and movement carry out it is extensive, obtain the extensive of entity List of entities and the extensive action lists of movement, extensive list of entities constitute the extensive of service behavior together with extensive action lists Behavior, and execute step S17.
Step S17 updates local behavior library and/or cloud behavior library, and executes step S18.Wherein, if cloud behavior library In task behavior be applicable in, then local behavior library is updated according to acquired task behavior in step S14;If building Goal task behavior B out, then according to the extensive row step S15 goal task behavior B obtained and obtained according to step S16 To be updated to cloud behavior library and local behavior library.
Step S18, the goal task behavior in process performing library.
As shown in figure 4, judging, which whether there is in behavior library, is applicable in above-mentioned service request in above-mentioned steps S13 and S14 Specific step is as follows for task behavior:
Step S21 judges in behavior library with the presence or absence of goal task behavior B, and if it exists, then follow the steps S26;If not depositing Thening follow the steps S22.
Step S22 arranges the object in goal task behavior B with the extensive object that WordNet is generalized for the task behavior Table L, and step S23 is executed, during extensive, semantic disambiguation is carried out to the object in goal task behavior B with Lesk algorithm.
Step S23 traverses each task behavior B in behavior librarykIf object and task row in extensive object list L For BkExtensive object list LkThere are intersections, then follow the steps S24, if it does not exist, then follow the steps S27.
Step S24 calculates the object and behavior B of goal task behavior B using WordNetkExtensive object list in Distance, the smallest task behavior of selected distance, i.e. minimum range task behavior, and execute step S25;
Step S25, judges whether the distance of minimum range task behavior is less than pre-determined distance threshold value, if being less than, executes Otherwise step S26 executes step S27.
Step S26, it is believed that there is the task behavior being adapted with destination service behavior B in behavior library, i.e., in behavior library Task behavior is applicable in, and is adjusted according to service request to minimum range task behavior.
Adjustment to minimum range task behavior are as follows:
According to the request of the specific service of goal task behavior and its entity attributes, according to grabbing for mechanical arm and gripper The movement for taking angle, grasp mode and movement routine adjustment minimum range task behavior, constitutes goal task behavior, i.e. selection pair Grasp mode number is answered to embody with crawl angle value to grasp mode function and crawl angle function.
Step S27, it is believed that there is no the task behaviors being adapted with goal task behavior B in behavior library, i.e., in behavior library Task behavior it is not applicable.
During the work time, to the update in cloud behavior library, steps are as follows:
(1) knowledge mapping for traversing entity in the local behavior library of each robot, as the knowledge mapping of sporocarp is not deposited Beyond the clouds in unit, then the knowledge mapping of the entity is directly added in unit beyond the clouds.
(2) the task behavior B in the local behavior library of each robot is traversed, searches whether exist and the task beyond the clouds The identical behavior of behavior B concept, if it does not, adding task behavior B in direct behavior library beyond the clouds;If it is present It carries out step (3).Same concept behavior meet two conditions: (1) verb of behavior act is identical, between (2) behavior entity away from From be less than pre-determined distance threshold value, specifically, by the representation of concept form of behavior be BehaviorConcept=< verb, Entity >, for example, behavior " eating apple " is expressed as eatingApple=< eat, apple >, whether the concept of two behaviors Identical realized by comparing verb and entity, and in the algorithm, it is to pass through that whether verb verb, which expresses same concept, The correct matching of verb character string forms realizes that the similar of entity is most short between the entity of two behaviors by calculating Distance.The shortest distance between two entities is the distance for calculating two entities and reaching nearest public hypernym.If two Verb difference in behavioral concept or the shortest distance between the entity in two behaviors are greater than the pre-determined distance threshold value, Then think that the concept of two behaviors is different.
(3) calculate the task behavior B and its with the shortest distance between entity each in conceptual action EntityDistance;If the shortest distance is greater than physical distance threshold value, then it represents that task behavior B and its with conceptual action be not With, then the behavior task behavior is added in behavior library beyond the clouds;Otherwise, step (4) are carried out.
(4) smallest edit distance of calculating task behavior B and the action sequence with conceptual action BehaviorDistance.If BehaviorDistance is greater than default edit distance threshold BehaviorDistanceThreshold indicates that the execution sequence of two behaviors has biggish difference, then adds behavior beyond the clouds Task behavior B;Otherwise it is assumed that task behavior B and with conceptual action be it is similar, do not need to add task in behavior library beyond the clouds Behavior B.
Control process of the robot system based on cloud unit is as follows:
(1) entity in environment is obtained by visual perception, with ConceptNet selection target entity.If perceived Entity and the goal task behavior B object to be operated it is different, in ConceptNet IsA, RelatedTo, The ternary relations such as UsedFor obtain the entity for meeting goal task behavior B operation object, i.e. target object.Behavior library beyond the clouds Middle lookup and goal task behavior B have the same conceptual action of same concept.Same conceptual action if it exists, then calculate goal task The distance of the entity of behavior B and the entity with conceptual action, if distance is less than default physical distance threshold value EntityDistanceThreshold, robot is requested according to the specific service of goal task behavior B and its entity attributes, It adjusts according to the crawl angle of mechanical arm and gripper, grasp mode and movement routine with the movement in conceptual action, to obtain Goal task behavior B is taken out, and executes step (4).If the task behavior and goal task behavior B in the behavior library of cloud are general It is different in thought, then carry out step (2).
(2) goal task behavior B is constructed, and the goal task behavior B constructed and its extensive behavior are added to cloud In behavior library.
(3) to target object performance objective task behavior B.

Claims (9)

1. a kind of sense cognition enhancing robot system based on cloud knowledge fusion, which is characterized in that including robot, processor And with the sension unit of processor communication connection, local storage unit and cloud unit;
Sension unit includes the depth camera for obtaining robot surrounding enviroment information, and local storage unit is stored with local Behavior library, the fusion of cloud unit have cloud behavior library, and behavior library is the set of different task behavior and entity, and behavior is robot In the task of execution, the combination of a series of actions, entity are the objects of behavior act;
The processor is used for: in behavior library movement and entity carry out it is extensive, construct extensive behavior library;The service of reception is asked It asks;Based on the surrounding enviroment information that sension unit obtains, target object corresponding with service request is detected and positioned;From The goal task behavior for being suitable for service request and target object is obtained in behavior library;It controls robot and mesh is executed to target object Mark task behavior;
The step of goal task behavior for being suitable for service request and target object is obtained in the subordinate act library include:
Judge whether the task behavior in local behavior library is applicable in;
If the task behavior in local behavior library is not applicable, judge whether the task behavior in the behavior library of cloud is applicable in, if suitable With then updating local behavior library;
If the task behavior in the behavior library of cloud is not applicable, goal task behavior is constructed, and update local behavior library and cloud Behavior library.
2. robot system according to claim 1, which is characterized in that the surrounding enviroment obtained based on sension unit Information, the step of target object corresponding with service request is detected and positioned include:
Receive the image that depth camera obtains;
The object in image is detected and positioned using YOLO algorithm, obtains target object and its in depth camera coordinate Coordinate in system;
Coordinate of the target object in depth camera coordinate system target object is converted to using calibrating method to sit in robot Coordinate in mark system.
3. robot system according to claim 1, which is characterized in that judge whether the task behavior in behavior library is applicable in The step of include:
If goal task behavior is not present in behavior library, the entity of goal task behavior is generalized for its extensive reality with WordNet Body list;
If being used in behavior library there are the task behavior that the extensive list of entities of extensive list of entities and goal task behavior has intersection WordNet calculates the distance of the extensive list of entities of task behavior in the entity and behavior library of goal task behavior, obtains distance The smallest task behavior;
If minimum range is less than pre-determined distance threshold value, it is applicable in, and minimum range task behavior is adjusted according to service request It is whole, obtain goal task behavior.
4. robot system according to claim 3, which is characterized in that the foundation service request is to minimum range task The step of behavior is adjusted include:
According to service request and destination service behavior entity attributes, according to robot grab object when crawl angle, grab Movement in modulus formula and movement routine adjustment minimum range task behavior.
5. robot system according to claim 3, it is characterised in that:
After constructing goal task behavior, it is acted and is carried out with entity extensive, constructs extensive behavior, and by goal task Behavior and it is extensive after behavior update in behavior library.
6. robot system according to claim 1, which is characterized in that the movement in behavior library and entity carry out Extensive, the step of constructing extensive behavior library, includes:
It is with WordNet that the entity in behavior library is extensive at extensive list of entities, and according to robot grab object when path Planning, grasp mode and crawl angle are extensive at extensive action lists by the movement in behavior library, form the extensive row of robot For library.
7. according to robot system described in any one of claim 3 to 6 claim, which is characterized in that with WordNet to reality Body carries out extensive step
The part of speech that entity is arranged is noun, obtains the word set that the entity represents in WordNet;
The sentence that inputs containing entity and can express Entity Semantics carries out semantic disambiguation with Lesk algorithm, obtains from word set Replace the sub- word set of table Entity Semantics;
The corresponding extensive word set of sub- word set is obtained with the hypernym function hypernyms of WordNet;
According to the height that Context Selection is extensive in extensive word set, the extensive result of entity is obtained.
8. robot system according to claim 1, which is characterized in that cloud unit, which also merges, knowledge mapping, described The step of knowledge mapping includes ConceptNet and WordNet, updates cloud unit is as follows:
The knowledge mapping of entity in each robot, behavior library, local is traversed, there is no cloud units if there is the knowledge mapping of entity In, then the knowledge mapping of the entity is added in unit beyond the clouds;
The task behavior in each robot, behavior library, local is traversed, if being not present and a certain task behavior phase in the behavior library of cloud With the same conceptual action of concept, then the task behavior is added in unit beyond the clouds;
If there is same conceptual action with the task behavior in cloud behavior, calculate with conceptual action with it is every in the task behavior The shortest distance between a entity adds the task if the shortest distance is greater than default physical distance threshold value beyond the clouds in unit Behavior;
If the shortest distance is less than or equal to default physical distance threshold value, calculate with action sequence between conceptual action and the task behavior Smallest edit distance adds the behavior in unit beyond the clouds if smallest edit distance is greater than default edit distance threshold.
9. robot system according to claim 1, which is characterized in that cloud unit has also merged knowledge mapping, knowledge Map includes ConceptNet and WordNet, is based on cloud unit, and processor is used for:
The entity obtained from environment according to sension unit, with ConceptNet selection target entity, if the entity perceived It is different with the entity that goal task behavior to be operated, meet goal task behavior using the ternary relation acquisition in ConceptNet The entity of operation object;
If there being behavior identical with goal task behavioral concept in the behavior library of cloud, the entity and phase of goal task behavior are calculated With the physical distance of conceptual action, if physical distance is less than default physical distance threshold value, according to service request and goal task Entity attributes in behavior, according to the adjustment of the crawl angle of robot, grasp mode and movement routine with dynamic in conceptual action Make, obtains goal task behavior;
If constructing goal task behavior and to it there is no the behavior with goal task behavior same concept in the behavior library of cloud It carries out extensive at its extensive behavior, goal task behavior and its extensive behavior is added in the behavior library of cloud.
CN201710353598.6A 2017-05-18 2017-05-18 A kind of sense cognition enhancing robot system based on cloud knowledge fusion Active CN107291811B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710353598.6A CN107291811B (en) 2017-05-18 2017-05-18 A kind of sense cognition enhancing robot system based on cloud knowledge fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710353598.6A CN107291811B (en) 2017-05-18 2017-05-18 A kind of sense cognition enhancing robot system based on cloud knowledge fusion

Publications (2)

Publication Number Publication Date
CN107291811A CN107291811A (en) 2017-10-24
CN107291811B true CN107291811B (en) 2019-11-29

Family

ID=60094139

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710353598.6A Active CN107291811B (en) 2017-05-18 2017-05-18 A kind of sense cognition enhancing robot system based on cloud knowledge fusion

Country Status (1)

Country Link
CN (1) CN107291811B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108297116A (en) * 2018-02-24 2018-07-20 上海理工大学 A kind of intelligence machine human hand system
CN108711172B (en) * 2018-04-24 2020-07-03 中国海洋大学 Unmanned aerial vehicle identification and positioning method based on fine-grained classification
CN108748149B (en) * 2018-06-04 2021-05-28 上海理工大学 Non-calibration mechanical arm grabbing method based on deep learning in complex environment
CN109465834A (en) * 2019-01-04 2019-03-15 北京邮电大学 A kind of mechanical arm fast worktodo planing method based on planning knowledge base
CN110263342A (en) * 2019-06-20 2019-09-20 北京百度网讯科技有限公司 Method for digging and device, the electronic equipment of the hyponymy of entity
CN111424380B (en) * 2020-03-31 2021-04-30 山东大学 Robot sewing system and method based on skill learning and generalization
CN111975783B (en) * 2020-08-31 2021-09-03 广东工业大学 Robot grabbing detection method and system
CN112085122B (en) * 2020-09-21 2024-03-15 中国科学院上海微***与信息技术研究所 Ontology-based semi-supervised image scene semantic deepening method
CN113925742B (en) * 2021-10-20 2022-06-21 南通大学 Control method and control system of target-driven upper limb exoskeleton rehabilitation robot
CN117950481A (en) * 2022-10-17 2024-04-30 中国电信股份有限公司 Interactive information generation method, device and system, electronic equipment and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782976A (en) * 2010-01-15 2010-07-21 南京邮电大学 Automatic selection method for machine learning in cloud computing environment
CN105078449A (en) * 2015-08-24 2015-11-25 华南理工大学 Senile dementia monitoring system based on healthy service robot
CN105643607A (en) * 2016-04-08 2016-06-08 深圳市中科智敏机器人科技有限公司 Intelligent industrial robot with sensing and cognitive abilities
CN105729468A (en) * 2016-01-27 2016-07-06 浙江大学 Enhanced robot workbench based on multiple depth cameras
CN106142087A (en) * 2016-08-10 2016-11-23 东北大学 A kind of intelligent robot system based on cloud computing and control method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782976A (en) * 2010-01-15 2010-07-21 南京邮电大学 Automatic selection method for machine learning in cloud computing environment
CN105078449A (en) * 2015-08-24 2015-11-25 华南理工大学 Senile dementia monitoring system based on healthy service robot
CN105729468A (en) * 2016-01-27 2016-07-06 浙江大学 Enhanced robot workbench based on multiple depth cameras
CN105643607A (en) * 2016-04-08 2016-06-08 深圳市中科智敏机器人科技有限公司 Intelligent industrial robot with sensing and cognitive abilities
CN106142087A (en) * 2016-08-10 2016-11-23 东北大学 A kind of intelligent robot system based on cloud computing and control method thereof

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Wisanu Jitviriya 等."Development of Cloud Action for Seamless Robot Using Backpropagation Neural Network.《The 2017 International Conference on Artificial Life and Robotics (ICAROB 2017)》.2017, *
杨建华 等."基于贡献模型的多机器人多目标观测方法".《模式识别与人工智能》.2015,第28卷(第4期), *
杨莎 等."感认知增强的智能机械手***".《浙江大学学报(工学版)》.2016,第50卷(第6期), *
王淑刚."面向感知增强机械手臂的任务学习与规划".《中国优秀硕士学位论文全文数据库信息科技辑(月刊)》.2016,(第07期), *

Also Published As

Publication number Publication date
CN107291811A (en) 2017-10-24

Similar Documents

Publication Publication Date Title
CN107291811B (en) A kind of sense cognition enhancing robot system based on cloud knowledge fusion
Rusu et al. Sim-to-real robot learning from pixels with progressive nets
CN107214701B (en) A kind of livewire work mechanical arm automatic obstacle avoiding paths planning method based on movement primitive library
US20190378019A1 (en) Systems and methods enabling online one-shot learning and generalization by intelligent systems of task-relevant features and transfer to a cohort of intelligent systems
Alonso et al. Current research trends in robot grasping and bin picking
CN109960880A (en) A kind of industrial robot obstacle-avoiding route planning method based on machine learning
CN109782600A (en) A method of autonomous mobile robot navigation system is established by virtual environment
Chen et al. Deep reinforcement learning based trajectory planning under uncertain constraints
CN109483534A (en) A kind of grasping body methods, devices and systems
Auerbach et al. Dynamic resolution in the co-evolution of morphology and control
Valarezo Anazco et al. Natural object manipulation using anthropomorphic robotic hand through deep reinforcement learning and deep grasping probability network
Raessa et al. Teaching a robot to use electric tools with regrasp planning
Sanfilippo et al. A universal control architecture for maritime cranes and robots using genetic algorithms as a possible mapping approach
CN109693234A (en) Robot tumble prediction technique, device, terminal device and computer storage medium
Marchese A directional diffusion algorithm on cellular automata for robot path-planning
WO2022191414A1 (en) Parameterized waypoint generation on dynamically parented non-static objects for robotic autonomous tasks
Takamatsu et al. Learning-from-observation system considering hardware-level reusability
Pocius et al. Communicating robot goals via haptic feedback in manipulation tasks
Vanc et al. Communicating human intent to a robotic companion by multi-type gesture sentences
Crenganis et al. Inverse kinematics of a 7 DOF manipulator using adaptive neuro-fuzzy inference systems
Wendemuth et al. Towards cognitive systems for assisted cooperative processes of goal finding and strategy change
Chen et al. Kinova gemini: Interactive robot grasping with visual reasoning and conversational AI
Atanasyan et al. An architecture for ar-based human-machine interaction with application to an autonomous mobile robot platform
Jensen Co-creative Robotic Design Processes in Architecture
Sanchez et al. Towards advanced robotic manipulation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant