CN110223212A - A kind of dispatch control method and system of transportation robot - Google Patents

A kind of dispatch control method and system of transportation robot Download PDF

Info

Publication number
CN110223212A
CN110223212A CN201910533614.9A CN201910533614A CN110223212A CN 110223212 A CN110223212 A CN 110223212A CN 201910533614 A CN201910533614 A CN 201910533614A CN 110223212 A CN110223212 A CN 110223212A
Authority
CN
China
Prior art keywords
transmission device
target
state information
target transmission
itself
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910533614.9A
Other languages
Chinese (zh)
Other versions
CN110223212B (en
Inventor
张雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Noah Wood Robot Technology Co ltd
Original Assignee
Shanghai Wood Wood Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Wood Wood Robot Technology Co Ltd filed Critical Shanghai Wood Wood Robot Technology Co Ltd
Priority to CN201910533614.9A priority Critical patent/CN110223212B/en
Publication of CN110223212A publication Critical patent/CN110223212A/en
Application granted granted Critical
Publication of CN110223212B publication Critical patent/CN110223212B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/40Business processes related to the transportation industry

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Business, Economics & Management (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention provides a kind of dispatch control method of transportation robot and system, method includes: to detect itself working condition to obtain bulk state information;Shooting obtains target image, and target image includes transmission device, and is set at transmission device, and the indicator light for showing transmission device working condition;Target image is analyzed to obtain the work state information of transmission device;According to bulk state information and work state information, analysis obtains the target transmission device docked with itself and conveying task type;The spatial position of target transmission device is obtained, and navigates and is moved to target transmission device position;It is docked according to conveying task type with target transmission device, completes the loading or unloading of cargo.Whole process of the present invention, which does not need dispatching party participation, can complete transportation dispatching, reduce the retardance of transport task caused by participating in because of dispatching party, and then promote docking efficiency and cargo transportation efficiency, improve application popularization rate.

Description

A kind of dispatch control method and system of transportation robot
Technical field
The present invention relates to transport control technology field, the dispatch control methods and system of espespecially a kind of transportation robot.
Background technique
Current logistics transportation system wide variety, various logistics transportation systems respectively have advantage.Wherein, with artificial intelligence The progress and development of technology carry out logistics transportation using transportation robot and have also obtained more and more favors.
Existing logistics transportation system the problem is that, docked between current transportation robot and transmission device When, need the dispatching parties such as server to participate in United Dispatching transport, dispatching party uniformly monitors each transportation robot and each biography The status information of device is sent, to dock according to status information scheduling controlling current transportation robot with which transmission device, directly After being completely transferred on transmission device to cargo, dispatching party controls current transportation robot again and leaves interfaced transmission dress It sets, entire logistics transportation process whole process needs dispatching party to participate in.
How to solve whole process in the prior art needs dispatching party participation scheduling that can complete to transport, and cargo transport is caused to be imitated Rate and docking low efficiency, to reduce the problem of application popularization rate is urgent need to resolve.
Summary of the invention
The object of the present invention is to provide a kind of dispatch control method of transportation robot and system, realize that whole process does not need to adjust Degree side participates in that transportation dispatching can be completed, and reduces the retardance of transport task caused by participating in because of dispatching party, and then promotion pair Efficiency and cargo transportation efficiency are connect, application popularization rate is improved.
Technical solution provided by the invention is as follows:
The present invention provides a kind of dispatch control method of transportation robot, comprising steps of
It detects itself working condition and obtains bulk state information;
Shooting obtains target image, and the target image includes transmission device, and is set at the transmission device, and uses In the indicator light for showing transmission device working condition;
The target image is analyzed to obtain the work state information of the transmission device;
According to the bulk state information and the work state information, analysis obtains the target docked with itself transmission dress Set and convey task type;
The spatial position of the target transmission device is obtained, and navigates and is moved to target transmission device position Place;
It is docked according to the conveying task type with the target transmission device, completes the loading or unloading of cargo.
Further, the work state information for being analyzed to obtain the transmission device to the target image is specific Comprising steps of
By visual detection algorithm, LED status knowledge is carried out to the target image using default neural network model Not;
According to LED status recognition result, analysis obtains the work state information of the transmission device;
Wherein, the LED status includes instruction lamp on/off state and indicator light shape and color state;The indicator light shape Color state includes indicator light colors state and indicator light shape state.
Further, the target image is analyzed after obtaining the work state information of the transmission device, root According to the bulk state information and the work state information, analysis obtains target transmission device and the conveying docked with itself Before task type comprising steps of
The shared work state information for respectively analyzing obtained transmission device of transmission is carried out with remaining transportation robot.
Further, described according to the bulk state information and the work state information, it is right with itself that analysis is obtained The target transmission device and conveying task type connect specifically includes step:
Judge that itself is in loading condition or state to be unloaded according to the bulk state information;
When itself be in when loading condition, determine the work state information be the corresponding transmission device of state to be sent For target transmission device, and determine that the conveying task type is cargo reception type;
When itself is in state to be unloaded, determine that the work state information is the corresponding transmission device of state to be received For target transmission device, and determine that the conveying task type is cargo dispensing type.
Further, it further comprises the steps of:
When according to the bulk state information and the work state information, analysis obtains at least two work with itself When the candidate transmission device that state matches, the distance between itself and each candidate transmission device value are calculated;
The size for comparing all distance values determines that the minimum corresponding candidate transmission device of distance value is target transmission dress It sets.
Further, the spatial position for obtaining the target transmission device, and navigate and be moved to the target transmission Device position specifically includes step:
At least four target languages of target transmission device in the target image are identified by visual detection algorithm detection Adopted point;The target semanteme point is immobilizes on the target transmission device, and the point that identification is high;
According to the dimension information of the target transmission device, the first space bit of the target transmission device is calculated It sets;
The target transmission device position is moved to according to first spatial position navigation.
Further, the spatial position for obtaining the target transmission device, and navigate and be moved to the target transmission Device position specifically includes step:
At transmitting detection laser to the stabilizer blade of the target transmission device, obtains each stabilizer blade and swash what laser coordinate was fastened Light coordinate;
The second space position of the target transmission device is calculated according to the laser coordinate;
The target transmission device position is moved to according to second space position navigation.
The present invention also provides a kind of Dispatching Control Systems of transportation robot, comprising: several transportation robots and transmission Device;The transmission device is equipped with the indicator light for showing itself working condition;Each transportation robot includes: image Acquisition module, processing module, detection module, analysis module, control module and execution module;
The detection module obtains bulk state information for detecting itself working condition;
Described image acquisition module obtains target image for shooting, and the target image includes transmission device, Yi Jishe Indicator light at the transmission device;
The processing module is connect with described image acquisition module, obtains institute for being analyzed the target image State the work state information of transmission device;
The analysis module is connect with the processing module and the detection module respectively, for according to the ontology shape State information and the work state information, analysis obtain the target transmission device docked with itself and conveying task type;
The control module is connect with the analysis module, for obtaining the spatial position of the target transmission device, and Navigation is moved to the target transmission device position;
The execution module is connect with the analysis module, for being moved to the target transmission device position Afterwards, it is carried out docking the loading or unloading for completing cargo with the target transmission device according to the conveying task type.
Further, the processing module includes: the first image identification unit and first processing units;
The first image recognition unit, for passing through visual detection algorithm, using default neural network model to described Target image carries out LED status identification;
The first processing units are connect with the first image recognition unit, are tied for being identified according to LED status Fruit, analysis obtain the work state information of the transmission device;
Wherein, the LED status includes instruction lamp on/off state and indicator light shape and color state;The indicator light shape Color state includes indicator light colors state and indicator light shape state.
Further, each transportation robot further include: wireless communication module;
The wireless communication module is connect with the wireless communication module of remaining transportation robot, shared for carrying out transmission Respectively analyze the work state information of obtained transmission device.
Further, the analysis module includes: judging unit and the first determination unit;
The judging unit, for judging that itself is in loading condition or to be unloaded according to the bulk state information State;
First determination unit is connect with the judging unit, for determining the work when itself being in when loading condition Be the corresponding transmission device of state to be sent as status information it is target transmission device, and determines that the conveying task type is goods Object receives type;When itself is in state to be unloaded, determine that the work state information is the corresponding transmission of state to be received Device is target transmission device, and determines that the conveying task type is that cargo launches type.
Further, the analysis module further include: the second processing unit and the second determination unit;
Described the second processing unit is analyzed for working as according to the bulk state information and the work state information To at least two match with itself working condition candidate transmission device when, calculate itself and each candidate transmission device it Between distance value, the size of more all distance values;
Second determination unit, connect with described the second processing unit, for determining the minimum corresponding candidate of distance value Transmission device is the target transmission device.
Further, the control module includes: the second image identification unit, third processing unit and the first navigation movement Unit;
Second image identification unit, for identifying target in the target image by visual detection algorithm detection 1 target semanteme points of transmission device;The target semanteme point is to immobilize on the target transmission device, and distinguish The high point of the property known;
The third processing unit is connect with second image identification unit, for according to the target transmission device Dimension information, the first spatial position of the target transmission device is calculated;
The first navigation mobile unit, connect, for according to first spatial position with the third processing unit Navigation is moved to the target transmission device position.
Further, the control module includes: that laser detection unit, fourth processing unit and the second navigation are mobile single Member;
The laser detection unit obtains each for emitting at detection laser to the stabilizer blade of the target transmission device The laser coordinate that stabilizer blade is fastened in laser coordinate;
The fourth processing unit is connect with the laser detection unit, for being calculated according to the laser coordinate The second space position of the target transmission device;
The second navigation mobile unit, connect, for according to the second space position with the fourth processing unit Navigation is moved to the target transmission device position.
The dispatch control method and system of a kind of transportation robot provided through the invention, can whole process do not need to dispatch Side participates in that transportation dispatching can be completed, and reduces the retardance of transport task caused by participating in because of dispatching party, and then promote docking Efficiency and cargo transportation efficiency improve application popularization rate.
Detailed description of the invention
Below by clearly understandable mode, preferred embodiment is described with reference to the drawings, to a kind of tune of transportation robot Above-mentioned characteristic, technical characteristic, advantage and its implementation of degree control method and system are further described.
Fig. 1 is a kind of flow chart of one embodiment of the dispatch control method of transportation robot of the present invention;
Fig. 2 is the structural schematic diagram of transmission device in the present invention;
Fig. 3 is the schematic diagram of laser coordinate system and world coordinate system in the present invention;
Fig. 4 is a kind of structural schematic diagram of one embodiment of the Dispatching Control System of transportation robot of the present invention.
Specific embodiment
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, Detailed description of the invention will be compareed below A specific embodiment of the invention.It should be evident that drawings in the following description are only some embodiments of the invention, for For those of ordinary skill in the art, without creative efforts, it can also be obtained according to these attached drawings other Attached drawing, and obtain other embodiments.
To make simplified form, part related to the present invention is only schematically shown in each figure, they are not represented Its practical structures as product.In addition, there is identical structure or function in some figures so that simplified form is easy to understand Component only symbolically depicts one of those, or has only marked one of those.Herein, "one" is not only indicated " only this ", can also indicate the situation of " more than one ".
One embodiment of the present of invention, as depicted in figs. 1 and 2, a kind of dispatch control method of transportation robot 1, comprising:
S100 detects itself working condition and obtains bulk state information;
S200 shooting obtains target image, and target image includes transmission device 2, and is set at transmission device 2, and be used for Show the indicator light 21 of 2 working condition of transmission device;
S300 analyzes target image to obtain the work state information of transmission device 2;
Specifically, the working condition that transportation robot 1 inquires itself obtains bulk state information, the sheet of transportation robot 1 Body status information includes that transferring robot is in idle and with the to be installed of waiting transmitting device unloading cargo to transportation robot 1 Load state, transferring robot are in idle and have the state to be unloaded for waiting transportation robot 1 to unload cargo to transmitting device, And transportation robot 1 is in and docks the cargo loading execution state for carrying out cargo loading, or transport with transmitting device Robot 1 is in docks the cargo unloading execution state for carrying out cargo unloading with transmitting device.In addition, bulk state information It can also include that obtained available money is analyzed according to own system resource (system resource includes power resources and cpu resource etc.) Source information, i.e. remaining capacity information and remaining CPU information etc..
Image capture module 11 is provided on transportation robot 1, image capture module 11 is arranged in 1 fuselage of transportation robot Any one fixed position of front side, so that image capture module 11 can shoot shooting on front side of 1 fuselage of transportation robot Image in range.Image capture module 11 includes camera, camera, depth camera etc..Transportation robot 1 controls image and adopts Collecting the shooting of module 11 to obtain includes transmission device 2, and is set at transmission device 2, and for showing 2 working condition of transmission device Indicator light 21 target image.Formal shooting needs to carry out camera lens adjustment before obtaining target image, to avoid empty burnt phenomenon shadow Ring the quality of acquired target image.Target image can be picture, be also possible to carry out shot segmentation, obtained figure to video As frame.After the acquisition of transportation robot 1 obtains target image, image preprocessing is carried out to target image, image preprocessing includes ash Degreeization processing, binary conversion treatment, filtering processing etc., image preprocessing process are that this is no longer going to repeat them for the prior art.It will Target image after carrying out image preprocessing is analyzed to obtain the work state information of each transmission device 2.Transmitting device Work state information include transmission device 2 be in idle and have wait transportation robot 1 unload cargo to transmitting device to Reception state, transmission device 2 are in idle and have the shape to be sent for waiting transmitting device to unload cargo to transportation robot 1 State and transmitting device are in docks the cargo loading execution state for carrying out cargo loading, Huo Zhechuan with transportation robot 1 Defeated device is in docks the cargo unloading execution state for carrying out cargo unloading with transportation robot 1.
S400 obtains the target transmission device 2 docked with itself according to bulk state information and work state information, analysis And conveying task type;
S500 obtains the spatial position of target transmission device 2, and navigates and be moved to 2 position of target transmission device;
S600 is docked according to conveying task type with target transmission device 2, and the loading or unloading of cargo is completed.
Specifically, the work state information that transportation robot 1 is obtained according to the bulk state information and analysis of acquisition, into The docking target i.e. target of conveying task type and each transportation robot 1 that row analysis obtains each transportation robot 1 passes Send device 2.Then the positioning of transportation robot 1 obtains the location information of itself, and obtains the space bit of target transmission device 2 It sets, path planning is independently carried out according to location information and spatial position and generates mobile route, and then according to the mobile route of generation It is moved to 2 position of target transmission device, after transportation robot 1 reaches 2 position of target transmission device, with target Transmission device 2 carries out information exchange, i.e. transportation robot 1 loads and unloads ready trigger signals to target transmission device 2 one, So that transportation robot 1 is docked between each other with target transmission device 2, complete between target transmission device 2 Cargo loads docking operation or cargo unloads docking operation.
In the present embodiment, need to judge that transportation robot 1 is to receive cargo from transmission device 2, or cargo is shipped to Transmission device 2, it is therefore desirable to each transportation robot 1 obtains the bulk state information of itself, due to transmission device 2 either Delivery of goods state is also possible to cargo reception state, and hospital, logistic storage, supermarket or library etc. application scenarios In might have multiple transmission devices 2, and 2 placement location of multiple transmission devices may be concentrated more, it is therefore desirable to discriminatory analysis The work state information of each transmission device 2 is obtained, then according to the bulk state information of transportation robot 1 and transmission device 2 Work state information, the target transmission device 2 docked by transportation robot 1 itself matched and searched with itself, and then formulate life At corresponding cargo transport task, the sky that positioning obtains target transmission device 2 is carried out to the target transmission device 2 docked with itself Between position, and then navigate be moved to 2 position of target transmission device carry out cargo charge and discharge operations (including cargo load docking Operation or cargo unload docking operation).The present invention is by transportation robot 1 and 2 organically of transmission device, by linking independently Determine docking object, and independent navigation is moved to the charge and discharge operations that cargo is completed in destination, is complementary to one another, forms overall efficiency It is promoted, reaches wider application, effectively solve the problems, such as docking between transportation robot 1 and transmission device 2, and due to complete Journey, which does not need dispatching party participation, can complete transportation dispatching, reduce the retardance of transport task caused by participating in because of dispatching party, And then docking efficiency and cargo transportation efficiency are promoted, improve application popularization rate.
Based on previous embodiment, the work state information that S200 is analyzed to obtain transmission device 2 to target image is specific Comprising steps of
S210 carries out LED status knowledge to target image by visual detection algorithm, using default neural network model Not;
S220 obtains the work state information of transmission device 2 according to LED status recognition result, analysis;
Wherein, LED status includes instruction lamp on/off state and indicator light shape and color state;Indicator light shape and color state packet Include indicator light colors state and indicator light shape state.
Specifically, being pre-processed after transportation robot 1 collects target image in real time to target image, with pretreatment The part in target image including indicator light 21 carries out LED status identification afterwards, to obtain LED status identification knot Fruit.Preferably, external environment is very big to the Color influences of indicator light 21, possibly even will instruction when carrying out image preprocessing The color of lamp 21 is as noise filtering, and therefore, indicator light 21 shows transmission device 2 current except through different color states Except the working condition at moment, can be combined with 21 shape of indicator light and carry out effective supplement, it is effective exclude illumination etc. it is extraneous because Element and 21 color of indicator light such as decline at the interference of the factors to 21 color of indicator light, realize under the influence of a variety of outsides and internal factor, The work state information of transmission device 2 can be accurately identified to obtain, so further Lifting Convey machine people 1 according to subject to more True reliable work state information more precisely finds out in conjunction with bulk state information and passes with the matched target of itself working condition Device 2 is sent, the accuracy of cargo docking is completed in the robot autonomous scheduling of lifting motion, since the accuracy of cargo docking improves, And then the probability of mistake docking charge and discharge operations is reduced, to promote docking efficiency and cargo transportation efficiency indirectly.Generally, indicator light Color state includes red, green, yellow, blue etc., and indicator light shape state includes round, rectangular, triangle etc., is referred to Show that lamp on/off state includes the illuminating state of indicator light 21 and the OFF state of indicator light 21.Pass through existing visual detection algorithm LED status identification is carried out, such as it take faster-rcnn as basic structure that default neural network model, which is, back-end network uses Mobilenetv2 presets the indicator light 21 in neural network model identification positioning target image by this, then carries out indicator light State recognition.Also for example default neural network model is with R-CNN (or SPP-NET, Fast R-CNN, YOLO, SSD) for base Plinth structure, back-end network use mobilenetv2 (or mobilenetv1), preset neural network model identification by this and determine Indicator light 21 in the target image of position, then carries out LED status identification.
In the present embodiment, by visual detection algorithm, determining for indicator light 21 can be carried out using default neural network model Position identification and the identification of LED status obtain LED status recognition result, and then have by default neural network model Effect, quick, accurately and efficiently identification obtains LED status recognition result, and detection recognition effect is preferable, is convenient for subsequent transportation machine Device people 1 analyzes to obtain the work state information of transmission device 2 according to LED status recognition result, so that transportation robot 1 according to bulk state information and work state information, voluntarily matched and searched target transmission device 2, so formulate generate it is corresponding Cargo transport task precisely, is reliably completed loading operations or unloading operation.Transportation robot 1 is transmitted towards target 2 position of device is mobile, carries out between each other after transportation robot 1 reaches 2 position of target transmission device pair It connects, and completes loading operations or cargo unloading operation, do not need manually to participate in, reduce costs, it is defeated to improve cargo Send efficiency.
One embodiment of the present of invention, a kind of dispatch control method of transportation robot 1, comprising:
S100 detects itself working condition and obtains bulk state information;
S200 shooting obtains target image, and target image includes transmission device 2, and is set at transmission device 2, and be used for Show the indicator light 21 of 2 working condition of transmission device;
S300 analyzes target image to obtain the work state information of transmission device 2;
S301 and remaining transportation robot 1, which carry out the shared working condition for respectively analyzing obtained transmission device 2 of transmission, to be believed Breath;
S400 obtains the target transmission device 2 docked with itself according to bulk state information and work state information, analysis And conveying task type;
S500 obtains the spatial position of target transmission device 2, and navigates and be moved to 2 position of target transmission device;
S600 is docked according to conveying task type with target transmission device 2, and the loading or unloading of cargo is completed.
Specifically, this is no longer going to repeat them for part same as the previously described embodiments.The present embodiment is relative to above-mentioned implementation For example, each machine can carry out information exchange with remaining robot other than itself per capita, and then realize and remaining robot The work state information for carrying out shared respectively analysis gained transmission device 2, is realized in default scene areas, transportation robot 1 with Transportation robot 1 shares the work state information respectively obtained between each other, once reach the work shape between transportation robot 1 State information sharing enables to each transportation robot 1 to reduce shooting and obtains all transmission devices 2 and its indicator light around itself 21 target image quantity reduces the working condition for repeating to obtain the same transmission device 2 to analysis in identification synchronization and believes The probability of breath, and then non-productive work amount is reduced, system resource waste caused by reducing because of non-productive work.Further, since transporter Work state information between device people 1 is shared, expands the real-time work state information of all transmission devices 2 in wide application scenarios Tracking, reduce each transportation robot 1 for the blind spot rate of work state information, and then improve each transport in application scenarios Success rate is docked between robot 1 and transmission device 2, reduces the vacant of transportation robot 1 and transmission device 2 in total system Rate, and then docking efficiency and cargo transportation efficiency are promoted indirectly.
Based on previous embodiment, S400 obtains docking with itself according to bulk state information and work state information, analysis Target transmission device 2 and conveying task type specifically include step:
S410 judges that itself is in loading condition or state to be unloaded according to bulk state information;
S420 is in when loading condition when itself, determines that work state information is the corresponding transmission device of state to be sent 2 be target transmission device 2, and determines that conveying task type is that cargo receives type;
S430 determines that work state information is the corresponding transmission device of state to be received when itself is in state to be unloaded 2 be target transmission device 2, and determines that conveying task type is that cargo launches type.
Specifically, transportation robot 1 is analyzed and determined according to the bulk state information of itself, judgement from it is current when It carves and whether is in loading condition, if transportation robot 1 is at current time to loading condition, stop judging, otherwise Whether judgement is in state to be unloaded in current time certainly.Can certainly be judgement from current time whether be in Unloaded state stops judging if transportation robot 1 is in state to be unloaded at current time, and otherwise judgement is from working as Whether the preceding moment is in loading condition.
Judge that itself is in when loading condition once current transportation robot 1, then according to the work of all transmission devices 2 Status information is judged, judges whether each transmission device 2 is in state to be sent at current time, if current transmission dress It sets 2 then to stop judging when being in state to be sent at current time, and determines that being in state to be sent at current time corresponds to Current transmission device 2 be target transmission device 2, and determine transportation robot 1 conveying task type be cargo receive type. Continue to judge conversely, switching to next transmission device 2, until current transportation robot 1 determines that a transmission device 2 is mesh Until marking transmission device 2.
When current transportation robot 1 judges that itself is in state to be unloaded, then according to the work of all transmission devices 2 Status information is judged, judges whether each transmission device 2 is in state to be received at current time, if current transmission dress It sets 2 then to stop judging when being in state to be received at current time, and determines that being in state to be received at current time corresponds to Current transmission device 2 be target transmission device 2, and determine transportation robot 1 conveying task type be cargo launch type. Continue to judge conversely, switching to next transmission device 2, until current transportation robot 1 determines that a transmission device 2 is mesh Until marking transmission device 2.
Illustratively, as shown in Fig. 2, the orientation of several indicator lights 21 includes transversely arranged or longitudinal arrangement, According to the orientation of indicator light 21, different colours indicator light 21 (shape indicator light 21) will be divided into corresponding in target image Bright light region, with facilitate to indicator light 21 carry out color identification (or shape recognition).Assuming that two indicator lights 21 are laterally set It sets, and respectively red indicating light 21 (either round indicator light 21) and green indicator light 21 (or rectangular indicator light 21), and And setting red indicating light 21 (either round indicator light 21) is bright and when green indicator light 21 (or rectangular indicator light 21) goes out, pass Device 2 is sent to be in state to be sent, otherwise transmission device 2 is in state to be received.The shooting of transportation robot 1 obtains target figure Picture, and 21 color of indicator light identification (or shape recognition) and instruction lamp on/off state are carried out to the target image that shooting obtains Identification.If red indicating light 21 (or round indicator light 21) is bright, it will identify that indicator light 21 is that red colored lamp is (or round Lamp), so that transportation robot 1 determines that the transmission device 2 of azarin colored lights (or circular lamp) is the target docked with itself Transmission device 2, and determine that the conveying task type of transportation robot 1 is that cargo receives type.If green indicator light 21 (or side Shape indicator light 21) it is bright, then it will identify that indicator light 21 is green light (or square front lamp), so that transportation robot 1 determines brilliant green The transmission device 2 of colored lights (or square front lamp) is the target transmission device 2 docked with itself, and determines transportation robot 1 Conveying task type be cargo launch type.
In the present embodiment, transportation robot 1 can be according to the working condition of itself bulk state information and transmission device 2 Information, the formulation for independently carrying out conveying task generates, to complete to interact docking between transportation robot 1 and transmission device 2 It completes cargo and dispatches transportation work, whole process, which does not need dispatching party participation, can complete transportation dispatching, reduce because dispatching party participates in The retardance of caused transport task, and then docking efficiency and cargo transportation efficiency are promoted, improve application popularization rate.
Based on previous embodiment, further comprise the steps of:
S401 is when according to bulk state information and work state information, analysis obtains at least two working conditions with itself When the candidate transmission device 2 to match, the distance between itself and each candidate transmission device 2 value are calculated;
The size of more all distance values of S402 determines that the minimum corresponding candidate transmission device 2 of distance value is target transmission Device 2.
Specifically, transportation robot 1 may be according to bulk state if including multiple transmission devices 2 at application scenarios The work state information of information and transmission device 2, analysis obtain at least two and wait for what landing state matched with itself cargo Candidate transmission device 2, i.e., when transportation robot 1 is in when loading condition, it is understood that there may be at least two are in state to be sent Transmission device 2, when transportation robot 1 is in state to be unloaded, it is understood that there may be at least two be in state to be received biographies Send device 2.At this point, transportation robot 1 can be communicated with all with itself working condition matches candidate transmission device 2 Interaction detects signal strength when itself being communicated with each candidate transmission device 2, to calculate to obtain according to signal strength The distance between itself and each candidate transmission device 2 value.It is, of course, also possible to emit detection signal (such as laser or infrared Line etc.) to each candidate transmission device 2, after receiving reflected detection signal, according to transmitting detection signal Launch time and receiving reflects the receiving time of detection signal, and the transfer rate of detection signal calculate To the distance between transportation robot 1 itself and each candidate transmission device 2 value.In short, no matter which kind of mode, calculates transporter The method of the distance between device people 1 and each candidate transmission device 2 value is within that scope of the present invention.Transportation robot 1 After the distance between itself and each candidate transmission device 2 value is calculated, all distance values are subjected to size comparison, by distance Candidate transmission device 2 corresponding to value minimum is used as target transmission device 2.
In the present embodiment, transportation robot 1 is avoided the occurrence of when judging has at least two candidate transmission devices 2, by In the delay machine phenomenon that can not select and itself carry out docking the appearance of target transmission device 2 for completing cargo handling operation.Pass through Aforesaid way can independently select the minimum candidate transmission dress of distance value when judging has at least two candidate transmission devices 2 It sets 2 and is used as target transmission device 2, guarantee that each transportation robot 1 is in entire cargo transport process with minimum range progress cargo Transport, to promote cargo transportation efficiency.
Based on previous embodiment, S500 obtains the spatial position of target transmission device 2, and navigates and be moved to target transmission dress It sets 2 positions and specifically includes step:
S510 identifies at least four target languages of target transmission device 2 in target image by visual detection algorithm detection Justice point 22;1 target semanteme points 22 are 2 largest contours vertex of target transmission device, and 1 target semanteme points 22 It is non-coplanar;
The first spatial position of target transmission device 2 is calculated according to the dimension information of target transmission device 2 in S520;
S530 is moved to 2 position of target transmission device according to the navigation of the first spatial position.
Specifically, as shown in Fig. 2, being transmitted and being filled according to determined target after the determining target transmission device 2 of transportation robot 1 It sets 2 corresponding target images and carries out image recognition, since under free scene, the shooting of transportation robot 1 is acquired Target image in the angle of target transmission device 2 may be different, therefore identify that target transmission fills by visual detection algorithm After setting 2, the 1 target semanteme points 22 for identifying target transmission device 2 are returned.Semanteme point is that can retouch on transmission device 2 It states, immobilizes and point that identification is high, semanteme point can specifically describe out the position that transmission device 2 is in application scenarios, Then target semanteme point is then the semantic point on target transmission device 2.Due to the invariance of transmission device 2, then semantic point relative to The spatial position of application scenarios is then determining, and because semantic point identification is high, then relative to points other on transmission device 2 For, it is easy to return when subsequent image identifies to identify.Semanteme point is more, and semantic point it is more accurate (i.e. semantic point closer to The largest contours vertex of transmission device 2 is closer) when, the first space where target transmission device 2 is calculated in transportation robot 1 Position position is more accurate.
Transportation robot 1 carries out transportation machines after shooting gets target image, through existing visual detection algorithm People 1 carries out 1 target semanteme points of 2 fixation and recognition of target transmission device and target transmission device 2 to target image 22 carry out fixation and recognition, and the face feature point that the recurrence of target semanteme point 22 is similar to face returns identification.Such as second is default It is basic structure that neural network model, which is with faster-rcnn, and back-end network uses mobilenetv2, second default by this Target transmission device 2 in neural network model identification positioning target image, then advancees to few four target semanteme points 22 Fixation and recognition.Also the such as second default neural network model is with R-CNN (or SPP-NET, Fast R-CNN, YOLO, SSD) For basic structure, back-end network uses mobilenetv2 (or mobilenetv1), passes through the second default neural network mould Target transmission device 2 in type identification positioning target image, then advancees to the fixation and recognition of few four target semanteme points 22.By In target transmission device 2 scale size it is known that using camera calibration as a result, can directly be calculated according to EPnP algorithm Space coordinate of the 1 target semanteme points 22 relative to image capture module 11, i.e., target transmission device 2 is relative to image The space coordinate of acquisition module 11.Since the fixation position of transportation robot 1 is arranged in image capture module 11, can then count Space coordinate of the transportation robot 1 relative to image capture module 11 is calculated, since target transmission device 2 is imaged on Image Acquisition The pixel coordinate of module 11 it is known that according to the transformational relation between world coordinate system and camera coordinates system carries out that mesh is calculated Space coordinate of the transmission device 2 on world coordinate system is marked, and the origin due to establishing world coordinate system is it is known that can then obtain First spatial position of the target transmission device 2 in application scenarios.
Through this embodiment, the default neural network model of building carries out cascade recurrence by foundation structure and back-end network Target transmission device 2 in recognition target image, then advancees to the fixation and recognition of few four target semanteme points 22, it is this step by step It returns and knows otherwise, first carry out coarse recurrence identification positioning target object target transmission device 2, then carry out fine return and know Not Ding Wei target semanteme point 22, constitute one by slightly returning device to the cascade of essence, the phenomenon that then effectively preventing over-fitting, In identification locating speed and identification locating effect, there is very big promotion.
Above-mentioned building training obtains the training process of the first default neural network model and the second default neural network model For the prior art, illustratively, predefined frame, which sets the goal, transmission device 2 and defines four target semanteme points 22, obtains mark in advance Target transmission device 2 is determined and has defined the training sample image of four target semanteme points 22, has been carried out by training sample image Then training the second default neural network model of building is identified according to the second default neural network model that training obtains.Similarly, The first default neural network model is obtained according to aforesaid way building training, is no longer described in detail herein.
Based on previous embodiment, S500 obtains the spatial position of target transmission device 2, and navigates and be moved to target transmission dress It sets 2 positions and specifically includes step:
At S540 transmitting detection laser to the stabilizer blade 23 of target transmission device 2, each stabilizer blade 23 is obtained in laser coordinate system On laser coordinate;
The second space position of target transmission device 2 is calculated according to laser coordinate by S550;
S560 is moved to 2 position of target transmission device according to the navigation of second space position.
Specifically, sending detection laser to target as shown in Figure 2 by the laser transmitting-receiving equipment being set on transportation robot 1 At the stabilizer blade 23 of transmission device 2.As shown in figure 3, defining Ow-XwYwZw is world coordinate system, Oc-XcYcZc is laser coordinate System, using Laser emission direction as x-axis, using laser scanning direction as y-axis, x-axis and y-axis constitute scanning surface, perpendicular to scanning surface For z-axis.Assuming that P point is any one central point of the stabilizer blade 23 of target transmission device 2, then transportation robot 1 can be calculated Space coordinate of 23 central point of stabilizer blade on world coordinate system.Relationship between laser coordinate system and world coordinate system are as follows:
Wherein, transition matrix R, including laser rotary matrix and laser translation matrix laser.[m, n, 1] is point P The homogeneous coordinates of laser coordinate of the corresponding laser point p ' in laser coordinate system, [Xw, Yw, Zw, 1] are point P in world coordinates The homogeneous coordinates of world coordinates in system.Laser rotary matrix and laser translation matrix laser can by multiple groups laser coordinate and World coordinates carries out calculating acquisition, and for the prior art, this is no longer going to repeat them for this.Building world coordinate system is intended merely to more preferably Description laser transmitting-receiving equipment, target transmission device 2 spatial position, since laser transmitting-receiving equipment is fixed at transportation machines On people 1, and transportation robot 1 in the spatial position of application scenarios it is known that due to 2 stabilizer blade 23 of target transmission device central point 231 in the laser coordinate that laser coordinate is fastened it is known that according to the transformational relation between world coordinate system and laser coordinate system, carry out Space coordinate of the central point 231 of 2 stabilizer blade 23 of target transmission device on world coordinate system is calculated, and due to establishing the world The origin of coordinate system is it is known that can then obtain second sky of the central point 231 of 2 stabilizer blade 23 of target transmission device in application scenarios Between position.
Through this embodiment, vertical precise positioning can be carried out in any place of any environment by carrying out positioning using laser, Adaptable, accuracy rate is high.
One embodiment of the present of invention, as shown in figure 4, a kind of Dispatching Control System of transportation robot 1, comprising: several A transportation robot 1 and transmission device 2;Transmission device 2 is equipped with the indicator light 21 for showing itself working condition;Each transport Robot 1 includes: image capture module 11, processing module 13, detection module 12, analysis module 14, control module 15 and executes Module 16;
Detection module 12 obtains bulk state information for detecting itself working condition;
Image capture module 11 obtains target image for shooting, and target image includes transmission device 2, and is set to and passes Send the indicator light 21 at device 2;
Processing module 13 is connect with image capture module 11, obtains transmission device 2 for being analyzed target image Work state information;
Analysis module 14 is connect with processing module 13 and detection module 12 respectively, for according to bulk state information and work Make status information, analysis obtains the target transmission device 2 docked with itself and conveying task type;
Control module 15 is connect with analysis module 14, for obtaining the spatial position of target transmission device 2, and shifting of navigating It moves to 2 position of target transmission device;
Execution module 16 is connect with analysis module 14, after being moved to 2 position of target transmission device, according to Conveying task type carries out docking the loading or unloading for completing cargo with target transmission device 2.
Specifically, the present embodiment is the corresponding Installation practice of above method embodiment, specific effect is referring to the above method Embodiment, this is no longer going to repeat them.
Based on previous embodiment, processing module 13 includes: the first image identification unit and first processing units;
First image identification unit, for passing through visual detection algorithm, using default neural network model to target image Carry out LED status identification;
First processing units are connect with the first image identification unit, for analyzing according to LED status recognition result To the work state information of transmission device 2;
Wherein, LED status includes instruction lamp on/off state and indicator light shape and color state;Indicator light shape and color state packet Include indicator light colors state and indicator light shape state.
Specifically, the present embodiment is the corresponding Installation practice of above method embodiment, specific effect is referring to the above method Embodiment, this is no longer going to repeat them.
Based on previous embodiment, each transportation robot 1 further include: wireless communication module;
Wireless communication module is connect with the wireless communication module of remaining transportation robot 1, shared respective for carrying out transmission Analyze the work state information of obtained transmission device 2.
Based on previous embodiment, analysis module 14 includes: judging unit and the first determination unit;
Judging unit, for judging that itself is in loading condition or state to be unloaded according to bulk state information;
First determination unit, connect with judging unit, for determining that working condition is believed when itself being in when loading condition Breath is that the corresponding transmission device 2 of state to be sent is target transmission device 2, and determines that conveying task type is that cargo receives class Type;When itself is in state to be unloaded, determine work state information be the corresponding transmission device 2 of state to be received be target pass Device 2 is sent, and determines that conveying task type is that cargo launches type.
Specifically, the present embodiment is the corresponding Installation practice of above method embodiment, specific effect is referring to the above method Embodiment, this is no longer going to repeat them.
Based on previous embodiment, analysis module 14 further include: the second processing unit and the second determination unit;
The second processing unit, for when according to bulk state information and work state information, analysis obtain at least two with When the candidate transmission device 2 that the working condition of itself matches, the distance between itself and each candidate transmission device 2 are calculated Value, the size of more all distance values;
Second determination unit, connect with the second processing unit, for determining the minimum corresponding candidate transmission device 2 of distance value For target transmission device 2.
Specifically, the present embodiment is the corresponding Installation practice of above method embodiment, specific effect is referring to the above method Embodiment, this is no longer going to repeat them.
Based on previous embodiment, control module 15 includes: the second image identification unit, third processing unit and the first navigation Mobile unit;
Second image identification unit, for identifying target transmission device in target image by visual detection algorithm detection 21 target semanteme points 22;The target semanteme point 22 is to immobilize on the target transmission device 2, and recognize The high point of property;
Third processing unit is connect with the second image identification unit, for the dimension information according to target transmission device 2, The first spatial position of target transmission device 2 is calculated;
First navigation mobile unit, connect with third processing unit, for being moved to mesh according to the navigation of the first spatial position Mark 2 position of transmission device.
Specifically, the present embodiment is the corresponding Installation practice of above method embodiment, specific effect is referring to the above method Embodiment, this is no longer going to repeat them.
Based on previous embodiment, control module 15 includes: laser detection unit, fourth processing unit and the second navigation movement Unit;
Laser detection unit obtains each stabilizer blade 23 for emitting at detection laser to the stabilizer blade 23 of target transmission device 2 In the laser coordinate that laser coordinate is fastened;
Fourth processing unit is connect with laser detection unit, for target transmission device 2 to be calculated according to laser coordinate Second space position;
Second navigation mobile unit, connect with fourth processing unit, for being moved to mesh according to the navigation of second space position Mark 2 position of transmission device.
Specifically, the present embodiment is the corresponding Installation practice of above method embodiment, specific effect is referring to the above method Embodiment, this is no longer going to repeat them.
It should be noted that above-described embodiment can be freely combined as needed.The above is only of the invention preferred Embodiment, it is noted that for those skilled in the art, in the premise for not departing from the principle of the invention Under, several improvements and modifications can also be made, these modifications and embellishments should also be considered as the scope of protection of the present invention.

Claims (14)

1. a kind of dispatch control method of transportation robot, which is characterized in that comprising steps of
It detects itself working condition and obtains bulk state information;
Shooting obtains target image, and the target image includes transmission device, and is set at the transmission device, and is used for table The indicator light of bright transmission device working condition;
The target image is analyzed to obtain the work state information of the transmission device;
According to the bulk state information and the work state information, analysis obtain the target transmission device docked with itself with And conveying task type;
The spatial position of the target transmission device is obtained, and navigates and is moved to the target transmission device position;
It is docked according to the conveying task type with the target transmission device, completes the loading or unloading of cargo.
2. the dispatch control method of transportation robot according to claim 1, which is characterized in that described to the target figure Work state information as being analyzed to obtain the transmission device specifically includes step:
By visual detection algorithm, LED status identification is carried out to the target image using default neural network model;
According to LED status recognition result, analysis obtains the work state information of the transmission device;
Wherein, the LED status includes instruction lamp on/off state and indicator light shape and color state;The indicator light shape and color shape State includes indicator light colors state and indicator light shape state.
3. the dispatch control method of transportation robot according to claim 1, which is characterized in that the target image into After row analysis obtains the work state information of the transmission device, believed according to the bulk state information and the working condition Breath, analysis obtain the target transmission device that dock with itself and conveying task type before comprising steps of
The shared work state information for respectively analyzing obtained transmission device of transmission is carried out with remaining transportation robot.
4. the dispatch control method of transportation robot according to claim 1, which is characterized in that described according to the ontology Status information and the work state information, analysis obtain the target transmission device docked with itself and conveying task type tool Body comprising steps of
Judge that itself is in loading condition or state to be unloaded according to the bulk state information;
When itself is in when loading condition, determine that the work state information be the corresponding transmission device of state to be sent is mesh Transmission device is marked, and determines that the conveying task type is that cargo receives type;
When itself is in state to be unloaded, determine that the work state information be the corresponding transmission device of state to be received is mesh Transmission device is marked, and determines that the conveying task type is that cargo launches type.
5. the dispatch control method of transportation robot according to claim 1-4, which is characterized in that further include step It is rapid:
When according to the bulk state information and the work state information, analysis obtains at least two working conditions with itself When the candidate transmission device to match, the distance between itself and each candidate transmission device value are calculated;
The size for comparing all distance values determines that the minimum corresponding candidate transmission device of distance value is the target transmission device.
6. the dispatch control method of transportation robot according to claim 1-4, which is characterized in that the acquisition The spatial position of the target transmission device, and navigation is moved to the target transmission device position and specifically includes step It is rapid:
1 target semanteme points of target transmission device in the target image are identified by visual detection algorithm detection; The target semanteme point is immobilizes on the target transmission device, and the point that identification is high;
According to the dimension information of the target transmission device, the first spatial position of the target transmission device is calculated;
The target transmission device position is moved to according to first spatial position navigation.
7. the dispatch control method of transportation robot according to claim 1-4, which is characterized in that the acquisition The spatial position of the target transmission device, and navigation is moved to the target transmission device position and specifically includes step It is rapid:
At transmitting detection laser to the stabilizer blade of the target transmission device, obtains each stabilizer blade and sat in the laser that laser coordinate is fastened Mark;
The second space position of the target transmission device is calculated according to the laser coordinate;
The target transmission device position is moved to according to second space position navigation.
8. a kind of Dispatching Control System of transportation robot characterized by comprising several transportation robots and transmission dress It sets;The transmission device is equipped with the indicator light for showing itself working condition;Each transportation robot includes: that image is adopted Collect module, processing module, detection module, analysis module, control module and execution module;
The detection module obtains bulk state information for detecting itself working condition;
Described image acquisition module obtains target image for shooting, and the target image includes transmission device, and is set to institute State the indicator light at transmission device;
The processing module is connect with described image acquisition module, obtains the biography for being analyzed the target image Send the work state information of device;
The analysis module is connect with the processing module and the detection module respectively, for being believed according to the bulk state Breath and the work state information, analysis obtain the target transmission device docked with itself and conveying task type;
The control module is connect with the analysis module, for obtaining the spatial position of the target transmission device, and is navigated It is moved to the target transmission device position;
The execution module is connect with the analysis module, after being moved to the target transmission device position, root It carries out docking the loading or unloading for completing cargo with the target transmission device according to the conveying task type.
9. the Dispatching Control System of transportation robot according to claim 8, which is characterized in that the processing module packet It includes: the first image identification unit and first processing units;
The first image recognition unit, for passing through visual detection algorithm, using default neural network model to the target Image carries out LED status identification;
The first processing units are connect with the first image recognition unit, for dividing according to LED status recognition result Analysis obtains the work state information of the transmission device;
Wherein, the LED status includes instruction lamp on/off state and indicator light shape and color state;The indicator light shape and color shape State includes indicator light colors state and indicator light shape state.
10. the Dispatching Control System of transportation robot according to claim 8, which is characterized in that each transporter Device people further include: wireless communication module;
The wireless communication module is connect with the wireless communication module of remaining transportation robot, shared respective for carrying out transmission Analyze the work state information of obtained transmission device.
11. the Dispatching Control System of transportation robot according to claim 8, which is characterized in that the analysis module packet It includes: judging unit and the first determination unit;
The judging unit, for judging that itself is in loading condition or shape to be unloaded according to the bulk state information State;
First determination unit is connect with the judging unit, for determining the work shape when itself being in when loading condition State information is that the corresponding transmission device of state to be sent is target transmission device, and determines that the conveying task type connects for cargo Receive type;When itself is in state to be unloaded, determine that the work state information is the corresponding transmission device of state to be received For target transmission device, and determine that the conveying task type is cargo dispensing type.
12. according to the Dispatching Control System of the described in any item transportation robots of claim 8-11, which is characterized in that described point Analyse module further include: the second processing unit and the second determination unit;
Described the second processing unit, for when according to the bulk state information and the work state information, analysis obtain to When lacking two candidate transmission devices to match with the working condition of itself, itself is calculated between each candidate transmission device Distance value, the size of more all distance values;
Second determination unit, connect with described the second processing unit, for determining the minimum corresponding candidate transmission of distance value Device is the target transmission device.
13. according to the Dispatching Control System of the described in any item transportation robots of claim 8-11, which is characterized in that the control Molding block includes: the second image identification unit, third processing unit and the first navigation mobile unit;
Second image identification unit, for identifying that target transmits in the target image by visual detection algorithm detection 1 target semanteme points of device;The target semanteme point is immobilizes on the target transmission device, and identification High point;
The third processing unit is connect with second image identification unit, for the ruler according to the target transmission device The first spatial position of the target transmission device is calculated in very little information;
The first navigation mobile unit, connect with the third processing unit, for being navigated according to first spatial position It is moved to the target transmission device position.
14. according to the Dispatching Control System of the described in any item transportation robots of claim 8-11, which is characterized in that the control Molding block includes: laser detection unit, fourth processing unit and the second navigation mobile unit;
The laser detection unit obtains each stabilizer blade for emitting at detection laser to the stabilizer blade of the target transmission device In the laser coordinate that laser coordinate is fastened;
The fourth processing unit is connect with the laser detection unit, described for being calculated according to the laser coordinate The second space position of target transmission device;
The second navigation mobile unit, connect with the fourth processing unit, for being navigated according to the second space position It is moved to the target transmission device position.
CN201910533614.9A 2019-06-20 2019-06-20 Dispatching control method and system for transport robot Active CN110223212B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910533614.9A CN110223212B (en) 2019-06-20 2019-06-20 Dispatching control method and system for transport robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910533614.9A CN110223212B (en) 2019-06-20 2019-06-20 Dispatching control method and system for transport robot

Publications (2)

Publication Number Publication Date
CN110223212A true CN110223212A (en) 2019-09-10
CN110223212B CN110223212B (en) 2021-05-18

Family

ID=67814027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910533614.9A Active CN110223212B (en) 2019-06-20 2019-06-20 Dispatching control method and system for transport robot

Country Status (1)

Country Link
CN (1) CN110223212B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110705453A (en) * 2019-09-29 2020-01-17 中国科学技术大学 Real-time fatigue driving detection method
CN110780651A (en) * 2019-11-01 2020-02-11 四川长虹电器股份有限公司 AGV dispatching system and method
CN113052189A (en) * 2021-03-30 2021-06-29 电子科技大学 Improved MobileNet V3 feature extraction network
CN113075923A (en) * 2019-12-18 2021-07-06 财团法人工业技术研究院 Mobile carrier and state estimation and sensing fusion switching method thereof
CN116341884A (en) * 2023-05-31 2023-06-27 佳都科技集团股份有限公司 Data processing method and system for task emergency assignment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6167607B1 (en) * 1981-05-11 2001-01-02 Great Lakes Intellectual Property Vision target based assembly
US20130184849A1 (en) * 2012-01-04 2013-07-18 Globalfoundries Singapore Pte. Ltd. Efficient transfer of materials in manufacturing
CN105292892A (en) * 2015-11-11 2016-02-03 江苏汇博机器人技术有限公司 Automatic storage system of industrial robot
CN106526534A (en) * 2016-10-17 2017-03-22 南京理工大学 Device and method for automatic sorting carrying of articles based on radio navigation through moving trolley
CN107003662A (en) * 2014-11-11 2017-08-01 X开发有限责任公司 Position control robot cluster with visual information exchange
CN108891830A (en) * 2018-06-05 2018-11-27 广州市远能物流自动化设备科技有限公司 A kind of dispatch control method and automated guided vehicle of automated guided vehicle
CN109154825A (en) * 2016-07-28 2019-01-04 X开发有限责任公司 inventory management
CN109308072A (en) * 2017-07-28 2019-02-05 杭州海康机器人技术有限公司 The Transmission Connection method and AGV of automated guided vehicle AGV
CN109341689A (en) * 2018-09-12 2019-02-15 北京工业大学 Vision navigation method of mobile robot based on deep learning

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6167607B1 (en) * 1981-05-11 2001-01-02 Great Lakes Intellectual Property Vision target based assembly
US20130184849A1 (en) * 2012-01-04 2013-07-18 Globalfoundries Singapore Pte. Ltd. Efficient transfer of materials in manufacturing
CN107003662A (en) * 2014-11-11 2017-08-01 X开发有限责任公司 Position control robot cluster with visual information exchange
CN105292892A (en) * 2015-11-11 2016-02-03 江苏汇博机器人技术有限公司 Automatic storage system of industrial robot
CN109154825A (en) * 2016-07-28 2019-01-04 X开发有限责任公司 inventory management
CN106526534A (en) * 2016-10-17 2017-03-22 南京理工大学 Device and method for automatic sorting carrying of articles based on radio navigation through moving trolley
CN109308072A (en) * 2017-07-28 2019-02-05 杭州海康机器人技术有限公司 The Transmission Connection method and AGV of automated guided vehicle AGV
CN108891830A (en) * 2018-06-05 2018-11-27 广州市远能物流自动化设备科技有限公司 A kind of dispatch control method and automated guided vehicle of automated guided vehicle
CN109341689A (en) * 2018-09-12 2019-02-15 北京工业大学 Vision navigation method of mobile robot based on deep learning

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110705453A (en) * 2019-09-29 2020-01-17 中国科学技术大学 Real-time fatigue driving detection method
CN110780651A (en) * 2019-11-01 2020-02-11 四川长虹电器股份有限公司 AGV dispatching system and method
CN113075923A (en) * 2019-12-18 2021-07-06 财团法人工业技术研究院 Mobile carrier and state estimation and sensing fusion switching method thereof
CN113075923B (en) * 2019-12-18 2024-04-12 财团法人工业技术研究院 Mobile carrier and state estimation and sensing fusion switching method thereof
CN113052189A (en) * 2021-03-30 2021-06-29 电子科技大学 Improved MobileNet V3 feature extraction network
CN113052189B (en) * 2021-03-30 2022-04-29 电子科技大学 Improved MobileNet V3 feature extraction network
CN116341884A (en) * 2023-05-31 2023-06-27 佳都科技集团股份有限公司 Data processing method and system for task emergency assignment

Also Published As

Publication number Publication date
CN110223212B (en) 2021-05-18

Similar Documents

Publication Publication Date Title
CN110223212A (en) A kind of dispatch control method and system of transportation robot
CN111496770A (en) Intelligent carrying mechanical arm system based on 3D vision and deep learning and use method
CN112256018A (en) Robot scheduling processing method, device, equipment and storage medium
CN103196362B (en) A kind of system of the three-dimensional position for definite relative checkout gear of emitter
CN112149555A (en) Multi-storage AGV tracking method based on global vision
CN110262507A (en) A kind of camera array robot localization method and device based on 5G communication
US20040125985A1 (en) Chassis alignment system
US8861790B2 (en) System and method for guiding a mobile device
CN106647738A (en) Method and system for determining docking path of automated guided vehicle, and automated guided vehicle
CN112184765B (en) Autonomous tracking method for underwater vehicle
CN108038861A (en) A kind of multi-robot Cooperation method for sorting, system and device
CN110515378A (en) A kind of intelligent Target searching method applied to unmanned boat
CN108318858A (en) A kind of system for monitoring the position of luggage
JP3583450B2 (en) Automatic three-dimensional object inspection apparatus and method
CN113284178A (en) Object stacking method and device, computing equipment and computer storage medium
CN109571408A (en) The angle calibration system method and storage medium of a kind of robot, stock container
CN112091925A (en) Material handling system and material handling method based on machine vision
CN115892823A (en) Material storage and matching inspection integrated system and method
CN113601501B (en) Flexible operation method and device for robot and robot
EP1337454B1 (en) Chassis alignment system
US11797906B2 (en) State estimation and sensor fusion switching methods for autonomous vehicles
CN108600963A (en) A method of the position for monitoring luggage
CN111056197B (en) Automatic container transferring method based on local positioning system
CN112847374B (en) Parabolic-object receiving robot system
CN113064441A (en) Unmanned aerial vehicle parking method and device, storage medium and unmanned aerial vehicle nest

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 200335 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Applicant after: Shanghai zhihuilin Medical Technology Co.,Ltd.

Address before: 200335 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Applicant before: Shanghai Zhihui Medical Technology Co.,Ltd.

Address after: 200335 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Applicant after: Shanghai Zhihui Medical Technology Co.,Ltd.

Address before: 200335 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Applicant before: SHANGHAI MROBOT TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 202150 room 205, zone W, second floor, building 3, No. 8, Xiushan Road, Chengqiao Town, Chongming District, Shanghai (Shanghai Chongming Industrial Park)

Patentee after: Shanghai Noah Wood Robot Technology Co.,Ltd.

Address before: 200335 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Patentee before: Shanghai zhihuilin Medical Technology Co.,Ltd.