CN205301998U - Vision and indoor positioning system of food delivery robot who finds range and fuse - Google Patents

Vision and indoor positioning system of food delivery robot who finds range and fuse Download PDF

Info

Publication number
CN205301998U
CN205301998U CN201520964727.1U CN201520964727U CN205301998U CN 205301998 U CN205301998 U CN 205301998U CN 201520964727 U CN201520964727 U CN 201520964727U CN 205301998 U CN205301998 U CN 205301998U
Authority
CN
China
Prior art keywords
controller
vision
sensor
robot
delivery robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201520964727.1U
Other languages
Chinese (zh)
Inventor
张二云
孟勃
牛冠冲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CHANGCHUN YAOGUANG TECHNOLOGY CO., LTD.
Original Assignee
Shanghai Novelor Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Novelor Intelligent Technology Co Ltd filed Critical Shanghai Novelor Intelligent Technology Co Ltd
Priority to CN201520964727.1U priority Critical patent/CN205301998U/en
Application granted granted Critical
Publication of CN205301998U publication Critical patent/CN205301998U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

Vision and indoor positioning system of food delivery robot who finds range and fuse belongs to the location technical field of robot, including vision sensor, laser rangefinder sensor, drive pivot, controller, the input of vision sensor and laser rangefinder sensor all is connected with the output of drive pivot, and the output of vision sensor and laser rangefinder sensor is connected with the controller, the output and the input of drive pivot of controller are connected, inside signal processing unit and the memory cell of being provided with of controller. The utility model discloses under the circumstances of not going on transforming to dining facilities, through simple demarcation means, can accomplish the location of robot in the dining room, when disposing the robot fast, also reduced manufacturing cost, the extensive popularization of this indoor positioning system's robot is favorable to carrying.

Description

The meal delivery robot indoor locating system that vision merges with range finding
Technical field
This utility model belongs to robot localization technical field, especially relates to a kind of meal delivery robot indoor locating system.
Background technology
The development of the robotics brought along with lifting and the technical development of labor cost, robotics is more and more applied in daily life, and Indoor Robot occupies very big ratio central, the mobile apparatus people of indoor operation, both it is to be understood that self coordinate, know the coordinate of impact point again, this alignment system that places one's entire reliance upon. Indoor general not open area, and would be likely to occur people with other etc. the object that moves in real time, the indoor locating system of robot is just proposed the requirement that comparison is high by this.
At present, occur in that many indoor locating systems for robot, have the system by laying magnetic signpost on ground, this type of system needs ground is made transformation, quantities is relatively larger, and robot can only in the route walking specified as train, and motility is poor; There is the alignment system merged by laser radar sensor with IMU unit, although this system relative magnetism guidepost system has very big motility, but due to high accuracy IMU system complex, it is served only in military systems, and civilian IMU precision is general, and meeting accumulated error, precise decreasing can be caused after life-time service. There is presently no appearance can the relatively good indoor locating system for meal delivery robot. Therefore need badly in the middle of prior art and want a kind of novel technical scheme to solve this problem.
Utility model content
Technical problem to be solved in the utility model is: provide the meal delivery robot indoor locating system that a kind of vision merges with range finding, when dining facilities not being transformed, by simply demarcating means, robot location in dining room can be completed, while rapid deployment robot, also reduce production cost, be conducive to carrying the large-scale promotion of the robot of this indoor locating system.
The meal delivery robot indoor locating system that vision merges with range finding, it is characterized in that: include vision sensor, laser range sensor, drive shaft, controller, the input of described vision sensor and laser range sensor is all connected with the outfan of drive shaft, and the outfan of vision sensor and laser range sensor is connected with controller; The outfan of described controller is connected with the input of drive shaft, and controller is internally provided with signal processing unit and memory element.
The signal processing unit processes that the distance signal that the picture signal of described vision sensor collection and described laser range sensor gather is arranged by controller inside.
Described laser range sensor is one-dimensional laser range sensor.
By above-mentioned design, this utility model can bring following beneficial effect: a kind of vision and the meal delivery robot indoor locating system and localization method that merge of finding range, when dining facilities not being transformed, by simply demarcating means, robot location in dining room can be completed, while rapid deployment robot, also reduce production cost, be conducive to carrying the large-scale promotion of the robot of this indoor locating system.
Of the present utility model have the beneficial effects that further:
1, laser range sensor adopts one-dimensional laser range sensor fast response time, and cost is low; And the rotation sweep combined by laser range sensor and vision sensor completes the one-dimensional distance conversion to the point in two dimensional surface, simple to operate.
2, vision technique mature and reliable, and the high speed development of microelectric technique makes the digital signal processor cost with visual processes ability be substantially reduced, and utilizes mobile platform to process visual signal and is possibly realized.
3, when meal delivery robot alignment system is scanned, utilize the mark that indoor have existed as a reference point, it is not necessary to put into huge resource again and environment is transformed.
4, single pass perfection can record reference point, thus coordinates computed, disposes speed and is greatly improved.
5, carry out checking distance signal by visual signal, solve distance measuring sensor in indoor locating system and be easily moved the problem that object interference causes erroneous judgement.
Accompanying drawing explanation
Below in conjunction with the drawings and specific embodiments, this utility model is further described:
Fig. 1 is the meal delivery robot indoor locating system schematic diagram that this utility model shows that vision merges with range finding.
Fig. 2 is that this utility model shows that coordinate points schematic diagram is demarcated in the meal delivery robot indoor locating system dining room that vision and range finding are merged.
1-vision sensor, 2-laser range sensor, 3-drive shaft, 4-controller, 5-reference point A, 6-reference point B, 7-reference Point C, 8-meal delivery robot in figure.
Detailed description of the invention
The meal delivery robot indoor locating system that a kind of vision merges with range finding, as shown in Figure 1, including vision sensor 1, laser range sensor 2, drive shaft 3, controller 4, the input of described vision sensor 1 and laser range sensor 2 is all connected with the outfan of drive shaft 3, and the outfan of vision sensor 1 and laser range sensor 2 is connected with controller 4; The outfan of described controller 4 is connected with the input of drive shaft 3, and controller 4 is internally provided with signal processing unit and memory element.
The meal delivery robot indoor orientation method that vision merges with range finding, as in figure 2 it is shown, comprise the following steps,
Step one, startup meal delivery robot, start the alignment system described in claim simultaneously, laser range sensor and vision sensor uniform rotation under the driving of rotating shaft, in a scan, controller is recorded the distance of meal delivery robot and periphery barrier, acquires the image of periphery barrier simultaneously, and the picture signal of each barrier sends after processing to controller, screening recording feature image, the data base that characteristics of image is corresponding with image distance set up by controller;
The data of data store internal gathered in described step one are screened by step 2, controller, selected characteristic point A, B, C A as a reference point, reference point B, reference Point C, meal delivery robot and its distance respectively a, b, c;Controller is internal carries out storage record to reference point A, reference point B, reference Point C relative to the distance a of meal delivery robot, b, c automatically; Complete positioning-system coordinate relative to reference point A, reference point B, reference Point C demarcation;
Step 3, repeating said steps one and step 2, determine the rower that clicks on indoor for dining room, set up the data point cloud atlas of indoor, dining room coordinate, complete the demarcation of indoor, dining room barrier coordinate points in controller;
Step 4, the data point cloud atlas set up according to described step 3, start meal delivery robot and carry out room service, when meal delivery robot operation scans peripheral obstacle again by the vision sensor of rotation sweep and laser range sensor, the distance of peripheral obstacle is again collected with image information, the reference point information collected is contrasted by the information that fuzzy treatment technology is stored with in controller, image information is all identical with storage information with range information, position successfully, by the data path walking that controller gathers.
Drive shaft 3 drives the vision sensor 1 and laser range sensor 2 that can rotate for scanning. Vision sensor 1 is responsible for gathering in a scan image, and laser range sensor 2 is responsible for distance between robot measurement and barrier in a scan. Vision sensor 1 obtains picture signal, and laser range sensor 2 obtains distance signal, and controller 4 is responsible for controlling drive shaft 3 and is operated, and processes the signal of laser range sensor 2 and vision sensor 1.
Utility model works process is as follows, and before location starts, system is first once demarcated. First meal delivery robot 8 is scanned, the uniform rotation under the driving of drive shaft 3 of laser range sensor 2 and vision sensor 1, in a scan, controller 4 have recorded and the distance of periphery barrier, acquire the image of periphery barrier simultaneously, the image of each barrier, after controller 4 processes, filters out obvious characteristic record, and the one_to_one corresponding of characteristics of image and image distance set up by controller. System is screened, and has chosen visually easily distinguishable, has A, B, a C A5 as a reference point of apparent contour and obvious color characteristic, reference point B6, reference Point C 7, meal delivery robot 8 now with its distance respectively a, b, c. Now namely complete meal delivery robot 8 coordinate relative to reference point A5, reference point B6, the demarcation of reference Point C 7. So repeat to scan, the indoor rower that clicks on is determined, the final point cloud chart relying on the data-handling capacity that controller semi-finals is big to set up indoor coordinate.
Location work process is as discussed below, when meal delivery robot 8 is in operation the vision sensor 1 by rotation sweep and laser range sensor 2 scans peripheral obstacle again, the distance of peripheral obstacle is again collected with image information, as reference point A5, reference point B6, when the information of reference Point C 7 is contrasted by the information that fuzzy treatment technology and controller 4 are stored, if image information is all identical with storage information with range information, i.e. alignment system and reference point A5, reference point B6, the distance of reference Point C 7 respectively a, b, c, then meal delivery robot 8 is positioned at before by reference point A5, reference point B6, reference Point C 7 completes on the point of location, position successfully.
By setting up reference point, strengthening Fuzzy Processing, system can complete to be accurately positioned more.
When there being moving obstacle to enter the visual field, alignment system can be passed through vision sensor 1 and combine with laser range sensor 2, rejects dynamic object.Method is as follows: when vision sensor 1 collect unlabeled graphs as time, if the distance that the data of laser range sensor 2 are corresponding different twice, controller 4 is after deducting self coordinate, if not the relative meal delivery robot 8 of uncalibrated image is apart from still differing, then unlabeled graphs picture is moving obstacle.

Claims (3)

1. the meal delivery robot indoor locating system that vision merges with range finding, it is characterized in that: include vision sensor (1), laser range sensor (2), drive shaft (3), controller (4), the input of described vision sensor (1) and laser range sensor (2) is all connected with the outfan of drive shaft (3), and the outfan of vision sensor (1) and laser range sensor (2) is connected with controller (4); The outfan of described controller (4) is connected with the input of drive shaft (3), and controller (4) is internally provided with signal processing unit and memory element.
2. the meal delivery robot indoor locating system that vision according to claim 1 merges with range finding, is characterized in that: picture signal that described vision sensor (1) gathers and the distance signal that described laser range sensor (2) gathers are by the internal signal processing unit processes arranged of controller (4).
3. the meal delivery robot indoor locating system that vision according to claim 1 merges with range finding, is characterized in that: described laser range sensor (2) is one-dimensional laser range sensor.
CN201520964727.1U 2015-11-27 2015-11-27 Vision and indoor positioning system of food delivery robot who finds range and fuse Active CN205301998U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201520964727.1U CN205301998U (en) 2015-11-27 2015-11-27 Vision and indoor positioning system of food delivery robot who finds range and fuse

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201520964727.1U CN205301998U (en) 2015-11-27 2015-11-27 Vision and indoor positioning system of food delivery robot who finds range and fuse

Publications (1)

Publication Number Publication Date
CN205301998U true CN205301998U (en) 2016-06-08

Family

ID=56462968

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201520964727.1U Active CN205301998U (en) 2015-11-27 2015-11-27 Vision and indoor positioning system of food delivery robot who finds range and fuse

Country Status (1)

Country Link
CN (1) CN205301998U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105467994A (en) * 2015-11-27 2016-04-06 长春诺惟拉智能科技有限责任公司 Vision and ranging fusion-based food delivery robot indoor positioning system and positioning method
CN107422735A (en) * 2017-07-29 2017-12-01 深圳力子机器人有限公司 A kind of trackless navigation AGV laser and visual signature hybrid navigation method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105467994A (en) * 2015-11-27 2016-04-06 长春诺惟拉智能科技有限责任公司 Vision and ranging fusion-based food delivery robot indoor positioning system and positioning method
CN105467994B (en) * 2015-11-27 2019-01-18 长春瑶光科技有限公司 The meal delivery robot indoor orientation method that vision is merged with ranging
CN107422735A (en) * 2017-07-29 2017-12-01 深圳力子机器人有限公司 A kind of trackless navigation AGV laser and visual signature hybrid navigation method

Similar Documents

Publication Publication Date Title
CN105467994B (en) The meal delivery robot indoor orientation method that vision is merged with ranging
CN106199626B (en) Based on the indoor three-dimensional point cloud map generation system and method for swinging laser radar
CN103729883B (en) A kind of three-dimensional environment information gathering and reconfiguration system and method
CN103512579B (en) A kind of map constructing method based on thermal infrared video camera and laser range finder
CN102508257B (en) Vehicle-mounted mobile mapping device
CN109460029A (en) Livestock and poultry cultivation place inspection mobile platform and its control method
CN102425991B (en) Automation storage yard laser measurement device and application method thereof
CN112836737A (en) Roadside combined sensing equipment online calibration method based on vehicle-road data fusion
CN109737981B (en) Unmanned vehicle target searching device and method based on multiple sensors
CN106289285A (en) Map and construction method are scouted by a kind of robot associating scene
CN108536145A (en) A kind of robot system intelligently followed using machine vision and operation method
CN103941746A (en) System and method for processing unmanned aerial vehicle polling image
CN103605978A (en) Urban illegal building identification system and method based on three-dimensional live-action data
CN107991662A (en) A kind of 3D laser and 2D imaging synchronous scanning device and its scan method
CN103175485A (en) Method for visually calibrating aircraft turbine engine blade repair robot
CN111426302B (en) Unmanned aerial vehicle high accuracy oblique photography measurement system
CN105203023A (en) One-stop calibration method for arrangement parameters of vehicle-mounted three-dimensional laser scanning system
CN208027170U (en) A kind of power-line patrolling unmanned plane and system
CN205301998U (en) Vision and indoor positioning system of food delivery robot who finds range and fuse
CN206649347U (en) A kind of application deployment system based on unmanned vehicle
CN116560357A (en) Tunnel inspection robot system based on SLAM and inspection control method
CN1605830A (en) Large-scale three dimensional shape and appearance measuring and splicing method without being based on adhesive mark
CN202382708U (en) Automatic storage yard laser measuring device
CN117274378A (en) Indoor positioning system and method based on AI vision fusion three-dimensional scene
CN116958083A (en) Motor car bottom bolt looseness detection method based on robot self-adaptive pose adjustment

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20160627

Address after: 130000, 1303B-3 room 14, 2499 Wei Shan Road, Changchun hi tech Zone, Jilin, China

Patentee after: CHANGCHUN YAOGUANG TECHNOLOGY CO., LTD.

Address before: 511, room 130000, ferry innovation factory, 246 revision Road, Jilin, Changchun

Patentee before: SHANGHAI NOVELOR INTELLIGENT TECHNOLOGY CO., LTD.