CN109240297A - A kind of independent navigation robot that view-based access control model follows - Google Patents

A kind of independent navigation robot that view-based access control model follows Download PDF

Info

Publication number
CN109240297A
CN109240297A CN201811126556.XA CN201811126556A CN109240297A CN 109240297 A CN109240297 A CN 109240297A CN 201811126556 A CN201811126556 A CN 201811126556A CN 109240297 A CN109240297 A CN 109240297A
Authority
CN
China
Prior art keywords
robot
module
tracking target
image
main control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811126556.XA
Other languages
Chinese (zh)
Inventor
麦贵林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shendong Science And Technology (shanghai) Co Ltd
Original Assignee
Shendong Science And Technology (shanghai) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shendong Science And Technology (shanghai) Co Ltd filed Critical Shendong Science And Technology (shanghai) Co Ltd
Priority to CN201811126556.XA priority Critical patent/CN109240297A/en
Publication of CN109240297A publication Critical patent/CN109240297A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a kind of independent navigation robots that view-based access control model follows, including binocular vision module, main control module, motor drive module and upper computer module;Wherein, binocular vision module is used for the depth image and RGB image of output tracking target;Main control module is used to run KCF track algorithm and obtains the real time position and distance of tracking target, and drives robot to move to the position of tracking target by control motor drive module;Motor drive module is used to receive the modulation duty cycle of main control module output, and the modulation duty cycle received is converted to output to the voltage value of motor, to realize the motion control to robot by the revolving speed for adjusting motor;Upper computer module is used to show the status information of robot, while upper computer module is also used to receive the operating status of the control instruction control robot of user's input.The present invention only needs a camera, so that it may choose at random tracking target, expand the application space of robot significantly.

Description

A kind of independent navigation robot that view-based access control model follows
Technical field
The present invention relates to robotic technology field more particularly to it is a kind of by vision follow technology realize robot it is autonomous Navigation, the independent navigation robot that the view-based access control model without artificially manipulating follows.
Background technique
In recent years, service robot technology is grown rapidly, and the application scenarios in life are also more and more.Independently with random Device people is one kind of service robot, is related to the technology of the every field such as motor control, data fusion, image procossing, exists at present The related fields of robot is independently followed, is mostly to realize tracking using RF (radio frequency) module, tracked target is needed to carry signal Source, which has limited the application scenarios independently followed.
Summary of the invention
For above-mentioned problems of the prior art, the present invention provides a kind of independent navigation machine that view-based access control model follows People only needs a camera, so that it may choose at random tracking target, this has expanded the application space of robot significantly.
The independent navigation robot includes binocular vision module, main control module, motor drive module and upper computer module; Wherein, the binocular vision module and the motor drive module are electrically connected with the main control module respectively, the host computer mould It is communicated to connect between block and the main control module;
The binocular vision module is used for detecting and tracking target, and exports the depth image and RGB figure of the tracking target Picture;The main control module is schemed for running KCF track algorithm according to the depth image of binocular vision module output and RGB As obtain it is described tracking target real time position and at a distance from robot, and obtain it is described tracking target real time position and After at a distance from robot, robot is driven to transport to the position of the tracking target by controlling the motor drive module It is dynamic;
The motor drive module is used to receive the modulation duty cycle of the main control module output, and the modulation that will be received Duty cycle conversion is voltage value of the output to motor, to be realized by the revolving speed for adjusting the motor to robot Motion control;The upper computer module is used to show the status information of robot, while the upper computer module is also used to receive The control instruction of user's input, and the operation of main control module control robot is passed through according to the control instruction received State.
Optionally, the status information of the robot includes the figure that detects of electricity, speed and robot of robot Picture.
Further, the binocular vision module includes colour TV camera, infrared projection machine and thermal camera;Its In, the colour TV camera is used to obtain the RGB image of tracking target;The infrared projection machine is described to swash for issuing laser Light projects near infrared spectrum after infrared fileter;The thermal camera is for collecting when the near infrared spectrum is irradiated to After tracking target, the infrared spectroscopy, which is distorted, is formed by speckle point, obtains the speckle image of tracking target;And it will obtain Speckle image be transferred to processing chip PS1080, PS1080 obtained after handling speckle image tracking target depth map Picture.
It further, is the accuracy for guaranteeing depth image, the thermal camera is marked according to preset calibrations method It is fixed;
The preset calibrations method specifically: the thermal camera and tracking target between in range, often A reference planes are taken every pre-determined distance, and record the first speckle image in taken reference planes, to obtain multiple references Several first speckle images corresponding to plane;
In the physical location of measurement tracking target, by the second speckle image of the tracking target currently got and several First speckle image is done respectively and operation, if two images registration is high, corresponding position will generate peak on the image Obtained peak value is superimposed by value, and the depth of each point in the corresponding depth image of tracking target is then generated by interpolation arithmetic Data.
Further, the motor drive module includes four metal-oxide-semiconductors with the electronic mechatronics, and described four Metal-oxide-semiconductor forms the H-bridge circuit for driving the motor.
Further, the main control module includes PID controller;The PID controller in the main control module for obtaining After to the real time position of tracking target and at a distance from robot, exported to the motor drive module described electronic for adjusting The modulation duty cycle of machine revolving speed.
Optionally, the upper computer module is connected to the main control module by WIFI.
The present invention devises a robot for following technology using vision come independent navigation, primarily to solving existing Realize that the robot of target following needs tracked target to carry signal source using RF module, independently followed to limit The problem of application scenarios;Technology is followed by using vision, robot of the invention only needs a camera, so that it may arbitrarily choosing Tracking target is selected, this has expanded the application space of robot significantly;Supermarket shopping guide, nursing old people, target can be applied to The fields such as tracking.
Detailed description of the invention
Fig. 1 is the structure chart for the independent navigation robot that view-based access control model provided in an embodiment of the present invention follows.
Description of symbols:
101: binocular vision module, 102: main control module, 103: motor drive module,
104: upper computer module.
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is A part of the embodiments of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill people Member's every other embodiment obtained without making creative work, shall fall within the protection scope of the present invention.
Referring to Fig. 1, Fig. 1 shows the structural representation for the independent navigation robot that the view-based access control model in the present embodiment follows Figure;The independent navigation robot that the view-based access control model follows includes binocular vision module 101, main control module 102, motor drive module 103 and upper computer module 104;Wherein, binocular vision module 101 and motor drive module 103 are electric with main control module 102 respectively Connection is realized by WIFI between upper computer module 104 and main control module 102 and is communicated to connect;
Above-mentioned binocular vision module 101 is used for detecting and tracking target, and the depth image of output tracking target and RGB figure Picture;Wherein, depth image is comprising the range information in front of camera, the image information that the available camera of RGB image photographed, The two, which combines, can be obtained by tracking the distance between target and robot and offset in the picture;Main control module 102 is one Platform minicomputer, after the depth image and RGB image of above-mentioned 101 output tracking target of binocular vision module, the master control mould Block 102 runs KCF track algorithm, obtains the real time position of tracking target and at a distance from robot;
The process of the KCF track algorithm are as follows: then initialization tracing area image first follows the position frame of area image Ring displacement after alignment target obtain multiple samples, go out a classifier by these sample trainings, when classifier training it is good it Afterwards, next frame image is taken, estimation range is first generated, this is the image district taken near object region, then pre- Area sampling is surveyed, cyclic shift then is carried out to the sampling, several image prediction areas are obtained after being aligned target, by training before Classifier respectively to these Target areas calculate respond, respond it is maximum then be actual position of the target in this frame image, so After repeat this process, retraining detects again.
Further, above-mentioned main control module 102 be also used to the real time position for obtaining tracking target and with robot away from From rear, modulation duty cycle is exported to motor drive module 103 by PID controller, motor drive module 103 receives master control mould After the modulation duty cycle that block 102 exports, the modulation duty cycle received is converted into output to the voltage value of motor, thus logical The revolving speed of motor is overregulated, realizes the motion control to robot;It drives robot to move towards target position, realizes to target Tracking.
And above-mentioned upper computer module 104 is then for showing the status information of robot, electricity, speed including robot with And the information such as image for detecting of robot.The upper computer module 104 is also used to receive and use to user's demonstration and operation and control interface simultaneously The control instruction of family input, and the fortune all around of the control of main control module 102 robot is passed through according to the control instruction received The operation such as row, emergency shutdown.
Specifically, above-mentioned binocular vision module 101 includes colour TV camera, infrared projection machine and thermal camera;Its In, which is used to obtain the RGB image of tracking target;For issuing laser, what is issued swashs the infrared projection machine Light projects near infrared spectrum after infrared fileter;After the near infrared spectrum is irradiated to tracking target, infrared spectroscopy hair Raw distortion forms speckle point, and thermal camera collects infrared spectroscopy and is distorted formed speckle point at this time, obtains tracking target Speckle image, and by obtained speckle image be transferred to processing chip PS1080, after PS1080 handles speckle image Obtain the depth image of tracking target.
Further, be the accuracy for guaranteeing depth image, above-mentioned thermal camera need according to preset calibrations method into Rower is fixed;The preset calibrations method specifically: between thermal camera and tracking target in range, every it is default away from From taking a reference planes, and the first speckle image in taken reference planes is recorded, so that it is right to obtain multiple reference planes institutes Several first speckle images answered;In the physical location of measurement tracking target, by the second of the tracking target currently got Speckle image and several first speckle images are done respectively and operation, if two images registration is high, corresponding position will Peak value is generated on the image, obtained peak value is superimposed, and then generates the corresponding depth map of tracking target by interpolation arithmetic The depth data of each point as in.
Such as the distance range of tracking target and camera is 0.8m-3.5m, takes a reference planes every 1cm, then one 270 width speckle images can be demarcated altogether.So when the physical location of true measurement tracking target, currently available can be dissipated Spot image and this 270 width image are done respectively and operation, if two images registration is high, corresponding position will be on the image Peak value is generated, obtained peak value is superimposed, the depth data of each point in depth image is just produced by interpolation arithmetic.
Above-mentioned motor drive module 103 includes four metal-oxide-semiconductors with electronic mechatronics, which is direct current generator; Four metal-oxide-semiconductor compositions are used for the H-bridge circuit of drive motor.Motor after a pair of of metal-oxide-semiconductor on the H-bridge circuit diagonal line is connected It can operate.According to the different cornerwise metal-oxide-semiconductors of selection to being connected, the current direction in adjustable motor, thus Realize the control turned to motor.The modulation duty cycle that it is exported by receiving main control module 102, and the modulation that will be received Duty ratio switchs to export to the voltage value of motor, to realize the adjusting to motor speed, and then realizes to robot Motion control.
Above-mentioned upper computer module 104 can be the Android APP based on JAVA language exploitation.It is main, and there are two tasks: one It is the reading to 101 acquired image of binocular vision module;Second is that being communicated with main control module 102.Wherein for user The case where fluency of interactive interface, the processing of each event requires to be completed in a short time, otherwise will appear Caton.And scheme It is more time-consuming as processing task, it is therefore desirable to individually to open up a thread and be handled.
Robot will compress image and is transmitted back to come after by WIFI, and the image reading thread of upper computer module 104 is to image It is shown on interface after decoding, user, can also be in image other than can be with the camera image of real time inspection robot at this time On manually select a tracing area, select robot after tracing area that can automatically track to the moving object in selected areas. In addition, the working condition in order to check robot, the also real-time read machine people of the upper computer module 104 is sent back by WIFI The status informations such as electricity, speed, are then also shown on interface.And it is unexpected in order to prevent to occur, the upper computer module 104 Emergency stop push button is also provided in operation interface, clicking the button can be with the movement of emergency stops robot.
The present embodiment devises a robot for following technology using vision come independent navigation, primarily to solving existing Some realizes that the robot of target following needs tracked target to carry signal source using RF module, independently follows to limit Application scenarios the problem of;Technology is followed by using vision, the robot of the present embodiment only needs a camera, so that it may with Meaning selection tracking target, this has expanded the application space of robot significantly;Can be applied to supermarket shopping guide, nursing old people, The fields such as target following.
It should be noted that in the embodiment of the present invention, the appearance of " first ", " second ", it is only for make differentiation technology Noun and description are convenient, should not be construed as the restriction to the embodiment of the present invention.The terms "include", "comprise" or its any other Variant is intended to non-exclusive inclusion, so that including process, method, article or the terminal device of a series of elements Include not only those elements, but also including other elements that are not explicitly listed, or further includes for this process, side Method, article or the intrinsic element of terminal device.In the absence of more restrictions, being limited by sentence "including a ..." Fixed element, it is not excluded that including that there is also other identical in the process, method of the element, article or terminal device Element.
These are only the preferred embodiment of the present invention, is not intended to restrict the invention, for those skilled in the art For member, the invention may be variously modified and varied.All within the spirits and principles of the present invention, it is made it is any modification, Equivalent replacement, improvement etc., should all be included in the protection scope of the present invention.

Claims (7)

1. a kind of independent navigation robot that view-based access control model follows, which is characterized in that the independent navigation robot includes binocular Vision module, main control module, motor drive module and upper computer module;Wherein, the binocular vision module and the motor Drive module is electrically connected with the main control module respectively, is communicated to connect between the upper computer module and the main control module;
The binocular vision module is used for detecting and tracking target, and exports the depth image and RGB image of the tracking target;Institute Main control module is stated for running KCF track algorithm, is obtained according to the depth image of binocular vision module output and RGB image It is described tracking target real time position and at a distance from robot, and obtaining it is described tracking target real time position and and machine After the distance of people, robot is driven to move to the position of the tracking target by controlling the motor drive module;
The motor drive module is used to receive the modulation duty cycle of the main control module output, and the modulation duty that will be received Than being converted to output to the voltage value of motor, to realize the movement to robot by the revolving speed for adjusting the motor Control;The upper computer module is used to show the status information of robot, while the upper computer module is also used to receive user The control instruction of input, and the operation shape of main control module control robot is passed through according to the control instruction received State.
2. the independent navigation robot that view-based access control model as described in claim 1 follows, which is characterized in that the shape of the robot State information includes the image that electricity, speed and the robot of robot detect.
3. the independent navigation robot that view-based access control model as described in claim 1 follows, which is characterized in that the binocular vision mould Block includes colour TV camera, infrared projection machine and thermal camera;
Wherein, the colour TV camera is used to obtain the RGB image of tracking target;The infrared projection machine is used to issue laser, The laser projects near infrared spectrum after infrared fileter;The thermal camera works as the near infrared spectrum for collecting After being irradiated to tracking target, the infrared spectroscopy, which is distorted, is formed by speckle point, obtains the speckle image of tracking target;And Obtained speckle image is transferred to processing chip PS1080, PS1080 obtains tracking target after handling speckle image Depth image.
4. the independent navigation robot that view-based access control model as described in claim 1 follows, which is characterized in that guarantee depth image Accuracy, the thermal camera demarcated according to preset calibrations method;
The preset calibrations method specifically: the thermal camera and tracking target between in range, every pre- If distance takes a reference planes, and records the first speckle image in taken reference planes, to obtain multiple reference planes Several corresponding first speckle images;
Measurement tracking target physical location when, by the second speckle image of the tracking target currently got with several first Speckle image is done respectively and operation, if two images registration is high, corresponding position will generate peak value on the image, will Obtained peak value superposition, then generates the depth data of each point in the corresponding depth image of tracking target by interpolation arithmetic.
5. the independent navigation robot that view-based access control model as described in claim 1 follows, which is characterized in that the motor driven mould Block includes four metal-oxide-semiconductors with the electronic mechatronics, and four metal-oxide-semiconductors form the H bridge for driving the motor Circuit.
6. the independent navigation robot that view-based access control model as described in claim 1 follows, which is characterized in that the main control module packet Include PID controller;The PID controller be used for the main control module obtain tracking target real time position and with robot After distance, the modulation duty cycle for adjusting the motor speed is exported to the motor drive module.
7. the independent navigation robot that view-based access control model as described in claim 1 follows, which is characterized in that the upper computer module It is connected to the main control module by WIFI.
CN201811126556.XA 2018-09-26 2018-09-26 A kind of independent navigation robot that view-based access control model follows Pending CN109240297A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811126556.XA CN109240297A (en) 2018-09-26 2018-09-26 A kind of independent navigation robot that view-based access control model follows

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811126556.XA CN109240297A (en) 2018-09-26 2018-09-26 A kind of independent navigation robot that view-based access control model follows

Publications (1)

Publication Number Publication Date
CN109240297A true CN109240297A (en) 2019-01-18

Family

ID=65057743

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811126556.XA Pending CN109240297A (en) 2018-09-26 2018-09-26 A kind of independent navigation robot that view-based access control model follows

Country Status (1)

Country Link
CN (1) CN109240297A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110147096A (en) * 2019-04-01 2019-08-20 江苏大学 A kind of multimachine control method for coordinating that view-based access control model follows
CN110515384A (en) * 2019-09-09 2019-11-29 深圳市三宝创新智能有限公司 A kind of the human body follower method and robot of view-based access control model mark
CN111208831A (en) * 2020-02-24 2020-05-29 吉林大学 Unmanned carrying trolley based on computer vision
CN112019682A (en) * 2019-05-30 2020-12-01 精工爱普生株式会社 Display method, display device and information system
CN112223278A (en) * 2020-09-09 2021-01-15 山东省科学院自动化研究所 Detection robot following method and system based on depth visual information
TWI742644B (en) * 2020-05-06 2021-10-11 東元電機股份有限公司 Following mobile platform and method thereof
CN115639819A (en) * 2022-10-17 2023-01-24 郑州大学 Automatic following robot with visual and depth information fused

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019239A (en) * 2012-11-27 2013-04-03 江苏大学 Trajectory tracking sliding mode control system and control method for spraying mobile robot
CN103885449A (en) * 2014-04-04 2014-06-25 辽宁工程技术大学 Intelligent visual tracking wheeled robot based on multiple sensors and control method thereof
CN105678288A (en) * 2016-03-04 2016-06-15 北京邮电大学 Target tracking method and device
CN106863324A (en) * 2017-03-07 2017-06-20 东莞理工学院 A kind of service robot platform of view-based access control model
CN107784663A (en) * 2017-11-14 2018-03-09 哈尔滨工业大学深圳研究生院 Correlation filtering tracking and device based on depth information
WO2018048353A1 (en) * 2016-09-09 2018-03-15 Nanyang Technological University Simultaneous localization and mapping methods and apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019239A (en) * 2012-11-27 2013-04-03 江苏大学 Trajectory tracking sliding mode control system and control method for spraying mobile robot
CN103885449A (en) * 2014-04-04 2014-06-25 辽宁工程技术大学 Intelligent visual tracking wheeled robot based on multiple sensors and control method thereof
CN105678288A (en) * 2016-03-04 2016-06-15 北京邮电大学 Target tracking method and device
WO2018048353A1 (en) * 2016-09-09 2018-03-15 Nanyang Technological University Simultaneous localization and mapping methods and apparatus
CN106863324A (en) * 2017-03-07 2017-06-20 东莞理工学院 A kind of service robot platform of view-based access control model
CN107784663A (en) * 2017-11-14 2018-03-09 哈尔滨工业大学深圳研究生院 Correlation filtering tracking and device based on depth information

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨智婷: "基于RGB-D的运动目标鲁棒跟踪算法研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
王篷金: "Kinect数据修复方法研究及其在立体视频中的应用", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110147096A (en) * 2019-04-01 2019-08-20 江苏大学 A kind of multimachine control method for coordinating that view-based access control model follows
CN112019682A (en) * 2019-05-30 2020-12-01 精工爱普生株式会社 Display method, display device and information system
CN110515384A (en) * 2019-09-09 2019-11-29 深圳市三宝创新智能有限公司 A kind of the human body follower method and robot of view-based access control model mark
CN111208831A (en) * 2020-02-24 2020-05-29 吉林大学 Unmanned carrying trolley based on computer vision
TWI742644B (en) * 2020-05-06 2021-10-11 東元電機股份有限公司 Following mobile platform and method thereof
CN112223278A (en) * 2020-09-09 2021-01-15 山东省科学院自动化研究所 Detection robot following method and system based on depth visual information
CN115639819A (en) * 2022-10-17 2023-01-24 郑州大学 Automatic following robot with visual and depth information fused

Similar Documents

Publication Publication Date Title
CN109240297A (en) A kind of independent navigation robot that view-based access control model follows
CN105744163B (en) A kind of video camera and image capture method based on depth information tracking focusing
CN107301377B (en) Face and pedestrian sensing system based on depth camera
CN109443199B (en) 3D information measuring system based on intelligent light source
EP1610548A1 (en) Projector with automatic focus adjustment
US20110025830A1 (en) Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation
CN102785249A (en) Robot control system, robot system and program
JP2007058222A (en) System and method for producing magnified images of microscope slide
CN109394168B (en) A kind of iris information measuring system based on light control
US9591228B2 (en) Method for the localization of a tool in a workplace, corresponding system and computer program product
CN103371795A (en) Image processing apparatus and image processing method
CN107079106A (en) Focusing method and device, image capturing method and device and camera system
Burner et al. Evimo2: an event camera dataset for motion segmentation, optical flow, structure from motion, and visual inertial odometry in indoor scenes with monocular or stereo algorithms
CN105338246B (en) Lens devices and imaging device including the lens devices
CN112437908A (en) Depth sensing robot hand-eye camera using structured light
CN116297531B (en) Machine vision detection method, system, medium and equipment
US11803101B2 (en) Method for setting the focus of a film camera
CN208820949U (en) A kind of laser cartoon projection arrangement
CN109394170B (en) A kind of iris information measuring system of no-reflection
Pang et al. Generation of high speed CMOS multiplier-accumulators
CN109309825A (en) A kind of laser cartoon projection arrangement and control method
Bichsel et al. Automatic interpretation of human head movements
JP6028876B1 (en) Lens unit, imaging device, and control method
CN105196297B (en) A kind of icon is intelligently tinted system and its workflow
CN211557400U (en) System for testing camera dynamic shooting ambiguity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20190118