CN108694725A - A kind of the Robotic Dynamic tracking and system of view-based access control model conspicuousness - Google Patents
A kind of the Robotic Dynamic tracking and system of view-based access control model conspicuousness Download PDFInfo
- Publication number
- CN108694725A CN108694725A CN201810456224.1A CN201810456224A CN108694725A CN 108694725 A CN108694725 A CN 108694725A CN 201810456224 A CN201810456224 A CN 201810456224A CN 108694725 A CN108694725 A CN 108694725A
- Authority
- CN
- China
- Prior art keywords
- visual signature
- speed
- point
- mobile robot
- signature point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/207—Analysis of motion for motion estimation over a hierarchy of resolutions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
- Image Analysis (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a kind of Robotic Dynamic tracking of view-based access control model conspicuousness, step includes that mobile robot is moved according to preset path, is absorbed to ambient enviroment using two cameras and obtains two-path video information;Digital signal processor in digital information processing system carries out feature extraction to two-path video information, the characteristic point extracted is subjected to Stereo matching, and speed and direction of motion parameter Estimation are carried out to characteristic point, the speed and the direction of motion of the characteristic point obtained according to estimation carry out pretreatment to characteristic point and eliminate static nature point;Significance calculating is carried out to pretreated characteristic point and is partitioned into target dynamic object information;Target dynamic object is tracked according to target dynamic object information control mobile robot and is followed, the invention also discloses a kind of Robotic Dynamic tracking systems of view-based access control model conspicuousness, technical solution through the invention is, it can be achieved that mobile robot more accurate detection more efficient to dynamic object and tracking.
Description
Technical field
The invention belongs to robot vision detection fields, and in particular to a kind of Robotic Dynamic of view-based access control model conspicuousness with
Track method and system.
Background technology
The methods of poor, light stream, grid, but this currently, well known Robotic Dynamic object detection and tracking technique have powerful connections
There are many defects in the application of robot for a little methods, for example require static video camera or its detection and track usually needle
To all dynamic objects of observation, but in some applications of robot and not all dynamic object it is necessary to carry out with
Track.And such issues that then advantageously accounted for using the method for view-based access control model conspicuousness.The method of vision significance is to imitate the mankind
The mechanism of vision significance, i.e., most significant feature (unusual feature) are usually attracting note that this method is usually led to
It crosses and calculates the significance value of some position to represent the object of attention, therefore be highly suitable for the camera shooting that robot utilizes its outfit
The dynamic object of head detect and track near its circumference.In addition the realization of existing tracking system uses general-purpose computing system and body
Product is larger, and embedded application is restricted.
Invention content
It is an object of the invention in view of the drawbacks of the prior art, provide a kind of Robotic Dynamic of view-based access control model conspicuousness
Tracking and system, using the visible detection method of view-based access control model conspicuousness, combined digital signal handles (DSP) system, can
Realize mobile robot more accurate detection more efficient to dynamic object and tracking.
Technical scheme is as follows:
A kind of Robotic Dynamic tracking of view-based access control model conspicuousness, includes the following steps:
S1, mobile robot are moved according to preset path, are absorbed and are obtained to ambient enviroment using two cameras
To two-path video information;Described two cameras are installed in the mobile robot;
The two-path video information is transferred to digital information processing system by S2, the mobile robot;
Digital signal processor in S3, the digital information processing system carries out feature to the two-path video information and carries
It takes, the characteristic point extracted is calculated using Stereo Matching Algorithm, obtain visual signature point, and by the visual signature point
It stores to data buffer storage unit;
S4, the digital signal processor use speed and movement of the Multi-RANSAC algorithms to the visual signature point
Direction carries out parameter Estimation, and the speed and the direction of motion of the visual signature point obtained according to estimation are to the visual signature point
It is pre-processed;Static vision characteristic point of the pretreatment for eliminating in the visual signature point;
S5, the digital signal processor to the pretreated residue visual signature point carry out significance calculating and minute
Cut out target dynamic object information;
S6, the digital signal processor send out finger according to the target dynamic object information to the mobile robot
It enables;Described instruction is tracked target dynamic object for controlling the mobile robot.
Further, the step S4 the specific steps are:
S41, the digital signal processor are using Multi-RANSAC algorithms to the speed of the visual signature pointIt carries out
Parameter Estimation;The visual signature spot moving direction is obtained using formula (1):
WhereinThe speed of the respectively described visual signature pointVelocity component on x, y-axis direction;
S42, the digital signal processor formula (2) pre-process the visual signature point, eliminate static
Visual signature point, equation are as follows:
Wherein vR,θRFor the mobile robot speed and the direction of motion;For the visual signature point speed and
Direction average value;δ1And δ2Respectively speed and direction threshold values;The static vision characteristic point is the spy of coincidence formula (2)
Sign point.
Further, in the step S3, the Stereo Matching Algorithm is dynamic programming algorithm.
Further, the step S5 the specific steps are:
S51, the digital signal processor carry out conspicuousness calculating to the pretreated visual signature point;
S52, the digital signal processor meet according to preset saliency value threshold value selection saliency value to be regarded described in threshold value
Feel characteristic point, target dynamic object information is partitioned into using average drifting (Mean-Shift) algorithm.
Further, digital signal processor described in the step S51 clicks through the pretreated visual signature
The calculation formula that row conspicuousness calculates is as follows:
Wherein kpFor normalization coefficient, r is the mobile robot and the visual signature point distance;rx,ryIt is r in x, y
Component in axis direction;For component of the estimating speed on x, y-axis direction of i-th of visual signature point;vRx,
vRyIt is the mobile robot speed in the component on x, y-axis direction.
A kind of Robotic Dynamic tracking system of view-based access control model conspicuousness, including mobile robot and Digital Signal Processing system
System;The mobile robot includes:
Communication module, for establishing connection and transmission data and instruction with the digital information processing system;
Robot control unit, the movement for controlling robot;
Motor driver, for being moved according to the order-driven mobile robot of robot control unit;
Two cameras, absorb ambient enviroment and obtain video information;
The digital information processing system is embedded system, including:
Communication module, for establishing connection and transmission data and instruction with the mobile robot;
Digital signal processor, for carrying out feature extraction to the video information, using Stereo Matching Algorithm to extraction
The characteristic point gone out carries out Stereo matching and obtains visual signature point, using Multi-RANSAC algorithms to the speed of the visual signature point
Degree and the direction of motion carry out parameter Estimation, and the speed and the direction of motion of the visual signature point obtained according to estimation are regarded to described
Feel that characteristic point eliminate the pretreatment of static vision characteristic point;Significance is carried out to the pretreated visual signature point
It calculates and is partitioned into target dynamic object information;Instruction is sent out according to target dynamic object information generation;Described instruction is used
The motor driver is controlled in controlling the robot control unit, the mobile robot is made to carry out target dynamic object
Tracking;
Data buffer unit, for storing the visual signature point.
Further, the digital signal processor uses Multi-RANSAC (Multi-Random Sample
Consensus) carry out speed of the algorithm to the visual signature pointParameter Estimation;
The visual signature spot moving direction is obtained using formula (1):
WhereinThe respectively described visual signature spot speedVelocity component on x, y-axis direction.
Further, the digital signal processor formula (2) pre-processes the visual signature point, eliminates
Static vision characteristic point, equation are as follows:
Wherein vR,θRFor the mobile robot speed and the direction of motion;For the visual signature point feature point
Speed and direction average value;δ1And δ2Respectively speed and direction threshold values.
Further, the digital signal processor carries out pretreated visual signature point the calculating of conspicuousness calculating
Formula is as follows:
Wherein kpFor normalization coefficient, r is the mobile robot at a distance from the visual signature point;rx,ryExist for r
X, the component on y-axis direction;For component of the speed on x, y-axis direction of i-th of visual signature point;vRx,vRyFor institute
Mobile robot speed is stated in the component on x, y-axis direction.
Further, the digital signal processor is more than the institute of threshold value according to preset saliency value threshold value selection saliency value
Visual signature point is stated, target dynamic object information is partitioned into using average drifting (Mean-Shift) algorithm.
Compared with prior art, advantageous effects of the invention are as follows:
The present invention provides a kind of Robotic Dynamic tracking of view-based access control model conspicuousness and systems, using view-based access control model
The visible detection method of conspicuousness realizes that mobile robot is more efficient to dynamic object more accurate in conjunction with Embedded DSP System
Detection and tracking increase the diversity and simplicity that it is used simultaneously because system dimension is smaller and is easily embedded in other systems.
Description of the drawings
Fig. 1 is a kind of flow signal of the Robotic Dynamic tracking of view-based access control model conspicuousness of the embodiment of the present invention 1
Figure.
Fig. 2 is a kind of structural representation of the Robotic Dynamic tracking system of view-based access control model conspicuousness of the embodiment of the present invention 2
Figure.
Specific implementation mode
In order to be fully understood from the purpose of the present invention, feature and effect, below with reference to attached drawing to the several excellent of the present invention
The embodiment of choosing illustrates.
Embodiment 1
As shown in Figure 1, the present embodiment 1 discloses a kind of Robotic Dynamic tracking of view-based access control model conspicuousness, including
Following steps:
S1, mobile robot are moved according to preset path, are absorbed and are obtained to ambient enviroment using two cameras
To two-path video information;Two cameras are installed in mobile robot;
Two-path video information is transferred to digital information processing system by S2, mobile robot;
Digital signal processor in S3, digital information processing system carries out feature extraction to two-path video information, uses
Stereo Matching Algorithm calculates the characteristic point extracted, obtains visual signature point, and visual signature point is stored to data
Buffer unit;
Specifically, the Stereo Matching Algorithm of step S3 is dynamic programming algorithm.
S4, digital signal processor carry out the speed and the direction of motion of visual signature point using Multi-RANSAC algorithms
Parameter Estimation, the speed and the direction of motion of the visual signature point obtained according to estimation pre-process visual signature point;Pre- place
Reason is for eliminating the static vision characteristic point in visual signature point;
Specifically, step S4 the specific steps are:
S41, digital signal processor are using Multi-RANSAC algorithms to the speed of visual signature pointParameter is carried out to estimate
Meter;Visual signature spot moving direction is obtained using formula (1):
WhereinThe respectively speed of visual signature pointVelocity component on x, y-axis direction;
S42, digital signal processor formula (2) pre-process visual signature point, eliminate static vision feature
Point, equation are as follows:
Wherein vR,θRFor mobile robot speed and the direction of motion;It is average for the speed of visual signature point and direction
Value;δ1And δ2Respectively speed and direction threshold values;Static vision characteristic point is the characteristic point of coincidence formula (2).
S5, digital signal processor carry out significance calculating to pretreated remaining visual signature point and are partitioned into target
Dynamic object information;
Specifically, step S5 the specific steps are:
S51, digital signal processor carry out conspicuousness calculating to pretreated visual signature point;
S52, digital signal processor meet the visual signature point of threshold value according to preset saliency value threshold value selection saliency value,
Target dynamic object information is partitioned into using average drifting (Mean-Shift) algorithm.
Specifically, digital signal processor carries out conspicuousness calculating to pretreated visual signature point in step S51
Calculation formula is as follows:
Wherein kpFor normalization coefficient, r is mobile robot and visual signature point distance;rx,ryIt is r on x, y-axis direction
Component;For component of the estimating speed on x, y-axis direction of i-th of visual signature point;vRx,vRyFor mobile machine
People's speed is in the component on x, y-axis direction.
S6, digital signal processor send out instruction according to target dynamic object information to mobile robot;Instruction is for controlling
Mobile robot processed is tracked target dynamic object.
Embodiment 2
As shown in Fig. 2, present embodiment discloses a kind of Robotic Dynamic tracking systems of view-based access control model conspicuousness comprising
Mobile robot and digital information processing system;Mobile robot includes:
Communication module 1, for establishing connection and transmission data and instruction with digital information processing system;
Robot control unit 2, the movement for controlling robot;
Motor driver 3, for being moved according to the order-driven mobile robot of robot control unit;
Two cameras 4, absorb ambient enviroment and obtain video information;
The digital information processing system is embedded system, including:
Communication module 5, for establishing connection and transmission data and instruction with mobile robot;
Digital signal processor 6, for carrying out feature extraction to video information, using Stereo Matching Algorithm to extracting
Characteristic point carries out Stereo matching and obtains visual signature point, the speed to visual signature point and movement using Multi-RANSAC algorithms
Direction carries out parameter Estimation, and the speed and the direction of motion of the visual signature point obtained according to estimation eliminate visual signature point
Fall the pretreatment of static vision characteristic point;Significance calculating is carried out to pretreated visual signature point and is partitioned into target dynamic
Object information;Instruction is sent out according to the generation of target dynamic object information;Instruction controls motor for controlling robot control unit
Driver makes mobile robot be tracked target dynamic object;
Data buffer unit, for storing visual signature point.
Specifically, digital signal processor 6 is using Multi-RANSAC (Multi-Random Sample
Consensus) carry out speed of the algorithm to visual signature pointParameter Estimation;
Visual signature spot moving direction is obtained using formula (1):
WhereinRespectively visual signature spot speedVelocity component on x, y-axis direction.
Specifically, digital signal processor 6 pre-processes visual signature point using formula (2), static vision is eliminated
Characteristic point, equation are as follows:
Wherein vR,θRFor mobile robot speed and the direction of motion;Speed for visual signature point feature point and side
To average value;δ1And δ2Respectively speed and direction threshold values.
Specifically, digital signal processor 6 carries out pretreated visual signature point the calculation formula of conspicuousness calculating
It is as follows:
Wherein kpFor normalization coefficient, r is mobile robot at a distance from visual signature point;rx,ryIt is r in x, y-axis direction
On component;For component of the speed on x, y-axis direction of i-th of visual signature point;vRx,vRyFor mobile robot
Speed is in the component on x, y-axis direction.
Specifically, the vision that digital signal processor 6 is more than threshold value according to preset saliency value threshold value selection saliency value is special
Point is levied, target dynamic object information is partitioned into using average drifting (Mean-Shift) algorithm.
The Robotic Dynamic tracking and system of a kind of view-based access control model conspicuousness provided through the invention, using based on
The visible detection method of vision significance, in conjunction with Embedded DSP System, it can be achieved that mobile robot is more efficient to dynamic object more
Accurate detection and tracking, simultaneously because system dimension is smaller and is easily embedded in other systems, increase diversity that it is used and
Simplicity.
The preferred embodiment of the present invention has been described in detail above, it should be understood that those skilled in the art without
It needs creative work according to the present invention can conceive and makes many modifications and variations.Therefore, all technologies in the art
Personnel according to present inventive concept in prior art basis by logic analysis, reasoning or according to it is limited experiment it is available
Technical solution, should be among the protection domain determined by the claims.
Claims (10)
1. a kind of Robotic Dynamic tracking of view-based access control model conspicuousness, which is characterized in that include the following steps:
S1, mobile robot are moved according to preset path, are absorbed to ambient enviroment using two cameras and obtain two
Road video information;Described two cameras are installed in the mobile robot;
The two-path video information is transferred to digital information processing system by S2, the mobile robot;
Digital signal processor in S3, the digital information processing system carries out feature extraction to the two-path video information,
The characteristic point extracted is calculated using Stereo Matching Algorithm, obtains visual signature point, and the visual signature point is stored up
It deposits to data buffer storage unit;
S4, the digital signal processor are using Multi-RANSAC algorithms to the speed and the direction of motion of the visual signature point
Parameter Estimation is carried out, the speed and the direction of motion of the visual signature point obtained according to estimation carry out the visual signature point
Pretreatment;Static vision characteristic point of the pretreatment for eliminating in the visual signature point;
S5, the digital signal processor carry out significance calculating to the pretreated residue visual signature point and are partitioned into
Target dynamic object information;
S6, the digital signal processor send out instruction according to the target dynamic object information to the mobile robot;Institute
Instruction is stated to be tracked target dynamic object for controlling the mobile robot.
2. the Robotic Dynamic tracking of view-based access control model conspicuousness as described in claim 1, which is characterized in that the step
S4 the specific steps are:
S41, the digital signal processor are using Multi-RANSAC algorithms to the speed of the visual signature pointCarry out parameter
Estimation;The visual signature spot moving direction is obtained using formula (1):
WhereinThe speed of the respectively described visual signature pointVelocity component on x, y-axis direction;
S42, the digital signal processor formula (2) pre-process the visual signature point, eliminate static vision
Characteristic point, equation are as follows:
Wherein vR,θRFor the mobile robot speed and the direction of motion;Speed for the visual signature point and direction
Average value;δ1And δ2Respectively speed and direction threshold values;The static vision characteristic point is the characteristic point of coincidence formula (2).
3. the Robotic Dynamic tracking of view-based access control model conspicuousness as described in claim 1, which is characterized in that the step
In S3, the Stereo Matching Algorithm is dynamic programming algorithm.
4. the Robotic Dynamic tracking of view-based access control model conspicuousness as claimed in claim 1 or 2, which is characterized in that described
Step S5 the specific steps are:
S51, the digital signal processor carry out conspicuousness calculating to the pretreated visual signature point;
The vision that S52, the digital signal processor meet threshold value according to preset saliency value threshold value selection saliency value is special
Point is levied, target dynamic object information is partitioned into using average drifting (Mean-Shift) algorithm.
5. the Robotic Dynamic tracking of view-based access control model conspicuousness as claimed in claim 4, which is characterized in that the step
Digital signal processor described in S51 carries out the calculation formula of conspicuousness calculating such as to the pretreated visual signature point
Under:
Wherein kpFor normalization coefficient, r is the mobile robot and the visual signature point distance;rx,ryIt is r in x, y-axis side
Upward component;For component of the speed on x, y-axis direction of i-th of visual signature point;vRx,vRyFor the shifting
Mobile robot speed is in the component on x, y-axis direction.
6. a kind of Robotic Dynamic tracking system of view-based access control model conspicuousness, which is characterized in that including mobile robot and number
Signal processing system;The mobile robot includes:
Communication module, for establishing connection and transmission data and instruction with the digital information processing system;
Robot control unit, the movement for controlling robot;
Motor driver, for being moved according to the order-driven mobile robot of robot control unit;
Two cameras, absorb ambient enviroment and obtain video information;
The digital information processing system is embedded system, including:
Communication module, for establishing connection and transmission data and instruction with the mobile robot;
Digital signal processor, for carrying out feature extraction to the video information, using Stereo Matching Algorithm to extracting
Characteristic point carries out Stereo matching and obtains visual signature point, using Multi-RANSAC algorithms to the speed of the visual signature point and
The direction of motion carries out parameter Estimation, and the speed and the direction of motion of the visual signature point obtained according to estimation are to vision spy
Sign point eliminate the pretreatment of static vision characteristic point;Significance calculating is carried out to the pretreated visual signature point
And it is partitioned into target dynamic object information;Instruction is sent out according to target dynamic object information generation;Described instruction is for controlling
It makes the robot control unit and controls the motor driver, the mobile robot is made to chase after target dynamic object
Track;
Data buffer unit, for storing the visual signature point.
7. the Robotic Dynamic tracking system of view-based access control model conspicuousness as claimed in claim 6, which is characterized in that the number
Signal processor is using Multi-RANSAC (Multi-Random Sample Consensus) algorithms to the visual signature point
Carry out speedParameter Estimation;
The visual signature spot moving direction is obtained using formula (1):
WhereinThe respectively described visual signature spot speedVelocity component on x, y-axis direction.
8. the Robotic Dynamic tracking system of view-based access control model conspicuousness as claimed in claims 6 or 7, which is characterized in that described
Digital signal processor formula (2) pre-processes the visual signature point, eliminates static vision characteristic point, equation is such as
Under:
Wherein vR,θRFor the mobile robot speed and the direction of motion;For the speed of the visual signature point feature point
With direction average value;δ1And δ2Respectively speed and direction threshold values.
9. the Robotic Dynamic tracking system of view-based access control model conspicuousness as claimed in claim 6, which is characterized in that the number
The calculation formula that signal processor carries out pretreated characteristic point conspicuousness calculating is as follows:
Wherein kpFor normalization coefficient, r is the mobile robot at a distance from the visual signature point;rx,ryIt is r in x, y-axis
Component on direction;For component of the speed on x, y-axis direction of i-th of visual signature point;vRx,vRyFor the shifting
Mobile robot speed is in the component on x, y-axis direction.
10. the Robotic Dynamic tracking system of view-based access control model conspicuousness as claimed in claim 9, which is characterized in that the number
Word signal processor is more than the visual signature point of threshold value according to preset saliency value threshold value selection saliency value, is floated using mean value
It moves (Mean-Shift) algorithm and is partitioned into target dynamic object information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810456224.1A CN108694725A (en) | 2018-05-14 | 2018-05-14 | A kind of the Robotic Dynamic tracking and system of view-based access control model conspicuousness |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810456224.1A CN108694725A (en) | 2018-05-14 | 2018-05-14 | A kind of the Robotic Dynamic tracking and system of view-based access control model conspicuousness |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108694725A true CN108694725A (en) | 2018-10-23 |
Family
ID=63847405
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810456224.1A Pending CN108694725A (en) | 2018-05-14 | 2018-05-14 | A kind of the Robotic Dynamic tracking and system of view-based access control model conspicuousness |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108694725A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110244746A (en) * | 2019-07-11 | 2019-09-17 | 肇庆学院 | A kind of Robotic Dynamic barrier that view-based access control model pays attention to avoids method and system |
CN110415273A (en) * | 2019-07-29 | 2019-11-05 | 肇庆学院 | A kind of efficient motion tracking method of robot and system of view-based access control model conspicuousness |
CN115601432A (en) * | 2022-11-08 | 2023-01-13 | 肇庆学院(Cn) | Robot position optimal estimation method and system based on FPGA |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104751466A (en) * | 2015-04-01 | 2015-07-01 | 电子科技大学 | Deform able object tracking algorithm based on visual salience and system thereof |
WO2015163830A1 (en) * | 2014-04-22 | 2015-10-29 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | Target localization and size estimation via multiple model learning in visual tracking |
CN105893957A (en) * | 2016-03-30 | 2016-08-24 | 上海交通大学 | Method for recognizing and tracking ships on lake surface on the basis of vision |
-
2018
- 2018-05-14 CN CN201810456224.1A patent/CN108694725A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015163830A1 (en) * | 2014-04-22 | 2015-10-29 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | Target localization and size estimation via multiple model learning in visual tracking |
CN104751466A (en) * | 2015-04-01 | 2015-07-01 | 电子科技大学 | Deform able object tracking algorithm based on visual salience and system thereof |
CN105893957A (en) * | 2016-03-30 | 2016-08-24 | 上海交通大学 | Method for recognizing and tracking ships on lake surface on the basis of vision |
Non-Patent Citations (5)
Title |
---|
BINGHUA GUO等: "Visual Saliency-Based Motion Detection Technique for Mobile Robots", 《INTERNATIONAL JOURNAL OF ROBOTICS & AUTOMATION》 * |
和箫: "基于视觉的目标跟踪研究及其在移动机器人中的实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
李竹林等: "《图像立体匹配技术及其发展和应用》", 31 July 2007 * |
樊祥锰等: "基于视觉显著性的均值漂移跟踪算法", 《传感器与微***》 * |
郭丙华等: "基于视觉显著性的移动机器人动态环境建模", 《控制理论与应用》 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110244746A (en) * | 2019-07-11 | 2019-09-17 | 肇庆学院 | A kind of Robotic Dynamic barrier that view-based access control model pays attention to avoids method and system |
CN110244746B (en) * | 2019-07-11 | 2020-02-18 | 肇庆学院 | Robot dynamic barrier avoiding method and system based on visual attention |
CN110415273A (en) * | 2019-07-29 | 2019-11-05 | 肇庆学院 | A kind of efficient motion tracking method of robot and system of view-based access control model conspicuousness |
CN115601432A (en) * | 2022-11-08 | 2023-01-13 | 肇庆学院(Cn) | Robot position optimal estimation method and system based on FPGA |
CN115601432B (en) * | 2022-11-08 | 2023-05-30 | 肇庆学院 | Robot position optimal estimation method and system based on FPGA |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Reid et al. | Active tracking of foveated feature clusters using affine structure | |
Lou et al. | 3-D model-based vehicle tracking | |
CN103268616A (en) | Multi-feature multi-sensor method for mobile robot to track moving body | |
Lee et al. | Ground-moving-platform-based human tracking using visual SLAM and constrained multiple kernels | |
CN108694725A (en) | A kind of the Robotic Dynamic tracking and system of view-based access control model conspicuousness | |
CN112785628B (en) | Track prediction method and system based on panoramic view angle detection tracking | |
CN103886325A (en) | Cyclic matrix video tracking method with partition | |
CN103150737A (en) | Real-time space target feature point tracking method suitable for space tethered robot | |
CN107097256B (en) | Model-free method for tracking target of the view-based access control model nonholonomic mobile robot under polar coordinates | |
CN112861808B (en) | Dynamic gesture recognition method, device, computer equipment and readable storage medium | |
CN106887012A (en) | A kind of quick self-adapted multiscale target tracking based on circular matrix | |
Mi et al. | A system for an anticipative front human following robot | |
Chen et al. | Real-time Visual Object Tracking via CamShift-Based Robust Framework. | |
Lai et al. | A survey of deep learning application in dynamic visual SLAM | |
CN109903309A (en) | A kind of robot motion's information estimating method based on angle optical flow method | |
Zhang et al. | Event-based circular detection for AUV docking based on spiking neural network | |
Königs et al. | Fast visual people tracking using a feature-based people detector | |
Chai et al. | 3D gesture recognition method based on faster R-CNN network | |
Zhou et al. | SURF feature detection method used in object tracking | |
Xu et al. | Research on the method of 3D registration technology | |
Gunawan et al. | Geometric deep particle filter for motorcycle tracking: development of intelligent traffic system in Jakarta | |
CN110244746A (en) | A kind of Robotic Dynamic barrier that view-based access control model pays attention to avoids method and system | |
Pei et al. | Dynamic SLAM system using histogram-based outlier score to improve anomaly detection | |
CN117576165B (en) | Ship multi-target tracking method and device, electronic equipment and storage medium | |
Feng et al. | Initialization of 3D Human Hand Model and Its Applications in Human Hand Tracking. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181023 |
|
RJ01 | Rejection of invention patent application after publication |