CN108491758A - A kind of track detection method and robot - Google Patents

A kind of track detection method and robot Download PDF

Info

Publication number
CN108491758A
CN108491758A CN201810126117.2A CN201810126117A CN108491758A CN 108491758 A CN108491758 A CN 108491758A CN 201810126117 A CN201810126117 A CN 201810126117A CN 108491758 A CN108491758 A CN 108491758A
Authority
CN
China
Prior art keywords
robot
image
mentioned
track
video stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810126117.2A
Other languages
Chinese (zh)
Other versions
CN108491758B (en
Inventor
冯平
卢思岑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Rui Ling Innovation Technology Development Co Ltd
Original Assignee
Shenzhen Rui Ling Innovation Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Rui Ling Innovation Technology Development Co Ltd filed Critical Shenzhen Rui Ling Innovation Technology Development Co Ltd
Priority to CN201810126117.2A priority Critical patent/CN108491758B/en
Publication of CN108491758A publication Critical patent/CN108491758A/en
Application granted granted Critical
Publication of CN108491758B publication Critical patent/CN108491758B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a kind of track detection method and robots, wherein the track detection method includes:The live video stream of robot acquisition trajectory;Based on the live video stream, the position of the robot is determined by vision positioning;According to the traveling scheme of robot described in the location determination of the robot;The robot is controlled to be advanced according to the traveling scheme;In each frame image of the live video stream, detect whether that there are target images, wherein the target image is the image for the fault zone for showing the track;If in each frame image of the live video stream, there are the target images, the target image is then uploaded to server, so that the server carries out secondary analysis to the target image, and the target image is matched to by corresponding client based on the result of secondary analysis and is shown.The present invention program can realize the automatic detection to the infrastructure and infrastructure device of rail traffic, ensure Rail Transit System safe operation.

Description

A kind of track detection method and robot
Technical field
The invention belongs to robotic technology field more particularly to a kind of track detection methods and robot.
Background technology
The development of rail traffic is being concentrated mainly on a line big city and provincial capital nearly ten years, and has to two The universal trend in three line cities.And safety is the first element of rail traffic.In order to ensure the safety of rail traffic, rail traffic Infrastructure and infrastructure device follow inspection in 48 hours, monthly test, half annual test, annual test, 5 years standards examined greatly, and this its In, some detections almost rely on artificial detection, such as inspection in 48 hours etc..Traditional artificial detection not only needs to spend big Manpower and materials are measured, but also the problems such as there is false retrieval, missing inspection and the security risks of testing staff.
Invention content
In view of this, the present invention provides a kind of track detection method and robot, it can be achieved that basis to rail traffic The automatic detection of facility and infrastructure device ensures Rail Transit System safe operation.
The first aspect of the present invention provides a kind of track detection method, and the track detection method includes:
The live video stream of robot acquisition trajectory;
Based on the live video stream, the position of the robot is determined by vision positioning;
According to the traveling scheme of robot described in the location determination of the robot;
The robot is controlled to be advanced according to the traveling scheme;
In each frame image of the live video stream, detect whether that there are target images, wherein the target image is Show the image of the fault zone of the track;
If the target image is uploaded to there are the target image in each frame image of the live video stream Server so that the server carries out secondary analysis to the target image, and based on the result of secondary analysis by the mesh Logo image is matched to corresponding client and shows.
The second aspect of the present invention provides a kind of robot, and the robot application is in field of track traffic, the machine Device people includes:
Video acquisition module is used for the live video stream of machine acquisition trajectory;
Vision positioning module determines the position of the robot by vision positioning for being based on the live video stream;
Scheme determining module, the traveling scheme for robot described in the location determination according to the robot;
Traveling control module is advanced for controlling the robot according to the traveling scheme;
Fault detection module, in each frame image of the live video stream, detecting whether there are target image, In, the target image is the image for the fault zone for showing the track;
Information uploading module, for when in each frame image in the live video stream there are when the target image, will The target image is uploaded to server so that the server carries out secondary analysis to the target image, and based on secondary The target image is matched to corresponding client and shown by the result of analysis.
Therefore in the present invention program, first by the live video stream of robot acquisition trajectory, it is then based on described Live video stream determines the position of the robot by vision positioning, described in the location determination then according to the robot The traveling scheme of robot controls the robot and is advanced according to the traveling scheme, and in each frame of the live video stream In image, detect whether that there are target images, wherein the target image is the figure for the fault zone for showing the track The target image is uploaded to server by picture there are when the target image in each frame image of the live video stream, So that the server carries out secondary analysis to the target image, and based on the result of secondary analysis by the target image Corresponding client is assigned to show.The Daily Round Check operation of track is given robot to execute by the present invention program, is existed by robot Quickly identification reduces false retrieval, the possibility that missing inspection occurs, realizes the base to rail traffic there are the region of failure during inspection The automatic detection of Infrastructure and infrastructure device ensures Rail Transit System safe operation.
Description of the drawings
It to describe the technical solutions in the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description be only the present invention some Embodiment for those of ordinary skill in the art without having to pay creative labor, can also be according to these Attached drawing obtains other attached drawings.
Fig. 1 is the implementation process schematic diagram of track detection method provided in an embodiment of the present invention;
Fig. 2 (a) is the structure diagram of rail detection system provided in an embodiment of the present invention;
Fig. 2 (b) is the specific implementation flow schematic diagram of step 105 in embodiment illustrated in fig. 1;
Fig. 3 is the specific implementation flow schematic diagram of step 102 in embodiment illustrated in fig. 1;
Fig. 4 (a) is the schematic diagram of coordinate system in step 301 in embodiment illustrated in fig. 3;
Fig. 4 (b) is the schematic diagram of the center line of the center line of image and image middle orbit in step 303 in embodiment illustrated in fig. 3;
Fig. 5 is the specific implementation flow schematic diagram of step 103 in embodiment illustrated in fig. 1;
Fig. 6 is the structure diagram of robot provided in an embodiment of the present invention.
Specific implementation mode
In being described below, for illustration and not for limitation, it is proposed that such as tool of particular system structure, technology etc Body details, to understand thoroughly the embodiment of the present invention.However, it will be clear to one skilled in the art that there is no these specific The present invention can also be realized in the other embodiments of details.In other situations, it omits to well-known system, device, electricity The detailed description of road and method, in case unnecessary details interferes description of the invention.
In order to illustrate the technical solution of the present invention, illustrated below by specific embodiment.
Embodiment one
Fig. 1 shows the implementation process for the track detection method that the embodiment of the present invention one provides, and details are as follows:
In a step 101, the live video stream of robot acquisition trajectory;
In embodiments of the present invention, service personnel is when starting the robot applied to field of track traffic, first by machine People is placed in orbital position, for example, Gui Hang robots are installed on the rail of track, alternatively, flying robot is attached in track Closely let fly away.Above-mentioned robot upon actuation, first sets the frame per second of image acquisition device (such as camera), so according to sampling thheorem Start the live video stream of acquisition trajectory afterwards.Specifically, above-mentioned robot is equipped with multiple cameras, is mounted on based on camera Different location in robot can collect the orbital image of different angle;Type based on camera can collect rail The live video stream of the different video type in road, above-mentioned video type include but not limited to that infrared video, greyscale video, multichannel are super HD video is not construed as limiting herein.Specifically, the multichannel ultra high-definition image in above-mentioned multichannel ultra high-definition video is for assessing bridge The fault conditions such as crack, contact net foreign matter;Infrared image in above-mentioned infrared video is for assessing tunnel, bridge infiltration water and connecing The fault conditions such as net-fault failure;That is, different types of video is for assessing different types of failure.Optionally, it is adopted in robot After the various types of live video streams for collecting track, whether each frame image that can be detected in above-mentioned live video stream reaches pre- If picture quality condition, if in above-mentioned live video stream exist not up to above-mentioned picture quality condition multichannel ultra high-definition image And/or infrared image, then by the multichannel ultra high-definition image and/or infrared image of above-mentioned not up to above-mentioned picture quality condition from upper It states in live video stream and rejects, to avoid the bad image contributions of quality to robot and server to the live video stream of track Analyzing processing, reduce and the possibilities of error analysis results occur.
In a step 102, it is based on above-mentioned live video stream, the position of above-mentioned robot is determined by vision positioning;
In embodiments of the present invention, it is based on live video stream, realtime graphic can be got, specifically, it is believed that when Before the image of a newest frame that gets be realtime graphic.Robot upon actuation, can be based on above-mentioned realtime graphic, pass through Vision positioning determines the position of itself, realizes the positioning to itself.Optionally, GPS can be carried in above-mentioned robot interior Module or other locating modules are realized the Primary Location to robot by above-mentioned GPS module or other locating modules, then are passed through The realtime graphic of the collected track of vision positioning and robot realizes the precise positioning to robot, i.e., is drawn by Primary Location Determine zonule residing for robot possibility, precise positioning is carried out in the zonule, lowers fortune when robot carries out precise positioning Calculate pressure.Since track is after the completion of laying, position will not change, i.e. the position of track is fixed position, thus, The environmental map that above-mentioned track can be obtained in advance is trained and is learnt to robot by the environmental map of above-mentioned track Afterwards, restart above-mentioned robot;After above-mentioned robot starts and collects the realtime graphic of track, by the spy to orbital image The image processing operations such as sign point extraction can determine the position of above-mentioned robot.Certainly, according to the difference of robot type, also may be used To determine the position of robot otherwise, it is not construed as limiting herein.
In step 103, according to the traveling scheme of the above-mentioned robot of location determination of above-mentioned robot;
In embodiments of the present invention, it can be further determined that according to the current location of identified robot in step 102 The traveling scheme of above-mentioned robot specifically can pass through the current location planning robot's of the route and robot of track Travelling route, when the current location of robot is Chong Die with the route of track, control robot follows the route of track to advance;When When the position of the current location offset track of robot, control robot first marches to the position Chong Die with the route of track, then The route of track is followed to advance.It is of course also possible to determine the traveling scheme of above-mentioned robot otherwise, do not limit herein It is fixed.
At step 104, above-mentioned robot is controlled to be advanced according to above-mentioned traveling scheme;
In embodiments of the present invention, robot can advance according to traveling scheme for duration, and pass through step during advancing Rapid 101 to 103 constantly update the traveling scheme of itself.
In step 105, in each frame image of above-mentioned live video stream, detect whether that there are target images;
In embodiments of the present invention, image procossing is carried out to live video stream by the embedded hardware that robot carries, It whether there is target image in each frame image to detect above-mentioned live video stream, wherein above-mentioned target image is to show The image of the fault zone of above-mentioned track.In fact, due to when being shot to track, since track is inherently by two What parallel rail and the equidistant laying of several sleepers were constituted, thus the picture of the normal orbit of shooting gained is under normal conditions It is very close, that is, if the live video stream of shooting acquisition normal orbit, adjacent two field pictures in the live video stream Similarity should be higher.When there is the case where picture mutation, it is likely that mean that failure has occurred in track, can be examined based on this It surveys and whether there is target image in each frame image.For example, if foreign matter occur in A points in track, other places are normal, then exist During shooting live video stream, when shooting is to A points, include in live video stream A points image will with it is other not Including the image of A points has significant difference, in other words, in live video stream, has the orbital image of foreign matter and does not have foreign matter Orbital image necessarily have larger difference, therefore, it is possible to judge whether track is deposited according to the similarity of adjacent two field pictures In foreign matter;Certainly, other obstacles, such as larger crack etc., can also according to the similarity of adjacent two field pictures judge obtain, It is not construed as limiting herein.
In step 106, if there are above-mentioned target images in each frame image of above-mentioned live video stream, by above-mentioned mesh Logo image is uploaded to server.
In embodiments of the present invention, robot, server, PC (Person Computer, PC) and/or movement Terminal may make up a rail detection system, and above-mentioned server can also include data management platform and database, wherein above-mentioned number It is used to store the various information accessed by robot according to library, above-mentioned data management platform provides an interactive interface, for showing Show that the various information accessed by robot, i.e. service personnel can pass through data management platform realization and robot at server Interaction, Fig. 2 (a) shows the structure diagram of the rail detection system.In the rail detection system, once it detects the presence of Above-mentioned target image is then uploaded to server by target image by network so that above-mentioned server to above-mentioned target image into Row secondary analysis, and the result notice service personnel based on secondary analysis.Specifically, PC and mobile terminal can be equipped with based on upper State rail detection system carrying client, above-mentioned client login after, can the result based on above-mentioned secondary analysis will be upper It states target image and is matched to corresponding client and show, for example, being shown in the client of PC and/or being shown in the client of mobile terminal On end, it is not construed as limiting herein.By above-mentioned client, above-mentioned data management platform can also be accessed and database (accesses clothes Business device).Above-mentioned network can be Wireless Fidelity (WiFi, Wireless Fidelity) network, can also be mobile data network Network, such as general packet radio service technology (GPRS, General Packet Radio Service)) network, the third generation move Dynamic communication network (i.e. 3G network), forth generation mobile communications network (i.e. 4G networks) etc., are not construed as limiting herein.Since robot exists May also occur mistake during target image in detection live video stream, such as acquisition is made due to strong illumination Occur fuzzy in image or do not know region, influence the judgement of robot, lead to machine human factor error will be not present faulty section The image in domain is uploaded to server as target image, in this case, in order to improve the accuracy of fault detect, above-mentioned In server secondary analysis and database purchase will be carried out to above-mentioned target image.If the result of secondary analysis determines above-mentioned target Really there is fault zone in image, that is, determine that there are failures in the track shown by above-mentioned target image, then it can will be above-mentioned The relevant information of target image is shown on the data management platform of server.Further, if the result of secondary analysis determines Really there is fault zone in above-mentioned target image, the result that server is also based on secondary analysis judges whether to need manually Go to and repair the fault zone, if so, the relevant information of above-mentioned target image is shown in the mobile terminal bound in advance and/ Or in the client of PC.It specifically, can be in the client of mobile terminal and/or PC if above-mentioned track is underground railway track Show subway line, fault type indicated by above-mentioned target image by data management platform, fault degree, acquisition time and Collecting location, naturally it is also possible to which other relevant informations of above-mentioned target image are shown in above-mentioned mobile terminal and/or above-mentioned number According to management platform, it is not construed as limiting herein.The practical specific location for indicating above-mentioned track and breaking down of above-mentioned collecting location.Due to There may be several robots to be run on the section route of each track, that is, may whithin a period of time, data management platform Need to show the information of more than two target images, therefore, it is possible to based on acquisition time by sequence of the elder generation after to multiple targets The relevant information of image is ranked up, and can also be believed the correlation of multiple target images by weight to light sequence based on fault degree Breath is ranked up, be also based on collecting location apart from the machine (i.e. the mobile terminal of the relevant information of display target image and/or PC position) shows the relevant information of multiple target images after sequence by being closely ranked up to remote sequence.
Optionally, in order to improve the accuracy of detection target image, Fig. 2 (b) shows the specific implementation stream of step 105 Journey, details are as follows:
In step 201, binary conversion treatment is carried out to each frame image of above-mentioned live video stream;
In embodiments of the present invention, the binary-state threshold used when first setting binary conversion treatment specifically may be used double Peak method, maximum variance between clusters, maximum entropy threshold method or iterative method set above-mentioned binary-state threshold, and based on above-mentioned setting Binary-state threshold carries out binary conversion treatment to each frame image of above-mentioned live video stream.
In step 202, the area-of-interest of above-mentioned each frame image after binary conversion treatment is extracted;
In embodiments of the present invention, above-mentioned area-of-interest (region of interest, ROI) is selected from image The image-region selected, the area-of-interest are the emphasis for carrying out image analysis.Specifically, can by preset operator and Function extracts the area-of-interest of above-mentioned each frame image after binary conversion treatment.
In step 203, denoising operation is carried out to the area-of-interest of above-mentioned each frame image based on Morphology Algorithm;
In embodiments of the present invention, carrying out denoising operation to the area-of-interest of above-mentioned each frame image will specifically open Combine with closure operation and may make up morphology scratch filter, during this, needs to choose suitable structural element, ability Realize preferable denoising effect.
In step 204, for each frame image after denoising, the area-of-interest and consecutive frame image of image are calculated The characteristics of image difference of area-of-interest;
In embodiments of the present invention, above-mentioned consecutive frame image is figure adjacent with above-mentioned image in above-mentioned live video stream Picture, for example, for the first frame image in live video stream, the image adjacent with the image is the rear frame image of the image; For last frame image in live video stream, the image adjacent with the image is the prior image frame of the image.To regarding in real time For intermediate frame image in frequency stream, the image adjacent with the image is the prior image frame of the image or rear frame image, for example, can First to calculate the characteristics of image difference of the area-of-interest of intermediate frame image and the area-of-interest of prior image frame, if prior image frame Picture quality it is bad, then can calculate the image of the area-of-interest of intermediate frame image and the area-of-interest of rear frame image again Feature difference.Above-mentioned characteristics of image includes but not limited to:Color characteristic, textural characteristics, shape feature and spatial relation characteristics.
In step 205, if the image of the area-of-interest of above-mentioned image and the area-of-interest of above-mentioned consecutive frame image Feature difference has exceeded preset characteristics of image disparity range, then above-mentioned image and/or above-mentioned consecutive frame image is determined as mesh Logo image.
In embodiments of the present invention, ambient lighting feelings when above-mentioned preset characteristics of image disparity range can be according to shooting Condition is adjusted, and is not construed as limiting herein.
Optionally, above-mentioned track detection method further includes:
In the traveling process of above-mentioned robot, based on the range sensor that above-mentioned robot carries, above-mentioned machine is obtained People in above-mentioned robot local environment at a distance from barrier;
If above-mentioned distance is not more than preset distance threshold, above-mentioned traveling scheme is adjusted, so that robot Far from above-mentioned barrier, advance alternatively, suspending above-mentioned robot.
Wherein, according to the type of above-mentioned robot, range sensor can be carried in the different location of robot.In one kind Under application scenarios, the above-mentioned artificial flying robot of machine, then can robot all around orientation is respectively mounted distance up and down Sensor knocks the barrier being likely to occur in the air on the way to avoid flight;Under another application scenarios, above-mentioned machine is artificial Gui Hang robots then can knock rail in the front and back position mounting distance sensor of robot to avoid when moving forward or back The barrier that road near zone is likely to occur.Further, when robot carry any range sensor measured by distance When no more than preset distance threshold, above-mentioned traveling scheme can be adjusted, be advanced alternatively, suspending above-mentioned robot.
Under a kind of application scenarios, the above-mentioned artificial flying robot of machine, then in all around any orientation up and down The distance that measures of range sensor when being not more than preset distance threshold, by the master of the above-mentioned Distance Transmission measured to robot Dynamic security control plate, above-mentioned active safety control plate are advanced robot is controlled to the opposite direction of barrier (it is believed that above-mentioned The direction of sensor is the direction of the barrier), until the distance that measures of above-mentioned range sensor beyond it is above-mentioned it is preset away from After threshold value certain proportion (such as 20%), traveling scheme is adjusted and is continued on based on Obstacle Position.It needs to note Meaning, during above-mentioned control robot advances to the opposite direction of barrier, it is also necessary to ensure other sensors institute The value measured is also in normal range (NR).Further, above-mentioned active safety control plate can also connect accelerometer, barometer, Light stream sensor and/or more than one ultrasonic sensor can not only measure the barrier in environment, can also obtain itself Flight progress.As it can be seen that can be combined with flight control by above-mentioned active safety control plate, intelligent barrier avoiding can be realized; And when above-mentioned active safety control plate detects data perturbation measured by a plurality of types of sensors, then need above-mentioned active to pacify Full control panel control robot carries out emergency braking, and specially robot, which executes, declines instruction, declines instruction in above-mentioned execution In the process, when the data measured by the range sensor being installed below robot are less than one meter, the flight electricity of closing machine people Machine is to land;If during executing decline instruction, number measured by the range sensor being installed below robot is detected According to still disorder, then the flight motor of closing machine people immediately, makes flying robot's free falling, by being set outside flying robot The passive security protective cover set protects it.It is believed that above-mentioned passive security protective cover is mainly used for passive protection, keep away When exempting from that flying robot is out of control because of unexpected factors in special circumstances, pedestrian or vehicular traffic are damaged.Specifically, above-mentioned Passive security protective cover is designed using SOLIDWORKS design tools, and carries out finite element fraction with ANSYS analysis tools Analysis under the premise of ensureing structural strength, reduces protective cover matter to greatest extent to realize the optimization to above-mentioned passive security protective cover It measures (i.e. weight).
Above-mentioned robot can also provide remote monitoring function, and each biography of above-mentioned robot is specially sent to server The collected data of sensor institute, at the same to server send above-mentioned robot collected track realtime graphic, above-mentioned The collected data of each sensor institute and the data for carrying out fusion arrangement, and being obtained after fusion is arranged are sent on server It is shown to the mobile terminal bound in advance.After above-mentioned fusion arrangement obtained data get muddled etc. abnormal phenomenon when, It can be reminded on mobile terminals by modes such as vibrations, word and/or the tinkle of bells, it is abnormal existing to prompt service personnel to pay close attention to As;And when above-mentioned robot carries out emergency braking, the side such as vibrations, word and/or the tinkle of bells can also be passed through on mobile terminals Formula is reminded, to prompt service personnel that the place of robot emergency braking is gone to the behaviour such as robot to be recycled and overhauled Make.Certainly, service personnel can also actively consult the data obtained after the above-mentioned fusion received by mobile terminal arranges, to sentence Whether disconnected robot is during smooth flight, can be with when service personnel thinks the traveling of current robot there are when problem Pause instruction is sent to robot by client or server, above-mentioned track machine people receives client or server is sent Pause instruction after, above-mentioned pause instruction can be based on and suspend the flight of above-mentioned robot.The process of above-mentioned pause robot flight It can refer to the process of above-mentioned emergency braking.
Under another application scenarios, the above-mentioned artificial Gui Hang robots of machine are then surveyed in the range sensor in front It, will be upper when the distance obtained is not more than preset distance threshold (above-mentioned front is for the direction of travel of Gui Hang robots) The Distance Transmission measured is stated to the active safety control plate of robot, above-mentioned active safety control plate control robot, which suspends, goes Into;Also, since barrier may be animal, thus machine human hair can also be controlled and go out alarm audio, to disperse robot Neighbouring animal.Further, above-mentioned robot can combine collected with installation manipulator and object frame of receiving, above-mentioned robot The realtime graphic of track, can further determine whether the barrier falls into the range that manipulator can capture, if so, passing through The manipulator captures barrier to receiving in object frame, with removing obstacles object.Optionally, the distance that range sensor measures is more than default Distance threshold after, you can determine that the barrier has been not present, robot can be controlled and continued on.Further, above-mentioned master Dynamic security control plate can also connect accelerometer, barometer, light stream sensor and/or more than one ultrasonic sensor, no Barrier can be only measured, the detailed traveling situation of itself can also be obtained.As it can be seen that can be with by above-mentioned active safety control plate Realize intelligent barrier avoiding.
Above-mentioned robot can also provide remote monitoring function, and each biography of above-mentioned robot is specially sent to server The collected data of sensor institute, at the same to server send above-mentioned robot collected track realtime graphic, above-mentioned The collected data of each sensor institute and the data for carrying out fusion arrangement, and being obtained after fusion is arranged are sent on server It is shown to the mobile terminal bound in advance.After above-mentioned fusion arrangement obtained data get muddled etc. abnormal phenomenon when, It can be reminded on mobile terminals by modes such as vibrations, word and/or the tinkle of bells, it is abnormal existing to prompt service personnel to pay close attention to As;The data obtained after being arranged due to above-mentioned fusion include the realtime graphic of track, thus service personnel can also independently consult Current realtime graphic, to judge robot whether in stable traveling process, when service personnel thinks current robot Traveling can send pause instruction, above-mentioned track machine people receives by client or server there are when problem to robot After the pause instruction sent to client or server, above-mentioned pause instruction can be based on and suspend above-mentioned robot traveling, so that It obtains service personnel and rushes to robot position recycling machine people at the first time.
Optionally, Fig. 3 is shown as the artificial flying robot of machine, the specific implementation flow of step 102, is described in detail such as Under:
In step 301, it is based on above-mentioned live video stream, obtains realtime graphic;
In embodiments of the present invention, it is believed that the image of the newest frame currently got is realtime graphic.
In step 302, coordinate system is initialized;
In embodiments of the present invention, the x-axis of above-mentioned coordinate system is the extending direction of track, and the y-axis of above-mentioned coordinate system is rail Pillow is laid with direction, and the z-axis of above-mentioned coordinate system is perpendicular to the direction in track laying face.Fig. 4 (a) shows a kind of showing for coordinate system It is intended to, can be seen that by Fig. 4 (a), x-axis coordinate represents the distance that Robot track has been advanced, and y-axis coordinate represents machine The distance of people's offset track center line, z-axis coordinate represent the flying height of the relative orbit paved surface of robot.
In step 303, above-mentioned robot receives the RFID tag for carrying location information that radio-frequency unit is sent, X-axis positioning is carried out according to above-mentioned RFID tag;
In embodiments of the present invention, above-mentioned radio-frequency unit is pre-set on the route of above-mentioned track, for example, can be every A radio-frequency unit is just arranged in preset milimeter number (such as three kilometers), and robot, can at the set-point for passing by the radio-frequency unit Distinguishing label with the radio frequency sent according to the radio-frequency unit positions the x-axis coordinate of itself, that is, knows itself along track The distance of traveling.Alternatively, can also be according to being installed on the collected realtime graphic of camera institute in orientation under robot to track In sleeper be identified, the quantity of sleeper passed through in robot traveling process is calculated, due to the laying distance of sleeper It is fixed, thus the laying distance of sleeper can be multiplied by according to the quantity for the sleeper for calculating gained, knows itself along track The distance of traveling.It is alternatively possible in conjunction with above two calculation, first the distance of traveling is counted according to the quantity of sleeper It calculates, is then corrected further according to the distance for carrying traveling of the RFID tag of location information to being calculated so that x The result of axis positioning is more accurate.
In step 304, above-mentioned robot is according in the center line of above-mentioned realtime graphic and above-mentioned realtime graphic middle orbit The relative position of line carries out y-axis positioning;
In embodiments of the present invention, the realtime graphic in this step is acquired by the camera for being installed on orientation under robot The realtime graphic arrived, theoretically considers, when robot towards track extending direction movement when, if positioned at track just on Side, then the center line of track overlaps the center line with image, just shown as shown in Figure 4.Thus, it can calculate in above-mentioned realtime graphic, The distance of the center line of realtime graphic to the center line of realtime graphic middle orbit is multiplied by certain engineer's scale, you can obtain according to the distance Know the distance of robot offset track center line.
In step 305, above-mentioned robot acquires the flying height of above-mentioned robot, root by the range sensor of carrying Z-axis positioning is carried out according to above-mentioned flying height;
In embodiments of the present invention, robot is acquired to track paving by being equipped on the range sensor in orientation under robot If the distance in face, which is flying height, and z-axis positioning is carried out according to above-mentioned flying height;
Within step 306, according to the positioning of above-mentioned x-axis as a result, the result of above-mentioned y-axis positioning and the knot of above-mentioned z-axis positioning Fruit determines above-mentioned robot coordinate residing in above-mentioned coordinate system.
In embodiments of the present invention, according to the positioning of above-mentioned x-axis as a result, the result and above-mentioned z-axis of above-mentioned y-axis positioning position As a result, understanding that the mileage number of the track residing for robot, the distance of offset track center line and flying height namely robot exist Residing coordinate in above-mentioned coordinate system.
Optionally, Fig. 5 is shown as the artificial flying robot of machine, the specific implementation flow of step 103, is described in detail such as Under:
In step 501, it is based on above-mentioned live video stream, obtains realtime graphic;
In embodiments of the present invention, it is believed that the image of the newest frame currently got is realtime graphic.
In step 502, binary conversion treatment is carried out to above-mentioned realtime graphic;
In embodiments of the present invention, the binary-state threshold used when first setting binary conversion treatment specifically may be used double Peak method, maximum variance between clusters, maximum entropy threshold method or iterative method set above-mentioned binary-state threshold, and based on above-mentioned setting Binary-state threshold carries out binary conversion treatment to each frame image of above-mentioned live video stream.
In step 503, the area-of-interest of the above-mentioned realtime graphic after binary conversion treatment is extracted;
In embodiments of the present invention, above-mentioned each frame after binary conversion treatment can be extracted by preset operator and function The area-of-interest of image.
In step 504, longitudinal rectangular area in above-mentioned area-of-interest is extracted based on Morphology Algorithm;
In embodiments of the present invention, which includes the image of two rail of track;
In step 505, by straight-line detection, two rail of above-mentioned track are determined in above-mentioned longitudinal rectangular area Position;
In embodiments of the present invention, Hough transformation method may be used and carry out straight-line detection, LSD (Line can also be used Segment Detector) straight-line detection is carried out, it is not construed as limiting herein.
In step 506, the position of above-mentioned two rail is fitted, obtains the center line of two rail;
In step 507, the position of center line and above-mentioned robot based on above-mentioned two rail determines above-mentioned robot Traveling scheme.
It in embodiments of the present invention, can be based on the position of above-mentioned robot, in advance after obtaining the center line of two rail The position of certain point on the center line of two rail is entered, then using the center line of two rail as the traveling scheme of robot;Alternatively, Can also be directly from one route parallel with the above-mentioned center line of two rail of the location determination of robot as traveling scheme, this Place is not construed as limiting.
Optionally, when the artificial Gui Hang robots of machine, since Gui Hang robots are advanced automatically on rail, y-axis And z-axis is fixed, thus directly the quantity of sleeper can be identified by collected realtime graphic during traveling And count, the quantity of the sleeper passed through in robot traveling process is obtained, multiplied by with the laying distance of sleeper, knows itself edge The distance that track has been advanced.It is alternatively possible to first be calculated the distance of traveling according to the quantity of sleeper, then further according to taking The distance of traveling of the RFID tag with location information to being calculated is corrected so that the result of x-axis positioning is more It is accurate to add.
Optionally, when the artificial Gui Hang robots of machine, above-mentioned track detection method further includes:
In the traveling process of above-mentioned robot, the bolt of above-mentioned track is tapped by above-mentioned manipulator;
Acquire audio when above-mentioned robot taps the bolt of above-mentioned track;
Audio analysis is carried out to above-mentioned audio, whether the above-mentioned bolt of result judgement based on audio analysis loosens;
It has been loosened if it is determined that result is above-mentioned bolt, has then sent above-mentioned judgement knot to mobile terminal and/or above-mentioned server Fruit, to notify service personnel to overhaul above-mentioned bolt.
Wherein, above-mentioned Gui Hang robots are also equipped with manipulator, and above-mentioned robot is during traveling, when by real-time When the analysis of image finds the bolt installed in track, it can suspend and advance and bolt is positioned, based on determining bolt Position output control manipulator taps above-mentioned bolt, and opens the sound-recording function of robot, when acquisition robot taps above-mentioned bolt Audio.When bolt looseness, tapping the tone color of bolt can change, i.e., the ingredient of the middle higher hamonic wave of collected audio With the amplitude proportional of each ingredient also can different from, by comparing collected audio and preset audio volume control, judgement is above-mentioned Whether bolt loosens, if collected audio and preset audio volume control similarity are higher, it is determined that judges result for bolt not It loosens, robot can continue on;It is preset less than one if collected audio and preset audio volume control similarity are relatively low Similarity threshold, it is determined that judgement result is that above-mentioned bolt has loosened, and can be sent to mobile terminal and/or above-mentioned server Judgement is stated as a result, to notify service personnel to overhaul above-mentioned bolt.
Therefore through the embodiment of the present invention so that robot realizes base according to the realtime graphic of collected track It advances in the intelligence of track and intelligent measurement operates, the doubtful orbital region to break down can be quickly found out, and will be in the failure It reports to server so that server can carry out secondary analysis, and the result based on secondary analysis notifies service personnel in time. During this, it is no longer necessary to manually carry out inspection to track, but give robot to execute the operation of the Daily Round Check of track, lead to Crossing robot, quickly identification reduces false retrieval, the possibility that missing inspection occurs, realizes to rail there are the region of failure during inspection The automatic detection of the infrastructure and infrastructure device of road traffic ensures Rail Transit System safe operation.
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in above-described embodiment, each process Execution sequence should be determined by its function and internal logic, the implementation process without coping with the embodiment of the present invention constitutes any limit It is fixed.
Embodiment two
Fig. 6 shows that the concrete structure block diagram of robot provided by Embodiment 2 of the present invention is only shown for convenience of description With relevant part of the embodiment of the present invention.The robot 6 includes:Image capture module 61, vision positioning module 62, scheme is true Cover half block 63, traveling control module 64, fault detection module 65, information uploading module 66.
Wherein, video acquisition module 61 are used for the live video stream of machine acquisition trajectory;
Vision positioning module 62 determines the position of above-mentioned robot by vision positioning for being based on above-mentioned live video stream It sets;
Scheme determining module 63 is used for the traveling scheme of the above-mentioned robot of location determination according to above-mentioned robot;
Traveling control module 64 is advanced for controlling above-mentioned robot according to above-mentioned traveling scheme;
Fault detection module 65, in each frame image of above-mentioned live video stream, detecting whether there are target image, Wherein, above-mentioned target image is the image for the fault zone for showing above-mentioned track;
Information uploading module 66, for when in each frame image in above-mentioned live video stream there are when above-mentioned target image, Above-mentioned target image is uploaded to server so that above-mentioned server carries out secondary analysis to above-mentioned target image, and is based on two Above-mentioned target image is matched to corresponding client and shown by the result of secondary analysis.
Optionally, above-mentioned fault detection module 65 includes:
First binarization unit carries out binary conversion treatment for each frame image to above-mentioned live video stream;
First extraction unit, the area-of-interest for extracting above-mentioned each frame image after binary conversion treatment;
Denoising unit, for carrying out denoising operation to the area-of-interest of above-mentioned each frame image based on Morphology Algorithm;
Difference calculation units, for for each frame image after denoising, calculating the area-of-interest and consecutive frame of image The characteristics of image difference of the area-of-interest of image, wherein above-mentioned consecutive frame image be in above-mentioned live video stream with it is above-mentioned The adjacent image of image;
Target image determination unit, if the region of interest of the area-of-interest and above-mentioned consecutive frame image for above-mentioned image The characteristics of image difference in domain has exceeded preset characteristics of image disparity range, then by above-mentioned image and/or above-mentioned consecutive frame image It is determined as target image.
Optionally, above-mentioned robot 6 further includes:
Detection of obstacles module, the distance in the traveling process of above-mentioned robot, being carried based on above-mentioned robot Sensor, obtain above-mentioned robot in above-mentioned robot local environment at a distance from barrier;
Project setting module carries out above-mentioned traveling scheme if being not more than preset distance threshold for above-mentioned distance Adjustment is advanced alternatively, suspending above-mentioned robot.
Optionally, above-mentioned robot 6 further includes:
Command reception module is used in the traveling process of above-mentioned robot, if above-mentioned track machine people receives movement The pause instruction that terminal is sent then is based on above-mentioned pause instruction and suspends above-mentioned robot traveling.
Optionally, above-mentioned image capture module 61 includes:
Multichannel ultra high-definition image acquisition units are used for the multichannel ultra high-definition image of the above-mentioned track of captured in real-time;
Infrared image acquisition unit is used for the infrared image of the above-mentioned track of captured in real-time;
Picture quality detection unit, for detecting whether above-mentioned multichannel ultra high-definition image and above-mentioned infrared image reach default Picture quality condition;
Image culling unit, for when there is the not up to multichannel ultra high-definition image of above-mentioned picture quality condition and/or red When outer image, the multichannel ultra high-definition image and/or infrared image of above-mentioned not up to above-mentioned picture quality condition are rejected.
Optionally, above- mentioned information uploading module 66 includes:
Acquisition time acquiring unit, the acquisition time for obtaining above-mentioned target image;
Collecting location acquiring unit, the collecting location for obtaining above-mentioned target image;
Image information uploading unit is used for above-mentioned target image, the acquisition time of above-mentioned target image and above-mentioned target The collecting location of image is uploaded to server.
Optionally, as the artificial flying robot of above-mentioned machine, above-mentioned vision positioning module 62 includes:
Realtime graphic acquiring unit obtains realtime graphic for being based on above-mentioned live video stream;
Initialization unit, for initializing coordinate system, wherein the x-axis of above-mentioned coordinate system is the extending direction of track, above-mentioned The y-axis of coordinate system is sleeper laying direction, and the z-axis of above-mentioned coordinate system is perpendicular to the direction in track laying face;
X-axis positioning unit, the RFID tag for carrying location information for receiving radio-frequency unit transmission, according to upper It states RFID tag and carries out x-axis positioning, wherein above-mentioned radio-frequency unit is pre-set on the route of above-mentioned track;
Y-axis positioning unit is used for the phase of the center line and the center line of above-mentioned realtime graphic middle orbit according to above-mentioned realtime graphic Y-axis positioning is carried out to position;
Z-axis positioning unit, the flying height for acquiring above-mentioned robot by the range sensor of carrying, according to above-mentioned Flying height carries out z-axis positioning;
Coordinate determination unit, for according to above-mentioned x-axis position as a result, result and the positioning of above-mentioned z-axis of above-mentioned y-axis positioning As a result, determining above-mentioned robot coordinate residing in above-mentioned coordinate system.
Optionally, as the artificial flying robot of above-mentioned machine, said program determination unit 63 includes:
Realtime graphic acquiring unit obtains realtime graphic for being based on above-mentioned live video stream;
Second binarization unit, for carrying out binary conversion treatment to above-mentioned realtime graphic;
Second extraction unit, the area-of-interest for extracting the above-mentioned realtime graphic after binary conversion treatment;
Third extraction unit, for extracting longitudinal rectangular area in above-mentioned area-of-interest based on Morphology Algorithm;
Straight-line detection unit, for by straight-line detection, the two of above-mentioned track to be determined in above-mentioned longitudinal rectangular area The position of bar rail;
Fitting unit is fitted for the position to above-mentioned two rail, obtains the center line of two rail;
Traveling scheme determination unit is used for the position of center line and above-mentioned robot based on above-mentioned two rail, in determination State the traveling scheme of robot.
Optionally, when the artificial Gui Hang robots of above-mentioned machine, above-mentioned robot is equipped with manipulator, and above-mentioned robot is also Including:
Module is tapped, for during traveling, the bolt of above-mentioned track to be tapped by above-mentioned manipulator;
Audio collection module, for acquiring audio when above-mentioned robot taps the bolt of above-mentioned track;
Audio analysis module, for carrying out audio analysis, the above-mentioned spiral shell of result judgement based on audio analysis to above-mentioned audio Whether bolt loosens;
Bolt looseness determination module, for when it is that above-mentioned bolt has loosened to judge result, to mobile terminal and/or above-mentioned Server sends above-mentioned judgement as a result, to notify service personnel to overhaul above-mentioned bolt.
Therefore through the embodiment of the present invention, robot can be realized according to the live video stream of collected track Intelligence based on track is advanced and intelligent measurement operation, is quickly found out the doubtful orbital region to break down, and will be in the failure It reports to server so that server can carry out secondary analysis, and be matched above-mentioned target image based on the result of secondary analysis It is shown to corresponding client, to notify service personnel in time.In the process, it is no longer necessary to inspection manually is carried out to track, and It is to give robot to execute the Daily Round Check operation of track, by robot, quickly there are failures for identification during inspection False retrieval is reduced, the possibility that missing inspection occurs in region, realizes that the automation to the infrastructure and infrastructure device of rail traffic is examined It surveys, ensures Rail Transit System safe operation.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each work( Can unit, module division progress for example, in practical application, can be as needed and by above-mentioned function distribution by different Functional unit, module complete, i.e., the internal structure of above-mentioned robot is divided into different functional units or module, with complete with The all or part of function of upper description.Each functional unit, module in embodiment can be integrated in a processing unit, Can be that each unit, module physically exist alone, can also two or more modules, unit be integrated in one unit In, the form that hardware had both may be used in above-mentioned integrated module, unit is realized, software function module, unit can also be used Form is realized.In addition, the specific name of each functional unit, module is also only to facilitate mutually differentiation, is not limited to this The protection domain of application.The specific work process of unit in above system, module can refer to pair in preceding method embodiment Process is answered, details are not described herein.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, is not described in detail or remembers in some embodiment The part of load may refer to the associated description of other embodiments.
Those of ordinary skill in the art may realize that lists described in conjunction with the examples disclosed in the embodiments of the present disclosure Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is implemented in hardware or software, depends on the specific application and design constraint of technical solution.Professional technician Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed robot and track detection method, it can be with It realizes by another way.For example, robot described above is only schematical, for example, above-mentioned module or unit Division, only a kind of division of logic function, formula that in actual implementation, there may be another division manner, such as multiple units or group Part can be combined or can be integrated into another system, or some features can be ignored or not executed.Another point, it is shown Or the mutual coupling or direct-coupling or communication connection discussed can be by some interfaces, device or unit it is indirect Coupling or communication connection can be electrical, machinery or other forms.
Embodiment described above is merely illustrative of the technical solution of the present invention, rather than its limitations;Although with reference to aforementioned reality Applying example, invention is explained in detail, it will be understood by those of ordinary skill in the art that:It still can be to aforementioned each Technical solution recorded in embodiment is modified or equivalent replacement of some of the technical features;And these are changed Or replace, the spirit and scope for various embodiments of the present invention technical solution that it does not separate the essence of the corresponding technical solution should all It is included within protection scope of the present invention.

Claims (10)

1. a kind of track detection method, which is characterized in that the track detection method includes:
The live video stream of robot acquisition trajectory;
Based on the live video stream, the position of the robot is determined by vision positioning;
According to the traveling scheme of robot described in the location determination of the robot;
The robot is controlled to be advanced according to the traveling scheme;
In each frame image of the live video stream, detect whether that there are target images, wherein the target image is display There is the image of the fault zone of the track;
If in each frame image of the live video stream, there are the target images, and the target image is uploaded to service Device so that the server carries out secondary analysis to the target image, and based on the result of secondary analysis by the target figure It is shown as being matched to corresponding client.
2. track detection method as described in claim 1, which is characterized in that each frame image in the live video stream In, detect whether that there are target images, including:
Binary conversion treatment is carried out to each frame image of the live video stream;
Extract the area-of-interest of each frame image after binary conversion treatment;
Denoising operation is carried out to the area-of-interest of each frame image based on Morphology Algorithm;
For each frame image after denoising, the image of the area-of-interest of image and the area-of-interest of consecutive frame image is calculated Feature difference, wherein the consecutive frame image is image adjacent with described image in the live video stream;
If the characteristics of image difference of the area-of-interest of described image and the area-of-interest of the consecutive frame image has exceeded pre- If characteristics of image disparity range, then described image and/or the consecutive frame image are determined as target image.
3. track detection method as described in claim 1, which is characterized in that the track detection method further includes:
In the traveling process of the robot, based on the robot carry range sensor, obtain the robot with The distance of barrier in the robot local environment;
If the distance is not more than preset distance threshold, the traveling scheme is adjusted, alternatively, suspending the machine People advances.
4. track detection method as described in claim 1, which is characterized in that the track detection method further includes:
In the traveling process of the robot, if the track machine people receives the pause instruction of mobile terminal transmission, Suspend the robot based on the pause instruction to advance.
5. track detection method as described in claim 1, which is characterized in that the real-time video of the robot acquisition trajectory Stream, including:
The infrared image of the multichannel ultra high-definition image of track and the track described in the robot captured in real-time;
It detects the multichannel ultra high-definition image and whether the infrared image reaches preset picture quality condition;
If there is the multichannel ultra high-definition image and/or infrared image for being not up to described image quality requirements, described will be not up to The multichannel ultra high-definition image and/or infrared image of described image quality requirements are rejected.
6. track detection method as described in claim 1, which is characterized in that described that the target image is uploaded to service Device, including:
Obtain the acquisition time and collecting location of the target image;
The collecting location of the target image, the acquisition time of the target image and the target image is uploaded to service Device.
7. such as claim 1 to 6 any one of them track detection method, which is characterized in that the artificial flying machine of machine People, it is described to be based on the live video stream, the position of the robot is determined by vision positioning, including:
Based on the live video stream, realtime graphic is obtained;
Initialize coordinate system, wherein the x-axis of the coordinate system is the extending direction of track, and the y-axis of the coordinate system spreads for sleeper The z-axis of set direction, the coordinate system is perpendicular to the direction in track laying face;
The robot receives the RFID tag for carrying location information that radio-frequency unit is sent, according to the radio frequency identification Label carries out x-axis positioning, wherein the radio-frequency unit is pre-set on the route of the track;
The robot carries out y according to the relative position of the center line of the realtime graphic and the center line of the realtime graphic middle orbit Axis positions;
The robot acquires the flying height of the robot by the range sensor of carrying, according to the flying height into Row z-axis positions;
According to x-axis positioning as a result, that the result and the z-axis that the y-axis positions position as a result, determining the robot The residing coordinate in the coordinate system.
8. such as claim 1 to 6 any one of them track detection method, which is characterized in that the artificial flying machine of machine People, the traveling scheme of robot described in the location determination according to the robot, including:
Based on the live video stream, realtime graphic is obtained;
Binary conversion treatment is carried out to the realtime graphic;
Extract the area-of-interest of the realtime graphic after binary conversion treatment;
Longitudinal rectangular area in the area-of-interest is extracted based on Morphology Algorithm;
By straight-line detection, the position of two rail of the track is determined in longitudinal rectangular area;
The position of two rail is fitted, the center line of two rail is obtained;
The position of center line and the robot based on two rail, determines the traveling scheme of the robot.
9. such as claim 1 to 6 any one of them track detection method, which is characterized in that the artificial rail row machine of machine People, the robot are equipped with manipulator, and the track detection method further includes:
In the traveling process of the robot, the bolt of the track is tapped by the manipulator;
Acquire the audio when robot taps the bolt of the track;
Audio analysis is carried out to the audio, whether bolt loosens described in the result judgement based on audio analysis;
Loosened if it is determined that result is the bolt, then send the judgement to mobile terminal and/or the server as a result, with Service personnel is notified to overhaul the bolt.
10. a kind of robot, which is characterized in that the robot application includes in field of track traffic, the robot:
Video acquisition module is used for the live video stream of machine acquisition trajectory;
Vision positioning module determines the position of the robot by vision positioning for being based on the live video stream;
Scheme determining module, the traveling scheme for robot described in the location determination according to the robot;
Traveling control module is advanced for controlling the robot according to the traveling scheme;
Fault detection module, in each frame image of the live video stream, detecting whether that there are target images, wherein The target image is the image for the fault zone for showing the track;
Information uploading module, for when in each frame image in the live video stream there are when the target image, will be described Target image is uploaded to server so that the server carries out secondary analysis to the target image, and is based on secondary analysis Result the target image be matched to corresponding client show.
CN201810126117.2A 2018-02-08 2018-02-08 Track detection method and robot Expired - Fee Related CN108491758B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810126117.2A CN108491758B (en) 2018-02-08 2018-02-08 Track detection method and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810126117.2A CN108491758B (en) 2018-02-08 2018-02-08 Track detection method and robot

Publications (2)

Publication Number Publication Date
CN108491758A true CN108491758A (en) 2018-09-04
CN108491758B CN108491758B (en) 2020-11-20

Family

ID=63339994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810126117.2A Expired - Fee Related CN108491758B (en) 2018-02-08 2018-02-08 Track detection method and robot

Country Status (1)

Country Link
CN (1) CN108491758B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109580137A (en) * 2018-11-29 2019-04-05 东南大学 A kind of bridge structure displacement influence line measurement method based on computer vision technique
CN111024431A (en) * 2019-12-26 2020-04-17 江西交通职业技术学院 Bridge rapid detection vehicle based on multi-sensor unmanned driving
CN111077159A (en) * 2019-12-31 2020-04-28 北京京天威科技发展有限公司 Fault detection method, system, equipment and readable medium for track circuit box
CN111380537A (en) * 2018-12-27 2020-07-07 奥迪股份公司 Method and device for positioning target position in navigation map
CN111668925A (en) * 2019-03-05 2020-09-15 特变电工智能电气有限责任公司 Transformer inspection tour inspection device based on intelligent vision
CN111672045A (en) * 2020-05-21 2020-09-18 国网湖南省电力有限公司 Fire-fighting robot, fire-fighting system and fire-fighting control method
CN112014848A (en) * 2020-02-11 2020-12-01 深圳技术大学 Sleeper positioning method, sleeper positioning device and electronic equipment
CN112444519A (en) * 2019-08-30 2021-03-05 比亚迪股份有限公司 Vehicle fault detection device and method
CN112508911A (en) * 2020-12-03 2021-03-16 合肥科大智能机器人技术有限公司 Rail joint touch net suspension support component crack detection system based on inspection robot and detection method thereof
CN113085923A (en) * 2021-04-15 2021-07-09 北京智川科技发展有限公司 Track detection method and device, automatic track detection vehicle and storage medium
CN113111704A (en) * 2021-03-02 2021-07-13 郑州大学 Airport pavement disease and foreign matter detection method and system based on deep learning
CN114821165A (en) * 2022-04-19 2022-07-29 北京运达华开科技有限公司 Track detection image acquisition and analysis method
CN114973694A (en) * 2022-05-19 2022-08-30 杭州中威电子股份有限公司 Tunnel traffic flow monitoring system and method based on inspection robot
CN115601719A (en) * 2022-12-13 2023-01-13 中铁十二局集团有限公司(Cn) Climbing robot and method for detecting invasion of foreign objects in subway tunnel
CN115760989A (en) * 2023-01-10 2023-03-07 西安华创马科智能控制***有限公司 Hydraulic support robot track alignment method and device
CN116596731A (en) * 2023-05-25 2023-08-15 北京贝能达信息技术股份有限公司 Rail transit intelligent operation and maintenance big data management method and system

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101519981A (en) * 2009-03-19 2009-09-02 重庆大学 Mine locomotive anti-collision early warning system based on monocular vision and early warning method thereof
CN202178515U (en) * 2011-07-30 2012-03-28 山东鲁能智能技术有限公司 Transformer station intelligent robot inspection system
CN102490764A (en) * 2011-12-13 2012-06-13 天津卓朗科技发展有限公司 Automatic detection method of track turnout notch
CN104331910A (en) * 2014-11-24 2015-02-04 沈阳建筑大学 Track obstacle detection system based on machine vision
CN104796664A (en) * 2015-03-26 2015-07-22 成都市斯达鑫辉视讯科技有限公司 Video monitoring device
CN105700532A (en) * 2016-04-19 2016-06-22 长沙理工大学 Vision-based transformer substation inspection robot navigation positioning control method
CN106341661A (en) * 2016-09-13 2017-01-18 深圳市大道智创科技有限公司 Patrol robot
CN106444588A (en) * 2016-11-30 2017-02-22 国家电网公司 Inspection system and inspection method of valve hall robot based on video monitoring linkage system
CN106428558A (en) * 2016-11-28 2017-02-22 北京交通大学 Rail comprehensive inspection method based on air-rail double-purpose unmanned aerial vehicle
CN107071344A (en) * 2017-01-22 2017-08-18 深圳英飞拓科技股份有限公司 A kind of large-scale distributed monitor video data processing method and device
CN107084754A (en) * 2017-04-27 2017-08-22 深圳万发创新进出口贸易有限公司 A kind of transformer fault detection device
CN206653397U (en) * 2017-03-29 2017-11-21 张胜雷 The towed electricity of hanger rail leads to integrated piping lane environmental monitoring intelligent miniature robot
CN107433952A (en) * 2017-05-12 2017-12-05 北京瑞途科技有限公司 A kind of intelligent inspection robot

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101519981A (en) * 2009-03-19 2009-09-02 重庆大学 Mine locomotive anti-collision early warning system based on monocular vision and early warning method thereof
CN202178515U (en) * 2011-07-30 2012-03-28 山东鲁能智能技术有限公司 Transformer station intelligent robot inspection system
CN102490764A (en) * 2011-12-13 2012-06-13 天津卓朗科技发展有限公司 Automatic detection method of track turnout notch
CN104331910A (en) * 2014-11-24 2015-02-04 沈阳建筑大学 Track obstacle detection system based on machine vision
CN104796664A (en) * 2015-03-26 2015-07-22 成都市斯达鑫辉视讯科技有限公司 Video monitoring device
CN105700532A (en) * 2016-04-19 2016-06-22 长沙理工大学 Vision-based transformer substation inspection robot navigation positioning control method
CN106341661A (en) * 2016-09-13 2017-01-18 深圳市大道智创科技有限公司 Patrol robot
CN106428558A (en) * 2016-11-28 2017-02-22 北京交通大学 Rail comprehensive inspection method based on air-rail double-purpose unmanned aerial vehicle
CN106444588A (en) * 2016-11-30 2017-02-22 国家电网公司 Inspection system and inspection method of valve hall robot based on video monitoring linkage system
CN107071344A (en) * 2017-01-22 2017-08-18 深圳英飞拓科技股份有限公司 A kind of large-scale distributed monitor video data processing method and device
CN206653397U (en) * 2017-03-29 2017-11-21 张胜雷 The towed electricity of hanger rail leads to integrated piping lane environmental monitoring intelligent miniature robot
CN107084754A (en) * 2017-04-27 2017-08-22 深圳万发创新进出口贸易有限公司 A kind of transformer fault detection device
CN107433952A (en) * 2017-05-12 2017-12-05 北京瑞途科技有限公司 A kind of intelligent inspection robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
柳斐: "变电站定轨自主巡视机器人***研究", 《中国优秀硕士学位论文全文数据库(电子期刊)工程科技Ⅱ辑》 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109580137A (en) * 2018-11-29 2019-04-05 东南大学 A kind of bridge structure displacement influence line measurement method based on computer vision technique
CN111380537A (en) * 2018-12-27 2020-07-07 奥迪股份公司 Method and device for positioning target position in navigation map
CN111668925A (en) * 2019-03-05 2020-09-15 特变电工智能电气有限责任公司 Transformer inspection tour inspection device based on intelligent vision
CN112444519B (en) * 2019-08-30 2022-07-15 比亚迪股份有限公司 Vehicle fault detection device and method
CN112444519A (en) * 2019-08-30 2021-03-05 比亚迪股份有限公司 Vehicle fault detection device and method
CN111024431A (en) * 2019-12-26 2020-04-17 江西交通职业技术学院 Bridge rapid detection vehicle based on multi-sensor unmanned driving
CN111024431B (en) * 2019-12-26 2022-03-11 江西交通职业技术学院 Bridge rapid detection vehicle based on multi-sensor unmanned driving
CN111077159A (en) * 2019-12-31 2020-04-28 北京京天威科技发展有限公司 Fault detection method, system, equipment and readable medium for track circuit box
CN112014848A (en) * 2020-02-11 2020-12-01 深圳技术大学 Sleeper positioning method, sleeper positioning device and electronic equipment
CN112014848B (en) * 2020-02-11 2023-06-23 深圳技术大学 Sleeper positioning method, sleeper positioning device and electronic equipment
CN111672045B (en) * 2020-05-21 2021-11-30 国网湖南省电力有限公司 Fire-fighting robot, fire-fighting system and fire-fighting control method
CN111672045A (en) * 2020-05-21 2020-09-18 国网湖南省电力有限公司 Fire-fighting robot, fire-fighting system and fire-fighting control method
CN112508911A (en) * 2020-12-03 2021-03-16 合肥科大智能机器人技术有限公司 Rail joint touch net suspension support component crack detection system based on inspection robot and detection method thereof
CN113111704A (en) * 2021-03-02 2021-07-13 郑州大学 Airport pavement disease and foreign matter detection method and system based on deep learning
CN113085923B (en) * 2021-04-15 2022-01-25 北京智川科技发展有限公司 Track detection method and device, automatic track detection vehicle and storage medium
CN113085923A (en) * 2021-04-15 2021-07-09 北京智川科技发展有限公司 Track detection method and device, automatic track detection vehicle and storage medium
CN114821165A (en) * 2022-04-19 2022-07-29 北京运达华开科技有限公司 Track detection image acquisition and analysis method
CN114973694A (en) * 2022-05-19 2022-08-30 杭州中威电子股份有限公司 Tunnel traffic flow monitoring system and method based on inspection robot
CN114973694B (en) * 2022-05-19 2024-05-24 杭州中威电子股份有限公司 Tunnel traffic flow monitoring system and method based on inspection robot
CN115601719A (en) * 2022-12-13 2023-01-13 中铁十二局集团有限公司(Cn) Climbing robot and method for detecting invasion of foreign objects in subway tunnel
CN115760989A (en) * 2023-01-10 2023-03-07 西安华创马科智能控制***有限公司 Hydraulic support robot track alignment method and device
CN116596731A (en) * 2023-05-25 2023-08-15 北京贝能达信息技术股份有限公司 Rail transit intelligent operation and maintenance big data management method and system

Also Published As

Publication number Publication date
CN108491758B (en) 2020-11-20

Similar Documents

Publication Publication Date Title
CN108491758A (en) A kind of track detection method and robot
WO2022037278A1 (en) Substation inspection robot system based on artificial intelligence
KR101949525B1 (en) Safety management system using unmanned detector
CN108957240A (en) Electric network fault is remotely located method and system
CN110492607A (en) A kind of intelligent substation condition monitoring system based on ubiquitous electric power Internet of Things
CN108494097A (en) Robot inspection system for substation
CN108956640A (en) Vehicle-mounted detection apparatus and detection method suitable for distribution line inspection
CN104951775A (en) Video technology based secure and smart recognition method for railway crossing protection zone
CN102915638A (en) Surveillance video-based intelligent parking lot management system
CN108710827B (en) A kind of micro- police service inspection in community and information automatic analysis system and method
CN103729908A (en) Intelligent inspection device of railway tunnel and application method thereof
CN109208468B (en) Automatic bridge inspection system based on video analysis
CN110874866B (en) Video-based three-dimensional monitoring method and system for transformer substation
CN108008666A (en) One kind building neutral net and its method of work
CN106444684A (en) Electrical equipment operating environment data acquisition and telecontrol comprehensive linkage device and use method
CN109905667A (en) A kind of video monitoring system suitable for large span special construction bridge
CN115661964A (en) Intelligent inspection system for comprehensive pipe gallery
CN205827660U (en) A kind of open type parking ground parking management equipment
KR101989376B1 (en) Integrated track circuit total monitoring system
CN113014870B (en) Subway gate passage ticket evasion identification method based on passenger posture rapid estimation
CN113691778A (en) Panoramic station patrol system for urban rail transit station
CN213518003U (en) A patrol and examine robot and system of patrolling and examining for airport pavement
CN109934161A (en) Vehicle identification and detection method and system based on convolutional neural network
CN112532927A (en) Intelligent safety management and control system for construction site
CN107991953A (en) One kind building cental system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201120