CN113657331A - Warning line infrared induction identification method and device, computer equipment and storage medium - Google Patents

Warning line infrared induction identification method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113657331A
CN113657331A CN202110978216.5A CN202110978216A CN113657331A CN 113657331 A CN113657331 A CN 113657331A CN 202110978216 A CN202110978216 A CN 202110978216A CN 113657331 A CN113657331 A CN 113657331A
Authority
CN
China
Prior art keywords
warning line
region
line
robot
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110978216.5A
Other languages
Chinese (zh)
Inventor
伍志峰
涂志伟
施健
王一科
贾林
涂静一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Kewei Robot Technology Co ltd
Original Assignee
Shenzhen Kewei Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Kewei Robot Technology Co ltd filed Critical Shenzhen Kewei Robot Technology Co ltd
Priority to CN202110978216.5A priority Critical patent/CN113657331A/en
Publication of CN113657331A publication Critical patent/CN113657331A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a warning line infrared induction identification method, a warning line infrared induction identification device, computer equipment and a storage medium. The method comprises the following steps: acquiring an image with a warning line shot by a depth camera to obtain a binary image and depth map information; determining the region where the warning line is located from the binary image according to the design rule of the warning line; determining radar information of the area where the warning line is located according to the area where the warning line is located and the depth map information; acquiring a time interval between image acquisition and image processing; determining the distance between the robot and the warning line according to the current moving speed and the time interval of the robot and the radar information of the region where the warning line is located; and generating a control signal according to the distance between the robot and the warning line, and sending the control signal to the robot so that the robot performs corresponding operation. By implementing the method provided by the embodiment of the invention, the robot can accurately identify the ground warning line, and the robot can more flexibly avoid the warning line and avoid some safety accidents.

Description

Warning line infrared induction identification method and device, computer equipment and storage medium
Technical Field
The invention relates to a robot, in particular to a warning line infrared induction identification method, a warning line infrared induction identification device, computer equipment and a storage medium.
Background
With the development of AI (Artificial Intelligence), robots are generally used in various fields of life, and a delivery robot realizes unmanned delivery, and a floor sweeping robot realizes unmanned cleaning of the ground, so that the robot works safely and efficiently and becomes a topic concerned by people.
In daily life, all can be provided with the warning line all around some danger areas subaerial, remind people to notice underfoot safety, play the warning effect, the robot uses laser radar to come the perception environment, plan the route, the barrier is kept away to intelligence, nevertheless when the robot location is lost, can remove some and fall the danger zone, like stair mouth and step edge etc. because laser radar's limitation, unable accurate discernment subaerial warning line, can't perceive falling the danger zone on the route in advance, thereby lead to the robot to fall, cause the incident.
Therefore, it is necessary to design a new method for realizing that the robot can accurately identify the ground warning line, and the robot can avoid the warning line more flexibly to avoid some safety accidents.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a warning line infrared induction identification method, a warning line infrared induction identification device, computer equipment and a storage medium.
In order to achieve the purpose, the invention adopts the following technical scheme: the warning line infrared induction identification method comprises the following steps:
acquiring an image with a warning line shot by a depth camera to obtain a binary image and depth map information;
determining the region where the warning line is located from the binary image according to the design rule of the warning line;
determining radar information of the region where the warning line is located according to the region where the warning line is located and the depth map information;
acquiring a time interval between image acquisition and image processing;
determining the distance between the robot and the warning line according to the current moving speed and the time interval of the robot and the radar information of the region where the warning line is located;
and generating a control signal according to the distance between the robot and the warning line, and sending the control signal to the robot so that the robot performs corresponding operation.
The further technical scheme is as follows: the design rule of the guard line comprises the following steps:
after the depth camera collects a guard line yellow area, the distance between the outlines of the guard line yellow area in binary image imaging is 25 pixels to 55 pixels;
the number of the outlines of the yellow regions of the warning lines is at least six.
The further technical scheme is as follows: the yellow area of the warning line is made of adhesive materials capable of reflecting infrared rays.
The further technical scheme is as follows: the determining the region where the warning line is located from the binarized image according to the design rule of the warning line includes:
deleting the content of the distribution characteristic rule which does not meet the number and the interval of the outlines of the guard line yellow areas in the binary image to obtain the outlines of the guard line yellow areas;
storing the outline of the warning line yellow area in a container;
and determining a circumscribed rectangle frame of the outline of the yellow region of the warning line to obtain the region where the warning line is located.
The further technical scheme is as follows: the determining of the circumscribed rectangle frame of the outline of the guard line yellow area to obtain the area where the guard line is located includes:
and drawing a circumscribed rectangle frame of the outline of the yellow region of the warning line by using a retangle function of OpenCv to obtain the region where the warning line is located.
The further technical scheme is as follows: the determining the radar information of the region where the warning line is located according to the region where the warning line is located and the depth map information includes:
converting the depth map information into a three-dimensional point cloud;
converting the three-dimensional point cloud into radar data;
and determining the radar information of the region where the warning line is located according to the region where the warning line is located and the radar data.
The invention also provides a warning line infrared induction recognition device, which comprises:
the image acquisition unit is used for acquiring an image with a warning line shot by the depth camera to obtain a binary image and depth map information;
the region determining unit is used for determining a region where the warning line is located from the binarized image according to the design rule of the warning line;
the radar information determining unit is used for determining radar information of the area where the warning line is located according to the area where the warning line is located and the depth map information;
the interval acquisition unit is used for acquiring the time interval between image acquisition and image processing;
the distance determining unit is used for determining the distance between the robot and the warning line according to the current moving speed of the robot, the time interval and radar information of the region where the warning line is located;
and the signal generating unit is used for generating a control signal according to the distance between the robot and the warning line and sending the control signal to the robot so as to enable the robot to perform corresponding operation.
The further technical scheme is as follows: the region determining unit includes:
the deleting subunit is used for deleting the content of the distribution characteristic rule which does not meet the number and the interval of the outlines of the guard line yellow areas in the binary image so as to obtain the outlines of the guard line yellow areas;
a storage subunit, configured to store the outline of the warning line yellow region in a container;
and the circumscribed rectangle frame determining subunit is used for determining the circumscribed rectangle frame of the outline of the guard line yellow area so as to obtain the area where the guard line is located.
The invention also provides computer equipment which comprises a memory and a processor, wherein the memory is stored with a computer program, and the processor realizes the method when executing the computer program.
The invention also provides a storage medium storing a computer program which, when executed by a processor, is operable to carry out the method as described above.
Compared with the prior art, the invention has the beneficial effects that: according to the invention, the binary image with the warning line and the depth map image are obtained, the area where the warning line is located is determined according to the design rule of the warning line, the radar information of the area where the warning line is located is determined according to the depth map information, the distance between the robot and the warning line is determined by combining the current moving speed and time interval of the robot and the radar information of the area where the warning line is located, and a corresponding control signal is generated, so that the robot can accurately identify the ground warning line, the robot can more flexibly avoid the warning line, and some safety accidents are avoided.
The invention is further described below with reference to the accompanying drawings and specific embodiments.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of an application scenario of a warning line infrared sensing identification method according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a warning line infrared induction identification method according to an embodiment of the present invention;
fig. 3 is a schematic sub-flow chart of a warning line infrared induction identification method according to an embodiment of the present invention;
fig. 4 is a schematic sub-flow chart of a warning line infrared induction identification method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a warning line provided by an embodiment of the present invention;
fig. 6 is a schematic block diagram of a warning line infrared induction recognition apparatus provided by an embodiment of the present invention;
fig. 7 is a schematic block diagram of an area determination unit of a warning line infrared induction recognition apparatus provided by the embodiment of the present invention;
fig. 8 is a schematic block diagram of a radar information determination unit of the warning line infrared induction recognition apparatus provided by the embodiment of the present invention;
FIG. 9 is a schematic block diagram of a computer device provided by an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1 and fig. 2, fig. 1 is a schematic view of an application scenario of the warning line infrared sensing identification method according to the embodiment of the present invention. Fig. 2 is a schematic flow chart of a warning line infrared induction identification method according to an embodiment of the present invention. The warning line infrared induction identification method is applied to a server. The server can be mounted on the robot to form a controller of the robot, and naturally, the server can also be arranged independently of the robot and can perform data interaction with the robot and the depth camera.
Fig. 2 is a schematic flow chart of a warning line infrared induction identification method according to an embodiment of the present invention. As shown in fig. 2, the method includes the following steps S110 to S160.
And S110, acquiring an image with a warning line shot by the depth camera to obtain a binary image and depth map information.
In this embodiment, the binarized image refers to image binarization, which is a process of setting the gray value of a pixel point on an image to be 0 or 255, that is, presenting an obvious black-and-white effect on the whole image; specifically, the initial image collected by the depth camera is not a binarized image, but a binarized image formed after the initial image is binarized.
Depth map information refers to an image with depth information.
And S120, determining the region where the warning line is located from the binarized image according to the design rule of the warning line.
In this embodiment, the region where the guard line is located refers to the position of the guard line within the binarized image.
Referring to fig. 5, the design rule of the warning line includes:
after the depth camera collects a guard line yellow area, the distance between the outlines of the guard line yellow area in binary image imaging is 25 pixels to 55 pixels;
the number of the outlines of the yellow regions of the warning lines is at least six.
The yellow area of the warning line is made of adhesive materials capable of reflecting infrared rays.
Under any environment, the depth camera can emit infrared light to the ground, and can emit after irradiating the warning line which can reflect light, and the depth camera with the optical filter can filter visible light of naked eyes at the moment and only receives the infrared light. The warning line material only reflects infrared light, other articles on the ground do not reflect infrared light, and the position of the warning line on the ground is detected through image information collected by the depth camera, so that important data support is provided for calculating the distance from the robot to the warning line.
The design pattern of the warning line is a customized design with black dots and yellow background, the total length is 1m, the diameter of the black dots is 2.5cm, and the interval between the two circle centers is 2.5 cm.
When the depth camera starts to detect at a distance of 1m from the warning line, the distance between the contour of the yellow region of the warning line collected by the depth camera in the binary image imaging is 25-55 pixel points.
The 1m guard line can detect at least 6 yellow zone profiles.
The warning line yellow area is made of a reflective infrared material, and the selected color is numbered: C0M 26Y 100K0, the material and the color of selecting still can detect out fast under dark and highlight condition, reflect light easily, reduce the interference, improve the rate of accuracy of discernment.
In an embodiment, referring to fig. 3, the step S120 may include steps S121 to S123.
And S121, deleting the content of the distribution characteristic rule which does not meet the number and the interval of the outlines of the guard line yellow areas in the binary image to obtain the outlines of the guard line yellow areas.
In the present embodiment, the outline of the guard-line yellow region means the outline content including only the guard-line yellow region.
Specifically, after the depth camera irradiates a guard line yellow area capable of reflecting infrared light, substances which do not accord with the characteristic rule on the ground are removed by grasping the distribution characteristic rule of the number and the interval of the profiles of the guard line yellow area in a binary image, and the remaining profile of the guard line yellow area is left.
And S122, storing the outline of the warning line yellow area in a container.
S123, determining a circumscribed rectangle frame of the outline of the guard line yellow area to obtain the area where the guard line is located.
In this embodiment, a rectangle circumscribing the outline of the guard line yellow region is drawn by using a rectangle function of OpenCv, so as to obtain the region where the guard line is located.
After the area where the warning line is located is found, the moving distance from the beginning of recognition to the grabbing of the warning line area and the current distance from the warning line of the robot are calculated according to the time interval of the two collected frame images and the moving speed of the robot, and then the robot can make a timely response to braking or obstacle-detouring driving.
And S130, determining radar information of the region where the warning line is located according to the region where the warning line is located and the depth map information.
In this embodiment, the radar information of the area where the guard line is located refers to the actual position of the guard line, that is, the specific radar position.
In an embodiment, referring to fig. 4, the step S130 may include steps S131 to S133.
S131, converting the depth map information into three-dimensional point cloud.
In this embodiment, the three-dimensional point cloud refers to a content formed by a pixel in each depth map information corresponding to a three-dimensional point.
And S132, converting the three-dimensional point cloud into radar data.
In the present embodiment, each three-dimensional point is converted into one radar data, thereby forming all the radar data.
And S133, determining radar information of the region where the warning line is located according to the region where the warning line is located and the radar data.
The radar data are also laid out in the corresponding images, the regions where the warning lines are located are also laid out in the corresponding images, and each point of the two images corresponds to each other, so that the radar data of the regions where the warning lines are located can be obtained according to the radar data and the regions where the warning lines are located.
S140, acquiring a time interval between image acquisition and image processing;
s150, determining the distance between the robot and the warning line according to the current moving speed of the robot, the time interval and radar information of the region where the warning line is located.
In this embodiment, the distance traveled by the robot can be obtained by using the product of the current moving speed of the robot and the time interval according to the current moving speed of the robot and the time interval, and the radar information of the region where the warning line is located and the current distance of the robot, and the distance between the robot and the warning line can be determined by combining the radar information of the region where the warning line is located and the current distance of the robot.
And S160, generating a control signal according to the distance between the robot and the warning line, and sending the control signal to the robot so that the robot performs corresponding operation.
The depth camera can acquire a binary image of the warning line area and can acquire depth map information of the warning line area, the depth map is converted into three-dimensional point cloud from the depth map, the three-dimensional point cloud is converted into radar data, and the radar information of the warning line area is acquired. Meanwhile, the distance from the beginning of recognition to the grabbing of the warning line area and the distance from the warning line of the robot are calculated by collecting and processing the time interval of two frames of image information and combining the current moving speed of the robot. According to the processed information, an instruction is sent to the motor drive to carry out deceleration braking or change the moving direction of the wheels of the robot, and the warning line is bypassed.
The patterns of the warning lines are the test effects of various materials under different scenes, and the method has good compatibility and practicability by combining the method of the embodiment, under the Ubuntu16.04 ROS system, the binary images collected by the depth camera are processed based on the infrared reflection induction and some visual images of the depth camera, the ground warning lines are efficiently and accurately identified, and the robot can conveniently bypass the dangerous areas near the warning lines, so that safety accidents are avoided.
According to the warning line infrared induction identification method, the binary image with the warning line and the depth map image are obtained, the area where the warning line is located is determined according to the design rule of the warning line, the radar information of the area where the warning line is located is determined according to the depth map information, the distance between the robot and the warning line is determined according to the current moving speed and time interval of the robot and the radar information of the area where the warning line is located, and the corresponding control signal is generated, so that the ground warning line can be accurately identified by the robot, the robot can avoid the warning line more flexibly, and some safety accidents are avoided.
Fig. 6 is a schematic block diagram of a warning line infrared sensing identification apparatus 300 according to an embodiment of the present invention. As shown in fig. 6, the present invention also provides a warning line infrared induction recognition apparatus 300 corresponding to the above warning line infrared induction recognition method. The warning line infrared induction recognition apparatus 300 includes a unit for performing the above-described warning line infrared induction recognition method, and the apparatus may be configured in a server. Specifically, referring to fig. 6, the warning line infrared sensing identification apparatus 300 includes an image acquisition unit 301, a region determination unit 302, a radar information determination unit 303, an interval acquisition unit 304, a distance determination unit 305, and a signal generation unit 306.
An image acquisition unit 301, configured to acquire an image with a warning line captured by a depth camera to obtain a binarized image and depth map information; a region determining unit 302, configured to determine a region where a guard line is located from the binarized image according to a design rule of the guard line; a radar information determining unit 303, configured to determine, according to the area where the warning line is located, radar information of the area where the warning line is located in combination with the depth map information; an interval acquisition unit 304, configured to acquire a time interval between image acquisition and image processing; a distance determining unit 305, configured to determine a distance between the robot and the warning line according to the current moving speed of the robot, the time interval, and radar information of the area where the warning line is located; and a signal generating unit 306, configured to generate a control signal according to a distance between the robot and the warning line, and send the control signal to the robot, so that the robot performs a corresponding operation.
In one embodiment, as shown in fig. 7, the region determination unit 302 includes a deletion subunit 3021, a storage subunit 3022, and a circumscribed rectangular frame determination subunit 3023.
A deleting subunit 3021, configured to delete the content of the distribution characteristic rule that does not satisfy the number and the interval of the outlines of the guard line yellow regions in the binarized image, so as to obtain the outlines of the guard line yellow regions; a storage subunit 3022 for storing the outline of the warning line yellow region in a container; a circumscribed rectangle frame determining subunit 3023, configured to determine a circumscribed rectangle frame of the outline of the guard line yellow region, so as to obtain the region where the guard line is located.
In an embodiment, the circumscribed rectangle frame determining subunit 3023 is configured to draw the circumscribed rectangle frame of the outline of the warning line yellow region by using a retangle function of OpenCv, so as to obtain the region where the warning line is located.
In one embodiment, as shown in fig. 8, the radar information determination unit 303 includes a first conversion sub-unit 3031, a second conversion sub-unit 3032, and an information determination sub-unit 3033.
A first converting subunit 3031, configured to convert the depth map information into a three-dimensional point cloud; a second converting subunit 3032, configured to convert the three-dimensional point cloud into radar data; and an information determining subunit 3033, configured to determine, according to the region where the warning line is located and the radar data, radar information of the region where the warning line is located.
It should be noted that, as can be clearly understood by those skilled in the art, the specific implementation process of the warning line infrared sensing identification apparatus 300 and each unit may refer to the corresponding description in the foregoing method embodiment, and for convenience and brevity of description, no further description is provided herein.
The above-described guard line infrared induction recognition apparatus 300 may be implemented in the form of a computer program that can be run on a computer device as shown in fig. 9.
Referring to fig. 9, fig. 9 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device 500 may be a server, wherein the server may be an independent server or a server cluster composed of a plurality of servers.
Referring to fig. 9, the computer device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a non-volatile storage medium 503 and an internal memory 504.
The non-volatile storage medium 503 may store an operating system 5031 and a computer program 5032. The computer programs 5032 include program instructions that, when executed, cause the processor 502 to perform a cordline infrared sensing identification method.
The processor 502 is used to provide computing and control capabilities to support the operation of the overall computer device 500.
The internal memory 504 provides an environment for the operation of the computer program 5032 in the non-volatile storage medium 503, and when the computer program 5032 is executed by the processor 502, the processor 502 can be enabled to execute a warning line infrared sensing identification method.
The network interface 505 is used for network communication with other devices. Those skilled in the art will appreciate that the configuration shown in fig. 9 is a block diagram of only a portion of the configuration associated with the present application and does not constitute a limitation of the computer device 500 to which the present application may be applied, and that a particular computer device 500 may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
Wherein the processor 502 is configured to run the computer program 5032 stored in the memory to implement the following steps:
acquiring an image with a warning line shot by a depth camera to obtain a binary image and depth map information; determining the region where the warning line is located from the binary image according to the design rule of the warning line; determining radar information of the region where the warning line is located according to the region where the warning line is located and the depth map information; acquiring a time interval between image acquisition and image processing; determining the distance between the robot and the warning line according to the current moving speed and the time interval of the robot and the radar information of the region where the warning line is located; and generating a control signal according to the distance between the robot and the warning line, and sending the control signal to the robot so that the robot performs corresponding operation.
Wherein, the design rule of the warning line comprises: after the depth camera collects a guard line yellow area, the distance between the outlines of the guard line yellow area in binary image imaging is 25 pixels to 55 pixels; the number of the outlines of the yellow regions of the warning lines is at least six.
The yellow area of the warning line is made of adhesive materials capable of reflecting infrared rays.
In an embodiment, when the step of determining the region where the warning line is located from the binarized image according to the design rule of the warning line is implemented, the processor 502 specifically implements the following steps:
deleting the content of the distribution characteristic rule which does not meet the number and the interval of the outlines of the guard line yellow areas in the binary image to obtain the outlines of the guard line yellow areas; storing the outline of the warning line yellow area in a container; and determining a circumscribed rectangle frame of the outline of the yellow region of the warning line to obtain the region where the warning line is located.
In an embodiment, when implementing the step of determining the circumscribed rectangle of the contour of the yellow region of the warning line to obtain the region where the warning line is located, the processor 502 specifically implements the following steps:
and drawing a circumscribed rectangle frame of the outline of the yellow region of the warning line by using a retangle function of OpenCv to obtain the region where the warning line is located.
In an embodiment, when the processor 502 implements the step of determining the radar information of the region where the warning line is located according to the region where the warning line is located and the depth map information, the following steps are specifically implemented:
converting the depth map information into a three-dimensional point cloud; converting the three-dimensional point cloud into radar data; and determining the radar information of the region where the warning line is located according to the region where the warning line is located and the radar data.
It should be understood that in the embodiment of the present Application, the Processor 502 may be a Central Processing Unit (CPU), and the Processor 502 may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will be understood by those skilled in the art that all or part of the flow of the method implementing the above embodiments may be implemented by a computer program instructing associated hardware. The computer program includes program instructions, and the computer program may be stored in a storage medium, which is a computer-readable storage medium. The program instructions are executed by at least one processor in the computer system to implement the flow steps of the embodiments of the method described above.
Accordingly, the present invention also provides a storage medium. The storage medium may be a computer-readable storage medium. The storage medium stores a computer program, wherein the computer program, when executed by a processor, causes the processor to perform the steps of:
acquiring an image with a warning line shot by a depth camera to obtain a binary image and depth map information; determining the region where the warning line is located from the binary image according to the design rule of the warning line; determining radar information of the region where the warning line is located according to the region where the warning line is located and the depth map information; acquiring a time interval between image acquisition and image processing; determining the distance between the robot and the warning line according to the current moving speed and the time interval of the robot and the radar information of the region where the warning line is located; and generating a control signal according to the distance between the robot and the warning line, and sending the control signal to the robot so that the robot performs corresponding operation.
Wherein, the design rule of the warning line comprises: after the depth camera collects a guard line yellow area, the distance between the outlines of the guard line yellow area in binary image imaging is 25 pixels to 55 pixels; the number of the outlines of the yellow regions of the warning lines is at least six.
The yellow area of the warning line is made of adhesive materials capable of reflecting infrared rays.
In an embodiment, when the processor executes the computer program to implement the step of determining the region where the warning line is located from the binarized image according to the design rule of the warning line, the following steps are specifically implemented:
deleting the content of the distribution characteristic rule which does not meet the number and the interval of the outlines of the guard line yellow areas in the binary image to obtain the outlines of the guard line yellow areas; storing the outline of the warning line yellow area in a container; and determining a circumscribed rectangle frame of the outline of the yellow region of the warning line to obtain the region where the warning line is located.
In an embodiment, when the processor executes the computer program to implement the step of determining the circumscribed rectangle of the outline of the guard line yellow region to obtain the region where the guard line is located, the following steps are specifically implemented:
and drawing a circumscribed rectangle frame of the outline of the yellow region of the warning line by using a retangle function of OpenCv to obtain the region where the warning line is located.
In an embodiment, when the processor executes the computer program to implement the step of determining the radar information of the region where the warning line is located according to the region where the warning line is located and the depth map information, the processor specifically implements the following steps:
converting the depth map information into a three-dimensional point cloud; converting the three-dimensional point cloud into radar data; and determining the radar information of the region where the warning line is located according to the region where the warning line is located and the radar data.
The storage medium may be a usb disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk, which can store various computer readable storage media.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, various elements or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs. The units in the device of the embodiment of the invention can be merged, divided and deleted according to actual needs. In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a terminal, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. The warning line infrared induction identification method is characterized by comprising the following steps:
acquiring an image with a warning line shot by a depth camera to obtain a binary image and depth map information;
determining the region where the warning line is located from the binary image according to the design rule of the warning line;
determining radar information of the region where the warning line is located according to the region where the warning line is located and the depth map information;
acquiring a time interval between image acquisition and image processing;
determining the distance between the robot and the warning line according to the current moving speed and the time interval of the robot and the radar information of the region where the warning line is located;
and generating a control signal according to the distance between the robot and the warning line, and sending the control signal to the robot so that the robot performs corresponding operation.
2. The infrared induction warning line identification method according to claim 1, wherein the design rule of the warning line includes:
after the depth camera collects a guard line yellow area, the distance between the outlines of the guard line yellow area in binary image imaging is 25 pixels to 55 pixels;
the number of the outlines of the yellow regions of the warning lines is at least six.
3. The infrared sensing identification method of the warning line according to claim 2, wherein the yellow region of the warning line is made of a sticker material capable of reflecting infrared rays.
4. The infrared induction identification method for guard lines according to claim 1, wherein the determining the region of the guard lines from the binarized image according to the design rule of the guard lines comprises:
deleting the content of the distribution characteristic rule which does not meet the number and the interval of the outlines of the guard line yellow areas in the binary image to obtain the outlines of the guard line yellow areas;
storing the outline of the warning line yellow area in a container;
and determining a circumscribed rectangle frame of the outline of the yellow region of the warning line to obtain the region where the warning line is located.
5. The infrared induction warning line identification method according to claim 4, wherein the determining a circumscribed rectangle frame of the outline of the yellow region of the warning line to obtain the region where the warning line is located comprises:
and drawing a circumscribed rectangle frame of the outline of the yellow region of the warning line by using a retangle function of OpenCv to obtain the region where the warning line is located.
6. The infrared sensing identification method for guard lines according to claim 1, wherein the determining radar information of the area where the guard line is located according to the area where the guard line is located and the depth map information comprises:
converting the depth map information into a three-dimensional point cloud;
converting the three-dimensional point cloud into radar data;
and determining the radar information of the region where the warning line is located according to the region where the warning line is located and the radar data.
7. Warning line infrared induction recognition device, its characterized in that includes:
the image acquisition unit is used for acquiring an image with a warning line shot by the depth camera to obtain a binary image and depth map information;
the region determining unit is used for determining a region where the warning line is located from the binarized image according to the design rule of the warning line;
the radar information determining unit is used for determining radar information of the area where the warning line is located according to the area where the warning line is located and the depth map information;
the interval acquisition unit is used for acquiring the time interval between image acquisition and image processing;
the distance determining unit is used for determining the distance between the robot and the warning line according to the current moving speed of the robot, the time interval and radar information of the region where the warning line is located;
and the signal generating unit is used for generating a control signal according to the distance between the robot and the warning line and sending the control signal to the robot so as to enable the robot to perform corresponding operation.
8. The warning line infrared induction recognition apparatus according to claim 7, wherein the area determination unit comprises:
the deleting subunit is used for deleting the content of the distribution characteristic rule which does not meet the number and the interval of the outlines of the guard line yellow areas in the binary image so as to obtain the outlines of the guard line yellow areas;
a storage subunit, configured to store the outline of the warning line yellow region in a container;
and the circumscribed rectangle frame determining subunit is used for determining the circumscribed rectangle frame of the outline of the guard line yellow area so as to obtain the area where the guard line is located.
9. A computer device, characterized in that the computer device comprises a memory, on which a computer program is stored, and a processor, which when executing the computer program implements the method according to any of claims 1 to 6.
10. A storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method according to any one of claims 1 to 6.
CN202110978216.5A 2021-08-23 2021-08-23 Warning line infrared induction identification method and device, computer equipment and storage medium Pending CN113657331A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110978216.5A CN113657331A (en) 2021-08-23 2021-08-23 Warning line infrared induction identification method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110978216.5A CN113657331A (en) 2021-08-23 2021-08-23 Warning line infrared induction identification method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113657331A true CN113657331A (en) 2021-11-16

Family

ID=78492773

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110978216.5A Pending CN113657331A (en) 2021-08-23 2021-08-23 Warning line infrared induction identification method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113657331A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115373407A (en) * 2022-10-26 2022-11-22 北京云迹科技股份有限公司 Method and device for robot to automatically avoid safety warning line

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0848199A (en) * 1994-08-09 1996-02-20 Hitachi Ltd Obstacle alarm system
KR101489836B1 (en) * 2013-09-13 2015-02-04 자동차부품연구원 Pedestrian detecting and collision avoiding apparatus and method thereof
CN107139666A (en) * 2017-05-19 2017-09-08 四川宝天智控***有限公司 Obstacle detouring identifying system and method
US20170368686A1 (en) * 2016-06-28 2017-12-28 Qihan Technology Co., Ltd. Method and device for automatic obstacle avoidance of robot
CN108152808A (en) * 2017-11-23 2018-06-12 安徽四创电子股份有限公司 A kind of circumference intelligent predicting method for early warning based on millimetre-wave radar
CN108805906A (en) * 2018-05-25 2018-11-13 哈尔滨工业大学 A kind of moving obstacle detection and localization method based on depth map
CN208801979U (en) * 2018-08-08 2019-04-30 上海工程技术大学 A kind of novel platform safe-guard line
CN109829386A (en) * 2019-01-04 2019-05-31 清华大学 Intelligent vehicle based on Multi-source Information Fusion can traffic areas detection method
CN110146865A (en) * 2019-05-31 2019-08-20 阿里巴巴集团控股有限公司 Target identification method and device for radar image
CN110733039A (en) * 2019-10-10 2020-01-31 南京驭行科技有限公司 Automatic robot driving method based on VFH + and vision auxiliary decision
CN111191600A (en) * 2019-12-30 2020-05-22 深圳元戎启行科技有限公司 Obstacle detection method, obstacle detection device, computer device, and storage medium
CN111812634A (en) * 2020-06-05 2020-10-23 森思泰克河北科技有限公司 Method, device and system for monitoring warning line target
CN112650300A (en) * 2021-01-07 2021-04-13 深圳市君航品牌策划管理有限公司 Unmanned aerial vehicle obstacle avoidance method and device

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0848199A (en) * 1994-08-09 1996-02-20 Hitachi Ltd Obstacle alarm system
KR101489836B1 (en) * 2013-09-13 2015-02-04 자동차부품연구원 Pedestrian detecting and collision avoiding apparatus and method thereof
US20170368686A1 (en) * 2016-06-28 2017-12-28 Qihan Technology Co., Ltd. Method and device for automatic obstacle avoidance of robot
CN107139666A (en) * 2017-05-19 2017-09-08 四川宝天智控***有限公司 Obstacle detouring identifying system and method
CN108152808A (en) * 2017-11-23 2018-06-12 安徽四创电子股份有限公司 A kind of circumference intelligent predicting method for early warning based on millimetre-wave radar
CN108805906A (en) * 2018-05-25 2018-11-13 哈尔滨工业大学 A kind of moving obstacle detection and localization method based on depth map
CN208801979U (en) * 2018-08-08 2019-04-30 上海工程技术大学 A kind of novel platform safe-guard line
CN109829386A (en) * 2019-01-04 2019-05-31 清华大学 Intelligent vehicle based on Multi-source Information Fusion can traffic areas detection method
CN110146865A (en) * 2019-05-31 2019-08-20 阿里巴巴集团控股有限公司 Target identification method and device for radar image
CN110733039A (en) * 2019-10-10 2020-01-31 南京驭行科技有限公司 Automatic robot driving method based on VFH + and vision auxiliary decision
CN111191600A (en) * 2019-12-30 2020-05-22 深圳元戎启行科技有限公司 Obstacle detection method, obstacle detection device, computer device, and storage medium
CN111812634A (en) * 2020-06-05 2020-10-23 森思泰克河北科技有限公司 Method, device and system for monitoring warning line target
CN112650300A (en) * 2021-01-07 2021-04-13 深圳市君航品牌策划管理有限公司 Unmanned aerial vehicle obstacle avoidance method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115373407A (en) * 2022-10-26 2022-11-22 北京云迹科技股份有限公司 Method and device for robot to automatically avoid safety warning line

Similar Documents

Publication Publication Date Title
CN106951847B (en) Obstacle detection method, apparatus, device and storage medium
KR102109941B1 (en) Method and Apparatus for Vehicle Detection Using Lidar Sensor and Camera
CN106485233A (en) Drivable region detection method, device and electronic equipment
CN110852312B (en) Cliff detection method, mobile robot control method, and mobile robot
EP3709134A1 (en) Tool and method for annotating a human pose in 3d point cloud data
JP4118452B2 (en) Object recognition device
CN110147698A (en) System and method for lane detection
CN106503653A (en) Area marking method, device and electronic equipment
JP5023186B2 (en) Object motion detection system based on combination of 3D warping technique and proper object motion (POM) detection
US20150169980A1 (en) Object recognition device
JP2023500994A (en) Obstacle recognition method, device, autonomous mobile device and storage medium
EP3951645A1 (en) Method and apparatus for detecting state of holding steering wheel by hands
CN106573588A (en) Drive assist device, drive assist method, and program
CN109635816A (en) Lane line generation method, device, equipment and storage medium
CN110033621A (en) A kind of hazardous vehicles detection method, apparatus and system
KR101667835B1 (en) Object localization using vertical symmetry
CN111368612A (en) Overman detection system, personnel detection method and electronic equipment
CN113657331A (en) Warning line infrared induction identification method and device, computer equipment and storage medium
CN112741555A (en) Cleaning method, system and cleaning equipment
CN113806464A (en) Road tooth determining method, device, equipment and storage medium
Lion et al. Smart speed bump detection and estimation with kinect
KR101236234B1 (en) Detection system of road line using both laser sensor and camera
JP2014056295A (en) Vehicle periphery monitoring equipment
JP7043787B2 (en) Object detection system
CN113053127B (en) Intelligent real-time state detection system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination