CN112817302B - Safety control method, system, equipment and readable medium for industrial robot - Google Patents

Safety control method, system, equipment and readable medium for industrial robot Download PDF

Info

Publication number
CN112817302B
CN112817302B CN201911124556.0A CN201911124556A CN112817302B CN 112817302 B CN112817302 B CN 112817302B CN 201911124556 A CN201911124556 A CN 201911124556A CN 112817302 B CN112817302 B CN 112817302B
Authority
CN
China
Prior art keywords
industrial robot
target object
motion
safety protection
safety
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911124556.0A
Other languages
Chinese (zh)
Other versions
CN112817302A (en
Inventor
吴曼玲
刘景亚
刘向东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CISDI Engineering Co Ltd
CISDI Research and Development Co Ltd
Original Assignee
CISDI Engineering Co Ltd
CISDI Research and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CISDI Engineering Co Ltd, CISDI Research and Development Co Ltd filed Critical CISDI Engineering Co Ltd
Priority to CN201911124556.0A priority Critical patent/CN112817302B/en
Publication of CN112817302A publication Critical patent/CN112817302A/en
Application granted granted Critical
Publication of CN112817302B publication Critical patent/CN112817302B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a safety control method, a safety control system, safety control equipment and a readable medium of an industrial robot, wherein the method comprises the following steps: processing the collected work area image of the industrial robot to obtain a safety protection area of the industrial robot; acquiring a position parameter of a target object, and predicting a motion track of the target object according to the position parameter; controlling the industrial robot to move according to the motion tracks of the safety protection area and the target object; the dynamic calculation of the dangerous area, namely the safety protection area, of the industrial robot and the intelligent planning of the action mode of the industrial robot according to the motion state of the target object are achieved, and the shutdown loss caused by the invasion of the target object in the traditional mode is reduced.

Description

Safety control method, system, equipment and readable medium for industrial robot
Technical Field
The invention relates to the field of robot safety control, in particular to a safety control method, a safety control system, safety control equipment and a readable medium for an industrial robot.
Background
The robot technology is widely applied to the fields of automobile and automobile part manufacturing industry, heavy machinery, aerospace, ships, chemical industry, electronic industry and the like.
In a factory, the robot gradually replaces a person to complete part of repetitive and high-risk work, but due to the fact that safety protection measures of the robot are not in place and safety chain is not made, accidents that the robot harms the person occasionally occur. The existing robot safety protection measures mainly comprise a safety fence, a safety grating or a safety carpet and the like, and once a signal triggered by a sensor is transmitted to a robot controller, the robot stops acting; although the above method can suppress the occurrence of a robot safety accident, the robot stops work when the target object enters a predetermined area by a fixed dangerous area and a robot control method which is not flexible enough. The method has great influence on the production and manufacturing of the process industry, and the production can bring great economic loss once stopped. Therefore, finding a more intelligent and flexible robot safety detection method and control mode is necessary for the operation of industrial robots in the process industry.
Disclosure of Invention
In view of the above-mentioned drawbacks of the prior art, it is an object of the present invention to provide a method, a system, a device and a readable medium for safety control of an industrial robot, which solve the problem of inflexible safety monitoring control of the robot.
To achieve the above and other related objects, the present invention provides a safety control method of an industrial robot, including:
processing the collected working area image of the industrial robot, detecting whether a target enters a monitoring area, and if the target enters the monitoring area, acquiring a safety protection area of the industrial robot;
acquiring a position parameter of a target object, and predicting a motion track of the target object according to the position parameter;
and controlling the industrial robot to move according to the motion tracks of the safety protection area and the target object.
Optionally, the safety control method of the industrial robot includes:
detecting a target object in the working area image;
judging whether the target object enters a monitoring area or not;
and if the target object enters the monitoring area, obtaining the safety protection area of the industrial robot.
Optionally, the step of obtaining a safety zone of the industrial robot includes:
collecting the spatial position coordinates of the joint point and the end tool point of the industrial robot;
performing two-dimensional coordinate conversion on the spatial position coordinates to obtain two-dimensional coordinate points on a motion plane of the industrial robot;
and acquiring the safety protection area according to the two-dimensional coordinate points of the joint and the two-dimensional coordinate points of the tail end tool point.
Optionally, the step of obtaining a safety zone of the industrial robot comprises:
connecting the joint two-dimensional coordinate points of each joint point at a plurality of moments with the two-dimensional coordinate points of the tail end tool point to obtain corresponding projection lines at each moment;
performing line-surface conversion on the projection lines at each moment according to a preset proportion to obtain corresponding projection surfaces at each moment;
and overlapping the projection surfaces at all times to obtain the safety protection area.
Optionally, the method includes: and acquiring the position parameters of the target object at different moments by using an image system.
Optionally, the step of obtaining the position parameter of the target object includes:
processing the plurality of groups of position parameters through a track prediction model to obtain a target object motion track;
and predicting the position parameters of the target object at a specific moment according to the motion trail of the target object.
Optionally, the step of obtaining the position parameter of the target object includes:
judging whether the position parameters of the target object at a specific moment fall into the safety protection area or not;
if yes, the target object is avoided by controlling the motion state of the industrial robot, wherein the motion state comprises the following steps: speed of movement, direction of movement.
The present invention also provides a safety control system of an industrial robot, comprising:
the image acquisition unit is used for acquiring a working area image of the industrial robot;
the processing unit is used for processing the working area image, judging whether a target object intrudes into the monitoring area or not, and calculating the safety protection area of the industrial robot if the target object intrudes into the monitoring area;
the position prediction unit is used for acquiring position parameters of the target object and predicting the motion track of the target object according to the position parameters;
and the motion decision unit controls the industrial robot to move according to the motion tracks of the safety protection area and the target object.
The present invention also provides an apparatus comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform a method as described in one or more of the above-described security control methods.
The present invention also provides one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform a method as described in one or more of the above-described security control methods.
As described above, according to the safety control method, system, device and readable medium for an industrial robot provided by the present invention, a safety protection area of the industrial robot is obtained by processing a collected work area image of the industrial robot; acquiring position parameters of a target object, and predicting the motion track of the target object according to the position parameters; controlling the industrial robot to move according to the motion tracks of the safety protection area and the target object; the dynamic calculation of the industrial robot danger area, namely the safety protection area, and the intelligent planning of the action mode of the industrial robot according to the motion state of the target object are achieved, and the shutdown loss caused by the invasion of the target object in the traditional mode is reduced.
Drawings
Fig. 1 is a schematic method flow diagram of a safety control method for an industrial robot according to an embodiment.
Fig. 2 is a block diagram of a safety control system of an industrial robot according to an embodiment.
Fig. 3 is a schematic diagram of a hardware structure of a terminal device according to an embodiment.
Fig. 4 is a schematic diagram of a hardware structure of a terminal device according to another embodiment.
Description of the element reference
Image acquisition unit 10, processing unit 20, position prediction unit 30, motion decision unit 40
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It should be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that, referring to fig. 1 to 4, the drawings provided in the following embodiments are only schematic illustrations of the basic idea of the present invention, and only the components related to the present invention are shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, number and proportion of the components in actual implementation may be changed arbitrarily, and the layout of the components may be more complicated.
Referring to fig. 1, the present invention provides a safety control method for an industrial robot, including:
s10: processing the acquired work area image of the industrial robot, and if a target object enters a monitoring area, acquiring a safety protection area of the industrial robot;
s20: acquiring a position parameter of a target object, and predicting a motion track of the target object according to the position parameter;
s30: and controlling the industrial robot to move according to the motion tracks of the safety protection area and the target object.
In some embodiments, a safety control method of an industrial robot includes:
detecting a target object in the working area image;
judging whether the target object enters a monitoring area or not;
and if the target object enters the monitoring area, determining a robot safety protection area according to the current motion state of the robot.
It will be appreciated that the working area image of the industrial robot may be acquired at certain intervals by acquiring the working area image of the industrial robot by the image acquisition unit 10, and the image acquisition unit 10 is disposed at other positions of the robot active area. Can realize carrying out the target object detection to the work area image through image from background modeling or image prospect detection technique, detect the target object gets into the monitoring area, then begins right industrial robot's safety protection district calculates, and is general, and the safety protection district is close to industrial robot relatively, and the monitoring area is adjacent with the safety protection district, and the relative protection district of monitoring area, keeps away from industrial robot.
In some embodiments, the step of obtaining a safety zone of the industrial robot comprises:
collecting the spatial position coordinates of the joint point and the end tool point of the industrial robot;
performing two-dimensional coordinate conversion on the spatial position coordinates to obtain two-dimensional coordinate points on a motion plane of the industrial robot;
and acquiring the safety protection area according to the two-dimensional coordinate points of the joint and the two-dimensional coordinate points of the tail end tool point.
In some embodiments, the step of obtaining a safety zone of the industrial robot further comprises:
connecting the joint two-dimensional coordinate points of each joint point at a plurality of moments with the two-dimensional coordinate points of the tail end tool point to obtain corresponding projection lines at each moment;
performing line-surface conversion on the projection lines at each moment according to a preset proportion to obtain corresponding projection surfaces at each moment;
and overlapping the projection surfaces at all times to obtain the safety protection area.
It is understood that the calculation of the safety zone may be performed by the following steps:
carrying out simulation operation on the motion path of the industrial robot in a computing system to obtain the motion coordinate of each joint point of the robot at each moment;
recording the time when the target object enters the monitoring area as t 1 When the object enters the monitoring area, the space position coordinate (x) of the first joint point at the moment is obtained 1 ,y 1 ,z 1 ) Spatial position coordinates (x) of the third correlation point 3 ,y 3 ,z 3 ) Spatial position coordinate articulation 6 (x) of sixth articulation point 6 ,y 6 ,z 6 ) And end tool point coordinates or TCP (x) tcp ,y tcp ,z tcp );
Projecting the space position coordinates of the first joint point, the third joint point, the sixth joint point and the TCP on an XY plane to obtain two-dimensional coordinate points (x) corresponding to the first joint point, the third joint point, the sixth joint point and the TCP respectively 1 ,y 1 ),(x 3 ,y 3 ),(x 6 ,y 6 ),(x tcp ,y tcp ) Connecting the four two-dimensional coordinate points to obtain a projection line, expanding the projection line by a certain width, namely widening the projection line by a certain multiple, such as 20-30 times, to obtain the area S1 of the projection surface at the moment;
repeating the steps at the next moment to obtain the area S2 of the projection surface at the next moment;
and superposing the S1 and the S2 to obtain the safety protection area.
In some embodiments, the safety control method of an industrial robot further includes: and acquiring the position parameters of the target object at different moments.
In some embodiments, the step of obtaining the position parameter of the target object comprises:
processing the plurality of groups of position parameters through a track prediction model to obtain a target object motion track;
and predicting the position parameters of the target object at a specific moment according to the motion trail of the target object.
In some embodiments, the position parameter of the target object at a specific time may be predicted by establishing a motion trajectory prediction model, and a gray scale prediction method may be collected to predict the trajectory and the position parameter, which may specifically be completed by the following steps:
the x coordinate column vector in the plurality of historical time target object motion positions (x, y, z) is:
X 0 =(x (0) (1),x (0) (2),…,x (0) (n)) T ,n≥4 (1)
x (0) and (n) is the position value of the nth historical moment, wherein the superscript 0 in the formula (1) indicates that the data is original data, n is the serial number of the original data, and T is matrix transposition.
And accumulating the original data sequence to obtain:
X 1 =(x (1) (1),x (1) (2),…,x (1) (n)) T ,n≥4; (2)
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002276372460000051
the superscript 1 indicates one accumulation generation.
Fitting and approximating the generated accumulated data by a linear dynamic model, wherein the form of the fitting and approximating is as follows:
Figure BDA0002276372460000052
a and u are parameters required to be obtained through modeling; a is a coefficient of development; u is the amount of gray effect;
the solution in equation (3) is:
Figure BDA0002276372460000053
in the formula (4), t represents a certain time; t is t 0 Is the initial time.
For x (1) (t) sampling at equal intervals, the discrete values being:
Figure BDA0002276372460000054
/>
k in formula (5) is the kth discrete point.
The coefficients a, u of the differential equation can be found by the least squares method, the vector form of which is:
Figure BDA0002276372460000061
Y=[x (0) (2),x (0) (3),…,x (0) (n)] T (7)
Figure BDA0002276372460000062
x in the formula (8) (1) (t) first order accumulated data generated from the raw data;
obtaining:
Figure BDA0002276372460000063
in the formula (9)
Figure BDA0002276372460000064
Is the first order accumulated data calculated from the differential equation.
According to
Figure BDA0002276372460000065
Reducing to generate reduction, and calculating to obtain predicted data
Figure BDA0002276372460000066
The formula of the gray scale prediction is used for predicting an X future coordinate, and it can be understood that the Y and Z future coordinates can be predicted by repeating the steps, so that the position parameter of the target object at a specific moment, namely the specific position coordinate of a certain moment in the future can be obtained.
In some embodiments, the step of obtaining the position parameter of the target object comprises:
judging whether the position parameters of the target object at a specific moment fall into the safety protection area or not;
if yes, the target object is avoided by controlling the motion state of the industrial robot, wherein the motion state comprises the following steps: speed of movement, direction of movement.
It can be understood that if the position parameter of the object at a specific moment falls into the safety protection area, the motion state of the industrial robot is changed to avoid the loss caused by collision of the industrial robot with the object. In some embodiments, the motion state may be static or moving, such as in the case of an excessively fast moving speed of the target object, the industrial robot may be controlled to stop moving at a position deviating from the moving track of the target object, so that the motion state of the industrial robot may be flexibly and intelligently adjusted to avoid unnecessary property loss.
In some embodiments, circumventing the target object by controlling the motion state of the industrial robot further comprises determining a current motion speed of the industrial robot; such as if the industrial robot motion speed is greater than the upper limit motion speed, the industrial robot motion is moving in the opposite direction; if the motion speed of the industrial robot is greater than the lower limit motion speed and less than the upper limit motion speed, the motion of the industrial robot is stopped; if the moving speed of the industrial robot is less than the lower limit moving speed, the moving speed of the industrial robot is reduced. The upper limit movement speed and the lower limit movement speed may be set according to the type and application of the robot, such as the upper limit movement speed may be 200mm/s, and the lower limit movement speed may be 50mm/s, which is not limited herein.
Referring to fig. 2, the present invention further provides a safety control system for an industrial robot, including:
the image acquisition unit 10 is used for acquiring a working area image of the industrial robot;
the processing unit 20 is used for processing the working area image, judging whether a target object enters a monitoring area or not, and if the target object enters the monitoring area, acquiring a safety protection area of the industrial robot;
the position prediction unit 30 is configured to acquire a position parameter of a target object, and predict a motion trajectory of the target object according to the position parameter;
and the motion decision unit 40 is used for controlling the motion of the industrial robot according to the motion tracks of the safety protection area and the target object.
The present invention also provides an apparatus comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform a method as described in one or more of the above-described security control methods.
The present invention also provides one or more machine-readable media having instructions stored thereon that, when executed by one or more processors, cause an apparatus to perform a method as described in one or more of the above-described security control methods.
As described above, according to the safety control method, system, device and readable medium for an industrial robot provided by the present invention, a safety protection area of the industrial robot is obtained by processing a collected work area image of the industrial robot; acquiring position parameters of a target object, and predicting the motion track of the target object according to the position parameters; controlling the industrial robot to move according to the motion tracks of the safety protection area and the target object; the dynamic calculation of the industrial robot danger area, namely the safety protection area, and the intelligent planning of the action mode of the industrial robot according to the motion state of the target object are achieved, and the shutdown loss caused by the invasion of the target object in the traditional mode is reduced.
An embodiment of the present application further provides an apparatus, which may include: one or more processors; and one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the method of fig. 1. In practical applications, the device may be used as a terminal device, and may also be used as a server, where examples of the terminal device may include: the mobile terminal includes a smart phone, a tablet computer, an e-book reader, an MP3 (moving Picture Experts Group Audio Layer III) player, an MP4 (moving Picture Experts Group Audio Layer IV) player, a laptop portable computer, a car-mounted computer, a desktop computer, a set-top box, an intelligent television, a wearable device, and the like.
The present application further provides a non-transitory readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to a device, the device may execute instructions (instructions) of steps included in the method in fig. 1 of the present application.
Fig. 3 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application. As shown, the terminal device may include: an input device 1100, a first processor 1101, an output device 1102, a first memory 1103, and at least one communication bus 1104. The communication bus 1104 is used to implement communication connections between the elements. The first memory 1103 may include a high-speed RAM memory, and may also include a non-volatile storage NVM, such as at least one disk memory, and the first memory 1103 may store various programs for performing various processing functions and implementing the method steps of the present embodiment.
In some embodiments, the first processor 1101 may be, for example, a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the first processor 1101 is coupled to the input device 1100 and the output device 1102 through a wired or wireless connection.
In some embodiments, the input device 1100 may include a variety of input devices, such as at least one of a user-oriented user interface, a device-oriented device interface, a software-programmable interface, a camera, and a sensor. In some embodiments, the device-oriented device interface may be a wired interface for data transmission between devices, or may be a hardware plug-in interface (e.g., USB interface, serial port, etc.) for data transmission between devices; in some embodiments, the user-oriented user interface may be, for example, user-oriented control keys, a voice input device for receiving voice input, and a touch-sensitive device (e.g., a touch screen with touch-sensitive functionality, a touch pad, etc.) for receiving user touch input; in some embodiments, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, such as an input pin interface or an input interface of a chip; the output devices 1102 may include output devices such as a display, audio, and the like.
In this embodiment, the processor of the terminal device includes a function for executing each module of the speech recognition apparatus in each device, and specific functions and technical effects may refer to the above embodiments, which are not described herein again.
Fig. 4 is a schematic hardware structure diagram of a terminal device according to an embodiment of the present application. Fig. 4 is a specific embodiment of fig. 3 in an implementation process. As shown, the terminal device of the present embodiment may include a second processor 1201 and a second memory 1202.
The second processor 1201 executes the computer program code stored in the second memory 1202 to implement the method described in fig. 4 in the above embodiment.
The second memory 1202 is configured to store various types of data to support operations at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, videos, and so forth. The second memory 1202 may include a Random Access Memory (RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a second processor 1201 is provided in the processing assembly 1200. The terminal device may further include: communication component 1203, power component 1204, multimedia component 1205, speech component 1206, input/output interfaces 1207, and/or sensor component 1208. The specific components included in the terminal device are set according to actual requirements, which is not limited in this embodiment.
The processing component 1200 generally controls the overall operation of the terminal device. The processing assembly 1200 may comprise one or more second processors 1201 to execute instructions to perform all or part of the steps of the data processing method described above. Further, the processing component 1200 can include one or more modules that facilitate interaction between the processing component 1200 and other components. For example, the processing component 1200 can include a multimedia module to facilitate interaction between the multimedia component 1205 and the processing component 1200.
The power supply component 1204 provides power to the various components of the terminal device. The power components 1204 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the terminal device.
The multimedia components 1205 include a display screen that provides an output interface between the terminal device and the user. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The voice component 1206 is configured to output and/or input voice signals. For example, the voice component 1206 includes a Microphone (MIC) configured to receive external voice signals when the terminal device is in an operational mode, such as a voice recognition mode. The received speech signal may further be stored in the second memory 1202 or transmitted via the communication component 1203. In some embodiments, the speech component 1206 further comprises a speaker for outputting speech signals.
The input/output interface 1207 provides an interface between the processing component 1200 and peripheral interface modules, which can be click wheels, buttons, and the like. These buttons may include, but are not limited to: a volume button, a start button, and a lock button.
The sensor component 1208 includes one or more sensors for providing various aspects of status assessment for the terminal device. For example, the sensor component 1208 may detect an open/closed state of the terminal device, relative positioning of the components, presence or absence of user contact with the terminal device. The sensor assembly 1208 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 1208 may also include a camera or the like.
The communication component 1203 is configured to facilitate communications between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot therein for inserting a SIM card therein, so that the terminal device may log onto a GPRS network to establish communication with the server via the internet.
As can be seen from the above, the communication component 1203, the voice component 1206, the input/output interface 1207 and the sensor component 1208 referred to in the embodiment of fig. 4 can be implemented as the input device in the embodiment of fig. 3.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (8)

1. A safety control method of an industrial robot, characterized by comprising:
processing the collected work area image of the industrial robot to obtain a safety protection area of the industrial robot;
acquiring position parameters of a target object, and predicting the motion track of the target object according to the position parameters;
controlling the industrial robot to move according to the motion tracks of the safety protection area and the target object;
the step of obtaining a safety zone of the industrial robot comprises:
collecting the spatial position coordinates of the joint point and the end tool point of the industrial robot;
performing two-dimensional coordinate conversion on the spatial position coordinates to obtain two-dimensional coordinate points on a motion plane of the industrial robot;
acquiring the safety protection area according to the two-dimensional coordinate point;
connecting the joint two-dimensional coordinate points of each joint point at a plurality of moments with the two-dimensional coordinate points of the tail end tool point to obtain corresponding projection lines at each moment;
performing line-surface conversion on the projection lines at each moment according to a preset proportion to obtain corresponding projection surfaces at each moment;
and overlapping the projection surfaces at all times to obtain the safety protection area.
2. A safety control method of an industrial robot according to claim 1, characterized by comprising:
detecting a target object in the working area image;
judging whether the target object enters a monitoring area or not;
and if the target object enters the monitoring area, obtaining the safety protection area 2 of the industrial robot.
3. A safety control method of an industrial robot according to claim 1, characterized by comprising: and acquiring the position parameters of the target object at different moments.
4. A safety control method of an industrial robot according to claim 1, characterized in that the step of obtaining a position parameter of the object comprises:
processing the plurality of groups of position parameters through a track prediction model to obtain a target object motion track;
and predicting the position parameters of the target object at a specific moment according to the motion trail of the target object.
5. The safety control method of an industrial robot according to any of claims 1 or 4, characterized in that the step of obtaining the position parameter of the object comprises:
judging whether the position parameters of the target object at a specific moment fall into the safety protection area or not;
if yes, the target object is avoided by controlling the motion state of the industrial robot, wherein the motion state comprises the following steps: speed of movement, direction of movement.
6. A safety control system of an industrial robot, comprising:
the image acquisition unit is used for acquiring a working area image of the industrial robot;
the processing unit is used for processing the working area image, judging whether a target object intrudes into the monitoring area or not, and if so, acquiring a safety protection area of the industrial robot;
the position prediction unit is used for acquiring position parameters of a target object and predicting the motion track of the target object according to the position parameters;
the motion decision unit controls the industrial robot to move according to the motion tracks of the safety protection area and the target object;
the step of obtaining a safety zone of the industrial robot comprises:
collecting space position coordinates of a joint point and a terminal tool point of the industrial robot;
performing two-dimensional coordinate conversion on the spatial position coordinates to obtain two-dimensional coordinate points on a motion plane of the industrial robot;
acquiring the safety protection area according to the two-dimensional coordinate point;
connecting the joint two-dimensional coordinate points of each joint point at a plurality of moments with the two-dimensional coordinate points of the tail end tool point to obtain corresponding projection lines at each moment;
performing line-surface conversion on the projection lines at each moment according to a preset proportion to obtain corresponding projection surfaces at each moment;
and superposing the projection surfaces at all times to obtain the safety protection area.
7. An apparatus, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the method recited by one or more of claims 1-5.
8. A machine-readable medium having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform the method of one or more of claims 1-5.
CN201911124556.0A 2019-11-18 2019-11-18 Safety control method, system, equipment and readable medium for industrial robot Active CN112817302B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911124556.0A CN112817302B (en) 2019-11-18 2019-11-18 Safety control method, system, equipment and readable medium for industrial robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911124556.0A CN112817302B (en) 2019-11-18 2019-11-18 Safety control method, system, equipment and readable medium for industrial robot

Publications (2)

Publication Number Publication Date
CN112817302A CN112817302A (en) 2021-05-18
CN112817302B true CN112817302B (en) 2023-04-07

Family

ID=75852081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911124556.0A Active CN112817302B (en) 2019-11-18 2019-11-18 Safety control method, system, equipment and readable medium for industrial robot

Country Status (1)

Country Link
CN (1) CN112817302B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114389861B (en) * 2021-12-24 2023-03-03 北京科技大学 Mechanical arm safety detection method and system based on EtherCAT automation

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2380709A2 (en) * 2010-04-22 2011-10-26 Sick AG 3D safety device and method for securing and operating at least one machine

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102323822B (en) * 2011-05-09 2013-07-03 无锡引域智能机器人有限公司 Method for preventing industrial robot from colliding with worker
CN104723350B (en) * 2015-03-16 2016-07-20 珠海格力电器股份有限公司 Industrial robot safety intelligent control method and system
CN106598046B (en) * 2016-11-29 2020-07-10 北京儒博科技有限公司 Robot avoidance control method and device
CN106956261A (en) * 2017-04-11 2017-07-18 华南理工大学 A kind of man-machine interaction mechanical arm system and method with security identification zone
DE102018214439A1 (en) * 2017-11-17 2019-05-23 Volkswagen Aktiengesellschaft Method and device for securing a working area of a robot during a use phase
CN109500811A (en) * 2018-11-13 2019-03-22 华南理工大学 A method of the mankind are actively avoided towards man-machine co-melting robot
CN109352658B (en) * 2018-12-04 2024-02-23 中冶赛迪工程技术股份有限公司 Industrial robot positioning control method, system and computer readable storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2380709A2 (en) * 2010-04-22 2011-10-26 Sick AG 3D safety device and method for securing and operating at least one machine

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
孟得山 ; 王学谦 ; 梁斌 ; 梁建 ; .基于头部碰撞等效模型的柔性关节机械臂安全构型优化.机器人.2017,(04),全文. *
邹玉静 ; 闵华松 ; 陈友东 ; .一种混联码垛机器人智能避障轨迹规划与仿真.计算机仿真.2013,(07),全文. *

Also Published As

Publication number Publication date
CN112817302A (en) 2021-05-18

Similar Documents

Publication Publication Date Title
US9237315B2 (en) Intrusion detection with directional sensing
CN103257812B (en) A kind of method adjusting display output and electronic equipment
TW201543342A (en) Electronic apparatus and drawing method and computer products thereof
CN104360816A (en) Screen capture method and system
Memarzadeh et al. Real-time and automated recognition and 2D tracking of construction workers and equipment from site video streams
CN102945557A (en) Vector site map drawing method based on mobile terminal
CN103076945B (en) Touch screen type interface of electronic equipment edit methods and device
CN104021004B (en) A kind of method and system that icon unit is operated
CN103412720A (en) Method and device for processing touch-control input signals
CN104035714B (en) Event processing method, device and equipment based on Android system
CN112817302B (en) Safety control method, system, equipment and readable medium for industrial robot
CN103092518A (en) Moving cloud desktop accurate touch method based on remote desktop protocol (RDP)
CN103414829A (en) Method, device and terminal device for controlling screen contents
CN104899361A (en) Remote control method and apparatus
CN110765629A (en) Method, system and equipment for calculating reflow zone
CN104699365A (en) Portable electronic device and interface display method thereof
CN105741046A (en) Information management model generation and system as well as information processing method and system
US20160004379A1 (en) Manipulation input device, portable information terminal, method for control of manipulation input device, and recording medium
CN105278669A (en) Mobile terminal control method and mobile terminal
CN105549822A (en) Icon moving method and mobile terminal
Ganjefar et al. Behavior of Smith predictor in teleoperation systems with modeling and delay time errors
CN103105957A (en) Display method and electronic equipment
CN114740854A (en) Robot obstacle avoidance control method and device
US20170262708A1 (en) Surveillance camera image pan tilt roll zoom (PTRZ) control from a 3D touch user interface apparatus and method of operation
Carton et al. Using penalized spline regression to calculate mean trajectories including confidence intervals of human motion data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant