WO2022190548A1 - Object detection system and control device - Google Patents

Object detection system and control device Download PDF

Info

Publication number
WO2022190548A1
WO2022190548A1 PCT/JP2021/047130 JP2021047130W WO2022190548A1 WO 2022190548 A1 WO2022190548 A1 WO 2022190548A1 JP 2021047130 W JP2021047130 W JP 2021047130W WO 2022190548 A1 WO2022190548 A1 WO 2022190548A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging means
object detection
distance information
output
imaging
Prior art date
Application number
PCT/JP2021/047130
Other languages
French (fr)
Japanese (ja)
Inventor
将己 藤田
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2022190548A1 publication Critical patent/WO2022190548A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/06Safety devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves

Definitions

  • the present invention relates to object detection systems and control devices.
  • a virtual protection area is set near a danger source such as a robot, and when an object such as a human body enters the protection area, the danger source is driven at low speed or stopped.
  • An area sensor for example, is used for object detection.
  • the protected area is set by considering the time required from object detection to completion of deceleration or stopping of the hazard. That is, the size of the protection area is determined by the safe distance from the hazard (the distance that can guarantee that the hazard completes deceleration and stoppage before an object entering the protection area reaches the hazard). Also, the safe distance is determined in consideration of the response speed of the sensor, the operating speed of the hazard, braking performance, and the like. From the viewpoint of productivity, it is desirable to set the protection area as narrow as possible. From the viewpoint of improving safety, it is advantageous to shorten the time from when an object actually enters the protected area to when the object is detected. Therefore, it is required to quickly detect an object entering the protected area.
  • the inventors are considering adopting a three-dimensional distance sensor as an area sensor in order to detect objects existing in a three-dimensional space with high accuracy.
  • a three-dimensional distance sensor it takes a certain amount of processing time to acquire the three-dimensional distance information, so the above problem becomes more pronounced.
  • Patent Document 1 discloses a technique for improving the temporal resolution of detection by using a combination of two magnetic sensors and torque sensors.
  • the device of Patent Literature 1 does not relate to area sensors.
  • An object of the present invention is to provide a technique for increasing the response speed of object detection.
  • the present invention adopts the following configuration.
  • a first aspect of the present invention includes a plurality of imaging means for periodically outputting distance information, an object detection means for detecting an object based on the distance information output by the plurality of imaging means, and a plurality of the imaging means.
  • An optical system for optically overlapping at least a part of the mutual measurable regions of the means, and a control means for controlling output timings of the distance information by the plurality of the imaging means so as to be shifted from each other.
  • Imaging means is, for example, a TOF (Time of Flight) sensor.
  • the “measurable area” is, for example, a three-dimensional space in the field of view (angle of view range) of each imaging means. However, the depth may be limited to a distance range in which an accurate distance can be measured.
  • the distance information from the plurality of imaging means are controlled to be shifted from each other, so the distance information from both imaging means can be obtained at different timings, so the apparent response speed is increased. increase. Therefore, it is possible to increase the response speed of object detection in a region (overlapping space) where the measurable regions of a plurality of imaging means overlap each other.
  • the optical system may have a lens that condenses incident light, and a reflecting member that reflects the light condensed by the lens and guides it to each of the plurality of imaging means.
  • the optical system may have a lens that collects incident light, and a beam splitter that splits the light collected by the lens and guides the light to each of the plurality of imaging means.
  • the output cycle of the distance information by each of the imaging means may be common, and the control means may control the distance information to be output at equal time intervals from the plurality of imaging means. .
  • the control means controls the time interval of T/n from the n imaging means. may be controlled so that the distance information is output by .
  • the output timings of the distance information from both are controlled to be staggered. Thereby, the response speed can be efficiently increased.
  • it may have setting means for setting a virtual protection area for object detection in an overlapping space where the measurable regions of the plurality of imaging means overlap each other. As a result, a high response speed can be secured in the protected area.
  • the imaging means may be a TOF sensor.
  • a second aspect of the present invention is a work machine, a plurality of imaging means for periodically imaging at predetermined time intervals, and at least a part of the measurable regions of the plurality of imaging means are optically overlapped. and an object detection means for detecting an object based on the distance information output from the plurality of imaging means, the control device for controlling the operation in a safety control system, wherein the mutual imaging timing is controlled by the and a safety control means for restricting the operation of the work machine based on the detection result of the object detection means. to provide a control device for
  • “Working machines” are, for example, robots and manufacturing equipment.
  • the “imaging means” is, for example, a TOF (Time of Flight) sensor.
  • the plurality of imaging means are controlled so that the imaging timings are shifted from each other, so images from each imaging means can be obtained at time intervals shorter than the imaging cycle, and the apparent imaging cycle is increased. be able to. Therefore, it is possible to increase the response speed of object detection and safety control.
  • the response speed of object detection can be increased.
  • FIG. 1 is a schematic diagram of an object detection system according to one embodiment of the present invention.
  • FIG. 2 is a schematic diagram showing the measurable area of one three-dimensional distance sensor.
  • FIG. 3 is a block diagram of an object detection system.
  • FIG. 4 is a timing chart showing measurement operations of two three-dimensional distance sensors.
  • FIG. 5 is a schematic diagram showing a configuration example of an optical system.
  • FIG. 6 is a schematic diagram showing a configuration example of an optical system.
  • FIG. 7 is a flowchart showing sensor installation processing and protection area setting processing.
  • FIG. 8 is a diagram showing an example of a luminance image displayed on the screen.
  • FIG. 9 is a diagram showing a display example of the protection area.
  • FIG. 1 is a schematic diagram of an object detection system 100 according to one embodiment of the invention.
  • FIG. 2 is a schematic diagram showing the measurable area of one three-dimensional distance sensor.
  • FIG. 3 is a block diagram of the object detection system 100.
  • FIG. 4 is a timing chart showing measurement operations of two three-dimensional distance sensors 10A and 10B (hereinafter referred to as sensors 10A and 10B).
  • the configuration of the sensors 10A and 10B is common.
  • the three-dimensional shape of the substantial measurable area RA of the sensor 10A is approximately a truncated quadrangular pyramid as an example. This is because the distance range in which an accurate distance can be measured is limited in the field of view of the sensor 10A.
  • the three-dimensional shape of the substantial measurable area RB of the sensor 10B is the same as the measurable area RA.
  • at least a portion of the measurable area RA of the sensor 10A and the measurable area RB of the sensor 10B overlap.
  • a three-dimensional area where the measurable areas RA and RB overlap is the overlapping space Rx.
  • the protection area 41 is preferably set inside the overlapping section Rx.
  • the protection area 41 is a virtual three-dimensional area for object detection, and is set in the vicinity of or around working machines such as robots and manufacturing equipment (hereinafter also referred to as "hazardous sources").
  • the protected area 41 is defined according to safety standards, taking into account the operating range of the hazard. For example, when an object such as a human body enters the protection area 41, safety control such as slow driving or stopping of the danger source is performed.
  • the sensors 10A and 10B each include a light emitting section 11, a light receiving section 12 and a computing section 13.
  • the light emitting portion 11 emits light (for example, infrared light), and the light receiving portion 12 receives reflected light.
  • a TOF sensor that acquires a range image from the time of flight (TOF) of light is employed.
  • TOF time of flight
  • an indirect TOF sensor that estimates the time difference from the phase difference between projected light and reflected light is employed.
  • the sensors 10A and 10B periodically output three-dimensional distance information and luminance information as measurement results. Measurement results by the sensors 10A and 10B are supplied to the control section 30 via the sensor I/F 14 in the sensor control unit 50.
  • FIG. Sensors 10A and 10B are controlled by control unit 30 via sensor I/F 14 .
  • each of the sensors 10A and 10B periodically repeats imaging processing by the light emitting unit 11 and the light receiving unit 12 and arithmetic processing by the arithmetic unit 13 (including data transfer).
  • An imaging process and an arithmetic process are performed in the frame period T1.
  • a frame period T1 is one cycle (output cycle).
  • In the imaging process in addition to light emission by the light emitting unit 11, exposure, readout of charges received by the light receiving unit 12, and the like are executed. These operations may be performed multiple times in one imaging process.
  • processing such as indirectly estimating time from the measured amount of light and generating distance information from the estimated time is executed. After arithmetic processing, measurement results (distance information and luminance information) are output.
  • control unit 30 which is an example of control means, controls the output timings of the measurement results of the sensor 10A and the sensor 10B to be shifted from each other.
  • the control unit 30 controls the sensors 10A and 10B so that the measurement results are output from the sensors 10A and 10B at equal time intervals.
  • the period T2 which is the interval between the output from the sensor 10A and the output from the sensor 10B, is set to T1/2. That is, when there are n sensors (where n is an integer equal to or greater than 2), control should be performed so that measurement results are output from the n sensors at time intervals of T1/n.
  • the measurement result is obtained only from one of the sensors 10A and 10B, so the measurement result acquisition cycle is the frame period T1.
  • the measurement results are obtained from both the sensors 10A and 10B, so measurement results are obtained at the period T2. Therefore, the response speed is apparently doubled.
  • the sensor 10B captures images while the sensor 10A is performing arithmetic processing, thereby substantially increasing the frame rate. Accordingly, by setting the protection area 41 so as to be included in the overlapping space Rx, it is possible to reliably increase the response speed in the protection area 41 .
  • the optical system 20 is a member for optically overlapping at least part of the measurable areas RA and RB of the multiple sensors 10A and 10B.
  • the measurable areas RA and RB do not have to match perfectly, the overlapping space Rx is preferably wide, so the optical system 20 is designed so that the measurable areas (fields of view) of the multiple sensors overlap as much as possible. Good. Practically, for example, the measurable regions of a plurality of sensors preferably overlap by 80% or more, preferably overlap by 90% or more, and more preferably overlap by substantially 100%.
  • the structure and optical components of the optical system 20 may be selected according to the characteristics of the measurement light used by the sensors 10A and 10B. For example, a visible light optical system may be used in the case of a sensor using visible light, and an infrared optical system may be used in the case of a sensor using infrared light.
  • the control unit 30 as an example of object detection means detects an object based on the distance information output by the sensors 10A and 10B as an example of imaging means. When one or both of the sensors 10A and 10B obtain distance information indicating the distance corresponding to the measurable area, the control unit 30 determines that "there is an object in the measurable area RA or the measurable area RB". It can be determined (detected). In addition, when either or both of the sensors 10A and 10B obtain distance information indicating the distance within the overlapping space Rx, the control unit 30 can determine that "there is an object within the overlapping space Rx.” can.
  • the control unit 30 can determine whether or not an object has entered the protection area 41 at a high response speed.
  • the control unit 30, which is an example of a safety control means, executes safety control to limit the operation of the working machine (hazard source) based on the object detection result. Since objects can be detected in the protection area 41 at a high response speed, safety control such as stopping or decelerating the work machine can be performed quickly.
  • the object detection system 100 includes a sensor control unit 50, sensors 10A and 10B, and an optical system 20.
  • the sensor control unit 50 is a control device including the control section 30 , the sensor I/F 14 , the display section 34 , the operation input section 35 , the storage section 36 and the communication I/F 37 .
  • the control unit 30 includes a CPU 31, a ROM 32, a RAM 33, a timer (not shown), and the like.
  • a control program executed by the CPU 31 is stored in the ROM 32 .
  • the RAM 33 provides a work area when the CPU 31 executes the control program.
  • the display unit 34 is configured with an LCD or the like, and displays various information.
  • the display unit 34 may have two or more screens, or may have a function of displaying two or more screens by dividing the screen.
  • the operation input unit 35 receives input of various instructions from the user and sends input information to the CPU 31 . Further, the operation input unit 35 may have a function of notifying the user by voice, lamp, or the like based on instructions from the CPU 31 .
  • the storage unit 36 is composed of, for example, a non-volatile memory.
  • the storage unit 36 may be an external memory.
  • Communication I/F 37 performs wired or wireless communication between control unit 30 and an external device.
  • the CPU 31 drives the sensors 10A and 10B via the sensor I/F 14 and controls the timing of the measurement operation. As described above, CPU 31 controls sensors 10A and 10B so that each of sensors 10A and 10B sequentially outputs measurement results at equal time intervals (cycle T2) (see FIG. 4). In the overlapping space Rx (at least in the protection area 41 (see FIG. 1)), measurement results are obtained at a frame rate of period T2, so high-speed object detection is possible.
  • the optical system 20 is an optical member designed to guide incident light from a measurement target range (also referred to as a monitoring area) including the protection area 41 to the plurality of sensors 10A and 10B. Due to the reversibility of light, the light emitted from the sensors 10A and 10B also illuminates the measurement target range via the optical system 20 . Thereby, part or all of the fields of view of the plurality of sensors 10A and 10B can be made common.
  • FIG. 5 and 6 schematically show an example of the configuration of the optical system 20.
  • FIG. FIG. 5 shows a reflection system
  • FIG. 6 shows a beam splitter system.
  • the optical system 20 is configured to have a lens 21 and a reflecting member 22, and the light L condensed by the lens 21 passes through the reflecting member 22 to the sensors 10A and 10B, respectively. be guided.
  • the optical system 20 is configured to have a lens 21 and a beam splitter 23, the light L condensed by the lens 21 is split into two by the beam splitter 23, and a plurality of sensors 10A and 10B, respectively.
  • FIG. 7 is a flowchart showing sensor installation processing and protection area setting processing. This sensor installation processing is executed by the user.
  • the protection area setting process is realized by the CPU 31 developing a program stored in the ROM 32 in the RAM 33 and executing the program.
  • the protected area setting process is started by a user's instruction.
  • the user places a plurality of markers M1 to M4 at the apex positions of the bottom surface of the desired protection area on the floor surface as the reference surface.
  • the markers M1 to M4 for example, stickers are adopted, but they may be three-dimensional objects. It is preferable that the brightness of the markers M1 to M4 is high enough to reliably perform distance measurement. If the desired protected area is a cuboid, four markers M1-M4 are used.
  • the reference surface on which the markers M1 to M4 are placed is not limited to the floor surface, and may be a flat surface such as a desk surface.
  • step S102 the user installs the sensors 10A and 10B, or adjusts their positions as necessary if they have already been installed.
  • the positions of the sensors 10A and 10B include not only three-dimensional positions but also orientations.
  • the user determines the positions and orientations of the sensors 10A and 10B so that the markers M1 to M4 are within the field of view (angle of view) of each sensor.
  • step S103 the user instructs the CPU 31 to perform distance measurement by the sensors 10A and 10B (distance measurement instruction).
  • Distance measurement here includes acquisition of luminance information as well as distance information.
  • step S201 the CPU 31 waits for a distance measurement instruction from the user, and when there is a distance measurement instruction, the process proceeds to step S202.
  • step S202 the CPU 31 causes the sensors 10A and 10B to perform distance measurement. Measurement results are alternately supplied to the CPU 31 from the sensors 10A and 10B.
  • the CPU 31 executes a protection area setting process in step S203. In this protection area setting process, first, the CPU 31 displays a brightness image of the monitoring area on the screen based on the brightness information output from each sensor among the measurement results.
  • the CPU 31 may display the brightness image obtained from the sensor 10A and/or the sensor 10B as it is on the screen. If the measurable ranges (fields of view) of the sensors 10A and 10B do not completely match, it is preferable to display on the screen a luminance image processed so that the overlapping space Rx of the sensors 10A and 10B can be visually recognized. For example, a graphic indicating the boundary of the area corresponding to the overlapping space Rx may be superimposed on the luminance image obtained from the sensor 10A or the sensor 10B, or an area (background area) other than the overlapping space Rx may be masked. good. Alternatively, an image obtained by trimming (cropping) only the area corresponding to the overlapping space Rx may be displayed on the screen.
  • FIG. 8 is a diagram showing a display example of a luminance image of a monitored area.
  • the portions corresponding to the markers M1 to M4 are particularly bright. The user can recognize the position of the apex of the bottom surface of the protected area on the screen from the four bright positions on the screen.
  • step S104 the user instructs the control unit 30 of the positions of the markers M1 to M4 by specifying the positions where the markers M1 to M4 are displayed on each screen.
  • the method of specifying the position is not limited, but for example, an operation such as moving the mouse cursor to the corresponding position and pressing the confirm button is performed. You may input the coordinate on a screen with a numerical value.
  • the user provides the control unit 30 with information on the desired height H of the protection area.
  • the CPU 31 calculates three-dimensional images of the markers M1 to M4 in the global coordinate system (XYZ) from the distance image based on the distance information obtained from the sensor 10A (or 10B). Coordinates are calculated to determine a protected area 41 having a height H and a rectangular base defined by the markers M1 to M4.
  • the protected area 41 is a hexahedron defined by eight vertices.
  • FIG. 9 is a diagram showing a display example of the protection area 41.
  • the CPU 31 stereoscopically displays the protection area 41 together with the reference surface 40, which is the floor surface, on the screen 34A at the angle of view of the sensor 10A. Thereby, the user can visually recognize the position of the protection area 41 in the current installation state of the sensors 10A and 10B.
  • step S105 the user looks at the protection area 41 displayed on the screen 34A, and if it is appropriate, inputs an OK instruction indicating that the protection area 41 is to be set. , enter the redo instructions.
  • step S205 CPU 31 waits for an OK instruction or a redo instruction from the user. If an instruction to redo is input, the process returns from step S105 to step S101 and from step S205 to step S201. Therefore, in this case, the user can redo the placement of the markers in step S101 and redo the placement of the sensors 10A and 10B in step S102.
  • Object detection system 100 waits for a distance measurement instruction and executes distance measurement again.
  • the protection area 41 is set by the user setting the marker and measuring the distance, and then specifying the vertex on the luminance image. can be done.
  • the markers M1 to M4 and the height information are used to set the protection area 41 here, the present invention is not limited to this, and markers corresponding to eight vertices may be used.
  • the CPU 31 controls to shift the output timings of the distance information from the sensors 10A and 10B where at least a part of the measurable areas RA and RB overlap with each other. Thereby, the response speed of object detection can be increased.
  • the three-dimensional distance sensor used at least optically measures the distance information and periodically outputs the distance information.
  • Any type of sensor may be adopted as long as it is a sensor that outputs. Either an active system or a passive system may be used (in the case of a passive system, the light emitting unit may be omitted). Any principle may be used, such as TOF camera, stereo camera, structured illumination, LiDAR. When adopting the TOF method, either a direct type or an indirect type may be used.
  • the light to be used is not limited to visible light and infrared light, and may include radio waves and X-rays.
  • each of the sensors 10A and 10B was controlled to sequentially output the measurement results at equal time intervals (cycle T2). Even if the output timings are not at equal time intervals, if they are offset, the effect of speeding up the response speed of object detection can be obtained as compared to detecting with only a single sensor. Also, although the frame period T1 of the sensors 10A and 10B is common, it does not necessarily have to be common.
  • the response speed can be efficiently increased by controlling the output timing of each sensor to be at equal time intervals.
  • the shape of the protected area is not limited to a polyhedron such as a rectangular parallelepiped.
  • the number of markers used for arranging the vertices of the bottom surface of the protection area does not have to be four, and may be three or more. Note that if there is already a mark on the floor to replace the marker, there is no need to install the marker again. For example, if a figure or pattern that serves as a mark is drawn on the floor surface, or if there is a characteristic object that serves as a mark, and the user can visually match when teaching the coordinates, the marker is used. can be an alternative to
  • markers are used to set the protection area, but the protection area may be set by other methods.
  • the relative positional relationship between the sensors 10A and 10B is known (for example, when the sensors 10A and 10B are fixed to each other and the information on the relative positions of the two sensors is preset in the ROM at the time of product shipment) )
  • the relative position of the sensor and the floor A three-dimensional coordinate system based on the floor surface can be defined by geometrically calculating such positional relationships.
  • the user inputs the coordinates of n vertices P1 (x1, y1, z1) to Pn (xn, yn, zn) of the protected area.
  • the sensor mounting information may be input by the user, may be measured by the user using a measuring instrument such as a tape measure or a protractor, or may be a component (MEMS gyro, angle sensor, etc.). If the sensors 10A and 10B incorporate components for measuring their own positions and orientations, it is possible to automatically correct the floor surface and protection area when the positions and orientations of the sensors 10A and 10B change.
  • ⁇ Appendix> [1] a plurality of imaging means (10A, 10B) that periodically output distance information; an object detection means (30) for detecting an object based on the distance information output by the plurality of imaging means (10A, 10B); an optical system (20) for optically overlapping at least part of the measurable regions (RA, RB) of the plurality of imaging means (10A, 10B); a control means (30) for controlling output timings of the distance information by the plurality of imaging means (10A, 10B) so as to be shifted from each other;
  • An object detection system (100) comprising:
  • a control device (50) for controlling an operation in a safety control system comprising: an object detection means (30) for detecting an object based on distance information output from the plurality of imaging means (10A, 10B); an imaging control means (30) for controlling the plurality of imaging means (10A, 10B) so as to shift imaging timings of each other by a time shorter than the time interval;
  • a control device (50) comprising: safety control means for restricting the operation of the work machine based on the detection result of the object detection means (30).
  • 10A, 10B three-dimensional distance sensor 20: optical system 30: control section 31: CPU 41: Protection area 100: Object detection system RA, RB: Measurable area Rx: Overlapping space T1: Frame period T2: Period

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
  • Manipulator (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

This object detection system comprises: a plurality of imaging means which periodically output distance information; an object detection means which detects an object on the basis of the distance information output from the plurality of imaging means; an optical system which is for at least partially optically overlapping the measurable regions of the plurality imaging means with each other; and a control means which controls the timing of the output of the distance information by the plurality of imaging means such that the timing of the output is staggered.

Description

物体検出システム、制御装置Object detection system, controller
 本発明は、物体検出システム、制御装置に関する。 The present invention relates to object detection systems and control devices.
 多くの生産現場において、作業者と協働しながらロボットが作業することが行われている。このような生産現場では、作業者の安全を確保するために様々な工夫が施されている。例えば、安全規格に基づき、ロボット等の危険源の近くに仮想の防護エリアを設定し、防護エリア内に人体等の物体が入ると危険源を低速駆動または停止させることが行われる。物体の検出には、例えばエリアセンサが用いられる。 At many production sites, robots work in collaboration with workers. In such production sites, various measures are taken to ensure the safety of workers. For example, based on safety standards, a virtual protection area is set near a danger source such as a robot, and when an object such as a human body enters the protection area, the danger source is driven at low speed or stopped. An area sensor, for example, is used for object detection.
 防護エリアは、物体検出から危険源の減速または停止の完了までの所要時間を考慮して設定される。すなわち、防護エリアの大きさは、危険源からの安全距離(防護エリアへの侵入物が危険源に到達するまでに危険源が減速・停止を完了することを保証できる距離)で決まる。また、安全距離は、センサの応答速度や、危険源の動作速度や制動性能などを考慮して定められる。生産性の観点からは、防護エリアをなるべく狭く設定したい。安全性向上の観点からは、防護エリアに実際に物体が侵入してから物体が検出されるまでの時間は短い方が有利である。従って、防護エリアに侵入する物体を速やかに検出することが求められる。 The protected area is set by considering the time required from object detection to completion of deceleration or stopping of the hazard. That is, the size of the protection area is determined by the safe distance from the hazard (the distance that can guarantee that the hazard completes deceleration and stoppage before an object entering the protection area reaches the hazard). Also, the safe distance is determined in consideration of the response speed of the sensor, the operating speed of the hazard, braking performance, and the like. From the viewpoint of productivity, it is desirable to set the protection area as narrow as possible. From the viewpoint of improving safety, it is advantageous to shorten the time from when an object actually enters the protected area to when the object is detected. Therefore, it is required to quickly detect an object entering the protected area.
特開2016-197403号公報JP 2016-197403 A
 しかしながら、エリアセンサの速度(距離情報の出力の周期)には性能的な限界がある。また、コスト上昇回避の観点から、高性能なセンサを採用できない場合もある。そのため、エリアセンサによる距離情報の出力周期が律速となり、防護エリアの設定や危険源の動作速度の設定が制約を受ける場合がある。従って、物体検出の応答速度を速めることに関して改善の余地があった。 However, there is a performance limit to the speed of the area sensor (distance information output cycle). Also, from the viewpoint of avoiding cost increase, there are cases where a high-performance sensor cannot be adopted. Therefore, the output period of the distance information from the area sensor becomes rate-determining, and the setting of the protection area and the setting of the operation speed of the hazard may be restricted. Therefore, there is room for improvement in increasing the response speed of object detection.
 本発明者らは、3次元空間に存在する物体を高精度に検出するために、エリアセンサとして、3次元距離センサの採用を検討している。このような3次元距離センサの場合、3次元距離情報の取得のためにある一定の処理時間を要することから、上記の課題がより顕著となる。 The inventors are considering adopting a three-dimensional distance sensor as an area sensor in order to detect objects existing in a three-dimensional space with high accuracy. In the case of such a three-dimensional distance sensor, it takes a certain amount of processing time to acquire the three-dimensional distance information, so the above problem becomes more pronounced.
 ところで、特許文献1は、磁気センサやトルクセンサを2つ組み合わせて用いることで、検出の時間的分解能を高める技術を開示している。しかし、特許文献1の装置は、エリアセンサに関するものではない。 By the way, Patent Document 1 discloses a technique for improving the temporal resolution of detection by using a combination of two magnetic sensors and torque sensors. However, the device of Patent Literature 1 does not relate to area sensors.
 本発明は、物体検出の応答速度を速める技術を提供することを目的とする。 An object of the present invention is to provide a technique for increasing the response speed of object detection.
 上記目的を達成するために本発明は、以下の構成を採用する。 In order to achieve the above objects, the present invention adopts the following configuration.
 本発明の第一側面は、距離情報を周期的に出力する複数の撮像手段と、複数の前記撮像手段により出力された前記距離情報に基づいて物体を検出する物体検出手段と、複数の前記撮像手段の互いの測定可能領域の少なくとも一部を光学的に重ねるための光学系と、複数の前記撮像手段による前記距離情報の出力タイミングを互いにずらすように制御する制御手段と、を有することを特徴とする物体検出システムを提供する。 A first aspect of the present invention includes a plurality of imaging means for periodically outputting distance information, an object detection means for detecting an object based on the distance information output by the plurality of imaging means, and a plurality of the imaging means. An optical system for optically overlapping at least a part of the mutual measurable regions of the means, and a control means for controlling output timings of the distance information by the plurality of the imaging means so as to be shifted from each other. To provide an object detection system for
 「撮像手段」は、一例としてTOF(Time of Flight)センサである。「測定可能領域」は、例えば、各撮像手段の視野(画角の範囲)における3次元空間である。ただし、その奥行きについては、正確な距離を測定可能な距離範囲に限定されてもよい。 "Imaging means" is, for example, a TOF (Time of Flight) sensor. The “measurable area” is, for example, a three-dimensional space in the field of view (angle of view range) of each imaging means. However, the depth may be limited to a distance range in which an accurate distance can be measured.
 この構成によれば、複数の撮像手段による距離情報の出力タイミングを互いにずらすように制御されるので、各撮像手段の双方からの距離情報がずれたタイミングで得られるので、見かけ上の応答速度が高まる。よって、複数の撮像手段の測定可能領域が互いに重複する領域(重複空間)内に対しては、物体検出の応答速度を速めることができる。 According to this configuration, since the output timings of the distance information from the plurality of imaging means are controlled to be shifted from each other, the distance information from both imaging means can be obtained at different timings, so the apparent response speed is increased. increase. Therefore, it is possible to increase the response speed of object detection in a region (overlapping space) where the measurable regions of a plurality of imaging means overlap each other.
 光学系の構成は任意である。例えば、前記光学系は、入射する光を集光するレンズと、前記レンズによって集光された光を反射して前記複数の撮像手段のそれぞれに導く反射部材と、を有する構成でもよい。あるいは、前記光学系は、入射する光を集光するレンズと、前記レンズによって集光された光を分割して前記複数の撮像手段のそれぞれに導くビームスプリッターと、を有する構成でもよい。 The configuration of the optical system is arbitrary. For example, the optical system may have a lens that condenses incident light, and a reflecting member that reflects the light condensed by the lens and guides it to each of the plurality of imaging means. Alternatively, the optical system may have a lens that collects incident light, and a beam splitter that splits the light collected by the lens and guides the light to each of the plurality of imaging means.
 また、前記撮像手段の各々による距離情報の出力周期は共通であり、前記制御手段は、前記複数の撮像手段から、等時間間隔で前記距離情報が出力されるように制御するようにしてもよい。具体的には、前記撮像手段の数をn(nは2以上の整数)、前記出力周期をTとしたときに、前記制御手段は、n個の前記撮像手段から、T/nの時間間隔で前記距離情報が出力されるように制御してもよい。例えば、2つの撮像手段を用いる場合、両者からの距離情報の出力タイミングが互い違いとなるように制御される。これにより、応答速度を効率的に速めることができる。 Further, the output cycle of the distance information by each of the imaging means may be common, and the control means may control the distance information to be output at equal time intervals from the plurality of imaging means. . Specifically, when the number of the imaging means is n (n is an integer equal to or greater than 2) and the output period is T, the control means controls the time interval of T/n from the n imaging means. may be controlled so that the distance information is output by . For example, when two imaging means are used, the output timings of the distance information from both are controlled to be staggered. Thereby, the response speed can be efficiently increased.
 また、複数の前記撮像手段の測定可能領域が互いに重複する重複空間に物体検出のための仮想的な防護エリアを設定する設定手段を有してもよい。これにより、防護エリアでの高い応答速度を確保することができる。 Further, it may have setting means for setting a virtual protection area for object detection in an overlapping space where the measurable regions of the plurality of imaging means overlap each other. As a result, a high response speed can be secured in the protected area.
 また、前記撮像手段は、TOFセンサであってもよい。 Also, the imaging means may be a TOF sensor.
 本発明の第二側面は、作業機械と、それぞれが所定の時間間隔で周期的に撮像する複数の撮像手段と、前記複数の撮像手段の互いの測定可能領域の少なくとも一部を光学的に重ねるための光学系と、前記複数の撮像手段から出力される距離情報に基づいて物体を検出する物体検出手段とを有する安全制御システムにおける動作を制御する制御装置であって、互いの撮像タイミングを前記時間間隔よりも短い時間ずらすように前記複数の撮像手段を制御する撮像制御手段と、前記物体検出手段の検出結果に基づいて前記作業機械の動作を制限する安全制御手段とを備えることを特徴とする制御装置を提供する。 A second aspect of the present invention is a work machine, a plurality of imaging means for periodically imaging at predetermined time intervals, and at least a part of the measurable regions of the plurality of imaging means are optically overlapped. and an object detection means for detecting an object based on the distance information output from the plurality of imaging means, the control device for controlling the operation in a safety control system, wherein the mutual imaging timing is controlled by the and a safety control means for restricting the operation of the work machine based on the detection result of the object detection means. to provide a control device for
 「作業機械」は例えばロボット、製造装置などである。「撮像手段」は、一例としてTOF(Time of Flight)センサである。 "Working machines" are, for example, robots and manufacturing equipment. The "imaging means" is, for example, a TOF (Time of Flight) sensor.
 この構成によれば、複数の撮像手段が撮像タイミングを互いにずらすように制御されるので、各撮像手段からの画像を撮像周期よりも短い時間間隔で得ることができ、見かけ上の撮像周期を高めることができる。よって、物体検出ならびに安全制御の応答速度を速めることが可能となる。 According to this configuration, the plurality of imaging means are controlled so that the imaging timings are shifted from each other, so images from each imaging means can be obtained at time intervals shorter than the imaging cycle, and the apparent imaging cycle is increased. be able to. Therefore, it is possible to increase the response speed of object detection and safety control.
 本発明によれば、物体検出の応答速度を速めることができる。 According to the present invention, the response speed of object detection can be increased.
図1は、本発明の一実施の形態に係る物体検出システムの模式図である。FIG. 1 is a schematic diagram of an object detection system according to one embodiment of the present invention. 図2は、1つの3次元距離センサの測定可能領域を示す模式図である。FIG. 2 is a schematic diagram showing the measurable area of one three-dimensional distance sensor. 図3は、物体検出システムのブロック図である。FIG. 3 is a block diagram of an object detection system. 図4は、2つの3次元距離センサの測定動作を示すタイミングチャートである。FIG. 4 is a timing chart showing measurement operations of two three-dimensional distance sensors. 図5は、光学系の構成例を示す模式図である。FIG. 5 is a schematic diagram showing a configuration example of an optical system. 図6は、光学系の構成例を示す模式図である。FIG. 6 is a schematic diagram showing a configuration example of an optical system. 図7は、センサ設置処理および防護エリア設定処理を示すフローチャートである。FIG. 7 is a flowchart showing sensor installation processing and protection area setting processing. 図8は、画面に表示された輝度画像の例を示す図である。FIG. 8 is a diagram showing an example of a luminance image displayed on the screen. 図9は、防護エリアの表示例を示す図である。FIG. 9 is a diagram showing a display example of the protection area.
 <適用例>
 図1~図4を参照して、本発明に係る物体検出システムの適用例を説明する。図1は、本発明の一実施の形態に係る物体検出システム100の模式図である。図2は、1つの3次元距離センサの測定可能領域を示す模式図である。図3は、物体検出システム100のブロック図である。図4は、2つの3次元距離センサ10A、10B(以下、センサ10A、10Bと記す)の測定動作を示すタイミングチャートである。
<Application example>
An application example of the object detection system according to the present invention will be described with reference to FIGS. 1 to 4. FIG. FIG. 1 is a schematic diagram of an object detection system 100 according to one embodiment of the invention. FIG. 2 is a schematic diagram showing the measurable area of one three-dimensional distance sensor. FIG. 3 is a block diagram of the object detection system 100. As shown in FIG. FIG. 4 is a timing chart showing measurement operations of two three- dimensional distance sensors 10A and 10B (hereinafter referred to as sensors 10A and 10B).
 センサ10A、10Bの構成は共通である。図2に示すように、センサ10Aの実質的な測定可能領域RAの立体形状は、一例として概ね四角錐台となる。センサ10Aの視野のうち、正確な距離を測定可能な距離範囲は限定されるからである。センサ10Bの実質的な測定可能領域RBの立体形状は測定可能領域RAと同様である。図1に示すように、センサ10Aの測定可能領域RAとセンサ10Bの測定可能領域RBは、少なくとも一部が重なる。測定可能領域RA、RBが重複する3次元領域が重複空間Rxである。防護エリア41は重複区間Rxの内部に設定されるとよい。 The configuration of the sensors 10A and 10B is common. As shown in FIG. 2, the three-dimensional shape of the substantial measurable area RA of the sensor 10A is approximately a truncated quadrangular pyramid as an example. This is because the distance range in which an accurate distance can be measured is limited in the field of view of the sensor 10A. The three-dimensional shape of the substantial measurable area RB of the sensor 10B is the same as the measurable area RA. As shown in FIG. 1, at least a portion of the measurable area RA of the sensor 10A and the measurable area RB of the sensor 10B overlap. A three-dimensional area where the measurable areas RA and RB overlap is the overlapping space Rx. The protection area 41 is preferably set inside the overlapping section Rx.
 ここで、防護エリア41は、物体検出のための仮想的な3次元領域であり、ロボットや製造装置等の作業機械(以下「危険源」とも呼ぶ)の近辺または周囲に設定される。防護エリア41は、安全規格に従って、危険源の動作範囲を考慮して定められる。例えば、防護エリア41内に人体等の物体が入ると危険源を低速駆動または停止させる等の安全制御が行われる。 Here, the protection area 41 is a virtual three-dimensional area for object detection, and is set in the vicinity of or around working machines such as robots and manufacturing equipment (hereinafter also referred to as "hazardous sources"). The protected area 41 is defined according to safety standards, taking into account the operating range of the hazard. For example, when an object such as a human body enters the protection area 41, safety control such as slow driving or stopping of the danger source is performed.
 図3に示すように、センサ10A、10Bは各々、発光部11、受光部12および演算部13を備える。発光部11は光(例えば、赤外光)を出射し、受光部12は反射光を受光する。センサ10A、10Bには、一例として、光の飛行時間(Time of Flight:TOF)から距離画像を取得するTOFセンサが採用される。例えば、投影光と反射光の位相差から時間差を推定する間接型TOFセンサが採用される。センサ10A、10Bは、3次元の距離情報および輝度情報を測定結果として周期的に出力する。センサ10A、10Bによる測定結果はセンサ制御ユニット50におけるセンサI/F14を介して制御部30に供給される。センサ10A、10Bは、センサI/F14を介して制御部30によって制御される。 As shown in FIG. 3, the sensors 10A and 10B each include a light emitting section 11, a light receiving section 12 and a computing section 13. The light emitting portion 11 emits light (for example, infrared light), and the light receiving portion 12 receives reflected light. For the sensors 10A and 10B, for example, a TOF sensor that acquires a range image from the time of flight (TOF) of light is employed. For example, an indirect TOF sensor that estimates the time difference from the phase difference between projected light and reflected light is employed. The sensors 10A and 10B periodically output three-dimensional distance information and luminance information as measurement results. Measurement results by the sensors 10A and 10B are supplied to the control section 30 via the sensor I/F 14 in the sensor control unit 50. FIG. Sensors 10A and 10B are controlled by control unit 30 via sensor I/F 14 .
 図4に示すように、センサ10A、10Bは各々、発光部11および受光部12による撮像処理と演算部13による演算処理(データ転送を含む)とを周期的に繰り返す。フレーム期間T1で、撮像処理と演算処理とが実行される。フレーム期間T1が1周期(出力周期)である。撮像処理においては、発光部11による発光のほか、露光、受光部12で受光された電荷の読み出し等が実行される。1回の撮像処理で、これらの動作が複数回実行されてもよい。演算処理においては、計測した光の量から間接的に時間を推定し、推定した時間から距離情報を生成する処理等が実行される。演算処理後は、測定結果(距離情報および輝度情報)が出力される。 As shown in FIG. 4, each of the sensors 10A and 10B periodically repeats imaging processing by the light emitting unit 11 and the light receiving unit 12 and arithmetic processing by the arithmetic unit 13 (including data transfer). An imaging process and an arithmetic process are performed in the frame period T1. A frame period T1 is one cycle (output cycle). In the imaging process, in addition to light emission by the light emitting unit 11, exposure, readout of charges received by the light receiving unit 12, and the like are executed. These operations may be performed multiple times in one imaging process. In the arithmetic processing, processing such as indirectly estimating time from the measured amount of light and generating distance information from the estimated time is executed. After arithmetic processing, measurement results (distance information and luminance information) are output.
 ここで、制御手段の一例としての制御部30は、センサ10Aとセンサ10Bとで、測定結果の出力タイミングを互いにずらすように制御する。一例として、制御部30は、センサ10A、10Bから等時間間隔で測定結果が出力されるように各センサ10A、10Bを制御する。例えばセンサが2つの場合、センサ10Aからの出力とセンサ10Bからの出力との間隔である周期T2はT1/2に設定される。すなわち、センサがn個(nは2以上の整数)の場合、n個のセンサからT1/nの時間間隔で測定結果が出力されるように制御すればよい。測定可能領域RA、RBのうち重複空間Rxでない領域においては、センサ10A、10Bのいずれかからしか測定結果が得られないので、測定結果の取得周期はフレーム期間T1となる。しかし、重複空間Rxでは、センサ10A、10Bの双方からの測定結果が得られるので、周期T2で測定結果が得られる。従って、応答速度が見かけ上、2倍になる。言い換えると、センサ10Aが演算処理を行っている期間を利用してセンサ10Bが撮像を行うことで、実質的にフレームレートが高まる。これにより、重複空間Rx内に包含されるように防護エリア41を設定することで、防護エリア41では応答速度を確実に高めることが可能となる。 Here, the control unit 30, which is an example of control means, controls the output timings of the measurement results of the sensor 10A and the sensor 10B to be shifted from each other. As an example, the control unit 30 controls the sensors 10A and 10B so that the measurement results are output from the sensors 10A and 10B at equal time intervals. For example, when there are two sensors, the period T2, which is the interval between the output from the sensor 10A and the output from the sensor 10B, is set to T1/2. That is, when there are n sensors (where n is an integer equal to or greater than 2), control should be performed so that measurement results are output from the n sensors at time intervals of T1/n. In the measurable areas RA and RB that are not the overlapping space Rx, the measurement result is obtained only from one of the sensors 10A and 10B, so the measurement result acquisition cycle is the frame period T1. However, in the overlapping space Rx, measurement results are obtained from both the sensors 10A and 10B, so measurement results are obtained at the period T2. Therefore, the response speed is apparently doubled. In other words, the sensor 10B captures images while the sensor 10A is performing arithmetic processing, thereby substantially increasing the frame rate. Accordingly, by setting the protection area 41 so as to be included in the overlapping space Rx, it is possible to reliably increase the response speed in the protection area 41 .
 光学系20は、複数のセンサ10A、10Bの互いの測定可能領域RA、RBの少なくとも一部を光学的に重ねるための部材である。測定可能領域RA、RBは完全に一致しなくてもよいが、重複空間Rxは広い方が好ましいため、複数のセンサの測定可能領域(視野)ができる限り重なるように光学系20が設計されるとよい。実用上は、例えば、複数のセンサの測定可能領域は、80%以上が重なるとよく、90%以上が重なると好ましく、実質的に100%重なるとさらに好ましい。なお、光学系20の構造や光学部品は、センサ10A、10Bが用いる測定光の特性に合わせて選択すればよい。例えば、可視光を用いるセンサの場合は可視光光学系、赤外光を用いるセンサの場合は赤外線光学系を用いればよい。 The optical system 20 is a member for optically overlapping at least part of the measurable areas RA and RB of the multiple sensors 10A and 10B. Although the measurable areas RA and RB do not have to match perfectly, the overlapping space Rx is preferably wide, so the optical system 20 is designed so that the measurable areas (fields of view) of the multiple sensors overlap as much as possible. Good. Practically, for example, the measurable regions of a plurality of sensors preferably overlap by 80% or more, preferably overlap by 90% or more, and more preferably overlap by substantially 100%. The structure and optical components of the optical system 20 may be selected according to the characteristics of the measurement light used by the sensors 10A and 10B. For example, a visible light optical system may be used in the case of a sensor using visible light, and an infrared optical system may be used in the case of a sensor using infrared light.
 物体検出手段の一例としての制御部30は、撮像手段の一例としてのセンサ10A、10Bにより出力された距離情報に基づいて物体を検出する。制御部30は、センサ10A、10Bのいずれかまたは双方で測定可能領域内に該当する距離を示す距離情報が得られた場合に、「測定可能領域RAまたは測定可能領域RB内に物体有り」と判定(検出)することができる。また、制御部30は、センサ10A、10Bのいずれかまたは双方で重複空間Rx内に該当する距離を示す距離情報が得られた場合に、「重複空間Rx内に物体有り」と判定することができる。特に、防護エリア41は重複空間Rx内にあるので、制御部30は、防護エリア41に物体が入ったか否かを高い応答速度で判定することができる。安全制御手段の一例としての制御部30は、物体検出結果に基づいて作業機械(危険源)の動作を制限する安全制御を実行する。防護エリア41については高い応答速度で物体の検出が可能であるため、作業機械の停止や減速といった安全制御も速やかに行うことができる。 The control unit 30 as an example of object detection means detects an object based on the distance information output by the sensors 10A and 10B as an example of imaging means. When one or both of the sensors 10A and 10B obtain distance information indicating the distance corresponding to the measurable area, the control unit 30 determines that "there is an object in the measurable area RA or the measurable area RB". It can be determined (detected). In addition, when either or both of the sensors 10A and 10B obtain distance information indicating the distance within the overlapping space Rx, the control unit 30 can determine that "there is an object within the overlapping space Rx." can. In particular, since the protection area 41 is within the overlapping space Rx, the control unit 30 can determine whether or not an object has entered the protection area 41 at a high response speed. The control unit 30, which is an example of a safety control means, executes safety control to limit the operation of the working machine (hazard source) based on the object detection result. Since objects can be detected in the protection area 41 at a high response speed, safety control such as stopping or decelerating the work machine can be performed quickly.
 以上の適用例は、本発明の理解を補助するための例示であり、本発明を限定解釈することを意図するものではない。 The above application examples are examples to aid understanding of the present invention, and are not intended to limit the interpretation of the present invention.
 <実施形態>
 次に、物体検出システム100の構成および防護エリアの設定の手順等を詳細に説明する。まず、図3で、物体検出システム100の全体構成を説明する。
<Embodiment>
Next, the configuration of the object detection system 100, the procedure for setting the protection area, and the like will be described in detail. First, the overall configuration of the object detection system 100 will be described with reference to FIG.
 物体検出システム100は、センサ制御ユニット50、センサ10A、10B、および光学系20を含む。センサ制御ユニット50は、制御部30、センサI/F14、表示部34、操作入力部35、記憶部36、通信I/F37を備える制御装置である。制御部30は、CPU31、ROM32、RAM33および不図示のタイマ等を備える。ROM32には、CPU31が実行する制御プログラムが格納されている。RAM33は、CPU31が制御プログラムを実行する際のワークエリアを提供する。 The object detection system 100 includes a sensor control unit 50, sensors 10A and 10B, and an optical system 20. The sensor control unit 50 is a control device including the control section 30 , the sensor I/F 14 , the display section 34 , the operation input section 35 , the storage section 36 and the communication I/F 37 . The control unit 30 includes a CPU 31, a ROM 32, a RAM 33, a timer (not shown), and the like. A control program executed by the CPU 31 is stored in the ROM 32 . The RAM 33 provides a work area when the CPU 31 executes the control program.
 表示部34は、LCD等で構成され、各種情報を表示する。表示部34は、2つ以上の画面を有するか、または画面分割により2つ以上の画面を表示する機能を有してもよい。操作入力部35は、ユーザからの各種指示の入力を受け付け、入力情報をCPU31に送る。また、操作入力部35は、CPU31からの指示に基づきユーザに対して音声やランプ等による報知を行う機能を有してもよい。記憶部36は例えば不揮発メモリで構成される。記憶部36は外部メモリであってもよい。通信I/F37は、制御部30と外部装置との間で有線または無線による通信を行う。 The display unit 34 is configured with an LCD or the like, and displays various information. The display unit 34 may have two or more screens, or may have a function of displaying two or more screens by dividing the screen. The operation input unit 35 receives input of various instructions from the user and sends input information to the CPU 31 . Further, the operation input unit 35 may have a function of notifying the user by voice, lamp, or the like based on instructions from the CPU 31 . The storage unit 36 is composed of, for example, a non-volatile memory. The storage unit 36 may be an external memory. Communication I/F 37 performs wired or wireless communication between control unit 30 and an external device.
 CPU31は、センサI/F14を介してセンサ10A、10Bを駆動し、測定動作のタイミングを制御する。上述したように、CPU31は、センサ10A、10Bの各々が順次、等時間間隔(周期T2)で測定結果を出力するようにセンサ10A、10Bを制御する(図4参照)。重複空間Rx(少なくとも防護エリア41(図1参照))においては、周期T2というフレームレートで測定結果が得られるので、高速での物体検出が可能となる。 The CPU 31 drives the sensors 10A and 10B via the sensor I/F 14 and controls the timing of the measurement operation. As described above, CPU 31 controls sensors 10A and 10B so that each of sensors 10A and 10B sequentially outputs measurement results at equal time intervals (cycle T2) (see FIG. 4). In the overlapping space Rx (at least in the protection area 41 (see FIG. 1)), measurement results are obtained at a frame rate of period T2, so high-speed object detection is possible.
 <光学系>
 光学系20は、防護エリア41を含む測定対象範囲(監視エリアともいう)から入射した光を複数のセンサ10A、10Bにそれぞれ導くように設計された光学部材である。光の可逆性により、複数のセンサ10A、10Bから発せられる光も、光学系20を介して測定対象範囲に照射される。これにより、複数のセンサ10A、10Bの視野の一部または全部を共通化することができる。
<Optical system>
The optical system 20 is an optical member designed to guide incident light from a measurement target range (also referred to as a monitoring area) including the protection area 41 to the plurality of sensors 10A and 10B. Due to the reversibility of light, the light emitted from the sensors 10A and 10B also illuminates the measurement target range via the optical system 20 . Thereby, part or all of the fields of view of the plurality of sensors 10A and 10B can be made common.
 図5~図6は、光学系20の構成の一例を模式的に示している。図5は反射方式、図6はビームスプリッター方式を示す。 5 and 6 schematically show an example of the configuration of the optical system 20. FIG. FIG. 5 shows a reflection system, and FIG. 6 shows a beam splitter system.
 図5の反射方式では、例えば、光学系20がレンズ21と反射部材22を有して構成され、レンズ21によって集光された光Lが反射部材22を介して複数のセンサ10A、10Bにそれぞれ導かれる。図6のビームスプリッター方式では、例えば、光学系20がレンズ21とビームスプリッター23を有して構成され、レンズ21によって集光された光Lがビームスプリッター23で2つに分割され、複数のセンサ10A、10Bにそれぞれ導かれる。ビームスプリッター23をさらに追加し光Lの分割数を増やせば、センサを3つ以上設けることも可能である。 In the reflection system of FIG. 5, for example, the optical system 20 is configured to have a lens 21 and a reflecting member 22, and the light L condensed by the lens 21 passes through the reflecting member 22 to the sensors 10A and 10B, respectively. be guided. In the beam splitter system of FIG. 6, for example, the optical system 20 is configured to have a lens 21 and a beam splitter 23, the light L condensed by the lens 21 is split into two by the beam splitter 23, and a plurality of sensors 10A and 10B, respectively. By further adding a beam splitter 23 to increase the number of splits of the light L, it is possible to provide three or more sensors.
 <防護エリア設定処理>
 次に、防護エリアの設定処理の一例を説明する。ここでは、マーカと呼ばれる物理的な指標を用いて3次元空間内に防護エリアを設定する例を述べる。
<Protection area setting processing>
Next, an example of protection area setting processing will be described. Here, an example of setting a protection area in a three-dimensional space using physical indices called markers will be described.
 図7は、センサ設置処理および防護エリア設定処理を示すフローチャートである。このセンサ設置処理はユーザによって実行される。また、防護エリア設定処理は、ROM32に格納されたプログラムをCPU31がRAM33に展開して実行することにより実現される。防護エリア設定処理は、ユーザの指示により開始される。 FIG. 7 is a flowchart showing sensor installation processing and protection area setting processing. This sensor installation processing is executed by the user. The protection area setting process is realized by the CPU 31 developing a program stored in the ROM 32 in the RAM 33 and executing the program. The protected area setting process is started by a user's instruction.
 まず、ステップS101では、ユーザは、複数のマーカM1~M4を、基準面としての床面上における、所望する防護エリアの底面の頂点の位置に配置する。マーカM1~M4としては例えば、シールなどが採用されるが、立体物であってもよい。測距を確実に行うことができる程度にマーカM1~M4の明度は高い方がよい。所望する防護エリアが直方体である場合、4つのマーカM1~M4を用いる。なお、マーカM1~M4を設置する基準面は床面に限らず、机面等の平坦面であってもよい。 First, in step S101, the user places a plurality of markers M1 to M4 at the apex positions of the bottom surface of the desired protection area on the floor surface as the reference surface. As the markers M1 to M4, for example, stickers are adopted, but they may be three-dimensional objects. It is preferable that the brightness of the markers M1 to M4 is high enough to reliably perform distance measurement. If the desired protected area is a cuboid, four markers M1-M4 are used. The reference surface on which the markers M1 to M4 are placed is not limited to the floor surface, and may be a flat surface such as a desk surface.
 ステップS102では、ユーザは、センサ10A、10Bを設置するか、または、設置済みの場合は必要に応じて位置を調整する。センサ10A、10Bの位置には、3次元方向の位置だけでなく、向きも含まれる。ユーザは、マーカM1~M4が各センサの視野(画角)に入るようにセンサ10A、10Bの位置や向きを決める。ステップS103では、ユーザは、センサ10A、10Bによる測距を実行するようCPU31に対して指示する(測距指示)。ここでいう測距は、距離情報だけでなく輝度情報を取得することも含む。 In step S102, the user installs the sensors 10A and 10B, or adjusts their positions as necessary if they have already been installed. The positions of the sensors 10A and 10B include not only three-dimensional positions but also orientations. The user determines the positions and orientations of the sensors 10A and 10B so that the markers M1 to M4 are within the field of view (angle of view) of each sensor. In step S103, the user instructs the CPU 31 to perform distance measurement by the sensors 10A and 10B (distance measurement instruction). Distance measurement here includes acquisition of luminance information as well as distance information.
 ステップS201では、CPU31は、ユーザからの測距指示を待ち、測距指示があると、ステップS202に進む。ステップS202では、CPU31は、センサ10A、10Bに測距を実行させる。センサ10A、10Bからは、それぞれの測定結果が交互にCPU31に供給される。CPU31は、一定期間の測定結果を得ると、ステップS203で、防護エリア設定処理を実行する。この防護エリア設定処理では、まず、CPU31は、測定結果のうち各センサにより出力された輝度情報に基づいて、監視エリアの輝度画像を画面に表示させる。 In step S201, the CPU 31 waits for a distance measurement instruction from the user, and when there is a distance measurement instruction, the process proceeds to step S202. In step S202, the CPU 31 causes the sensors 10A and 10B to perform distance measurement. Measurement results are alternately supplied to the CPU 31 from the sensors 10A and 10B. When the CPU 31 obtains the measurement result for a certain period of time, the CPU 31 executes a protection area setting process in step S203. In this protection area setting process, first, the CPU 31 displays a brightness image of the monitoring area on the screen based on the brightness information output from each sensor among the measurement results.
 センサ10Aとセンサ10Bの測定可能範囲(視野)が一致している場合、CPU31は、センサ10Aおよび/またはセンサ10Bから得られる輝度画像をそのまま画面に表示してよい。センサ10Aとセンサ10Bの測定可能範囲(視野)が完全に一致していない場合は、センサ10Aとセンサ10Bの重複空間Rxが視認できるように加工した輝度画像を画面に表示するとよい。例えば、センサ10Aまたはセンサ10Bから得られる輝度画像に対し、重複空間Rxに対応する領域の境界を示すグラフィックを重畳してもよいし、重複空間Rx以外の領域(背景領域)をマスクしてもよい。あるいは、重複空間Rxに対応する領域のみをトリミング(クロッピング)した画像を画面に表示してもよい。 When the measurable ranges (field of view) of the sensors 10A and 10B match, the CPU 31 may display the brightness image obtained from the sensor 10A and/or the sensor 10B as it is on the screen. If the measurable ranges (fields of view) of the sensors 10A and 10B do not completely match, it is preferable to display on the screen a luminance image processed so that the overlapping space Rx of the sensors 10A and 10B can be visually recognized. For example, a graphic indicating the boundary of the area corresponding to the overlapping space Rx may be superimposed on the luminance image obtained from the sensor 10A or the sensor 10B, or an area (background area) other than the overlapping space Rx may be masked. good. Alternatively, an image obtained by trimming (cropping) only the area corresponding to the overlapping space Rx may be displayed on the screen.
 図8は、監視エリアの輝度画像の表示例を示す図である。輝度画像において、マーカM1~M4に対応する部分がとりわけ明るく視認される。ユーザは、画面上における4つの明るい位置から、画面上における防護エリアの底面の頂点の位置を認識することができる。 FIG. 8 is a diagram showing a display example of a luminance image of a monitored area. In the brightness image, the portions corresponding to the markers M1 to M4 are particularly bright. The user can recognize the position of the apex of the bottom surface of the protected area on the screen from the four bright positions on the screen.
 輝度画像が表示されると、ステップS104で、ユーザは、それぞれの画面上でマーカM1~M4が表示されている位置を指定することで、マーカM1~M4の位置を制御部30に教示する。ここでの位置の指定方法は問わないが、例えば、マウスカーソルを該当位置に合わせて確定ボタンを押す、等の操作による。数値により画面上の座標を入力してもよい。また、ユーザは、所望する防護エリアの高さHの情報を制御部30に与える。 When the brightness image is displayed, in step S104, the user instructs the control unit 30 of the positions of the markers M1 to M4 by specifying the positions where the markers M1 to M4 are displayed on each screen. The method of specifying the position here is not limited, but for example, an operation such as moving the mouse cursor to the corresponding position and pressing the confirm button is performed. You may input the coordinate on a screen with a numerical value. In addition, the user provides the control unit 30 with information on the desired height H of the protection area.
 マーカM1~M4の位置および高さ情報が与えられると、CPU31は、センサ10A(または10B)から得られた距離情報に基づく距離画像から、グローバル座標系(XYZ)におけるマーカM1~M4の3次元座標を計算し、マーカM1~M4で規定される矩形を底面とする高さHの防護エリア41を決定する。本例の場合、防護エリア41は、8つの頂点で定義される六面体になる。 When the position and height information of the markers M1 to M4 are given, the CPU 31 calculates three-dimensional images of the markers M1 to M4 in the global coordinate system (XYZ) from the distance image based on the distance information obtained from the sensor 10A (or 10B). Coordinates are calculated to determine a protected area 41 having a height H and a rectangular base defined by the markers M1 to M4. In this example, the protected area 41 is a hexahedron defined by eight vertices.
 ステップS204で、CPU31は、設定した防護エリア41を表示させる。図9は、防護エリア41の表示例を示す図である。一例として、CPU31は、センサ10Aの画角における画面34Aに、床面である基準面40と共に、防護エリア41を立体的に表示させる。これにより、ユーザは、現在のセンサ10A、10Bの設置状態における防護エリア41の位置を視認することができる。 At step S204, the CPU 31 displays the set protection area 41. FIG. 9 is a diagram showing a display example of the protection area 41. As shown in FIG. As an example, the CPU 31 stereoscopically displays the protection area 41 together with the reference surface 40, which is the floor surface, on the screen 34A at the angle of view of the sensor 10A. Thereby, the user can visually recognize the position of the protection area 41 in the current installation state of the sensors 10A and 10B.
 ステップS105では、ユーザは、画面34Aに表示された防護エリア41を見て、適切であれば、この防護エリア41で確定することを示すOK指示を入力し、適切でなく設置をやり直したい場合は、やり直しの指示を入力する。一方、ステップS205では、CPU31は、ユーザからのOK指示またはやり直しの指示を待つ。やり直しの指示が入力された場合は、ステップS105からステップS101に戻ると共に、ステップS205からステップS201に戻る。従って、この場合、ユーザは、ステップS101でマーカの設置のやり直しができ、さらにステップS102でセンサ10A、10Bの設置のやり直しができる。物体検出システム100においては、測距指示を待って測距が再度実行される。 In step S105, the user looks at the protection area 41 displayed on the screen 34A, and if it is appropriate, inputs an OK instruction indicating that the protection area 41 is to be set. , enter the redo instructions. On the other hand, in step S205, CPU 31 waits for an OK instruction or a redo instruction from the user. If an instruction to redo is input, the process returns from step S105 to step S101 and from step S205 to step S201. Therefore, in this case, the user can redo the placement of the markers in step S101 and redo the placement of the sensors 10A and 10B in step S102. Object detection system 100 waits for a distance measurement instruction and executes distance measurement again.
 OK指示が入力された場合は、図7に示す処理は終了する。この場合、防護エリア41の設定は確定し、CPU31は、その後の物体検出等の処理や安全制御に移行することができる。 When an OK instruction is input, the processing shown in FIG. 7 ends. In this case, the setting of the protection area 41 is finalized, and the CPU 31 can proceed to subsequent processes such as object detection and safety control.
 このような手法によれば、ユーザがマーカを設置して測距させた後、輝度画像上で頂点を指定することで防護エリア41が設定されるので、防護エリア41の設定作業を補助することができる。なお、ここでは、防護エリア41の設定には、マーカM1~M4と高さ情報とを用いたが、これに限定されず、8頂点に対応するマーカを用いてもよい。 According to such a method, the protection area 41 is set by the user setting the marker and measuring the distance, and then specifying the vertex on the luminance image. can be done. Although the markers M1 to M4 and the height information are used to set the protection area 41 here, the present invention is not limited to this, and markers corresponding to eight vertices may be used.
 本実施の形態によれば、CPU31は、互いの測定可能領域RA、RBの少なくとも一部が重なるセンサ10A、10Bによる距離情報の出力タイミングを互いにずらすように制御する。これにより、物体検出の応答速度を速めることができる。 According to the present embodiment, the CPU 31 controls to shift the output timings of the distance information from the sensors 10A and 10B where at least a part of the measurable areas RA and RB overlap with each other. Thereby, the response speed of object detection can be increased.
 特に、センサ10A、10Bの各々が順次、等時間間隔(T2=T1/2)で距離情報を出力するように制御されるので、応答速度を効率的に速めることができる。しかも、測定可能領域が互いに重複する重複空間Rxに防護エリア41が包含されるように、センサ10A、10Bの各々を配置することで、防護エリア41での高い応答速度を確保することができる。 In particular, since each of the sensors 10A and 10B is controlled to sequentially output distance information at equal time intervals (T2=T1/2), the response speed can be efficiently increased. Moreover, by arranging each of the sensors 10A and 10B so that the protection area 41 is included in the overlapping space Rx in which the measurable regions overlap each other, a high response speed in the protection area 41 can be ensured.
 <変形例>
 上記実施形態は、本発明の構成例を例示的に説明するものに過ぎない。本発明は上記の具体的な形態には限定されることはなく、その技術的思想の範囲内で種々の変形が可能である。
<Modification>
The above-described embodiment is merely an example of the configuration of the present invention. The present invention is not limited to the specific forms described above, and various modifications are possible within the technical scope of the present invention.
 距離情報の出力タイミングを互いにずらすことで物体検出の応答速度を速めるという効果を得る観点に限れば、用いる3次元距離センサは、少なくとも、光学的に距離情報を測定し且つ距離情報を周期的に出力するセンサであれば、どのような種類のセンサを採用してもよい。アクティブ方式、パッシブ方式のいずれでもよい(パッシブ方式の場合、発光部はなくてよい)。TOFカメラ、ステレオカメラ、構造化照明、LiDARなど、いずれの原理を用いてもよい。TOF方式を採用する場合、直接型(ダイレクト型)と間接型(インダイレクト型)のいずれでもよい。また、用いる光は、可視光や赤外光に限られず、電波やX線を含んでもよい。 From the point of view of obtaining the effect of speeding up the response speed of object detection by shifting the output timing of the distance information, the three-dimensional distance sensor used at least optically measures the distance information and periodically outputs the distance information. Any type of sensor may be adopted as long as it is a sensor that outputs. Either an active system or a passive system may be used (in the case of a passive system, the light emitting unit may be omitted). Any principle may be used, such as TOF camera, stereo camera, structured illumination, LiDAR. When adopting the TOF method, either a direct type or an indirect type may be used. Moreover, the light to be used is not limited to visible light and infrared light, and may include radio waves and X-rays.
 なお、図4で説明したように、センサ10A、10Bの各々が順次、等時間間隔(周期T2)で測定結果を出力するように制御された。出力タイミングが等時間間隔でなくても、ずれていれば、単一のセンサだけで検出を行うことに比べれば、物体検出の応答速度を速めるという効果は得られる。また、センサ10A、10Bのフレーム期間T1は共通としたが、必ずしも共通でなくてもよい。 It should be noted that, as described in FIG. 4, each of the sensors 10A and 10B was controlled to sequentially output the measurement results at equal time intervals (cycle T2). Even if the output timings are not at equal time intervals, if they are offset, the effect of speeding up the response speed of object detection can be obtained as compared to detecting with only a single sensor. Also, although the frame period T1 of the sensors 10A and 10B is common, it does not necessarily have to be common.
 また、用いる3次元距離センサが3つ以上であっても、出力タイミングをずらす制御を適用可能である。各センサの出力タイミングが等時間間隔となるように制御すれば、応答速度を効率的に速めることが可能である。 Also, even if three or more three-dimensional distance sensors are used, it is possible to apply control to shift the output timing. The response speed can be efficiently increased by controlling the output timing of each sensor to be at equal time intervals.
 なお、防護エリアの形状は直方体等の多面体に限定されない。また、防護エリアの底面の頂点に配置するために用いるマーカの数は4つでなくてもよく、3つ以上であればよい。なお、マーカの代わりとなる目印が床面にすでに存在する場合には、あらためてマーカを設置する必要はない。例えば、床面に、目印となるような図形や模様が描かれていたり、目印となるような特徴的なオブジェクトが存在したりし、ユーザが座標教示時に目視で整合が取れる場合には、マーカの代替となり得る。 Note that the shape of the protected area is not limited to a polyhedron such as a rectangular parallelepiped. Also, the number of markers used for arranging the vertices of the bottom surface of the protection area does not have to be four, and may be three or more. Note that if there is already a mark on the floor to replace the marker, there is no need to install the marker again. For example, if a figure or pattern that serves as a mark is drawn on the floor surface, or if there is a characteristic object that serves as a mark, and the user can visually match when teaching the coordinates, the marker is used. can be an alternative to
 また、上記実施形態では防護エリアを設定するためにマーカを利用したが、それ以外の方法で防護エリアを設定してもよい。例えば、センサ10Aと10Bの相対的な位置関係が既知である場合(例えば、センサ10Aと10Bが互いに固定されており、製品出荷時に2つのセンサの相対位置の情報がROMにプリセットされている場合など)は、センサ10A、10Bを設置した後、センサの取り付け情報(床面からセンサまでの高さ、センサの取り付け角度など)を物体検出システム100に与えることで、センサと床面の相対的な位置関係を幾何学的に計算し、床面を基準とする3次元座標系を定義することができる。そして、床面を基準とする3次元座標系において、防護エリアのn個の頂点P1(x1,y1,z1)~Pn(xn,yn,zn)の座標をユーザに入力させることにより、防護エリアを規定してもよい。ここで、センサの取り付け情報は、ユーザが入力してもよいし、ユーザがメジャーや分度器などの計測器を用いて測定してもよいし、センサ10A、10Bに内蔵された部品(MEMSジャイロ、角度センサなど)で測定してもよい。センサ10A、10Bが自身の位置・姿勢を測定する部品を内蔵している場合には、センサ10A、10Bの位置・姿勢が変化したときに、床面や防護エリアの自動補正も可能である。 Also, in the above embodiment, markers are used to set the protection area, but the protection area may be set by other methods. For example, when the relative positional relationship between the sensors 10A and 10B is known (for example, when the sensors 10A and 10B are fixed to each other and the information on the relative positions of the two sensors is preset in the ROM at the time of product shipment) ), after installing the sensors 10A and 10B, by providing the object detection system 100 with sensor mounting information (the height from the floor to the sensor, the mounting angle of the sensor, etc.), the relative position of the sensor and the floor A three-dimensional coordinate system based on the floor surface can be defined by geometrically calculating such positional relationships. Then, in a three-dimensional coordinate system with the floor as a reference, the user inputs the coordinates of n vertices P1 (x1, y1, z1) to Pn (xn, yn, zn) of the protected area. may be specified. Here, the sensor mounting information may be input by the user, may be measured by the user using a measuring instrument such as a tape measure or a protractor, or may be a component (MEMS gyro, angle sensor, etc.). If the sensors 10A and 10B incorporate components for measuring their own positions and orientations, it is possible to automatically correct the floor surface and protection area when the positions and orientations of the sensors 10A and 10B change.
 なお、距離情報の出力タイミングを互いにずらすことで物体検出の応答速度を速めるという効果を得る観点に限れば、防護エリアを設定することは必須でない。 It should be noted that it is not essential to set a protected area from the standpoint of obtaining the effect of speeding up the response speed of object detection by shifting the output timing of the distance information.
 <付記>
 〔1〕 距離情報を周期的に出力する複数の撮像手段(10A、10B)と、
 複数の前記撮像手段(10A、10B)により出力された前記距離情報に基づいて物体を検出する物体検出手段(30)と、
 複数の前記撮像手段(10A、10B)の互いの測定可能領域(RA、RB)の少なくとも一部を光学的に重ねるための光学系(20)と、
 複数の前記撮像手段(10A、10B)による前記距離情報の出力タイミングを互いにずらすように制御する制御手段(30)と、
を有することを特徴とする物体検出システム(100)。
<Appendix>
[1] a plurality of imaging means (10A, 10B) that periodically output distance information;
an object detection means (30) for detecting an object based on the distance information output by the plurality of imaging means (10A, 10B);
an optical system (20) for optically overlapping at least part of the measurable regions (RA, RB) of the plurality of imaging means (10A, 10B);
a control means (30) for controlling output timings of the distance information by the plurality of imaging means (10A, 10B) so as to be shifted from each other;
An object detection system (100) comprising:
 〔2〕 作業機械と、
 それぞれが所定の時間間隔で周期的に撮像する複数の撮像手段(10A、10B)と、
 前記複数の撮像手段(10A、10B)の互いの測定可能領域の少なくとも一部を光学的に重ねるための光学系と、
 前記複数の撮像手段(10A、10B)から出力される距離情報に基づいて物体を検出する物体検出手段(30)と
 を有する安全制御システムにおける動作を制御する制御装置(50)であって、
 互いの撮像タイミングを前記時間間隔よりも短い時間ずらすように前記複数の撮像手段(10A、10B)を制御する撮像制御手段(30)と、
 前記物体検出手段(30)の検出結果に基づいて前記作業機械の動作を制限する安全制御手段と
 を備えることを特徴とする制御装置(50)。
[2] a working machine;
a plurality of imaging means (10A, 10B) each periodically imaging at predetermined time intervals;
an optical system for optically overlapping at least part of the mutual measurable regions of the plurality of imaging means (10A, 10B);
A control device (50) for controlling an operation in a safety control system comprising: an object detection means (30) for detecting an object based on distance information output from the plurality of imaging means (10A, 10B);
an imaging control means (30) for controlling the plurality of imaging means (10A, 10B) so as to shift imaging timings of each other by a time shorter than the time interval;
A control device (50) comprising: safety control means for restricting the operation of the work machine based on the detection result of the object detection means (30).
10A、10B:3次元距離センサ
20:光学系
30:制御部
31:CPU
41:防護エリア
100:物体検出システム
RA、RB:測定可能領域
Rx:重複空間
T1:フレーム期間
T2:周期
10A, 10B: three-dimensional distance sensor 20: optical system 30: control section 31: CPU
41: Protection area 100: Object detection system RA, RB: Measurable area Rx: Overlapping space T1: Frame period T2: Period

Claims (8)

  1.  距離情報を周期的に出力する複数の撮像手段と、
     複数の前記撮像手段により出力された前記距離情報に基づいて物体を検出する物体検出手段と、
     複数の前記撮像手段の互いの測定可能領域の少なくとも一部を光学的に重ねるための光学系と、
     複数の前記撮像手段による前記距離情報の出力タイミングを互いにずらすように制御する制御手段と、
    を有することを特徴とする物体検出システム。
    a plurality of imaging means for periodically outputting distance information;
    an object detection means for detecting an object based on the distance information output by the plurality of imaging means;
    an optical system for optically overlapping at least part of the measurable regions of the plurality of imaging means;
    a control means for controlling output timings of the distance information by the plurality of imaging means so as to be shifted from each other;
    An object detection system comprising:
  2.  前記光学系は、入射する光を集光するレンズと、前記レンズによって集光された光を反射して前記複数の撮像手段のそれぞれに導く反射部材と、を有することを特徴とする請求項1に記載の物体検出システム。 2. The optical system according to claim 1, wherein the optical system comprises a lens for condensing incident light, and a reflecting member that reflects the light condensed by the lens and guides the light to each of the plurality of imaging means. The object detection system according to .
  3.  前記光学系は、入射する光を集光するレンズと、前記レンズによって集光された光を分割して前記複数の撮像手段のそれぞれに導くビームスプリッターと、を有することを特徴とする請求項1に記載の物体検出システム。 2. The optical system according to claim 1, further comprising: a lens for condensing incident light; and a beam splitter for splitting the light condensed by the lens and guiding the split light to each of the plurality of imaging means. The object detection system according to .
  4.  前記撮像手段の各々による前記距離情報の出力周期は共通であり、
     前記制御手段は、前記複数の撮像手段から、等時間間隔で前記距離情報が出力されるように制御することを特徴とする請求項1に記載の物体検出システム。
    an output cycle of the distance information by each of the imaging means is common,
    2. The object detection system according to claim 1, wherein said control means controls said plurality of imaging means to output said distance information at equal time intervals.
  5.  前記撮像手段の数をn(nは2以上の整数)、前記出力周期をTとしたときに、
     前記制御手段は、n個の前記撮像手段から、T/nの時間間隔で前記距離情報が出力されるように制御することを特徴とする請求項4に記載の物体検出システム。
    When the number of imaging means is n (n is an integer equal to or greater than 2) and the output period is T,
    5. The object detection system according to claim 4, wherein the control means controls the n imaging means to output the distance information at a time interval of T/n.
  6.  複数の前記撮像手段の測定可能領域が互いに重複する重複空間に物体検出のための仮想的な防護エリアを設定する設定手段を有することを特徴とする請求項1~5のうちいずれか1項に記載の物体検出システム。 6. The apparatus according to any one of claims 1 to 5, further comprising setting means for setting a virtual protection area for object detection in an overlapping space in which the measurable regions of a plurality of imaging means overlap each other. The object detection system described.
  7.  前記撮像手段は、TOFセンサであることを特徴とする請求項1~6のいずれか1項に記載の物体検出システム。 The object detection system according to any one of claims 1 to 6, wherein the imaging means is a TOF sensor.
  8.  作業機械と、
     それぞれが所定の時間間隔で周期的に撮像する複数の撮像手段と、
     前記複数の撮像手段の互いの測定可能領域の少なくとも一部を光学的に重ねるための光学系と、
     前記複数の撮像手段から出力される距離情報に基づいて物体を検出する物体検出手段と
     を有する安全制御システムにおける動作を制御する制御装置であって、
     互いの撮像タイミングを前記時間間隔よりも短い時間ずらすように前記複数の撮像手段を制御する撮像制御手段と、
     前記物体検出手段の検出結果に基づいて前記作業機械の動作を制限する安全制御手段と
     を備えることを特徴とする制御装置。
    a working machine;
    a plurality of imaging means, each of which periodically images at a predetermined time interval;
    an optical system for optically overlapping at least part of the mutual measurable regions of the plurality of imaging means;
    and object detection means for detecting an object based on the distance information output from the plurality of imaging means, and
    imaging control means for controlling the plurality of imaging means so as to shift imaging timings of each other by a time shorter than the time interval;
    and safety control means for restricting the operation of the work machine based on the detection result of the object detection means.
PCT/JP2021/047130 2021-03-08 2021-12-20 Object detection system and control device WO2022190548A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-036411 2021-03-08
JP2021036411A JP2022136687A (en) 2021-03-08 2021-03-08 Object detection system and control device

Publications (1)

Publication Number Publication Date
WO2022190548A1 true WO2022190548A1 (en) 2022-09-15

Family

ID=83226067

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/047130 WO2022190548A1 (en) 2021-03-08 2021-12-20 Object detection system and control device

Country Status (2)

Country Link
JP (1) JP2022136687A (en)
WO (1) WO2022190548A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08262341A (en) * 1995-03-27 1996-10-11 Olympus Optical Co Ltd Optical image transmitting device
JP2005271693A (en) * 2004-03-24 2005-10-06 Sumitomo Electric Ind Ltd Object detection device
JP2007189333A (en) * 2006-01-11 2007-07-26 Matsushita Electric Ind Co Ltd Photographing system
JP5493105B2 (en) * 2010-03-19 2014-05-14 オプテックス株式会社 Object dimension measuring method and object dimension measuring apparatus using range image camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08262341A (en) * 1995-03-27 1996-10-11 Olympus Optical Co Ltd Optical image transmitting device
JP2005271693A (en) * 2004-03-24 2005-10-06 Sumitomo Electric Ind Ltd Object detection device
JP2007189333A (en) * 2006-01-11 2007-07-26 Matsushita Electric Ind Co Ltd Photographing system
JP5493105B2 (en) * 2010-03-19 2014-05-14 オプテックス株式会社 Object dimension measuring method and object dimension measuring apparatus using range image camera

Also Published As

Publication number Publication date
JP2022136687A (en) 2022-09-21

Similar Documents

Publication Publication Date Title
US8816874B2 (en) Danger presentation device, danger presentation system, danger presentation method and program
US10776898B2 (en) Projection system, image processing device and projection method
US20150091446A1 (en) Lighting control console and lighting control system
CN111161358B (en) Camera calibration method and device for structured light depth measurement
JP2011070625A (en) Optical touch control system and method thereof
JP2021515881A (en) Positioning methods, positioning devices, and computer program products
JP2006018476A5 (en)
CN110389653B (en) Tracking system for tracking and rendering virtual objects and method of operation therefor
US20220088783A1 (en) Method and Apparatus for Manufacturing Line Simulation
US11937024B2 (en) Projection system, projection device and projection method
JP6538760B2 (en) Mixed reality simulation apparatus and mixed reality simulation program
WO2022190548A1 (en) Object detection system and control device
KR20140074588A (en) Robot having projection function and method for controlling projector in robot
TWI454996B (en) Display and method of determining a position of an object applied to a three-dimensional interactive display
EP3660452B1 (en) Positioning system and positioning method
KR20210060762A (en) 3-dimensional scanning system for inspection by pixels of display and the method thereof
WO2022190544A1 (en) Object detection system and control device
Vincze et al. What-You-See-Is-What-You-Get Indoor Localization for Physical Human-Robot Interaction Experiments
US11803109B2 (en) Projection method, projection device, and projection system
JP2018179654A (en) Imaging device for detecting abnormality of distance image
JP2021092442A (en) Detection system
US20220088784A1 (en) Method and Apparatus for Monitoring Robot System
WO2022190537A1 (en) Information processing device, information processing method, and program
US20240123611A1 (en) Robot simulation device
US20240123622A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21930393

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21930393

Country of ref document: EP

Kind code of ref document: A1