CN111587448A - Remote accessory device identification apparatus - Google Patents

Remote accessory device identification apparatus Download PDF

Info

Publication number
CN111587448A
CN111587448A CN201880086353.4A CN201880086353A CN111587448A CN 111587448 A CN111587448 A CN 111587448A CN 201880086353 A CN201880086353 A CN 201880086353A CN 111587448 A CN111587448 A CN 111587448A
Authority
CN
China
Prior art keywords
remote
camera
controller
posture
attachment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880086353.4A
Other languages
Chinese (zh)
Inventor
羽马凉太
细幸广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kobelco Construction Machinery Co Ltd
Original Assignee
Kobelco Construction Machinery Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kobelco Construction Machinery Co Ltd filed Critical Kobelco Construction Machinery Co Ltd
Publication of CN111587448A publication Critical patent/CN111587448A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/28Small metalwork for digging elements, e.g. teeth scraper bits
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F3/00Dredgers; Soil-shifting machines
    • E02F3/04Dredgers; Soil-shifting machines mechanically-driven
    • E02F3/28Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
    • E02F3/36Component parts
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F3/00Dredgers; Soil-shifting machines
    • E02F3/04Dredgers; Soil-shifting machines mechanically-driven
    • E02F3/96Dredgers; Soil-shifting machines mechanically-driven with arrangements for alternate or simultaneous use of different digging elements
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • E02F9/264Sensors and their calibration for indicating the position of the work tool
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • E02F9/267Diagnosing or detecting failure of vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Mining & Mineral Resources (AREA)
  • Civil Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Structural Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Shovels (AREA)
  • Component Parts Of Construction Machinery (AREA)
  • Manipulator (AREA)

Abstract

The working device 20 has a distal end portion that can be exchanged into a plurality of types of distal attachment 25. The camera 40 can capture an image within the movable range of the remote attachment 25. The work implement posture sensor 30 detects the posture of the work implement 20. The controller 60 sets a detection frame (F) as a frame of a range including the remote attachment 25 in the image captured by the camera 40 based on the posture of the work implement 20 detected by the work implement posture sensor 30. The controller 60 recognizes the kind of the remote attachment 25 based on the image of the remote attachment 25 in the detection frame (F).

Description

Remote accessory device identification apparatus
Technical Field
The present invention relates to a remote attachment identifying apparatus for identifying a kind of a remote attachment of a construction machine.
Background
For example, patent document 1 discloses a technique for identifying a remote-end slave device (see claim 1 and the like of the document) in which a distance sensor measures a distance distribution including the remote-end slave device (slave device in the document) and identifies the remote-end slave device based on the distance distribution.
In the technique described in this document, the type of the remote attachment or the like is identified based on the distance distribution, but a distance sensor is required to measure the distance distribution. However, the distance sensor may increase cost as compared with a single lens reflex or the like. In addition, when the type of the remote accessory device is identified, it is important to ensure the accuracy of the identification.
(Prior art document)
(patent document)
Patent document 1: japanese patent laid-open publication No. 2017-157016
Disclosure of Invention
The invention aims to provide a far-end accessory device identification device which can identify the type of a far-end accessory device with high precision even if distance distribution is not used.
A remote attachment identifying device according to an aspect of the present invention is a remote attachment identifying device for a construction machine including a lower traveling structure, an upper revolving structure provided above the lower traveling structure, and a working mechanism having a distal end portion to which a plurality of types of remote attachments are interchangeably attached and being attached to the upper revolving structure, the device including: a camera attached to the upper revolving structure and capable of capturing an image of the remote attachment within a movable range; a work implement posture sensor that detects a posture of the work implement; and a controller that sets a detection frame in an area including the remote attachment for an image captured by the camera based on the posture of the work implement detected by the work implement posture sensor, and identifies the type of the remote attachment based on the image of the remote attachment in the detection frame.
According to this configuration, the type of the remote attachment can be identified with high accuracy without using the distance distribution.
Drawings
Fig. 1 is a schematic view of a construction machine 10 as viewed from the side.
Fig. 2 is a block diagram of the remote attachment identification apparatus 1 provided on the working machine 10 shown in fig. 1.
Fig. 3 is a flowchart of the remote accessory judging apparatus 1 shown in fig. 2.
Fig. 4 is an image taken with the camera 40 shown in fig. 1.
Fig. 5 is an image taken with the camera 40 shown in fig. 1.
Fig. 6 shows a schematic view corresponding to fig. 1 in a state where the remote attachment 25 shown in fig. 1 enters the blind spot of the camera 40.
Detailed Description
The remote accessory discrimination apparatus 1 shown in fig. 1 is explained with reference to fig. 1 to 6.
The remote attachment identification device 1 is a device that automatically identifies the type of the remote attachment 25, and is provided in the work machine 10. The construction machine 10 is a construction machine that performs work such as construction work. As the construction machine, for example, a construction machine 10 such as a hydraulic excavator, a hybrid excavator, or a crane can be used. The construction machine 10 includes a lower traveling structure 11, an upper revolving structure 13, a working machine 20, a working machine posture sensor 30, and a camera 40. The construction machine 10 further includes a monitor 50 and a controller 60 shown in fig. 2. In the present specification, the front direction of the upper revolving structure 13 is referred to as the front direction, and the rear direction of the upper revolving structure 13 is referred to as the rear direction, and the front and rear directions are collectively referred to as the front-rear direction. The left side viewed from the rear to the front is referred to as the left side, the right side as the right side, and the left and right sides are collectively referred to as the left-right direction. Further, directions perpendicular to the front-rear direction and the left-right direction are described as the up-down direction, and the upper side in the up-down direction is described as the upper side and the lower side in the up-down direction is described as the lower side.
The lower traveling body 11 is, for example, a crawler belt as shown in fig. 1, and causes the construction machine 10 to travel. The bottom surface 11b of the lower traveling structure 11 (the bottom surface of the construction machine 10) abuts the ground surface a. The upper slewing body 13 is provided above the lower traveling body 11 and is configured to be pivotable about a vertical direction with respect to the lower traveling body 11. The upper revolving structure 13 includes an operator's cab 13c (operator's cab).
The working mechanism 20 is a mechanism mounted on the upper slewing body 13 to perform work. The working device 20 includes a boom 21, an arm 23, and a distal attachment 25. The boom 21 is rotatably attached to the upper slewing body 13. The arm 23 is pivotally mounted on the boom 21.
The distal attachment 25 is provided at the distal end portion of the working device 20. The remote accessory device 25 may be exchanged for a plurality of categories. Examples of the remote attachment 25 include a bucket (an example shown in fig. 1), a dump box, a scissor-like device, a breaking hammer, and a magnet. A distal accessory 25 is pivotally mounted to the arm 23. A position serving as a reference of the remote attachment 25 is set as a reference position 25 b. The reference position 25b is a position determined not depending on the kind of the remote attachment 25. The reference position 25b is, for example, a base end portion of the distal attachment 25, and is, for example, a rotation axis (bucket pin or the like) of the distal attachment 25 with respect to the arm 23. In fig. 2 and 3, the remote attachment 25 is referred to as a "remote ATT". The boom 21, the arm 23, and the remote attachment 25 are driven by a boom cylinder (not shown), an arm cylinder (not shown), and an attachment cylinder (not shown), respectively.
The work implement posture sensor 30 is a sensor that detects the posture of the work implement 20 shown in fig. 1. Work implement attitude sensor 30 includes boom angle sensor 31, arm angle sensor 33, and remote attachment angle sensor 35. The boom angle sensor 31 detects an angle of the boom 21 (boom 21 angle) with respect to the upper swing body 13. The boom angle sensor 31 is configured by an angle sensor such as an encoder (encoder) provided at the base end portion of the boom 21, for example.
Here, the boom angle sensor 31 may be configured by a sensor that detects the amount of expansion and contraction of a cylinder that drives the boom 21, for example. In this case, the boom angle sensor 31 may convert the expansion and contraction amount of the cylinder into the angle of the boom 21 and output it to the controller 60. Alternatively, the boom angle sensor 31 may output the detected amount of expansion and contraction to the controller 60, and the controller 60 may convert the amount of expansion and contraction into the angle of the boom 21. The structure of detecting the angle by detecting the amount of expansion and contraction of the cylinder is also applicable to the arm angle sensor 33 and the remote attachment angle sensor 35. The arm angle sensor 33 detects an angle of the arm 23 with respect to the boom 21 (an angle of the arm 23). The distal attachment angle sensor 35 detects the angle of the distal attachment 25 relative to the stick 23 (distal attachment 25 angle).
The camera 40 (image pickup device) can take an image within the movable range of the remote attachment 25. The camera 40 captures an image of the work device 20 and its surroundings. The camera 40 is preferably capable of capturing an entire range assumed as the movable range of the remote attachment 25. The camera 40 is mounted on the upper revolving structure 13, for example, on the cab 13c (for example, the left front upper part, etc.), or may be mounted on a part other than the cab 13c in the upper revolving structure 13, for example. The camera 40 is fixed to the upper revolving unit 13. The camera 40 may also be movable (e.g., swiveled) relative to the upper swivel 13. The camera 40 may be constituted by a single lens reflex, for example. To reduce the cost of the camera 40, the camera 40 is preferably a single lens reflex camera. The camera 40 preferably has a zoom function such as an optical zoom function, for example. Specifically, the zoom position (focal length) of the camera 40 is preferably continuously variable between the telephoto angle and the wide angle. Fig. 1 shows an example of the angle of view 40a of the camera 40.
The monitor 50 displays various information. The monitor 50 may display an image captured by the camera 40, for example, as shown in fig. 4. The monitor 50 may display the detection frame F (see fig. 4). The details of the detection frame F will be described later. The monitor 50 may also display the discrimination result regarding the kind of the remote attachment 25.
The controller 60 (control unit) performs input/output of signals (information) and calculation (determination, calculation) as shown in fig. 2. The controller 60 includes a first controller 61 (main control unit) and a second controller 62 (assist control unit). The first controller 61 is configured by a computer including a processor such as a CPU and a memory such as a semiconductor memory, for example, and performs control and the like of the operation of the construction machine 10 (see fig. 1). The first controller 61 performs acquisition, processing, storage, and the like of information related to the work machine 10. The first controller 61 is connected to the work implement attitude sensor 30, the camera 40, and the monitor 50. The second controller 62 is configured by a computer including, for example, a processor of a CPU and a memory of a semiconductor memory, and identifies (specifies) the type of the remote accessory 25 based on image information including the remote accessory 25 (see fig. 4). The second controller 62 is a recognition device that performs image recognition by ai (intellectual intelligence). In addition, the first controller 61 and the second controller 62 may be unified into one controller. At least one of the first controller 61 and the second controller 62 may be classified in more detail. For example, the first controller 61 and the second controller 62 may be classified for each different kind of function.
(action)
The operation of the remote accessory identification apparatus 1 (mainly, the operation of the controller 60) will be described with reference to a flowchart shown in fig. 3. Hereinafter, the respective components (the camera 40, the controller 60, and the like) of the remote accessory identification apparatus 1 will be described mainly with reference to fig. 1, and the respective steps of the flowchart will be described with reference to fig. 3.
In step S11, the camera 40 captures an image containing the remote accessory device 25. Here, the camera 40 may capture images containing the remote accessory device 25 continuously in time. The controller 60 acquires an image (hereinafter, referred to as "camera image Im") captured by the camera 40 as shown in fig. 4. Fig. 4 and 5 show examples of a camera image Im captured by the camera 40. Fig. 5 shows a distant state in which the remote attachment 25 is distant from the camera 40, as compared with the close state shown in fig. 4. In fig. 4 and 5, portions other than the construction machine 10 are not shown.
In step S13, the work implement attitude sensor 30 detects the attitude of the work implement 20. Specifically, boom angle sensor 31 detects the boom 21 angle, arm angle sensor 33 detects the arm 23 angle, and remote attachment angle sensor 35 detects the remote attachment 25 angle. Further, the first controller 61 of the controller 60 acquires the posture information of the work implement 20 detected by the work implement posture sensor 30. The first controller 61 calculates the relative position of the reference position 25b with respect to the upper slewing body 13 based on the angle of the boom 21 and the angle of the arm 23. The first controller 61 may calculate an approximate position of the distal attachment 25 based on the position of the reference position 25b and the distal attachment 25 angle. Details regarding this calculation will be described later.
In step S20, the first controller 61 sets a detection frame F for the camera image Im as shown in fig. 4. The detection frame F is a frame in the camera image Im captured by the camera 40 (see fig. 1), and is a frame set in an area including the remote accessory 25. The image of the inner side of the detection frame F is used to discriminate the kind of the remote attachment 25. The image of the outside of the detection frame F is not used to discriminate the kind of the remote accessory device 25.
(setting of detection frame F)
The position, size, shape, and the like of the detection frame F of the camera image Im are set as follows. The detection frame F is set such that the entire outer shape of the distal attachment 25 is contained inside the detection frame F.
The background portion of the camera image Im outside the outer shape of the remote accessory 25 is a noise which is redundant information when the remote accessory 25 is discriminated. For this reason, it is preferable to set the detection frame F so that the background portion in the detection frame F is reduced as much as possible. That is, the detection frame F is preferably set so that the entire outer shape of the distal attachment 25 is accommodated inside the detection frame F and the size is as small as possible. For example, it is preferable that the remote accessory 25 be mapped to the central portion inside the detection frame F.
(frame for setting and detecting posture based on work device 20F)
What size the remote attachment 25 is mapped to in which position of the camera image Im varies depending on the posture of the work implement 20. For example, as shown in fig. 5, the farther the remote accessory device 25 is with respect to the camera 40, the smaller the remote accessory device 25 is mapped in the camera image Im. For example, the further the remote attachment 25 is located at the upper position, the more the remote attachment 25 is mapped to the upper position in the camera image Im. For example, the aspect ratio of the remote attachment 25 in the camera image Im changes according to the angle of the remote attachment 25 with respect to the arm 23.
Here, the detection frame F is set based on the posture of the work implement 20. For example, the detection frame F is set based on the position of the reference position 25b in the camera image Im. For example, the position of the reference position 25b in the camera image Im is calculated based on the boom 21 angle and the arm 23 angle. For example, the position of reference position 25b in camera image Im is obtained based on the position of reference position 25b with respect to upper revolving unit 13 or camera 40 shown in fig. 1, which is obtained from the angle of boom 21 and the angle of arm 23. Also, the detection frame F is set based on the angle of the remote attachment 25.
Specifically, the reference position 25b is calculated as follows, for example. The first controller 61 reads out from a memory (not shown) a reference position determination table in which the correspondence between the boom 21 angle and the arm 23 angle and the reference position 25b on the camera image Im is determined in advance. The first controller 61 can acquire the reference position 25b by specifying the reference position 25b corresponding to the angle of the boom 31 detected by the boom angle sensor 31 and the angle of the arm 23 detected by the arm angle sensor 33 from the reference position determination table.
Here, the reference position determination table may be created in advance by, for example, simulation using a specific construction machine 10. In the simulation, the work implement 20 is imaged by the camera 40 while changing the angle of the boom 21 and the angle of the arm 23. Then, the position of the reference position 25b is specified for each of the obtained camera images Im, and a plurality of data sets in which the reference position 25b is associated with the boom 21 angle and the arm 23 angle are generated and stored in the position determination table. The position determination table is created by the above-described operation. This operation may be performed by a person or by image processing.
Further, the detection frame F is set in the camera image Im as follows. The first controller 61 reads out, from a memory (not shown), a detection frame determination table in which the correspondence relationship between the boom 21 angle, arm 23 angle, and distal attachment 25 angle and detection frame information indicating the size of the detection frame F is determined in advance. Here, the detection frame information includes, for example, the length of the vertical side and the length of the horizontal side of the detection frame F, and positioning information indicating at which position in the detection frame F the reference position 25b is positioned is relatively good. Then, the first controller 61 specifies the detection frame information corresponding to the boom 21 angle detected by the boom angle sensor 31, the arm 23 angle detected by the arm angle sensor 33, and the remote attachment 25 angle detected by the remote attachment angle sensor 35, based on the detection frame decision table. Also, the first controller 61 may set the detection frame F indicated by the determined detection frame information in the camera image Im. At this time, the first controller 61 may set the detection frame F such that the reference position 25b is located at a position within the detection frame F indicated by the positioning information included in the detection frame information.
Here, the detection frame determination table may be created in advance by, for example, simulation using a specific construction machine 10 to which a specific remote attachment 25 such as a bucket is attached. In this simulation, the work equipment 20 is imaged by the camera 40 while changing the boom 21 angle, the arm 23 angle, and the remote attachment 25 angle, respectively. Then, a certain area including the remote accessory 25 is extracted for each obtained camera image Im, and the extracted area is set as the detection frame F. Here, as the detection frame F, for example, a quadrangular region in contact with the outer edge of the remote attachment 25 in the camera image Im may be used, or a quadrangular region slightly larger than the size of the quadrangle in contact with the outer edge may be used. This operation may be performed by a person or by image processing.
In this manner, the first controller 61 sets the detection frame F based on the posture of the work implement 20. Therefore, the first controller 61 does not need to use an object detection algorithm as a process for detecting the remote accessory device 25 for the entire area of the camera image Im. Therefore, the calculation load of the first controller 61 can be reduced accordingly. Further, since it is not necessary to use an object detection algorithm performed on the entire area of the camera image Im, the detection position of the remote accessory device 25 as the discrimination target of the kind is not erroneously recognized. For example, assume that other remote accessory devices 25 different from the remote accessory device 25 mounted on the stick 23 are located within the view angle of the camera 40 and are mapped within the camera image Im. In this case, since the other remote attachment 25 is not mounted on the work machine 10, it is not a kind of discrimination object. Also in this case, since the other remote-end accessory 25 is located at a position away from the reference position 25b, it is mapped outside the detection frame F in the camera image Im. For this reason, the present embodiment can prevent the other remote accessory device 25 from being a discrimination target of the category.
(setting of the detection frame based on the construction information of the construction machine 10F)
What size the remote attachment 25 is mapped to in which position in the camera image Im varies depending on the configuration of the work machine 10. For example, the position, size, and the like of the distal attachment 25 in the camera image Im vary depending on the length of the boom 21 and the length of the arm 23. Further, for example, the type of the remote attachment 25 assumed to be provided in the working device 20 varies depending on the size of the construction machine 10 (for example, "o ton class"). Then, the position, size, and the like of the remote attachment 25 in the camera image Im change.
Here, the detection frame F is preferably set based not only on the detection value of the work implement posture sensor 30 but also on structural information indicating the structure of the construction machine 10. The configuration information is included in, for example, main specifications of the work machine 10. The configuration information may be set (stored) in the first controller 61 in advance or may be acquired by some means, for example. The structural information includes, for example, information on the upper slewing body 13, information on the boom 21, and information on the arm 23. The structural information includes, for example, the size (dimension) and relative position of each of the upper revolving structure 13, the boom 21, and the arm 23. The configuration information includes the position of the camera 40 with respect to the upper slewing body 13. The controller 60 can calculate the attitude of the work implement 20 more accurately, for example, the reference position 25b more accurately, by using not only the detection value of the work implement attitude sensor 30 but also the structural information of the construction machine 10. As a result, the background portion in the detection frame F can be further reduced, and the accuracy of the type discrimination of the remote accessory 25 can be improved.
In the case where the detection frame F is set using the configuration information of the construction machine 10, the first controller 61 can perform processing in the following manner [ example a1 ] or [ example a2 ].
First, the detection frame F is roughly set based on the posture of the work implement 20 without using the structural information of the work machine 10 (example a 1). Then, the detection frame F may be corrected based on the structural information of the construction machine 10.
Specifically, the first controller 61 first determines the size of the detection frame F with reference to the detection frame determination table. Next, the first controller 61 may calculate a ratio between the weight information of the specific construction machine 10 used when creating the detection frame determination table and the weight information included in the own structure information, and may correct the size of the detection frame F by multiplying the ratio by the size of the detection frame F determined from the detection frame determination table. The weight information is information indicating the size of the work machine 10 of the "o ton class".
Instead of performing the correction as described in [ example a1 ], the detection frame F may be set from the beginning based on the structural information of the construction machine 10 and the posture of the working device 20 [ example a2 ]. The shape of the detection frame F is rectangular in the example shown in fig. 4, but may be polygonal other than rectangular, circular, elliptical, or a shape close to these.
Specifically, the first controller 61 calculates the reference position 25b of the construction machine 10 in the three-dimensional coordinate system, using the length of the boom 21 and the length of the arm 23 included in the configuration information, the angle of the boom 21 detected by the boom angle sensor 31, and the angle of the arm 23 detected by the arm angle sensor 33. Then, the first controller 61 calculates the reference position 25b on the camera image Im by projecting the reference position 25b in the three-dimensional coordinate system onto the imaging plane of the camera 40. Then, the first controller 61 may set the detection frame F on the camera image Im using the detection frame decision table described above. At this time, the controller 60 may correct the size of the detection frame F as shown in example a 1.
Further, even if there is no structural information of the construction machine 10, the structural information may be limited to a certain range because the structure of the construction machine 10 is substantially fixed. Therefore, the controller 60 can set the detection frame F so as to include the remote attachment 25 even when the structural information of the construction machine 10 is not acquired.
(Change of detection frame F)
The first controller 61 sequentially changes the setting of the detection frame F according to the change in the posture of the work implement 20. Specifically, for example, the detection frame F may be changed as described below. When the position of the reference position 25b in the camera image Im has changed, the first controller 61 changes the position of the detection frame F according to the position after the change of the reference position 25 b. In the case where the reference position 25b is distant from the camera 40 so that the far-end accessory 25 mapped on the camera image Im becomes small, the first controller 61 makes the detection frame F small. Similarly, in the case where the reference position 25b approaches the camera 40 so that the remote-end accessory 25 mapped on the camera image Im becomes large, the controller 60 makes the detection frame F large. When the angle of the remote attachment 25 with respect to the arm 23 changes and the aspect ratio of the remote attachment 25 mapped on the camera image Im changes, the first controller 61 changes the aspect ratio of the detection frame F. In the detection frame determination table, a region of a quadrangle circumscribing the remote accessory 25 appearing in the camera image Im or a region of a quadrangle slightly larger than the circumscribed quadrangle is set as the detection frame F. For this reason, if the detection frame F is set by the detection frame determination table, the size of the detection frame F is set to be smaller as the reference position 25b is farther from the camera 40 and to be larger as the reference position 25b is closer to the camera 40.
In step S31, the first controller 61 determines whether the position of the remote attachment 25 is located at a position that may become a blind spot of the camera 40, as shown in fig. 6. For example, when the construction machine 10 performs an excavation operation, the remote attachment 25 may enter a blind spot of the camera 40. In order to make this determination, the first controller 61 causes the memory to store information in which predetermined posture conditions are set in advance. The predetermined posture condition is a condition of the posture of the working device 20, and is a condition that the position of the remote attachment 25 may become a blind spot of the camera 40. Specifically, the condition is that at least a part of the remote attachment 25 may be disposed on the Z2 side opposite to the Z1 side, and the Z1 side is the side on which the camera 40 is disposed with respect to the ground plane a of the construction machine 10. The ground plane a is a virtual plane parallel to the bottom surface 11b and including the bottom surface 11 b. When the ground plane a is a horizontal plane, the "Z2 side" is located below the ground plane a.
In the case of this step S31, the type of the remote attachment 25 is unknown, and the structure (size, shape, etc.) of the remote attachment 25 is also unknown. Therefore, even if the posture of the working device 20 is known, it is unclear whether or not the remote attachment 25 is actually disposed on the Z2 side with respect to the ground plane a. Here, for example, the posture of the working device 20 assuming that the largest remote attachment 25 among the remote attachments 25 provided on the working device 20 is provided on the Z2 side with respect to the ground surface a may be set as the predetermined posture condition. For example, the predetermined posture condition may be set based on the distance from the ground contact surface a to the reference position 25 b.
Specifically, the first controller 61 determines the position of the distal end of the distal-end attachment 25 on the basis of the angle of the boom 21, the angle of the arm 23, and the angle of the distal-end attachment 25 detected by the boom angle sensor 31, the arm angle sensor 33, and the distal-end attachment angle sensor 35, respectively, assuming that the virtual maximum distal-end attachment 25 is attached. The first controller 61 may determine that the predetermined posture condition is satisfied when a distance in the vertical direction between the position of the distal end of the distal attachment 25 and the reference position 25b is greater than a distance in the vertical direction from the reference position 25b to the ground surface a.
As shown in fig. 1, when the posture of the work implement 20 detected by the work implement posture sensor 30 does not satisfy the predetermined posture condition (no at step S31), the process proceeds to step S33 to discriminate the type of the remote attachment 25. As shown in fig. 6, when the posture of the work implement 20 satisfies the predetermined posture condition (yes in step S31), the first controller 61 does not identify the type of the remote attachment 25. In this case, the current flow ends, and the process returns to, for example, "start". In this way, when the remote attachment 25 is disposed at a position where it is likely to be a blind spot of the camera 40, since the type of the remote attachment 25 is not discriminated, it is possible to eliminate erroneous discrimination and avoid unnecessary processing.
In the flowchart shown in fig. 3, after the image information of the camera 40 shown in fig. 1 is acquired (S11), the posture information of the work equipment 20 is acquired (S13), and the determination of step S31 is performed. However, this is merely an example, and the present invention may acquire the posture information of the work equipment 20 in a state where the image information of the camera 40 is not acquired, that is, in a state where the process of S11 is not performed (S13), and perform the determination of step S31. This is the same in the determination in step S33 and step S35. This is because the camera image Im is not required for the processing of steps S31, S33, S35. When the posture of work implement 20 satisfies the predetermined posture condition (yes at step S31), the present flow may be ended. This is the same when step S33 is no. In this case, the first controller 61 may omit the process of step S11 of acquiring the image information of the camera 40, in the case where the discrimination of the kind of the far-end accessory device 25 is not performed. In addition, in this case, the process of step S11 may also be provided between step S35 and step S37.
In step S33, the first controller 61 determines the corresponding distance L corresponding to the distance from the camera 40 to the remote attachment 25. If the corresponding distance L is too far, the remote attachment 25 is mapped to be small in the camera image Im shown in fig. 5, and even if a partial image of the remote attachment 25 is enlarged, there is a possibility that the discrimination accuracy of the kind of the remote attachment 25 cannot be secured. Here, it is determined whether or not the corresponding distance L shown in fig. 1 is of a magnitude that can ensure the discrimination accuracy. Specifically, the first controller 61 acquires the corresponding distance L corresponding to the distance from the remote attachment 25 to the camera 40 based on the posture of the work implement 20 detected by the work implement posture sensor 30.
In the case of step S33, the kind of the remote attachment 25 is unknown, and the configuration of the remote attachment 25 is also unknown. For this reason, the actual distance from the camera 40 to the remote accessory device 25 is unclear. Therefore, in the determination of step S33, the corresponding distance L corresponding to the actual distance from the camera 40 to the remote attachment 25 is used. For example, the corresponding distance L is a distance in the front-rear direction from the camera 40 to the reference position 25 b. This is the same in step S35. Alternatively, the corresponding distance L may be, for example, a distance in the front-rear direction between the camera 40 and the largest remote attachment 25 among the virtual remote attachments 25 provided on the working device 20. This is the same in step S35.
When the corresponding distance L is equal to or less than the predetermined first predetermined distance (yes at step S33), the process proceeds to step S35 to determine the type of the remote slave 25. The value of the first predetermined distance is preset by the first controller 61. The first predetermined distance is set according to whether or not the discrimination accuracy of the remote attachment 25 can be ensured. For example, the first predetermined distance is set according to the performance of the camera 40, the discrimination capability of the second controller 62, and the like, and the second predetermined distance used in step S35 is also the same. For example, when the zoom function of the camera 40 is used, the determination accuracy of the distal attachment 25 may be ensured in a state where the zoom position is set to the most telephoto side. The first predetermined distance is 5m in the example shown in fig. 3, but may be set to various distances.
When the corresponding distance L is greater than the first predetermined distance (no in step S33), the first controller 61 does not perform the discrimination of the type of the remote attachment 25. At this time, the present flow is ended, and the process returns to the start, for example. As described above, when the corresponding distance L corresponding to the distance from the camera 40 to the remote attachment 25 is large and there is a possibility that the accuracy of identifying the type of the remote attachment 25 cannot be ensured, the identification of the type of the remote attachment 25 is not performed. Therefore, erroneous discrimination can be eliminated, and unnecessary processing can be avoided.
In step S35, the first controller 61 determines whether or not to position the zoom position of the camera 40 at the telephoto side, which is farther than the widest angle side, based on the corresponding distance L. When the corresponding distance L is equal to or greater than the second predetermined distance (yes at step S35), the process proceeds to step S37. The value of the second predetermined distance is preset by the controller 60. The second prescribed distance is less than the first prescribed distance. The second predetermined distance is 3m in the example shown in fig. 3, but may be set to various distances. When the corresponding distance L is smaller than the second predetermined distance (no in step S35), the zoom position of the camera 40 is positioned on the widest angle side, and the process proceeds to step S40. In addition, what distance is set as the corresponding distance L can be set variously. For example, the correspondence distance L used for the determination in step S33 and the correspondence distance L used for the determination in step S35 may be the same or different.
In step S37, the first controller 61 positions the zoom position of the camera 40 at a position on the telephoto side farther than the widest angle side. The zoom position of the camera 40 is set to the telephoto side as the corresponding distance L becomes longer, and the image including the detection frame F is enlarged. Such control is performed when the corresponding distance L is equal to or less than a first predetermined value (yes at S33) (for example, equal to or less than 5 m) and equal to or more than a second predetermined value (yes at S35) (for example, equal to or more than 3 m). By locating the zoom position of the camera 40 on the telephoto side, the image of the distal end attachment 25 becomes clearer than when the image of the distal end attachment 25 is stretched and enlarged as it is, and the accuracy of discriminating the kind of the distal end attachment 25 can be improved.
In step S37, when the zoom position of the camera 40 is located on the telephoto side, the first controller 61 may change the size of the detection frame F according to the telephoto ratio. In this case, the first controller 61 may read out a table in which a correspondence relationship between the telescopic ratio and the magnification of the detection frame F corresponding to the telescopic ratio is defined in advance from the memory, specify the magnification of the detection frame F corresponding to the telescopic ratio with reference to the table, and magnify the detection frame F set in step S20 with the specified magnification. The table stores, for example, the enlargement ratio of the detection frame F for enlarging the size of the detection frame F to a size including the size of the entire image of the remote accessory in the camera image Im captured by telescopic shooting.
In step S40, the second controller 82 of the controller 60 recognizes the kind of the remote attachment 25. This discrimination is made based on the image of the remote attachment 25 within the detection frame F. This discrimination is performed by comparing the feature amount of the remote attachment 25 acquired from the image of the remote attachment 25 in the detection frame F with the feature amount set in the second controller 62 in advance. The feature amount used for discrimination is, for example, the shape (outline) of the outline of the distal attachment 25.
Specifically, the first controller 61 shown in fig. 2 cuts out an image in the detection frame F (see fig. 4) from the camera image Im under arbitrary conditions and timing. That is, the first controller 61 removes the area other than the detection frame F from the camera image Im. The number of images within the cut detection frame F may be one or plural. Specifically, the first controller 61 can cut out a plurality of detection frames F by cutting out the detection frames F from each of a plurality of camera images Im that are consecutive in time series.
The first controller 61 outputs the cut-out image to the second controller 62. The memory of the second controller 62 stores in advance the feature amount of the reference image as a discrimination reference of the kind of the remote attachment 25 and the kind name of the remote attachment 25 in association with each other. The reference image includes images of various postures of various types of remote accessory devices 25. The second controller 62 acquires the image in the detection frame F input from the first controller 61 as an input image, and calculates the feature amount from the input image. Here, for example, the outline shape of the image in the detection frame F may be used as the feature amount. The second controller 62 may extract the outline shape of the image within the detection frame F by applying a predetermined edge detection filter to the acquired input image, for example, and calculate the outline shape as the feature amount.
The second controller 62 compares the feature amount of the input image with the feature amount of the reference image to identify the type of the remote accessory 25. The more the tendency of the feature amount of the input image and the feature amount of the reference image match, the higher the accuracy of the type discrimination of the remote accessory device 25. Further, the more the reference image, the more the learning amount, the higher the accuracy of the species discrimination of the remote accessory device 25. Also, the second controller 62 outputs the discrimination result to the first controller 61.
Specifically, the second controller 62 may determine the feature amount of the reference image having the highest similarity to the feature amount of the input image among the feature amounts of the reference images stored in the memory, and output the category name of the remote accessory device 25 corresponding to the determined feature amount of the reference image to the first controller 61 as the discrimination result.
The feature amount of the reference image is generated in advance by machine-learning images of the plurality of remote accessories 25 having different postures for each of the plurality of types. As the machine learning, for example, a neural network, clustering (clustering), a bayesian network, a support vector machine, or the like can be employed. As the feature amount, for example, a Haar-LIKE feature amount, a pixel difference feature amount, an eoh (edge of histogram) feature amount, an hog (ordered gradients) feature amount, and the LIKE may be used in addition to the contour shape.
Alternatively, the second controller 62 causes the memory to store in advance a neural network obtained by machine learning images of a plurality of remote accessory devices 25 with the category names of the remote accessory devices 25 as teacher signals. The second controller 62 may input the input image acquired from the first controller 61 to the neural network, and output the type name of the remote accessory 25 output from the neural network to the first controller 61 as the discrimination result.
In the case of the system in which the images of the plurality of detection frames F are input from the first controller 61, the second controller 62 may compare the feature amount of the image of each of the plurality of detection frames F with the feature amount of each of the plurality of reference images stored in the memory, and determine the type of the remote accessory device 25 by majority voting. That is, the second controller 62 may determine the type of the remote accessory device 25 that is most discriminated as the final type of the remote accessory device 25 in the discrimination result of the image for each of the plurality of detection frames F.
Here, in the case where the camera 40 shown in fig. 1 is fixed to the upper revolving unit 13, the imaging angle of the camera 40 with respect to the distal attachment 25 is limited as compared with the case where the camera 40 is not fixed to the upper revolving unit 13. Therefore, when the camera 40 is fixed to the upper revolving structure 13, the reference image required for identifying the type of the remote attachment 25 can be reduced. Accordingly, the collection of the reference image becomes easy.
In step S50, the first controller 61 outputs the discrimination result input from the second controller 62 to the monitor 50. In this case, the first controller 61 may output the discrimination result to the monitor 50 by outputting a display instruction for displaying the discrimination result to the monitor 50. Here, the monitor 50 may display, for example, a character string indicating the name of the type of the remote slave 25, an icon graphically indicating the type of the remote slave 25, or both the character string and the icon.
In addition, the discrimination result may be used for disturbance prevention control of the construction machine 10. Specifically, the first controller 61 determines the position of the distal end of the distal attachment 25 using the determination result of the distal attachment 25, the boom 21 angle, the arm 23 angle, and the distal attachment 25 angle. When determining that the distal end position is located in the interference prevention area set around the construction machine 10, the first controller 61 executes interference prevention control such as slowing down the operation speed of the working equipment 20 or stopping the operation of the working equipment 20.
(comparison with the technique using a distance sensor)
The following discusses a case where the type of the remote accessory 25 shown in fig. 1 is determined based on the distance distribution (distance image, depth distribution) detected by the distance sensor. In this case, the distance sensor has a problem of higher cost than the single lens reflex camera. Further, the distance sensor is more susceptible to dust than a single lens reflex camera. On the other hand, in the present embodiment, a single lens reflex camera is used as the camera 40. In the case where the camera 40 is a single-lens reflex camera, these problems can be eliminated.
Further, a distance sensor such as a tof (time Of flight) sensor has a narrow angle Of view, and therefore has a limited detection range compared to a single lens reflex camera. Here, it is conceivable to measure the distance distribution around the remote attachment 25 by a distance sensor in a state where the working device 20 is in a specific limited posture, such as a posture in which the remote attachment 25 is in contact with the ground. However, in this case, when the type of the remote attachment 25 is determined, it is necessary to set the working device 20 to a specific posture, which is troublesome. On the other hand, in the present embodiment, when the type of the remote attachment 25 is discriminated, the posture of the working device 20 can be set to a substantially arbitrary posture. Therefore, in the present embodiment, the degree of freedom of the posture of the working device 20 is high when the type of the remote attachment 25 is determined. Specifically, in the present embodiment, the type of the remote attachment 25 can be determined in any posture of the working device 20 except for a state where the type of the remote attachment 25 is not determined as shown in the case of yes at S31 and no at S33 in fig. 3. In addition, the condition for not discriminating the kind of the remote accessory 25 may be set to various cases.
(Effect)
The effect of the remote accessory judging apparatus 1 shown in fig. 1 is as follows.
(first effect of the invention)
The remote-attachment identifying apparatus 1 includes a working device 20, a camera 40, a working-device attitude sensor 30, and a controller 60. The working machine 20 is mounted on the upper slewing body 13 of the construction machine 10. The working device 20 has a distal end portion (distal end portion of the working device 20) configured to be exchangeable with a plurality of kinds of distal attachment devices 25. The camera 40 is attached to the upper revolving structure 13 and can capture an image within the movable range of the remote attachment 25. The work implement posture sensor 30 detects the posture of the work implement 20.
The controller 60 sets a detection frame F (see fig. 4) in an area including the remote attachment 25 in the image captured by the camera 40, based on the posture of the work apparatus 20 detected by the work apparatus posture sensor 30 [ configuration 1-1 ]. The detection frame F will be described below with reference to fig. 4.
The controller 60 discriminates the kind of the remote attachment 25 based on the image of the remote attachment 25 in the detection frame F [ configuration 1-2 ].
In the above [ configuration 1-2 ], the controller 60 performs discrimination of the kind of the far-end attachment 25 based on the image. Therefore, the controller 60 can perform the discrimination of the kind of the remote attachment 25 even without using the distance distribution. As a result, the cost of the camera 40 can be reduced as compared with the case where the camera 40 needs to acquire the distance distribution.
On the other hand, in the case where the kind of the far-end slave device 25 is discriminated based on the image, since the information of the distance is not used, it can be said that the information for discrimination is less compared with the case where the kind is discriminated by the distance distribution. For this reason, it is important to ensure the discrimination accuracy of the kind of the remote accessory device 25 even if the information for discrimination is small. Here, the mapping form (for example, the position, size, shape, and the like) of the remote attachment 25 in the camera image Im changes depending on the posture of the work equipment 20.
Therefore, in the above [ configuration 1-1 ], the controller 60 sets the detection frame F including the remote attachment 25 based on the posture of the working device 20. Therefore, the controller 60 can set the detection frame F suitable for the discrimination of the kind of the remote attachment 25. For example, the controller 60 may set the detection frame F so as to include the entire remote attachment 25 and reduce the background portion around the remote attachment 25 as much as possible. Therefore, the accuracy of identifying the type of the remote attachment 25 can be improved as compared with the case where the detection frame F is not set based on the posture of the work implement 20. Therefore, the remote attachment discrimination apparatus 1 can discriminate the type of the remote attachment 25 with high accuracy even without using the distance distribution.
(second Effect of the invention)
[ constitution 2 ] A camera 40 is fixed to an upper revolving unit 13.
With the above [ configuration 2 ], the photographing angle of the camera 40 with respect to the remote attachment 25 is limited compared to the case where the camera 40 is not fixed to the upper revolving structure 13. Therefore, the amount of information required for the discrimination of the kind of the remote accessory device 25 can be reduced.
(third effect of the invention)
The controller 60 sequentially changes the setting of the detection frame F according to the change in the posture of the work implement 20 detected by the work implement posture sensor 30 [ configuration 3 ].
With the above-described [ configuration 3 ], even if the posture of the working device 20 changes after the detection frame F is set, the controller 60 can perform the determination of the type of the remote attachment 25.
(fourth effect of the invention)
The controller 60 sets a detection frame F based on the configuration information of the construction machine 10 [ configuration 4 ].
In the above-described [ configuration 1-1 ] and [ configuration 4 ], the controller 60 sets the detection frame F based on the posture of the work implement 20 detected by the work implement posture sensor 30 and the structural information of the construction machine 10. Therefore, the controller 60 can set the detection frame F more suitable for the discrimination of the type of the remote attachment 25 than the case where the detection frame F is set based only on the posture of the work implement 20.
(fifth effect of the invention)
[ constitution 5 ] A camera 40 has a zoom function. The controller 60 calculates a distance from the remote attachment 25 to the camera 40 based on the posture of the work implement 20 detected by the work implement posture sensor 30, and the zoom position of the camera 40 is set to the telephoto side as the distance becomes longer.
With the above [ configuration 5 ], even if the distance from the distal-end attachment 25 to the camera 40 is increased, the resolution of the image of the distal-end attachment 25 in the detection frame F can be increased by setting the zoom position of the camera 40 to the telephoto side. Therefore, the discrimination accuracy of the kind of the remote attachment 25 can be improved.
(sixth effect of the invention)
As shown in fig. 6, the controller 60 sets in advance a predetermined attitude condition, which is a condition for the attitude of the working device 20 and for the remote attachment 25 to be disposed on the side Z2 opposite to the side Z1 on which the construction machine 10 is disposed with respect to the ground surface a of the construction machine 10.
The controller 60 determines the type of the remote attachment 25 shown in fig. 6 when the posture of the work implement 20 detected by the work implement posture sensor 30 does not satisfy the predetermined posture condition (no in step S31 of fig. 3).
In the case where the posture of the work implement 20 detected by the work implement posture sensor 30 satisfies the predetermined posture condition (yes in step S31 of fig. 3), the controller 60 does not determine the type of the remote attachment 25 shown in fig. 6.
In the case where the remote accessory device 25 is disposed on the Z2 side with respect to the ground plane a, at least a part of the remote accessory device 25 may enter the dead space of the camera 40. Thus, it is possible that the kind of the remote attachment 25 cannot be discriminated or the discrimination accuracy cannot be ensured. Here, the remote-end-attachment discriminating apparatus 1 is provided with the above [ configuration 6-2 ]. Therefore, it is possible to suppress the controller 60 from erroneously recognizing the kind of the remote accessory 25, or to eliminate useless processing performed by the controller 60. Further, by the above [ configuration 6-2 ], it is possible to discriminate the type of the remote attachment 25 in a state where the discrimination accuracy of the type of the remote attachment 25 is easily ensured. As a result, the accuracy of identifying the type of the remote accessory 25 can be improved.
(seventh effect of the invention)
The controller 60 acquires the corresponding distance L corresponding to the distance from the remote attachment 25 to the camera 40 based on the posture of the work implement 20 detected by the work implement posture sensor 30.
The controller 60 determines the type of the remote slave 25 shown in fig. 1 when the corresponding distance L is equal to or less than a predetermined first predetermined distance (predetermined distance) (yes in step S33 of fig. 3).
If the correspondence distance L is longer than the first predetermined distance (no in step S33 of fig. 3), the controller 60 does not determine the type of the remote attachment 25 shown in fig. 1.
As the corresponding distance L increases, the distance from the camera 40 to the remote accessory 25 increases, and the image mapped on the camera image Im (see fig. 4) by the remote accessory 25 decreases, which may make it difficult to ensure the accuracy of identifying the type of the remote accessory 25. Here, the remote-attachment judging apparatus 1 includes the above [ configuration 7-2 ]. Therefore, it is possible to suppress the controller 60 from erroneously recognizing the kind of the remote accessory 25, and it is possible to eliminate useless processing performed by the controller 60. Further, by the above [ configuration 7-1 ], it is possible to identify the type of the remote attachment 25 while ensuring the accuracy of identification of the type of the remote attachment 25 and facilitating identification. As a result, the accuracy of identifying the type of the remote accessory 25 can be improved.
(modification example)
The above embodiment may be variously modified. For example, the connections between the various modules of the block diagram shown in FIG. 2 may be altered. For example, the order of the steps in the flowchart shown in fig. 3 may be changed. For example, the number of components of the remote-accessory identifying apparatus 1 shown in fig. 1 and 2 may be changed, or a part of the components may not be provided.
A part of the components of the remote attachment determination device 1 may be provided outside the construction machine 10. For example, the second controller 62 shown in fig. 2 may be provided outside the work machine 10. For example, the monitor 50 may not be provided.

Claims (7)

1. A remote attachment identifying device for a construction machine including a lower traveling structure, an upper slewing structure provided on an upper portion of the lower traveling structure, and a working device mounted on the upper slewing structure and having a distal end portion to which a plurality of types of remote attachments are interchangeably mounted, the device comprising:
a camera attached to the upper revolving structure and capable of capturing an image of the remote attachment within a movable range thereof;
a work implement posture sensor that detects a posture of the work implement; and the number of the first and second groups,
a controller, wherein the controller,
setting a detection frame for an image captured by the camera in a region including the remote accessory device based on the posture of the work implement detected by the work implement posture sensor,
identifying a category of the remote accessory device based on the image of the remote accessory device within the detection frame.
2. The remote accessory identification apparatus of claim 1,
the camera is fixed to the upper slewing body.
3. The remote accessory identification apparatus of claim 1 or 2,
the controller sequentially changes the setting of the detection frame in accordance with a change in the posture of the work implement detected by the work implement posture sensor.
4. The remote accessory device identification apparatus of any one of claims 1 to 3,
the controller sets the detection frame based on information on the structure of the construction machine.
5. The remote accessory device identification apparatus of any one of claims 1 to 4,
the camera is provided with a zoom function and,
the controller calculates a distance from the remote attachment to the camera based on the posture of the work implement detected by the work implement posture sensor, and sets a zoom position of the camera to a telephoto side as the distance increases.
6. The remote accessory device identification apparatus of any one of claims 1 to 5,
the controller is configured to set a predetermined attitude condition in advance, which is an attitude in which the remote attachment is disposed on a side opposite to a side where the camera is disposed with respect to a ground surface of the construction machine,
the control unit is used for controlling the operation of the motor,
performing a process of discriminating a kind of the remote attachment in a case where the posture of the work implement detected by the work implement posture sensor does not satisfy the prescribed posture condition,
when the posture of the work implement detected by the work implement posture sensor satisfies the predetermined posture condition, the process of identifying the type of the remote attachment is not executed.
7. The remote accessory device identification apparatus of any one of claims 1 to 6,
the control unit is used for controlling the operation of the motor,
acquiring a corresponding distance corresponding to a distance from the remote attachment to the camera based on the posture of the work implement detected by the work implement posture sensor,
performing a process of discriminating a kind of the remote slave device when the corresponding distance is equal to or less than a predetermined distance,
when the corresponding distance is greater than the predetermined distance, the process of discriminating the kind of the remote accessory device is not performed.
CN201880086353.4A 2018-01-19 2018-12-04 Remote accessory device identification apparatus Pending CN111587448A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018-007324 2018-01-19
JP2018007324A JP7114907B2 (en) 2018-01-19 2018-01-19 Tip attachment discriminator
PCT/JP2018/044473 WO2019142523A1 (en) 2018-01-19 2018-12-04 Tip attachment discrimination device

Publications (1)

Publication Number Publication Date
CN111587448A true CN111587448A (en) 2020-08-25

Family

ID=67301008

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880086353.4A Pending CN111587448A (en) 2018-01-19 2018-12-04 Remote accessory device identification apparatus

Country Status (5)

Country Link
US (1) US20200347579A1 (en)
EP (1) EP3723039A4 (en)
JP (1) JP7114907B2 (en)
CN (1) CN111587448A (en)
WO (1) WO2019142523A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114763701A (en) * 2021-01-14 2022-07-19 现代斗山英维高株式会社 Control system and method for construction machine

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2935968T3 (en) * 2016-07-15 2023-03-13 Cqms Pty Ltd Wear member monitoring system and method for monitoring a wear member

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004157932A (en) * 2002-11-08 2004-06-03 Matsushita Electric Ind Co Ltd Object recognition apparatus and object recognition program
JP2008060988A (en) * 2006-08-31 2008-03-13 Matsushita Electric Ind Co Ltd Apparatus for acquiring travel environment information
CN102642207A (en) * 2012-04-12 2012-08-22 华北电力大学 Multifunctional actuator for nuclear power plant operation and control method thereof
CN102914293A (en) * 2011-07-08 2013-02-06 佳能株式会社 Information processing apparatus and information processing method
CN106797449A (en) * 2015-09-30 2017-05-31 株式会社小松制作所 The periphery monitoring apparatus of track-type work machine
JP2017157016A (en) * 2016-03-02 2017-09-07 株式会社神戸製鋼所 Attachment recognition device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9495615B2 (en) * 2012-09-20 2016-11-15 Volvo Construction Equipment Ab Method for automatically recognizing and setting attachment and device therefor
US20160312432A1 (en) * 2015-04-23 2016-10-27 Caterpillar Inc. Computer Vision Assisted Work Tool Recognition and Installation
CN105518221A (en) * 2015-09-30 2016-04-20 株式会社小松制作所 Calibration system for imaging device, working machine, and calibration method for imaging device
US10094093B2 (en) * 2015-11-16 2018-10-09 Caterpillar Inc. Machine onboard activity and behavior classification
JP2018005555A (en) * 2016-07-01 2018-01-11 ソニー株式会社 Image processing device, information processing device and method, as well as program
US10011976B1 (en) * 2017-01-03 2018-07-03 Caterpillar Inc. System and method for work tool recognition
US10519631B2 (en) * 2017-09-22 2019-12-31 Caterpillar Inc. Work tool vision system
JPWO2019139102A1 (en) * 2018-01-10 2021-01-14 住友建機株式会社 Excavator and excavator management system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004157932A (en) * 2002-11-08 2004-06-03 Matsushita Electric Ind Co Ltd Object recognition apparatus and object recognition program
JP2008060988A (en) * 2006-08-31 2008-03-13 Matsushita Electric Ind Co Ltd Apparatus for acquiring travel environment information
CN102914293A (en) * 2011-07-08 2013-02-06 佳能株式会社 Information processing apparatus and information processing method
CN102642207A (en) * 2012-04-12 2012-08-22 华北电力大学 Multifunctional actuator for nuclear power plant operation and control method thereof
CN106797449A (en) * 2015-09-30 2017-05-31 株式会社小松制作所 The periphery monitoring apparatus of track-type work machine
JP2017157016A (en) * 2016-03-02 2017-09-07 株式会社神戸製鋼所 Attachment recognition device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114763701A (en) * 2021-01-14 2022-07-19 现代斗山英维高株式会社 Control system and method for construction machine

Also Published As

Publication number Publication date
JP7114907B2 (en) 2022-08-09
WO2019142523A1 (en) 2019-07-25
JP2019125314A (en) 2019-07-25
US20200347579A1 (en) 2020-11-05
EP3723039A1 (en) 2020-10-14
EP3723039A4 (en) 2021-04-21

Similar Documents

Publication Publication Date Title
EP1981278B1 (en) Automatic tracking device and automatic tracking method
US8768583B2 (en) Collision detection and mitigation systems and methods for a shovel
KR101854065B1 (en) Operation state detection system of work machine and work machine
JP5792091B2 (en) Object detection apparatus and object detection method
CN110926330B (en) Image processing apparatus, image processing method, and program
WO2015198410A1 (en) Outside recognition device
JP2003098424A (en) Range finder based on image processing
WO2023025262A1 (en) Excavator operation mode switching control method and apparatus and excavator
EP3306529A1 (en) Machine control measurements device
JP4853444B2 (en) Moving object detection device
CN111587448A (en) Remote accessory device identification apparatus
JP2017120551A (en) Autonomous traveling device
JP6766516B2 (en) Obstacle detector
CN110942520B (en) Auxiliary positioning method, device and system for operation equipment and storage medium
CN111216122B (en) Gripping robot and control program for gripping robot
US10565690B2 (en) External interference removal device
JP6097063B2 (en) Height measuring device and height measuring method
JP2017211765A (en) Object recognition device
JP2021051524A (en) Work machine peripheral detection object position detection system, and work machine peripheral detection object position detection program
WO2019070368A1 (en) System and method for object detection
JP6954145B2 (en) Tip attachment dimensional measuring device
EP3796257A1 (en) Estimation device, estimation method, and computer program product
JP2024028633A (en) Surrounding detection device for work machine
CN114902281A (en) Image processing system
CN117911523A (en) Adjustment method and device of golf sensor, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned

Effective date of abandoning: 20240628