WO2024023949A1 - Linear object detecting device, linear object detecting method, and linear object detecting program - Google Patents

Linear object detecting device, linear object detecting method, and linear object detecting program Download PDF

Info

Publication number
WO2024023949A1
WO2024023949A1 PCT/JP2022/028847 JP2022028847W WO2024023949A1 WO 2024023949 A1 WO2024023949 A1 WO 2024023949A1 JP 2022028847 W JP2022028847 W JP 2022028847W WO 2024023949 A1 WO2024023949 A1 WO 2024023949A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional
linear object
point group
points
projected
Prior art date
Application number
PCT/JP2022/028847
Other languages
French (fr)
Japanese (ja)
Inventor
幸弘 五藤
雄介 櫻原
正樹 和氣
崇 海老根
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to PCT/JP2022/028847 priority Critical patent/WO2024023949A1/en
Publication of WO2024023949A1 publication Critical patent/WO2024023949A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00

Definitions

  • the disclosed technology relates to a linear object detection device, a linear object detection method, and a linear object detection program.
  • MMS Mobile Mapping System
  • the disclosed technology has been made in view of the above points, and provides a linear object detection device that can detect linear objects with high accuracy even if the three-dimensional point group representing three-dimensional coordinates is highly dense. , a linear object detection method, and a linear object detection program.
  • a first aspect of the present disclosure is a linear object detection device, which projects a three-dimensional point group representing three-dimensional coordinates of points on a surface of a structure onto a horizontal plane to obtain a projected point group, and a straight line detection unit that detects a straight line based on the two-dimensional coordinates in the horizontal plane of and extracts projection points within a predetermined distance from the detected straight line from the projection point group; A plurality of line segments connecting each three-dimensional point included in the dimensional point group are calculated, and a set of three-dimensional points of the line segment whose angle with the horizontal plane is less than or equal to a predetermined angle is defined as a linear object. and an extraction unit that extracts as a constituent three-dimensional point group.
  • a second aspect of the present disclosure is a linear object detection method, in which the straight line detection unit projects a three-dimensional point group representing the three-dimensional coordinates of a point on the surface of a structure onto a horizontal plane to obtain a projected point group. , a straight line is detected based on the two-dimensional coordinates of the projected point group on the horizontal plane, and a projected point within a predetermined distance from the detected straight line is extracted from the projected point group, and an extraction unit A plurality of line segments connecting each three-dimensional point included in the three-dimensional point group corresponding to the group are calculated, and a set of three-dimensional points of the line segment whose angle with the horizontal plane is less than or equal to a predetermined angle is calculated. , is a linear object detection method that extracts a three-dimensional point group that constitutes a linear object.
  • a third aspect of the present disclosure is a linear object detection program, which is a program for causing a computer to function as each part of the linear object detection device of the first aspect.
  • FIG. 1 is a configuration diagram showing an example of the configuration of a linear object model generation system according to an embodiment.
  • FIG. 2 is a schematic diagram showing an example of a detection target by the linear object detection device of the embodiment.
  • FIG. 1 is a schematic diagram showing an example of a hardware configuration of a linear object detection device according to an embodiment. It is a block diagram showing an example of the functional composition of the linear object detection device of an embodiment. It is a schematic diagram which shows an example of the three-dimensional point group of the street tree which concerns on embodiment and is partially missing.
  • FIG. 3 is a diagram for explaining processing performed by an exclusion unit.
  • FIG. 3 is a diagram for explaining processing performed by an exclusion unit.
  • FIG. 3 is a diagram for explaining processing performed by a straight line detection section.
  • FIG. 3 is a diagram for explaining processing performed by a straight line detection section.
  • FIG. 3 is a diagram for explaining processing performed by an extraction unit.
  • FIG. 3 is a diagram for explaining processing performed by an extraction unit.
  • FIG. 3 is a diagram for explaining processing performed by an extraction unit.
  • FIG. 3 is a diagram for explaining processing performed by an extraction unit.
  • FIG. 3 is a diagram for explaining processing performed by a linear object model generation unit.
  • FIG. 3 is a diagram for explaining processing performed by a linear object model generation unit.
  • FIG. 3 is a diagram for explaining processing performed by a linear object model generation unit.
  • It is a flow chart which shows an example of linear object detection processing in a linear object detection device of an embodiment. It is a figure showing an example of the linear object model generated by the linear object detection device of an embodiment.
  • the linear object model generation system 1 of this embodiment includes a point cloud measuring device 20 and a linear object detection device 30.
  • the point cloud measuring device 20 and the linear object detection device 30 are connected via a network 9 by wired or wireless communication.
  • the point cloud measuring device 20 includes a scanner 22, a storage medium 24, and a communication I/F (Interface) 26.
  • the scanner 22 is a three-dimensional laser scanner, and acquires three-dimensional (X, Y, Z) coordinates of points on the surface of the structure as point group data by scanning the surface of the structure with a laser.
  • the point cloud data acquired by the scanner 22 is stored in a storage medium 24 that is a non-temporary storage medium.
  • Examples of the storage medium 24 include a USB (Universal Serial Bus) memory, an HDD (Hard Disk Drive), and an SSD (Solid State Drive).
  • the communication I/F 26 communicates various data such as point cloud data stored in the storage medium 24 to the linear object detection device 30 via the network 9 through wired or wireless communication.
  • the scanner 22 of the point cloud measuring device 20 scans the surfaces of the utility poles 10 1 to 10 3 and cables 12 1 to 12 4 shown in FIG . Obtain point cloud data representing the three-dimensional coordinates of points 12 1 to 12 4 on the surface.
  • the reference numerals 1 to 3 used to distinguish them from each other will be omitted, and they will simply be referred to as the utility pole 10.
  • the cables 12 1 to 12 4 are collectively referred to without distinguishing them individually, the reference numerals 1 to 4 used to distinguish them from each other are omitted and the cables 12 1 to 12 4 are simply referred to as the cable 12.
  • the scanner 22 of this embodiment is capable of measuring, for example, one rotation (360°) with the vertical direction, which is the Z-axis direction in FIG. 2, as a scan line.
  • the linear object detection device 30 extracts a three-dimensional point group constituting a linear object of the structure from point cloud data acquired by the scanner 22 of the point cloud measuring device 20 and stored in the storage medium 24. This is a device that generates a linear object model representing the linear object.
  • the linear detection device 30 is a CPU (Central Processing Unit) 31, ROM (READ ONLY MEMORY) 32, RAM (RANDOM Access Memory) 33, Storage 37, Display 37, and Display 37.
  • Each component is communicably connected to each other via a bus 39 such as a system bus or a control bus.
  • the CPU 31 is a central processing unit, and executes various programs such as the linear object detection program 35 stored in the storage 34 and controls each section.
  • the ROM 32 stores various programs and data executed by the CPU 31. Further, the RAM 33 temporarily stores programs or data as a work area when the CPU 31 executes various programs. That is, the CPU 31 reads the program from the storage 34 and executes the program using the RAM 33 as a work area.
  • a linear object detection program 35 is stored in the storage 34 of this embodiment.
  • the linear object detection program 35 may be one program, or may be a program group composed of a plurality of programs or modules.
  • the storage 34 is configured by an HDD or an SSD.
  • the storage 34 also stores various programs including an operating system and various data (all not shown).
  • a linear object model 36 generated by executing a linear object detection program 35 is stored in the storage 34 .
  • the display unit 37 displays a linear object model and various information.
  • the display section 37 is not particularly limited, and various types of displays may be used.
  • the communication I/F 38 is an interface for communicating with the point cloud measuring device 20 via the network 9, and uses standards such as Ethernet (registered trademark), FDDI, and Wi-Fi (registered trademark), for example.
  • the linear object detection device 30 includes a reading section 40, a parameter setting section 42, a linear object model calculation section 44, a storage control section 46, and a display control section 48.
  • the CPU 31 executes the linear object detection program 35 stored in the storage 34, the CPU 31 executes the reading section 40, the parameter setting section 42, the linear object model calculation section 44, the storage control section 46, and the display control section. Functions as 48.
  • the reading unit 40 reads point cloud data stored in the storage medium 24 of the point cloud measuring device 20 via the network 9.
  • the reading unit 40 outputs the read point cloud data to the linear object model calculation unit 44.
  • FIG. 5 shows an example of a group of 14 three-dimensional points read by the reading unit 40. Below, as an example, a case where the reading unit 40 reads the group of 14 three-dimensional points shown in FIG. 5 will be described.
  • the parameter setting unit 42 stores various parameters used when calculating a linear object model in the linear object model calculation unit 44, and outputs them to the linear object model calculation unit 44.
  • the linear object model calculation section 44 includes an exclusion section 50, a straight line detection section 52, an extraction section 54, and a linear object model generation section 56.
  • the exclusion unit 50 extracts surfaces of structures other than linear objects to be detected from the three-dimensional point cloud represented by the point cloud data read by the reading unit 40 (hereinafter referred to as the three-dimensional point group read by the reading unit 40). Exclude three-dimensional points that can be considered as points above.
  • the linear object detection device 30 of this embodiment detects the cable 12 as the linear object. Therefore, the exclusion unit 50 excludes three-dimensional points that can be regarded as points on the surfaces of the utility poles 10 1 to 10 3 from the three-dimensional point group read by the reading unit 40 .
  • the exclusion unit 50 calculates the distance between the three-dimensional points 14 among the plurality of three-dimensional points 14 arranged in the vertical direction (Z-axis direction in the figure) intersecting the horizontal plane (corresponding to the XY plane in the figure). If the vertical length of a partial point group consisting of three-dimensional points 14 with a predetermined interval or less is greater than or equal to a predetermined length, the three-dimensional points included in the partial point group are excluded from being projected onto a horizontal plane. do.
  • the exclusion unit 50 selects three-dimensional points 14 from among a plurality of three-dimensional points 14 lined up in the vertical direction (Z-axis direction in FIG. 6) from the acquired group of three-dimensional points 14.
  • the three-dimensional points 14 whose intervals are equal to or less than a predetermined interval are grouped to form a partial point group.
  • the scan line of the scanner 22 extends in the vertical direction. Therefore, this processing corresponds to grouping the three-dimensional points 14 from the group of three-dimensional points 14 according to the scan line.
  • FIG. 6 the example shown in FIG.
  • two partial point groups 60 are formed from the 3-dimensional point 14 group corresponding to the utility pole 10 1
  • two partial point groups 60 are formed from the 3-dimensional point 14 group corresponding to the utility pole 10 2
  • a group 60 is formed.
  • seven partial point groups 60 are formed from the three -dimensional point group 14 corresponding to the cable 121
  • seven partial point groups 60 are formed from the three-dimensional point group 14 corresponding to the cable 122
  • Seven partial point groups 60 are formed from the three-dimensional point group 14 corresponding to the cable 123
  • four partial point groups 60 are formed from the three-dimensional point group 14 corresponding to the cable 124 .
  • the exclusion unit 50 detects, for each partial point group 60, whether the vertical length 62 of the partial point group is equal to or greater than a predetermined length.
  • the vertical length 62 of the utility pole 10 is longer than that of the cable 12. Therefore, a length that is a threshold value for distinguishing between the cable 12 and the utility pole 10 is determined in advance as a predetermined length. If the vertical length 62 of the partial point group 60 is greater than or equal to a predetermined length, the exclusion unit 50 determines that the three-dimensional point 14 included in the partial point group 60 is the three-dimensional point 14 corresponding to the utility pole 10.
  • these three-dimensional points 14 are excluded from the group of three-dimensional points 14 that are candidates for the cable 12 (hereinafter referred to as candidate point group) and are not used in subsequent processing.
  • the exclusion unit 50 determines that the three-dimensional point 14 included in the partial point group 60 is a three-dimensional point corresponding to the cable 12.
  • Point 14 is assumed to be point 14, and is used as a candidate point group for subsequent processing. Note that the predetermined length used as the threshold here is set in the parameter setting section 42.
  • the above-mentioned predetermined length may be set as appropriate.
  • the exclusion unit 50 further excludes three-dimensional points 14 that exist below an arbitrary height from the candidate point group.
  • the cable 12 is the detection target, and the cable 12 is laid at a relatively high position (about 5 m or more). Therefore, the exclusion unit 50 excludes three-dimensional points 14 located at low positions, such as near the ground, from the candidate point group. Specifically, the exclusion unit 50 excludes three-dimensional points 14 whose Z coordinate value is less than a threshold value from the candidate point group. Thereby, the processing load on subsequent processing can be reduced, and the processing time can be shortened. Note that even if the Z coordinate value of the three-dimensional point 14 is the same, the height in real space differs depending on the position where the scanner 22 is provided.
  • the scanner 22 is located near the ground surface or if it is installed on a tripod, even if the Z coordinate value of the three-dimensional point 14 is 0, the height of the three-dimensional point 14 in real space will be The result will be different. Therefore, the arbitrary height used here changes depending on the vertical position where the scanner 22 is provided, that is, the height.
  • the exclusion unit 50 groups candidate point groups in which the distance between the partial point groups 60 is less than or equal to an arbitrary distance into the same group.
  • the exclusion unit 50 separates a group 64 1 including the partial point group 60 corresponding to the cables 12 1 to 12 3 and a group 64 2 including the partial point group 60 corresponding to the cable 12 4 . Group.
  • the exclusion unit 50 outputs information about the candidate points grouped into groups 64 to the straight line detection unit 52.
  • the straight line detection unit 52 projects the candidate point group onto a horizontal plane (corresponding to the XY plane in the figure) to obtain a projected point group, detects a straight line based on the two-dimensional coordinates of the projected point group on the horizontal plane, and detects the detected straight line. Projection points within a predetermined distance from are extracted from the projection point group and used as a candidate point group.
  • the straight line detection unit 52 converts the three-dimensional point 14 into a projected point 15 by projecting the candidate point group onto a horizontal plane (XY plane in FIG. 8). Therefore, the projection point 15 is a point that exists on the XY plane, and the value of the Z-axis coordinate is 0. Furthermore, the straight line detection unit 52 performs straight line detection for the 15 groups of projection points in units of groups 64. Note that the method by which the straight line detection unit 52 detects straight lines from the group of 15 projection points is not particularly limited, and may use, for example, known Hough transformation. In the example shown in FIG.
  • a straight line 66 1 and a straight line 66 2 are detected from a group of 15 projected points corresponding to a group of 14 three-dimensional points included in a group 64 1 (see FIG. 7).
  • the branch cable 12 is separated. Specifically, it is separated into cable 12 1 , cable 12 2 , and cable 12 3 . Note that cables 12 that are laid in the same direction but have different heights (positions in the Z-axis direction), such as cables 12 1 and 12 2 , are not separated in this process.
  • the straight line detection unit 52 excludes those that are less than an arbitrary length.
  • the candidate point group may include, for example, a three-dimensional point 14 corresponding to a tree or the like.
  • a straight line 66 is detected according to the projection point 15 corresponding to the tree or the like. Trees and the like are often relatively short in length compared to the cable 12. Therefore, the straight line 66 corresponding to the projected point 15 corresponding to the tree or the like is shorter than the straight line 66 corresponding to the projected point 15 corresponding to the cable 12 (straight lines 66 1 and 66 2 in FIG. 8). Therefore, the straight line 66 shorter than an arbitrary length is excluded, and the projection points 15 included in the straight line 66 are excluded from the candidate point group.
  • the straight line detection unit 52 extracts the projection points 15 within a predetermined distance from the detected straight line 66 from the group of 15 projection points.
  • the straight line detection unit 52 extracts the projection point 15 existing within a region of width H centered on the straight line 66.
  • the projection points 15 are grouped according to each straight line 66. Through this process, the projection points 15 existing as noise and the projection points 15 corresponding to other structures such as trees and utility poles 10 adjacent to the cable 12 can be excluded from the candidate projection point group.
  • the straight line detection unit 52 outputs a group of 15 projection points made up of the extracted projection points 15 to the extraction unit 54 as a group of candidate points.
  • the extraction unit 54 calculates a plurality of line segments connecting each three-dimensional point 14 included in the three-dimensional point group corresponding to the extracted projection point group 15, and calculates a plurality of line segments connecting the line segments and a horizontal plane (corresponding to the XY plane in the figure). ) is extracted as a group of three-dimensional points 14 constituting the cable 12.
  • the operation of the extracting unit 54 will be specifically explained using a group of 15 projection points included in a group corresponding to the straight line 661 as an example.
  • the group of 15 projection points included in the group corresponding to the straight line 66 1 includes a plurality of projection points 15 corresponding to the cable 12 1 and a plurality of projection points 15 corresponding to the cable 12 2 .
  • the extraction unit 54 detects a partial point group 60 to which the 14 three-dimensional points corresponding to the extracted 15 projected points belong, and each three-dimensional point included in the detected partial point group 60.
  • the coordinates of 14 are transformed so that the straight line 66 is parallel to the X axis.
  • the extraction unit 54 also calculates a plurality of line segments connecting the three-dimensional points 14 included in the three-dimensional point group consisting of the three-dimensional points 14 whose coordinates have been converted. Further, the extraction unit 54 calculates the angle between the calculated line segment and the X axis, in other words, the angle between the line segment and a horizontal plane (corresponding to the XY plane). Then, the extraction unit 54 connects a group of 14 three-dimensional points of line segments whose calculated angle is less than or equal to a predetermined angle.
  • the extraction unit 54 determines whether a line segment obtained by connecting each three-dimensional point 14 is parallel to the straight line 66 1 (see FIG. 9).
  • the angle between the line segment 67 1 connecting the three-dimensional point 14 1 and the three-dimensional point 14 2 and the X axis is less than or equal to a predetermined angle.
  • the angle between the line segment 672 connecting the three-dimensional points 141 and 143 and the X-axis is less than or equal to a predetermined angle. Therefore, the three-dimensional point 14 1 , the three-dimensional point 14 2 , and the three-dimensional point 14 3 are connected.
  • the angle formed by the line segment 67 3 connecting the three-dimensional points 14 1 and 14 11 and the X axis exceeds a predetermined angle.
  • the angle formed between the line segment 674 connecting the three-dimensional points 141 and 1412 and the X-axis exceeds a predetermined angle. Therefore, the three-dimensional point 141 , the three-dimensional point 1411 , and the three-dimensional point 1412 are not connected.
  • the angle between the line segment 675 connecting the three-dimensional point 143 and the three-dimensional point 144 and the X axis is less than or equal to a predetermined angle.
  • the angle between the line segment 676 connecting the three-dimensional points 143 and 145 and the X-axis is less than or equal to a predetermined angle.
  • the angle formed by the line segment 677 connecting the three-dimensional point 143 and the three-dimensional point 146 and the X axis is less than or equal to a predetermined angle. Therefore, the three-dimensional point 14 3 is connected to the three-dimensional point 14 4 , the three-dimensional point 14 5 , and the three-dimensional point 14 6 .
  • the angle between the line segment 678 connecting the three-dimensional points 143 and 1411 and the X-axis exceeds a predetermined angle. Further, the angle formed by the line segment 679 connecting the three-dimensional points 143 and 1412 with the X axis exceeds a predetermined angle. Therefore, the three-dimensional point 143 , the three-dimensional point 1411 , and the three-dimensional point 1412 are not connected.
  • the extraction unit 54 converts the group of three-dimensional points 14 connected through the above processing into a set of three-dimensional points 14 (hereinafter referred to as a linear object set) corresponding to the cable 12 (linear object) to be detected.
  • a linear object set 68A and a linear object set 68B are detected.
  • a group of 14 three-dimensional points included in the linear object set 68A is a point group corresponding to the cable 121
  • a group of 14 three-dimensional points included in the linear object set 68B is a point group corresponding to the cable 122 . .
  • the extraction unit 54 separates a group of 14 three-dimensional points that correspond to the cable 12 and are oriented in the same direction but have different heights.
  • the extraction unit 54 For each extracted linear object set 68, the extraction unit 54 extracts the three-dimensional points 14 included in the linear object set 68 as a group of three-dimensional points 14 constituting the cable 12, and generates the linear object model generation unit 56. Output to.
  • the linear object model generation unit 56 performs principal component analysis on the group of 14 three-dimensional points extracted by the extraction unit 54, derives a first principal component axis, and orthogonal to the derived first principal component axis.
  • a linear object model representing the cable 12 is generated using the intersections of a plane (corresponding to the YZ plane) and line segments connecting the three-dimensional points 14.
  • the linear object model generation unit 56 performs principal component analysis for each linear object set 68 to calculate a first principal component axis 69. Then, the linear object model generation unit 56 performs coordinate transformation so that each of the calculated first principal component axes 69 becomes parallel to the X axis.
  • the linear object model generation unit 56 sets a plurality of YZ planes 70 spaced apart by a certain distance in the X-axis direction. Furthermore, as shown in FIG. 15, the linear object model generation unit 56 derives an intersection 80 between the line segment connecting the three-dimensional points 14 and the YZ plane 70. In the example shown in FIG.
  • the linear object model generation unit 56 generates, for the YZ plane 701 , the intersection point 801 of the line segment connecting the three-dimensional point 141 and the three-dimensional point 142 , the three-dimensional point The intersection point 80 2 of the line segment connecting 14 1 and the 3-dimensional point 14 3 , the intersection 80 3 of the line segment connecting the 3-dimensional point 14 5 and the 3-dimensional point 14 2 , and the 3-dimensional point 14 5
  • the coordinates of the intersection point 804 with the line segment connecting the three-dimensional point 143 are derived.
  • the linear object model generation unit 56 derives a plurality of intersection points 80 for one YZ plane 70, it approximates a circle using the coordinates of each intersection point 80 as shown in FIG. 15, and calculates the center of the circle. and derive the radius. Through this process, the three-dimensional coordinates of the cable 12 are obtained.
  • the linear object model generation unit 56 generates the linear object model 36 representing the cable 12 by connecting the derived intersection points 80 in ascending order of the X coordinate values for each linear object set 68. .
  • the storage control unit 46 stores the generated linear object model 36 in the storage 34. Further, the display control unit 48 causes the linear object model 36 to be displayed on the display unit 37.
  • FIG. 16 shows a flowchart of an example of the linear object detection process executed by the linear object detection device 30 of this embodiment.
  • the linear object detection device 30 executes the linear object detection process shown in FIG. 16 by executing the linear object detection program 35 stored in the storage 34. Note that the linear object detection process shown in FIG. 16 is executed at a predetermined timing, such as the timing at which an execution instruction from the user is received.
  • step S100 of FIG. 16 the reading unit 40 reads the point cloud data stored in the storage medium 24 of the point cloud measuring device 20 via the network 9, as described above.
  • the exclusion unit 50 selects, as described above, a partial point consisting of three-dimensional points 14 in which the interval between the three-dimensional points 14 is equal to or less than a predetermined interval, among the plurality of three-dimensional points 14 arranged in the vertical direction. group into groups (see Figure 6).
  • the exclusion unit 50 excludes the three-dimensional points 14 included in the partial point group whose vertical length 62 is greater than or equal to a predetermined length from the candidate point group, as described above (see FIG. (See Figure 7).
  • the exclusion unit 50 excludes the three-dimensional points 14 existing below an arbitrary height from the candidate point group, as described above.
  • the exclusion unit 50 groups candidate point groups in which the distance between the partial point groups 60 is an arbitrary distance or less into the same group 64, as described above (see FIG. 7).
  • the straight line detection unit 52 projects the candidate point group onto the horizontal plane, as described above, and detects straight lines 66 for the projected points 15 in units of groups 64 (see FIG. 8).
  • the straight line detection unit 52 excludes, from among the detected straight lines 66, those whose length is less than an arbitrary length, as described above.
  • the straight line detection unit 52 extracts the projection points 15 within a predetermined distance from the detected straight line 66, as described above (see FIG. 9).
  • the extraction unit 54 detects the partial point group 60 to which the three-dimensional point group 14 corresponding to the extracted projection point group 15 belongs, as described above.
  • the extraction unit 54 transforms the coordinates of the 14 three-dimensional points included in the detected partial point group 60 so that the straight line 66 is parallel to the X axis, as described above (see FIG. reference).
  • the extraction unit 54 connects the line segment 67 connecting each three-dimensional point 14 included in the group of three-dimensional points 14 made up of the coordinate-converted three-dimensional points 14 with the X-axis, as described above.
  • the angle formed is calculated (see FIGS. 11A and 11B).
  • the extraction unit 54 connects the three-dimensional points 14 of the line segments 67 whose angles with respect to the X-axis are equal to or less than a predetermined angle, as described above (see FIGS. 11A and 11B).
  • the extraction unit 54 detects the group of connected three-dimensional points 14 as the linear object set 68, as described above. As shown in FIG. 12, a linear object set 68 is detected for each linear object to be detected (cable 12 in this embodiment).
  • the linear object model generation unit 56 performs principal component analysis for each linear object set 68, as described above, and derives the first principal component axis 69 (see FIG. 13).
  • the linear object model generation unit 56 transforms the coordinates of the first principal component axis 69 so that it becomes parallel to the X axis, as described above (see FIG. 13).
  • the linear object model generation unit 56 sets the YZ planes 70 provided at regular intervals as described above, and calculates the intersection point 80 between the YZ plane 70 and the line segment connecting the projection point 15. (See Figures 14 and 15).
  • the linear object model generation unit 56 generates the linear object model 36 by connecting the intersection points 80 in ascending order of the X coordinate, as described above.
  • the storage control unit 46 stores the linear object model 36 in the storage 34, as described above.
  • step S136 the display control unit 48 causes the linear object model 36 to be displayed on the display unit 37, as described above.
  • step S136 ends, the linear object detection process shown in FIG. 16 ends.
  • FIG. 17 shows an example of a linear object model 36 generated from a group of 14 three-dimensional points by the linear object detection device 30 of this embodiment. According to FIG. 17, it can be seen that the linear object detection device 30 of this embodiment can detect linear objects with high accuracy.
  • the linear object detection device 30 of this embodiment projects 14 groups of three-dimensional points representing the three-dimensional coordinates of points on the surface of a structure onto a horizontal plane to form 15 groups of projected points, and A straight line 66 is detected based on the two-dimensional coordinates of the group of points 15 on the horizontal plane, and projected points 15 within a predetermined distance from the detected straight line 66 are extracted from the group of 15 projected points.
  • the linear object detection device 30 calculates a plurality of line segments 67 connecting the three-dimensional points 14 included in the three-dimensional point 14 group corresponding to the extracted projection point group 15, and calculates the angle formed by the line segment 67 with the horizontal plane.
  • a set of three-dimensional points 14 of a line segment 67 whose angle is less than or equal to a predetermined angle is detected as a linear object set 68, and a group of three-dimensional points 14 included in the detected linear object set 68 constitutes a linear object.
  • the points are extracted as a group of 14 three-dimensional points.
  • the linear object detection device 30 of this embodiment detects linear objects based on the projection point 15 obtained by projecting the three-dimensional point 14 having three-dimensional coordinates onto the horizontal plane (XY plane). , even if the three-dimensional point group representing three-dimensional coordinates is highly dense, linear objects can be detected with high accuracy.
  • various processes that the CPU reads and executes software (programs) in the above embodiments may be executed by various processors other than the CPU.
  • the processor in this case is a PLD (Programmable Logic Device) whose circuit configuration can be changed after manufacturing, such as an FPGA (Field-Programmable Gate Array), and an ASIC (Application Specific Intel).
  • An example is a dedicated electric circuit that is a processor having a specially designed circuit configuration.
  • the position estimation process may be executed by one of these various processors, or by a combination of two or more processors of the same type or different types (for example, a combination of multiple FPGAs, and a combination of a CPU and an FPGA). etc.).
  • the hardware structure of these various processors is, more specifically, an electric circuit that is a combination of circuit elements such as semiconductor elements.
  • the linear object detection program 35 is stored (installed) in the storage 34 in advance, but the present invention is not limited to this.
  • the linear object detection program 35 is a CD-ROM (Compact Disk Read Only Memory), a DVD-ROM (Digital Versatile Disk Read Only Memory), and a USB (Universal Disk Read Only Memory).
  • Non-transitory storage such as Serial Bus) memory It may also be provided in a form stored on a medium.
  • the linear object detection program 35 may be downloaded from an external device via a network.
  • the processor includes: A three-dimensional point group representing the three-dimensional coordinates of points on the surface of the structure is projected onto a horizontal plane to obtain a projected point group, and a straight line is detected based on the two-dimensional coordinates of the projected point group on the horizontal plane. Extracting projection points within a predetermined distance from the straight line from the projection point group, A plurality of line segments connecting each three-dimensional point included in the three-dimensional point group corresponding to the extracted projected point group are calculated, and three of the line segments whose angle with the horizontal plane is less than or equal to a predetermined angle are calculated. Extract a set of dimensional points as a 3-dimensional point group that constitutes a linear object, A linear object detection device configured as follows.
  • a non-temporary storage medium storing a program executable by a computer to execute a linear object detection process
  • the linear object detection process includes: A three-dimensional point group representing the three-dimensional coordinates of points on the surface of the structure is projected onto a horizontal plane to obtain a projected point group, and a straight line is detected based on the two-dimensional coordinates of the projected point group on the horizontal plane. Extracting projection points within a predetermined distance from the straight line from the projection point group, A plurality of line segments connecting each three-dimensional point included in the three-dimensional point group corresponding to the extracted projected point group are calculated, and three of the line segments whose angle with the horizontal plane is less than or equal to a predetermined angle are calculated. Extract a set of dimensional points as a 3-dimensional point group that constitutes a linear object, Non-transitory storage medium.
  • Point cloud measuring device 22
  • Scanner 30
  • Linear object detection device 31
  • CPU 32
  • ROM 33
  • RAM 34
  • Display unit 38
  • Communication I/F 39
  • Bus 40
  • Reading section 42
  • Parameter setting section 44
  • Linear object model calculation section 46
  • Storage control section 48
  • Display control section 50
  • Exclusion section 52
  • Straight line detection section 54
  • Extraction section 56
  • Linear object model generation section

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

This linear object detecting device comprises: a straight line detecting unit which projects a three-dimensional point cloud representing three-dimensional coordinates of points on a surface of a structure onto a horizontal plane to form a projected point cloud, detects a straight line on the basis of two-dimensional coordinates of the projected point cloud on the horizontal plane, and extracts, from the projected point cloud, projected points within a predetermined distance from the detected straight line; and an extracting unit which calculates a plurality of line segments joining each three-dimensional point included in the three-dimensional point cloud and corresponding to the extracted projected point cloud, and extracts, as a three-dimensional group constituting a linear object, a set of the three-dimensional points of the line segments having an angle relative to the horizontal plane that is at most equal to a predetermined angle.

Description

線状物検出装置、線状物検出方法、及び線状物検出プログラムLinear object detection device, linear object detection method, and linear object detection program
 開示の技術は、線状物検出装置、線状物検出方法、及び線状物検出プログラムに関する。 The disclosed technology relates to a linear object detection device, a linear object detection method, and a linear object detection program.
 従来、車載した3次元レーザスキャナにより、屋外構造物を3次元モデル化する技術(Mobile Mapping System:MMS)が開発されている。例えば、特許文献1には、レーザスキャナが1回転する間に取得される点群をスキャンラインと呼ぶクラスタとし、隣接するクラスタが懸垂線状にあることを検出することにより、ケーブル等の構造物を表す3次元モデルデータを作成する技術が記載されている。 Conventionally, a technology (Mobile Mapping System: MMS) has been developed that creates a three-dimensional model of an outdoor structure using a three-dimensional laser scanner mounted on a vehicle. For example, in Patent Document 1, a group of points acquired during one rotation of a laser scanner is defined as a cluster called a scan line, and by detecting that adjacent clusters are in a catenary line, structures such as cables can be A technique for creating three-dimensional model data representing .
特許第6531051号公報Patent No. 6531051
 しかしながら、特許文献1に記載の技術では、例えば、レーザスキャナによるスキャンラインの間隔が短い等、高密度な点群においては、点群座標の誤差が生じる場合には、その誤差によって懸垂線状と判定されない場合がある。そのため、ケーブル等の線状物を精度よく検出できないという課題があった。 However, in the technology described in Patent Document 1, when an error occurs in the point group coordinates in a high-density point cloud, such as when the interval between scan lines by a laser scanner is short, the error causes a catenary line shape. It may not be determined. Therefore, there was a problem that linear objects such as cables could not be detected with high accuracy.
 開示の技術は、上記の点に鑑みてなされたものであり、3次元座標を表す3次元点群が高密度であっても、線状物を精度良く検出することができる線状物検出装置、線状物検出方法、及び線状物検出プログラムを提供することを目的とする。 The disclosed technology has been made in view of the above points, and provides a linear object detection device that can detect linear objects with high accuracy even if the three-dimensional point group representing three-dimensional coordinates is highly dense. , a linear object detection method, and a linear object detection program.
 本開示の第1態様は、線状物検出装置であって、構造物の表面上の点における3次元座標を表す3次元点群を水平面上に投影して投影点群とし、前記投影点群の前記水平面における2次元座標に基づいて直線を検出し、検出した前記直線から所定の距離内の投影点を前記投影点群から抽出する直線検出部と、抽出された投影点群に対応する3次元点群に含まれる各3次元点間を結んだ線分を複数、算出し、前記水平面とのなす角度が所定の角度以下である前記線分の3次元点の集合を、線状物を構成する3次元点群として抽出する抽出部と、を備える。 A first aspect of the present disclosure is a linear object detection device, which projects a three-dimensional point group representing three-dimensional coordinates of points on a surface of a structure onto a horizontal plane to obtain a projected point group, and a straight line detection unit that detects a straight line based on the two-dimensional coordinates in the horizontal plane of and extracts projection points within a predetermined distance from the detected straight line from the projection point group; A plurality of line segments connecting each three-dimensional point included in the dimensional point group are calculated, and a set of three-dimensional points of the line segment whose angle with the horizontal plane is less than or equal to a predetermined angle is defined as a linear object. and an extraction unit that extracts as a constituent three-dimensional point group.
 本開示の第2態様は、線状物検出方法であって、直線検出部が、構造物の表面上の点における3次元座標を表す3次元点群を水平面上に投影して投影点群とし、前記投影点群の前記水平面における2次元座標に基づいて直線を検出し、検出した前記直線から所定の距離内の投影点を前記投影点群から抽出し、抽出部が、抽出された投影点群に対応する3次元点群に含まれる各3次元点間を結んだ線分を複数、算出し、前記水平面とのなす角度が所定の角度以下である前記線分の3次元点の集合を、線状物を構成する3次元点群として抽出する、線状物検出方法である。 A second aspect of the present disclosure is a linear object detection method, in which the straight line detection unit projects a three-dimensional point group representing the three-dimensional coordinates of a point on the surface of a structure onto a horizontal plane to obtain a projected point group. , a straight line is detected based on the two-dimensional coordinates of the projected point group on the horizontal plane, and a projected point within a predetermined distance from the detected straight line is extracted from the projected point group, and an extraction unit A plurality of line segments connecting each three-dimensional point included in the three-dimensional point group corresponding to the group are calculated, and a set of three-dimensional points of the line segment whose angle with the horizontal plane is less than or equal to a predetermined angle is calculated. , is a linear object detection method that extracts a three-dimensional point group that constitutes a linear object.
 本開示の第3態様は、線状物検出プログラムであって、コンピュータを、上記第1態様の線状物検出装置の各部として機能させるためのプログラムである。 A third aspect of the present disclosure is a linear object detection program, which is a program for causing a computer to function as each part of the linear object detection device of the first aspect.
 開示の技術によれば、3次元座標を表す3次元点群が高密度であっても、線状物を精度良く検出することができる。 According to the disclosed technology, even if the three-dimensional point group representing three-dimensional coordinates has a high density, linear objects can be detected with high accuracy.
実施形態の線状物モデル生成システムの構成の一例を示す構成図である。FIG. 1 is a configuration diagram showing an example of the configuration of a linear object model generation system according to an embodiment. 実施形態の線状物検出装置による検出対象の一例を示す模式図である。FIG. 2 is a schematic diagram showing an example of a detection target by the linear object detection device of the embodiment. 実施形態の線状物検出装置のハードウェア構成の一例を示す模式図である。FIG. 1 is a schematic diagram showing an example of a hardware configuration of a linear object detection device according to an embodiment. 実施形態の線状物検出装置の機能構成の一例を示すブロック図である。It is a block diagram showing an example of the functional composition of the linear object detection device of an embodiment. 実施形態に係る一部が欠損している街路樹の三次元点群の一例を示す模式図である。It is a schematic diagram which shows an example of the three-dimensional point group of the street tree which concerns on embodiment and is partially missing. 除外部で行われる処理を説明するための図である。FIG. 3 is a diagram for explaining processing performed by an exclusion unit. 除外部で行われる処理を説明するための図である。FIG. 3 is a diagram for explaining processing performed by an exclusion unit. 直線検出部で行われる処理を説明するための図である。FIG. 3 is a diagram for explaining processing performed by a straight line detection section. 直線検出部で行われる処理を説明するための図である。FIG. 3 is a diagram for explaining processing performed by a straight line detection section. 抽出部で行われる処理を説明するための図である。FIG. 3 is a diagram for explaining processing performed by an extraction unit. 抽出部で行われる処理を説明するための図である。FIG. 3 is a diagram for explaining processing performed by an extraction unit. 抽出部で行われる処理を説明するための図である。FIG. 3 is a diagram for explaining processing performed by an extraction unit. 抽出部で行われる処理を説明するための図である。FIG. 3 is a diagram for explaining processing performed by an extraction unit. 線状物モデル生成部で行われる処理を説明するための図である。FIG. 3 is a diagram for explaining processing performed by a linear object model generation unit. 線状物モデル生成部で行われる処理を説明するための図である。FIG. 3 is a diagram for explaining processing performed by a linear object model generation unit. 線状物モデル生成部で行われる処理を説明するための図である。FIG. 3 is a diagram for explaining processing performed by a linear object model generation unit. 実施形態の線状物検出装置における線状物検出処理の一例を示すフローチャートである。It is a flow chart which shows an example of linear object detection processing in a linear object detection device of an embodiment. 実施形態の線状物検出装置により生成された線状物モデルの一例を示す図である。It is a figure showing an example of the linear object model generated by the linear object detection device of an embodiment.
 以下、開示の技術の実施形態の一例を、図面を参照しつつ説明する。なお、各図面において同一又は等価な構成要素及び部分には同一の参照符号を付与している。また、図面の寸法比率は、説明の都合上誇張されており、実際の比率とは異なる場合がある。 Hereinafter, an example of an embodiment of the disclosed technology will be described with reference to the drawings. In addition, the same reference numerals are given to the same or equivalent components and parts in each drawing. Furthermore, the dimensional ratios in the drawings are exaggerated for convenience of explanation and may differ from the actual ratios.
 まず、本実施形態の技術の線状物モデル生成システム1の構成の一例について説明する。図1に示すように、本実施形態の線状物モデル生成システム1は、点群測定器20及び線状物検出装置30を備える。点群測定器20及び線状物検出装置30は、ネットワーク9を介して有線通信または無線通信により接続されている。 First, an example of the configuration of the linear object model generation system 1 according to the technology of this embodiment will be described. As shown in FIG. 1, the linear object model generation system 1 of this embodiment includes a point cloud measuring device 20 and a linear object detection device 30. The point cloud measuring device 20 and the linear object detection device 30 are connected via a network 9 by wired or wireless communication.
 点群測定器20は、スキャナ22、記憶媒体24、及び通信I/F(Interface)26を備える。スキャナ22は、3次元レーザスキャナであり、構造物の表面をレーザによりスキャンすることにより、当該構造物の表面上の点における3次元(X,Y,Z)座標を点群データとして取得する。 The point cloud measuring device 20 includes a scanner 22, a storage medium 24, and a communication I/F (Interface) 26. The scanner 22 is a three-dimensional laser scanner, and acquires three-dimensional (X, Y, Z) coordinates of points on the surface of the structure as point group data by scanning the surface of the structure with a laser.
 スキャナ22により取得した点群データは、非一時的記憶媒体である記憶媒体24に記憶される。記憶媒体24としては、例えば、USB(Universal Serial Bus)メモリや、HDD(Hard Disk Drive)、及びはSSD(Solid State Drive)等が挙げられる。 The point cloud data acquired by the scanner 22 is stored in a storage medium 24 that is a non-temporary storage medium. Examples of the storage medium 24 include a USB (Universal Serial Bus) memory, an HDD (Hard Disk Drive), and an SSD (Solid State Drive).
 通信I/F26は、ネットワーク9を介して、有線通信または無線通信により、線状物検出装置30に対して記憶媒体24に記憶されている点群データ等の各種データの通信を行う。 The communication I/F 26 communicates various data such as point cloud data stored in the storage medium 24 to the linear object detection device 30 via the network 9 through wired or wireless communication.
 例えば、点群測定器20のスキャナ22は、図2に示した電柱10~10、及びケーブル12~12の表面をレーザによりスキャンニングすることにより電柱10~10、及びケーブル12~12の表面上の点における3次元座標を表す点群データを取得する。なお、以下では、電柱10~10について個々を区別せずに総称する場合、個々を区別するための符号1~3の記載を省略し、単に電柱10という。また同様にケーブル12~12について個々を区別せずに総称する場合、個々を区別するための符号1~4の記載を省略し、単にケーブル12という。本実施形態のスキャナ22は、図2のZ軸方向である鉛直方向をスキャンラインとして、例えば、1回転(360°)の計測が可能とされている。 For example, the scanner 22 of the point cloud measuring device 20 scans the surfaces of the utility poles 10 1 to 10 3 and cables 12 1 to 12 4 shown in FIG . Obtain point cloud data representing the three-dimensional coordinates of points 12 1 to 12 4 on the surface. In the following, when the utility poles 10 1 to 10 3 are collectively referred to without distinguishing them individually, the reference numerals 1 to 3 used to distinguish them from each other will be omitted, and they will simply be referred to as the utility pole 10. Similarly, when the cables 12 1 to 12 4 are collectively referred to without distinguishing them individually, the reference numerals 1 to 4 used to distinguish them from each other are omitted and the cables 12 1 to 12 4 are simply referred to as the cable 12. The scanner 22 of this embodiment is capable of measuring, for example, one rotation (360°) with the vertical direction, which is the Z-axis direction in FIG. 2, as a scan line.
 線状物検出装置30は、点群測定器20のスキャナ22が取得し、記憶媒体24に記憶されている点群データから、構造物のうちの線状物を構成する3次元点群を抽出し、当該線状物を表す線状物モデルを生成する装置である。 The linear object detection device 30 extracts a three-dimensional point group constituting a linear object of the structure from point cloud data acquired by the scanner 22 of the point cloud measuring device 20 and stored in the storage medium 24. This is a device that generates a linear object model representing the linear object.
 本実施形態の線状物検出装置30のハードウェア構成を説明する。図3に示すように、線状物検出装置30は、CPU(Central Processing Unit)31、ROM(Read Only Memory)32、RAM(Random Access Memory)33、ストレージ34、表示部37、及び通信I/F38を備える。各構成は、システムバスやコントロールバス等のバス39を介して相互に通信可能に接続されている。 The hardware configuration of the linear object detection device 30 of this embodiment will be explained. As shown in Fig. 3, the linear detection device 30 is a CPU (Central Processing Unit) 31, ROM (READ ONLY MEMORY) 32, RAM (RANDOM Access Memory) 33, Storage 37, Display 37, and Display 37. Communication I / Equipped with F38. Each component is communicably connected to each other via a bus 39 such as a system bus or a control bus.
 CPU31は、中央演算処理ユニットであり、ストレージ34に記憶されている線状物検出プログラム35等の各種プログラムを実行したり、各部を制御したりする。 The CPU 31 is a central processing unit, and executes various programs such as the linear object detection program 35 stored in the storage 34 and controls each section.
 ROM32には、CPU31で実行される各種プログラム及び各種データが記憶されている。また、RAM33は、CPU31が各種プログラムを実行する際の作業領域として一時的にプログラムまたはデータを記憶する。すなわち、CPU31は、ストレージ34からプログラムを読み出し、RAM33を作業領域としてプログラムを実行する。 The ROM 32 stores various programs and data executed by the CPU 31. Further, the RAM 33 temporarily stores programs or data as a work area when the CPU 31 executes various programs. That is, the CPU 31 reads the program from the storage 34 and executes the program using the RAM 33 as a work area.
 本実施形態のストレージ34には、線状物検出プログラム35が格納されている。なお、線状物検出プログラム35は、1つのプログラムであってもよいし、複数のプログラム又はモジュールで構成されるプログラム群であってもよい。ストレージ34としては、HDDまたはSSDにより構成される。また、ストレージ34には、オペレーティングシステムを含む各種プログラム、及び各種データ(いずれも図示省略)を格納する。さらに、ストレージ34には、線状物検出プログラム35を実行することにより生成された線状物モデル36が格納される。 A linear object detection program 35 is stored in the storage 34 of this embodiment. Note that the linear object detection program 35 may be one program, or may be a program group composed of a plurality of programs or modules. The storage 34 is configured by an HDD or an SSD. The storage 34 also stores various programs including an operating system and various data (all not shown). Furthermore, a linear object model 36 generated by executing a linear object detection program 35 is stored in the storage 34 .
 表示部37は、線状物モデルや、各種情報を表示する。表示部37は特に限定されるものではなく各種のディスプレイ等が挙げられる。 The display unit 37 displays a linear object model and various information. The display section 37 is not particularly limited, and various types of displays may be used.
 通信I/F38は、ネットワーク9を介して点群測定器20と通信するためのインタフェースであり、例えば、イーサネット(登録商標)、FDDI、Wi-Fi(登録商標)等の規格が用いられる。 The communication I/F 38 is an interface for communicating with the point cloud measuring device 20 via the network 9, and uses standards such as Ethernet (registered trademark), FDDI, and Wi-Fi (registered trademark), for example.
 次に、線状物検出装置30の機能構成について説明する。図4に示すように、線状物検出装置30は、読込部40、パラメータ設定部42、線状物モデル算出部44、保存制御部46、及び表示制御部48を備える。CPU31がストレージ34に記憶されている線状物検出プログラム35を実行することにより、CPU31が、読込部40、パラメータ設定部42、線状物モデル算出部44、保存制御部46、及び表示制御部48として機能する。 Next, the functional configuration of the linear object detection device 30 will be explained. As shown in FIG. 4, the linear object detection device 30 includes a reading section 40, a parameter setting section 42, a linear object model calculation section 44, a storage control section 46, and a display control section 48. When the CPU 31 executes the linear object detection program 35 stored in the storage 34, the CPU 31 executes the reading section 40, the parameter setting section 42, the linear object model calculation section 44, the storage control section 46, and the display control section. Functions as 48.
 読込部40は、点群測定器20の記憶媒体24に記憶されている点群データを、ネットワーク9を介して読み込む。読込部40は、読み込んだ点群データを線状物モデル算出部44に出力する。図5には、読込部40が読み込んだ3次元点14群の一例が示されている。以下では、一例として、読込部40が図5に示した3次元点14群を読み込んだ場合について説明する。 The reading unit 40 reads point cloud data stored in the storage medium 24 of the point cloud measuring device 20 via the network 9. The reading unit 40 outputs the read point cloud data to the linear object model calculation unit 44. FIG. 5 shows an example of a group of 14 three-dimensional points read by the reading unit 40. Below, as an example, a case where the reading unit 40 reads the group of 14 three-dimensional points shown in FIG. 5 will be described.
 パラメータ設定部42は、線状物モデル算出部44において線状物モデルを算出する際に用いられる各種パラメータを記憶し、線状物モデル算出部44に出力する。 The parameter setting unit 42 stores various parameters used when calculating a linear object model in the linear object model calculation unit 44, and outputs them to the linear object model calculation unit 44.
 線状物モデル算出部44は、除外部50、直線検出部52、抽出部54、及び線状物モデル生成部56を含む。 The linear object model calculation section 44 includes an exclusion section 50, a straight line detection section 52, an extraction section 54, and a linear object model generation section 56.
 除外部50は、読込部40が読み込んだ点群データが表す3次元点群(以下、読込部40が読み込んだ3次元点群という)から、検出対象となる線状物以外の構造物の表面上の点とみなせる3次元点を除外する。一例として、本実施形態の線状物検出装置30は線状物として、ケーブル12を検出する。そのため、除外部50は、読込部40が読み込んだ3次元点群から、電柱10~10の表面上の点とみなせる3次元点を除外する。 The exclusion unit 50 extracts surfaces of structures other than linear objects to be detected from the three-dimensional point cloud represented by the point cloud data read by the reading unit 40 (hereinafter referred to as the three-dimensional point group read by the reading unit 40). Exclude three-dimensional points that can be considered as points above. As an example, the linear object detection device 30 of this embodiment detects the cable 12 as the linear object. Therefore, the exclusion unit 50 excludes three-dimensional points that can be regarded as points on the surfaces of the utility poles 10 1 to 10 3 from the three-dimensional point group read by the reading unit 40 .
 具体的には、除外部50は、水平面(図のXY平面に相当)と交差する鉛直方向(図のZ軸方向)に並んだ複数の3次元点14のうち、3次元点14同士の間隔が所定の間隔以下の3次元点14からなる部分点群の鉛直方向の長さが所定の長さ以上の場合、当該部分点群に含まれる3次元点を、水平面上に投影する対象から除外する。 Specifically, the exclusion unit 50 calculates the distance between the three-dimensional points 14 among the plurality of three-dimensional points 14 arranged in the vertical direction (Z-axis direction in the figure) intersecting the horizontal plane (corresponding to the XY plane in the figure). If the vertical length of a partial point group consisting of three-dimensional points 14 with a predetermined interval or less is greater than or equal to a predetermined length, the three-dimensional points included in the partial point group are excluded from being projected onto a horizontal plane. do.
 図6及び図7を参照して、除外部50が行う処理について詳細に説明する。まず、図6に示すように、除外部50は、取得した3次元点14群から、鉛直方向(図6のZ軸方向)に並んだ複数の3次元点14のうち、3次元点14同士の間隔が所定の間隔以下の3次元点14をグループ化し、部分点群とする。本実施形態では、スキャナ22のスキャンラインは鉛直方向に伸びている。そのため、本処理は、3次元点14群からスキャンラインに応じて3次元点14をグループ化することに相当する。図6に示した例では、電柱10に対応する3次元点14群からは、2つの部分点群60が形成され、電柱10に対応する3次元点14群からは、2つの部分点群60が形成される。また、ケーブル12に対応する3次元点14群からは、7つの部分点群60が形成され、ケーブル12に対応する3次元点14群からは、7つの部分点群60が形成され、ケーブル12に対応する3次元点14群からは、7つの部分点群60が形成される。さらに、ケーブル12に対応する3次元点14群からは、4つの部分点群60が形成される。 The processing performed by the exclusion unit 50 will be described in detail with reference to FIGS. 6 and 7. First, as shown in FIG. 6, the exclusion unit 50 selects three-dimensional points 14 from among a plurality of three-dimensional points 14 lined up in the vertical direction (Z-axis direction in FIG. 6) from the acquired group of three-dimensional points 14. The three-dimensional points 14 whose intervals are equal to or less than a predetermined interval are grouped to form a partial point group. In this embodiment, the scan line of the scanner 22 extends in the vertical direction. Therefore, this processing corresponds to grouping the three-dimensional points 14 from the group of three-dimensional points 14 according to the scan line. In the example shown in FIG. 6, two partial point groups 60 are formed from the 3-dimensional point 14 group corresponding to the utility pole 10 1 , and two partial point groups 60 are formed from the 3-dimensional point 14 group corresponding to the utility pole 10 2 . A group 60 is formed. Furthermore, seven partial point groups 60 are formed from the three -dimensional point group 14 corresponding to the cable 121 , seven partial point groups 60 are formed from the three-dimensional point group 14 corresponding to the cable 122, Seven partial point groups 60 are formed from the three-dimensional point group 14 corresponding to the cable 123 . Furthermore, four partial point groups 60 are formed from the three-dimensional point group 14 corresponding to the cable 124 .
 次に、除外部50は、部分点群60毎に、部分点群の鉛直方向の長さ62が、所定の長さ以上であるか否かを検出する。一般に、ケーブル12よりも、電柱10の方が、鉛直方向の長さ62が長くなる。そのため、ケーブル12と電柱10とを区別するための閾値となる長さを所定の長さとして予め定めておく。除外部50は、部分点群60の鉛直方向の長さ62が所定の長さ以上である場合、その部分点群60に含まれる3次元点14は、電柱10に対応する3次元点14であるとみなせるため、これらの3次元点14を、ケーブル12の候補となる3次元点14群(以下、候補点群という)から除外し、以降の処理に用いない。換言すると、除外部50は、部分点群60の鉛直方向の長さ62が所定の長さ未満である場合、その部分点群60に含まれる3次元点14は、ケーブル12に対応する3次元点14であるとみなして、候補点群とし、以降の処理に用いる。なお、ここで閾値として用いる所定の長さは、パラメータ設定部42に設定されている。 Next, the exclusion unit 50 detects, for each partial point group 60, whether the vertical length 62 of the partial point group is equal to or greater than a predetermined length. Generally, the vertical length 62 of the utility pole 10 is longer than that of the cable 12. Therefore, a length that is a threshold value for distinguishing between the cable 12 and the utility pole 10 is determined in advance as a predetermined length. If the vertical length 62 of the partial point group 60 is greater than or equal to a predetermined length, the exclusion unit 50 determines that the three-dimensional point 14 included in the partial point group 60 is the three-dimensional point 14 corresponding to the utility pole 10. Therefore, these three-dimensional points 14 are excluded from the group of three-dimensional points 14 that are candidates for the cable 12 (hereinafter referred to as candidate point group) and are not used in subsequent processing. In other words, when the vertical length 62 of the partial point group 60 is less than a predetermined length, the exclusion unit 50 determines that the three-dimensional point 14 included in the partial point group 60 is a three-dimensional point corresponding to the cable 12. Point 14 is assumed to be point 14, and is used as a candidate point group for subsequent processing. Note that the predetermined length used as the threshold here is set in the parameter setting section 42.
 このように、鉛直方向の長さ62が所定の長さ以上となる部分点群60に含まれる3次元点14を除外することにより、電柱10に限らず、壁や地面等の表面上の3次元点14を候補点群から除外することができる。これにより、以降の処理にかかる処理負荷を低減することができる。なお、壁や地面等の表面上の3次元点14を候補点群から除外する場合、上述の所定の長さは、適宜、設定すればよい。 In this way, by excluding the three-dimensional points 14 included in the partial point group 60 whose vertical length 62 is greater than or equal to a predetermined length, three-dimensional points 14 on the surface of not only the telephone pole 10 but also a wall, the ground, etc. Dimensional point 14 can be excluded from the candidate point cloud. Thereby, the processing load on subsequent processing can be reduced. Note that when excluding the three-dimensional points 14 on a surface such as a wall or the ground from the candidate point group, the above-mentioned predetermined length may be set as appropriate.
 また、除外部50は、任意の高さ以下に存在する3次元点14を候補点群からさらに除外する。上述したように本実施形態ではケーブル12を検出対象としており、ケーブル12は、ある程度、高い位置(約5m以上)に敷設されている。そのため、除外部50は、地面付近等、低い位置に存在する3次元点14を候補点群から除外する。具体的には、除外部50は、Z座標の値が閾値未満の3次元点14を候補点群から除外する。これにより、以降の処理にかかる処理負荷を低減することができ、処理時間を短縮化することができる。なお、スキャナ22が設けられている位置に応じて、3次元点14のZ座標の値が同じであっても、実空間における高さは異なる。例えば、スキャナ22が地表付近に存在する場合と、三脚上に設置されている場合とでは、3次元点14のZ座標の値が0であっても、その3次元点14の実空間における高さが異なることになる。そのため、ここで用いる任意の高さは、スキャナ22が設けられている鉛直方向の位置、すなわち高さに応じて変化する。 Furthermore, the exclusion unit 50 further excludes three-dimensional points 14 that exist below an arbitrary height from the candidate point group. As described above, in this embodiment, the cable 12 is the detection target, and the cable 12 is laid at a relatively high position (about 5 m or more). Therefore, the exclusion unit 50 excludes three-dimensional points 14 located at low positions, such as near the ground, from the candidate point group. Specifically, the exclusion unit 50 excludes three-dimensional points 14 whose Z coordinate value is less than a threshold value from the candidate point group. Thereby, the processing load on subsequent processing can be reduced, and the processing time can be shortened. Note that even if the Z coordinate value of the three-dimensional point 14 is the same, the height in real space differs depending on the position where the scanner 22 is provided. For example, if the scanner 22 is located near the ground surface or if it is installed on a tripod, even if the Z coordinate value of the three-dimensional point 14 is 0, the height of the three-dimensional point 14 in real space will be The result will be different. Therefore, the arbitrary height used here changes depending on the vertical position where the scanner 22 is provided, that is, the height.
 さらに、除外部50は、図7に示すように、部分点群60間の距離が任意の距離以下である候補点群を同一のグループにまとめてグループ化する。図7に示した例では、除外部50は、ケーブル12~12に対応する部分点群60を含むグループ64と、ケーブル12に対応する部分点群60を含むグループ64とにグループ化する。 Furthermore, as shown in FIG. 7, the exclusion unit 50 groups candidate point groups in which the distance between the partial point groups 60 is less than or equal to an arbitrary distance into the same group. In the example shown in FIG. 7, the exclusion unit 50 separates a group 64 1 including the partial point group 60 corresponding to the cables 12 1 to 12 3 and a group 64 2 including the partial point group 60 corresponding to the cable 12 4 . Group.
 除外部50は、グループ64化された候補点群の情報を、直線検出部52に出力する。 The exclusion unit 50 outputs information about the candidate points grouped into groups 64 to the straight line detection unit 52.
 直線検出部52は、候補点群を、水平面(図のXY平面に相当)上に投影して投影点群とし、投影点群の水平面における2次元座標に基づいて直線を検出し、検出した直線から所定の距離内の投影点を投影点群から抽出し候補点群とする。 The straight line detection unit 52 projects the candidate point group onto a horizontal plane (corresponding to the XY plane in the figure) to obtain a projected point group, detects a straight line based on the two-dimensional coordinates of the projected point group on the horizontal plane, and detects the detected straight line. Projection points within a predetermined distance from are extracted from the projection point group and used as a candidate point group.
 詳細には、図8に示すように直線検出部52は、候補点群を、水平面(図8のXY平面)上に投影することで、3次元点14を投影点15に変換する。従って、投影点15は、XY平面に存在する点であり、Z軸座標の値が0となっている。さらに、直線検出部52は、グループ64単位で、投影点15群について直線検出を行う。なお、直線検出部52が投影点15群から直線検出を行う方法は特に限定されず、例えば、公知のHough変換等を用いればよい。図8に示した例では、グループ64(図7参照)に含まれる3次元点14群に対応する投影点15群から、直線66と、直線66とが検出される。これにより、分岐ケーブル12が分離される。具体的には、ケーブル12及びケーブル12と、ケーブル12とに分離される。なお、ケーブル12及びケーブル12のように、同じ方向に敷設されているが、高さ(Z軸方向の位置)が異なるケーブル12は、本処理では分離されない。 Specifically, as shown in FIG. 8, the straight line detection unit 52 converts the three-dimensional point 14 into a projected point 15 by projecting the candidate point group onto a horizontal plane (XY plane in FIG. 8). Therefore, the projection point 15 is a point that exists on the XY plane, and the value of the Z-axis coordinate is 0. Furthermore, the straight line detection unit 52 performs straight line detection for the 15 groups of projection points in units of groups 64. Note that the method by which the straight line detection unit 52 detects straight lines from the group of 15 projection points is not particularly limited, and may use, for example, known Hough transformation. In the example shown in FIG. 8, a straight line 66 1 and a straight line 66 2 are detected from a group of 15 projected points corresponding to a group of 14 three-dimensional points included in a group 64 1 (see FIG. 7). Thereby, the branch cable 12 is separated. Specifically, it is separated into cable 12 1 , cable 12 2 , and cable 12 3 . Note that cables 12 that are laid in the same direction but have different heights (positions in the Z-axis direction), such as cables 12 1 and 12 2 , are not separated in this process.
 次に、直線検出部52は、検出した直線(図8では、直線66、66)のうち、任意の長さ以下のものは除外する。候補点群には、ケーブル12に対応する3次元点14以外として、例えば、樹木等に対応する3次元点14が含まれる場合がある。この場合、樹木等に対応する投影点15に応じて直線66が検出される。樹木等は、ケーブル12に比べて、比較的長さが短いことが多い。そのため、樹木等に対応する投影点15に応じた直線66の方が、ケーブル12に対応する投影点15に応じた直線66(図8では、直線66、66)よりも短い。そこで、任意の長さよりも短い直線66については除外し、その直線66に含まれる投影点15については候補点群から除外する。 Next, among the detected straight lines (straight lines 66 1 and 66 2 in FIG. 8), the straight line detection unit 52 excludes those that are less than an arbitrary length. In addition to the three-dimensional point 14 corresponding to the cable 12, the candidate point group may include, for example, a three-dimensional point 14 corresponding to a tree or the like. In this case, a straight line 66 is detected according to the projection point 15 corresponding to the tree or the like. Trees and the like are often relatively short in length compared to the cable 12. Therefore, the straight line 66 corresponding to the projected point 15 corresponding to the tree or the like is shorter than the straight line 66 corresponding to the projected point 15 corresponding to the cable 12 (straight lines 66 1 and 66 2 in FIG. 8). Therefore, the straight line 66 shorter than an arbitrary length is excluded, and the projection points 15 included in the straight line 66 are excluded from the candidate point group.
 また、直線検出部52は、図9に示すように、検出した直線66から所定の距離内の投影点15を投影点15群から抽出する。換言すると、直線検出部52は、直線66を中心とした幅Hの領域内に存在する投影点15を抽出する。これにより、各直線66に応じた投影点15のグループ化がなされる。本処理により、ノイズとして存在する投影点15や、ケーブル12に隣接する樹木や電柱10等の他の構造物に対応する投影点15を候補投影点群から除外することができる。 Further, as shown in FIG. 9, the straight line detection unit 52 extracts the projection points 15 within a predetermined distance from the detected straight line 66 from the group of 15 projection points. In other words, the straight line detection unit 52 extracts the projection point 15 existing within a region of width H centered on the straight line 66. As a result, the projection points 15 are grouped according to each straight line 66. Through this process, the projection points 15 existing as noise and the projection points 15 corresponding to other structures such as trees and utility poles 10 adjacent to the cable 12 can be excluded from the candidate projection point group.
 直線検出部52は、抽出した投影点15からなる投影点15群を、候補点群として抽出部54に出力する。 The straight line detection unit 52 outputs a group of 15 projection points made up of the extracted projection points 15 to the extraction unit 54 as a group of candidate points.
 抽出部54は、抽出された投影点15群に対応する3次元点群に含まれる各3次元点14間を結んだ線分を複数、算出し、線分と水平面(図のXY平面に相当)とのなす角度が所定の角度以下である線分の3次元点14の集合を、ケーブル12を構成する3次元点14群として抽出する。 The extraction unit 54 calculates a plurality of line segments connecting each three-dimensional point 14 included in the three-dimensional point group corresponding to the extracted projection point group 15, and calculates a plurality of line segments connecting the line segments and a horizontal plane (corresponding to the XY plane in the figure). ) is extracted as a group of three-dimensional points 14 constituting the cable 12.
 直線66に応じたグループに含まれる投影点15群を例として挙げて、抽出部54の動作について具体的に説明する。直線66に応じたグループに含まれる投影点15群には、ケーブル12に対応する複数の投影点15と、ケーブル12に対応する複数の投影点15が含まれる。図10に示すように、抽出部54は、抽出された投影点15群に対応する3次元点14群が属する部分点群60を検出し、検出した部分点群60に含まれる各3次元点14の座標を、直線66がX軸と平行になるように座標変換する。 The operation of the extracting unit 54 will be specifically explained using a group of 15 projection points included in a group corresponding to the straight line 661 as an example. The group of 15 projection points included in the group corresponding to the straight line 66 1 includes a plurality of projection points 15 corresponding to the cable 12 1 and a plurality of projection points 15 corresponding to the cable 12 2 . As shown in FIG. 10, the extraction unit 54 detects a partial point group 60 to which the 14 three-dimensional points corresponding to the extracted 15 projected points belong, and each three-dimensional point included in the detected partial point group 60. The coordinates of 14 are transformed so that the straight line 66 is parallel to the X axis.
 また抽出部54は、座標を変換した各3次元点14からなる3次元点群に含まれる各3次元点14間を結んだ線分を複数、算出する。さらに、抽出部54は、算出した線分とX軸とのなす角度、換言すると線分と水平面(XY平面に相当)とのなす角度を算出する。そして抽出部54は、算出した角度が、所定の角度以下である線分の3次元点14群を接続する。 The extraction unit 54 also calculates a plurality of line segments connecting the three-dimensional points 14 included in the three-dimensional point group consisting of the three-dimensional points 14 whose coordinates have been converted. Further, the extraction unit 54 calculates the angle between the calculated line segment and the X axis, in other words, the angle between the line segment and a horizontal plane (corresponding to the XY plane). Then, the extraction unit 54 connects a group of 14 three-dimensional points of line segments whose calculated angle is less than or equal to a predetermined angle.
 換言すると、抽出部54は、各3次元点14を結んで得られる線分が、直線66(図9参照)に平行であるか否かを判定している。 In other words, the extraction unit 54 determines whether a line segment obtained by connecting each three-dimensional point 14 is parallel to the straight line 66 1 (see FIG. 9).
 例えば、図11Aに示した例では、3次元点14と3次元点14とを結んだ線分67と、X軸とのなす角度は所定の角度以下である。3次元点14と3次元点14とを結んだ線分67と、X軸とのなす角度は所定の角度以下である。そのため、3次元点14と3次元点14及び3次元点14とは接続される。一方、3次元点14と3次元点1411とを結んだ線分67と、X軸とのなす角度は所定の角度を超える。また、3次元点14と3次元点1412とを結んだ線分67と、X軸とのなす角度は所定の角度を超える。そのため、3次元点14と3次元点1411及び3次元点1412とは接続されない。 For example, in the example shown in FIG. 11A, the angle between the line segment 67 1 connecting the three-dimensional point 14 1 and the three-dimensional point 14 2 and the X axis is less than or equal to a predetermined angle. The angle between the line segment 672 connecting the three-dimensional points 141 and 143 and the X-axis is less than or equal to a predetermined angle. Therefore, the three-dimensional point 14 1 , the three-dimensional point 14 2 , and the three-dimensional point 14 3 are connected. On the other hand, the angle formed by the line segment 67 3 connecting the three-dimensional points 14 1 and 14 11 and the X axis exceeds a predetermined angle. Further, the angle formed between the line segment 674 connecting the three-dimensional points 141 and 1412 and the X-axis exceeds a predetermined angle. Therefore, the three-dimensional point 141 , the three-dimensional point 1411 , and the three-dimensional point 1412 are not connected.
 また例えば、図11Bに示した例では、3次元点14と3次元点14とを結んだ線分67と、X軸とのなす角度は所定の角度以下である。3次元点14と3次元点14とを結んだ線分67と、X軸とのなす角度は所定の角度以下である。3次元点14と3次元点14とを結んだ線分67と、X軸とのなす角度は所定の角度以下である。そのため、3次元点14と3次元点14、3次元点14、及び3次元点14とは接続される。一方、3次元点14と3次元点1411とを結んだ線分67と、X軸とのなす角度は所定の角度を超える。また、3次元点14と3次元点1412とを結んだ線分67と、X軸とのなす角度は所定の角度を超える。そのため、3次元点14と3次元点1411及び3次元点1412とは接続されない。 For example, in the example shown in FIG. 11B, the angle between the line segment 675 connecting the three-dimensional point 143 and the three-dimensional point 144 and the X axis is less than or equal to a predetermined angle. The angle between the line segment 676 connecting the three-dimensional points 143 and 145 and the X-axis is less than or equal to a predetermined angle. The angle formed by the line segment 677 connecting the three-dimensional point 143 and the three-dimensional point 146 and the X axis is less than or equal to a predetermined angle. Therefore, the three-dimensional point 14 3 is connected to the three-dimensional point 14 4 , the three-dimensional point 14 5 , and the three-dimensional point 14 6 . On the other hand, the angle between the line segment 678 connecting the three-dimensional points 143 and 1411 and the X-axis exceeds a predetermined angle. Further, the angle formed by the line segment 679 connecting the three-dimensional points 143 and 1412 with the X axis exceeds a predetermined angle. Therefore, the three-dimensional point 143 , the three-dimensional point 1411 , and the three-dimensional point 1412 are not connected.
 さらに、抽出部54は、上記の処理により接続された3次元点14群を、検出対象のケーブル12(線状物)に対応する3次元点14の集合(以下、線状物集合という)として検出する。図12に示した例では、線状物集合68Aと、線状物集合68Bとを検出する。線状物集合68Aに含まれる3次元点14群が、ケーブル12に対応する点群であり、線状物集合68Bに含まれる3次元点14群がケーブル12に対応する点群である。 Furthermore, the extraction unit 54 converts the group of three-dimensional points 14 connected through the above processing into a set of three-dimensional points 14 (hereinafter referred to as a linear object set) corresponding to the cable 12 (linear object) to be detected. To detect. In the example shown in FIG. 12, a linear object set 68A and a linear object set 68B are detected. A group of 14 three-dimensional points included in the linear object set 68A is a point group corresponding to the cable 121 , and a group of 14 three-dimensional points included in the linear object set 68B is a point group corresponding to the cable 122 . .
 このように、抽出部54によれば、同じ方向に向いているが高さが異なる、ケーブル12に対応する3次元点14群が分離される。 In this manner, the extraction unit 54 separates a group of 14 three-dimensional points that correspond to the cable 12 and are oriented in the same direction but have different heights.
 抽出部54は、抽出した線状物集合68毎に、線状物集合68に含まれる3次元点14を、ケーブル12を構成する3次元点14群として抽出し、線状物モデル生成部56に出力する。 For each extracted linear object set 68, the extraction unit 54 extracts the three-dimensional points 14 included in the linear object set 68 as a group of three-dimensional points 14 constituting the cable 12, and generates the linear object model generation unit 56. Output to.
 線状物モデル生成部56は、抽出部54が抽出した3次元点14群に対して主成分分析を行い、第1主成分軸を導出し、導出した第1主成分軸に対して直交する平面(YZ平面に相当)と、3次元点14同士を接続する線分との交点を用いて、ケーブル12を表す線状物モデルを生成する。 The linear object model generation unit 56 performs principal component analysis on the group of 14 three-dimensional points extracted by the extraction unit 54, derives a first principal component axis, and orthogonal to the derived first principal component axis. A linear object model representing the cable 12 is generated using the intersections of a plane (corresponding to the YZ plane) and line segments connecting the three-dimensional points 14.
 具体的には、線状物モデル生成部56は、図13に示すように、線状物集合68毎に、主成分分析を行い、第1主成分軸69を算出する。そして、線状物モデル生成部56は、算出した第1主成分軸69の各々が、X軸と平行になるように、座標変換を行う。 Specifically, as shown in FIG. 13, the linear object model generation unit 56 performs principal component analysis for each linear object set 68 to calculate a first principal component axis 69. Then, the linear object model generation unit 56 performs coordinate transformation so that each of the calculated first principal component axes 69 becomes parallel to the X axis.
 また、線状物モデル生成部56は、図14に示すように、X軸方向に一定の間隔離れた複数のYZ平面70を設定する。さらに、図15に示すように、線状物モデル生成部56は、3次元点14同士を結んだ線分と、YZ平面70との交点80を導出する。図15に示した例では、線状物モデル生成部56は、YZ平面70については、3次元点14と3次元点14とを結んだ線分との交点80、3次元点14と3次元点14とを結んだ線分との交点80、3次元点14と3次元点14とを結んだ線分との交点80、及び3次元点14と3次元点14とを結んだ線分との交点80の座標を導出する。 Furthermore, as shown in FIG. 14, the linear object model generation unit 56 sets a plurality of YZ planes 70 spaced apart by a certain distance in the X-axis direction. Furthermore, as shown in FIG. 15, the linear object model generation unit 56 derives an intersection 80 between the line segment connecting the three-dimensional points 14 and the YZ plane 70. In the example shown in FIG. 15, the linear object model generation unit 56 generates, for the YZ plane 701 , the intersection point 801 of the line segment connecting the three-dimensional point 141 and the three-dimensional point 142 , the three-dimensional point The intersection point 80 2 of the line segment connecting 14 1 and the 3-dimensional point 14 3 , the intersection 80 3 of the line segment connecting the 3-dimensional point 14 5 and the 3-dimensional point 14 2 , and the 3-dimensional point 14 5 The coordinates of the intersection point 804 with the line segment connecting the three-dimensional point 143 are derived.
 なお、線状物モデル生成部56は、1つのYZ平面70に対し、複数の交点80を導出した場合、図15に示すように各交点80の座標を用いて円近似し、その円の中心と半径を導出する。本処理により、ケーブル12の3次元座標が得られる。 Note that when the linear object model generation unit 56 derives a plurality of intersection points 80 for one YZ plane 70, it approximates a circle using the coordinates of each intersection point 80 as shown in FIG. 15, and calculates the center of the circle. and derive the radius. Through this process, the three-dimensional coordinates of the cable 12 are obtained.
 さらに、線状物モデル生成部56は、線状物集合68毎に、導出した交点80をX座標の値の昇順に交点80をつなげることで、ケーブル12を表す線状物モデル36を生成する。 Furthermore, the linear object model generation unit 56 generates the linear object model 36 representing the cable 12 by connecting the derived intersection points 80 in ascending order of the X coordinate values for each linear object set 68. .
 保存制御部46は、生成した線状物モデル36をストレージ34に格納する。また、表示制御部48は、線状物モデル36を表示部37に表示させる。 The storage control unit 46 stores the generated linear object model 36 in the storage 34. Further, the display control unit 48 causes the linear object model 36 to be displayed on the display unit 37.
 次に線状物検出装置30の作用について説明する。 Next, the operation of the linear object detection device 30 will be explained.
図16には、本実施形態の線状物検出装置30により実行される線状物検出処理の一例のフローチャートが示されている。線状物検出装置30は、ストレージ34に記憶されている線状物検出プログラム35を実行することにより、図16に示した線状物検出処理を実行する。なお、図16に示した線状物検出処理は、ユーザからの実行指示を受け付けたタイミング等、所定のタイミングで実行される。 FIG. 16 shows a flowchart of an example of the linear object detection process executed by the linear object detection device 30 of this embodiment. The linear object detection device 30 executes the linear object detection process shown in FIG. 16 by executing the linear object detection program 35 stored in the storage 34. Note that the linear object detection process shown in FIG. 16 is executed at a predetermined timing, such as the timing at which an execution instruction from the user is received.
 図16のステップS100で読込部40は、上述したように、点群測定器20の記憶媒体24に記憶されている点群データを、ネットワーク9を介して読み込む。 In step S100 of FIG. 16, the reading unit 40 reads the point cloud data stored in the storage medium 24 of the point cloud measuring device 20 via the network 9, as described above.
 次のステップS102で除外部50は、上述したように、鉛直方向に並んだ複数の3次元点14のうち、3次元点14同士の間隔が所定の間隔以下の3次元点14からなる部分点群にグループ化する(図6参照)。 In the next step S102, the exclusion unit 50 selects, as described above, a partial point consisting of three-dimensional points 14 in which the interval between the three-dimensional points 14 is equal to or less than a predetermined interval, among the plurality of three-dimensional points 14 arranged in the vertical direction. group into groups (see Figure 6).
 次のステップS104で除外部50は、上述したように、鉛直方向の長さ62が所定の長さ以上となる部分点群に含まれる3次元点14を候補点群から除外する(図6、図7参照)。 In the next step S104, the exclusion unit 50 excludes the three-dimensional points 14 included in the partial point group whose vertical length 62 is greater than or equal to a predetermined length from the candidate point group, as described above (see FIG. (See Figure 7).
 次のステップS106で除外部50は、上述したように、任意の高さ以下に存在する3次元点14を候補点群から除外する。 In the next step S106, the exclusion unit 50 excludes the three-dimensional points 14 existing below an arbitrary height from the candidate point group, as described above.
 次のステップS108で除外部50は、上述したように、部分点群60間の距離が任意の距離以下である候補点群を同一のグループ64にグループ化する(図7参照)。 In the next step S108, the exclusion unit 50 groups candidate point groups in which the distance between the partial point groups 60 is an arbitrary distance or less into the same group 64, as described above (see FIG. 7).
 次のステップS110で直線検出部52は、上述したように、候補点群を水平面上に投影し、グループ64単位で投影点15について直線66を検出する(図8参照)。 In the next step S110, the straight line detection unit 52 projects the candidate point group onto the horizontal plane, as described above, and detects straight lines 66 for the projected points 15 in units of groups 64 (see FIG. 8).
 次のステップS112で直線検出部52は、上述したように、検出した直線66のうち、任意の長さ以下のものを除外する。 In the next step S112, the straight line detection unit 52 excludes, from among the detected straight lines 66, those whose length is less than an arbitrary length, as described above.
 次のステップS114で直線検出部52は、上述したように、検出した直線66から所定の距離内の投影点15を抽出する(図9参照)。 In the next step S114, the straight line detection unit 52 extracts the projection points 15 within a predetermined distance from the detected straight line 66, as described above (see FIG. 9).
 次のステップS116で抽出部54は、上述したように、抽出された投影点15群に対応する3次元点14群が属する部分点群60を検出する。 In the next step S116, the extraction unit 54 detects the partial point group 60 to which the three-dimensional point group 14 corresponding to the extracted projection point group 15 belongs, as described above.
 次のステップS118で抽出部54は、上述したように、検出した部分点群60に含まれる3次元点14群の座標を、直線66がX軸と平行になるように座標変換する(図10参照)。 In the next step S118, the extraction unit 54 transforms the coordinates of the 14 three-dimensional points included in the detected partial point group 60 so that the straight line 66 is parallel to the X axis, as described above (see FIG. reference).
 次のステップS120で抽出部54は、上述したように、座標変換した各3次元点14からなる3次元点14群に含まれる各3次元点14間を結んだ線分67とX軸とのなす角度を算出する(図11A、図11B参照)。 In the next step S120, the extraction unit 54 connects the line segment 67 connecting each three-dimensional point 14 included in the group of three-dimensional points 14 made up of the coordinate-converted three-dimensional points 14 with the X-axis, as described above. The angle formed is calculated (see FIGS. 11A and 11B).
 次のステップS122で抽出部54は、上述したように、X軸に対する角度が所定の角度以下の線分67の3次元点14を接続する(図11A、図11B参照)。 In the next step S122, the extraction unit 54 connects the three-dimensional points 14 of the line segments 67 whose angles with respect to the X-axis are equal to or less than a predetermined angle, as described above (see FIGS. 11A and 11B).
 次のステップS124で抽出部54は、上述したように、接続された3次元点14群を、線状物集合68として検出する。図12に示すように、検出対象の線状物(本実施形態ではケーブル12)毎に、線状物集合68が検出される。 In the next step S124, the extraction unit 54 detects the group of connected three-dimensional points 14 as the linear object set 68, as described above. As shown in FIG. 12, a linear object set 68 is detected for each linear object to be detected (cable 12 in this embodiment).
 次のステップS126で線状物モデル生成部56は、上述したように、線状物集合68毎に、主成分分析し、第1主成分軸69を導出する(図13参照)。 In the next step S126, the linear object model generation unit 56 performs principal component analysis for each linear object set 68, as described above, and derives the first principal component axis 69 (see FIG. 13).
 次のステップS128で線状物モデル生成部56は、上述したように、第1主成分軸69をX軸と平行になるように座標変換する(図13参照)。 In the next step S128, the linear object model generation unit 56 transforms the coordinates of the first principal component axis 69 so that it becomes parallel to the X axis, as described above (see FIG. 13).
 次のステップS130で線状物モデル生成部56は、上述したように、一定間隔で設けられたYZ平面70を設定し、YZ平面70と投影点15を結んだ線分との交点80を算出する(図14、図15参照)。 In the next step S130, the linear object model generation unit 56 sets the YZ planes 70 provided at regular intervals as described above, and calculates the intersection point 80 between the YZ plane 70 and the line segment connecting the projection point 15. (See Figures 14 and 15).
 次のステップS132で線状物モデル生成部56は、上述したように、X座標の昇順に、交点80を接続することで、線状物モデル36を生成する。 In the next step S132, the linear object model generation unit 56 generates the linear object model 36 by connecting the intersection points 80 in ascending order of the X coordinate, as described above.
 次のステップS134で保存制御部46は、上述したように、線状物モデル36をストレージ34に格納する。 In the next step S134, the storage control unit 46 stores the linear object model 36 in the storage 34, as described above.
 次のステップS136で表示制御部48は、上述したように、線状物モデル36を表示部37に表示させる。ステップS136の処理が終了すると、図16に示した線状物検出処理が終了する。図17には、本実施形態の線状物検出装置30によって3次元点14群から生成された線状物モデル36の一例を示す。図17によれば、本実施形態の線状物検出装置30によれば、線状物を精度良く検出することができることがわかる。 In the next step S136, the display control unit 48 causes the linear object model 36 to be displayed on the display unit 37, as described above. When the process of step S136 ends, the linear object detection process shown in FIG. 16 ends. FIG. 17 shows an example of a linear object model 36 generated from a group of 14 three-dimensional points by the linear object detection device 30 of this embodiment. According to FIG. 17, it can be seen that the linear object detection device 30 of this embodiment can detect linear objects with high accuracy.
 以上説明したように、本実施形態の線状物検出装置30は、構造物の表面上の点における3次元座標を表す3次元点14群を水平面上に投影して投影点15群とし、投影点15群の水平面における2次元座標に基づいて直線66を検出し、検出した直線66から所定の距離内の投影点15を投影点15群から抽出する。また、線状物検出装置30は、抽出された投影点15群に対応する3次元点14群に含まれる3次元点14間を結んだ線分67を複数、算出し、水平面とのなす角度が所定の角度以下である線分67の3次元点14の集合を、線状物集合68として検出し、検出した線状物集合68に含まれる3次元点14群を、線状物を構成する3次元点14群として抽出する。 As explained above, the linear object detection device 30 of this embodiment projects 14 groups of three-dimensional points representing the three-dimensional coordinates of points on the surface of a structure onto a horizontal plane to form 15 groups of projected points, and A straight line 66 is detected based on the two-dimensional coordinates of the group of points 15 on the horizontal plane, and projected points 15 within a predetermined distance from the detected straight line 66 are extracted from the group of 15 projected points. In addition, the linear object detection device 30 calculates a plurality of line segments 67 connecting the three-dimensional points 14 included in the three-dimensional point 14 group corresponding to the extracted projection point group 15, and calculates the angle formed by the line segment 67 with the horizontal plane. A set of three-dimensional points 14 of a line segment 67 whose angle is less than or equal to a predetermined angle is detected as a linear object set 68, and a group of three-dimensional points 14 included in the detected linear object set 68 constitutes a linear object. The points are extracted as a group of 14 three-dimensional points.
 このように、本実施形態の線状物検出装置30では、3次元の座標を有する3次元点14を水平面(XY平面)に投影した投影点15に基づいて、線状物の検出を行うため、3次元座標を表す3次元点群が高密度であっても、線状物を精度良く検出することができる。 In this way, the linear object detection device 30 of this embodiment detects linear objects based on the projection point 15 obtained by projecting the three-dimensional point 14 having three-dimensional coordinates onto the horizontal plane (XY plane). , even if the three-dimensional point group representing three-dimensional coordinates is highly dense, linear objects can be detected with high accuracy.
 また、上記各実施形態でCPUがソフトウェア(プログラム)を読み込んで実行した各種処理を、CPU以外の各種のプロセッサが実行してもよい。この場合のプロセッサとしては、FPGA(Field-Programmable Gate Array)等の製造後に回路構成を変更可能なPLD(Programmable Logic Device)、及びASIC(Application Specific Integrated Circuit)等の特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路等が例示される。また、位置推定処理を、これらの各種のプロセッサのうちの1つで実行してもよいし、同種又は異種の2つ以上のプロセッサの組み合わせ(例えば、複数のFPGA、及びCPUとFPGAとの組み合わせ等)で実行してもよい。また、これらの各種のプロセッサのハードウェア的な構造は、より具体的には、半導体素子等の回路素子を組み合わせた電気回路である。 Further, various processes that the CPU reads and executes software (programs) in the above embodiments may be executed by various processors other than the CPU. The processor in this case is a PLD (Programmable Logic Device) whose circuit configuration can be changed after manufacturing, such as an FPGA (Field-Programmable Gate Array), and an ASIC (Application Specific Intel). In order to execute specific processing such as egrated circuit) An example is a dedicated electric circuit that is a processor having a specially designed circuit configuration. Furthermore, the position estimation process may be executed by one of these various processors, or by a combination of two or more processors of the same type or different types (for example, a combination of multiple FPGAs, and a combination of a CPU and an FPGA). etc.). Further, the hardware structure of these various processors is, more specifically, an electric circuit that is a combination of circuit elements such as semiconductor elements.
 また、上記各実施形態では、線状物検出プログラム35がストレージ34に予め記憶(インストール)されている態様を説明したが、これに限定されない。線状物検出プログラム35は、CD-ROM(Compact Disk Read Only Memory)、DVD-ROM(Digital Versatile Disk Read Only Memory)、及びUSB(Universal Serial Bus)メモリ等の非一時的(non-transitory)記憶媒体に記憶された形態で提供されてもよい。また、線状物検出プログラム35は、ネットワークを介して外部装置からダウンロードされる形態としてもよい。 Furthermore, in each of the above embodiments, a mode has been described in which the linear object detection program 35 is stored (installed) in the storage 34 in advance, but the present invention is not limited to this. The linear object detection program 35 is a CD-ROM (Compact Disk Read Only Memory), a DVD-ROM (Digital Versatile Disk Read Only Memory), and a USB (Universal Disk Read Only Memory). Non-transitory storage such as Serial Bus) memory It may also be provided in a form stored on a medium. Furthermore, the linear object detection program 35 may be downloaded from an external device via a network.
 以上の実施形態に関し、更に以下の付記を開示する。 Regarding the above embodiments, the following additional notes are further disclosed.
  (付記項1)
 メモリと、
 前記メモリに接続された少なくとも1つのプロセッサと、
 を含み、
 前記プロセッサは、
 構造物の表面上の点における3次元座標を表す3次元点群を水平面上に投影して投影点群とし、前記投影点群の前記水平面における2次元座標に基づいて直線を検出し、検出した前記直線から所定の距離内の投影点を前記投影点群から抽出し、
 抽出された投影点群に対応する3次元点群に含まれる各3次元点間を結んだ線分を複数、算出し、前記水平面とのなす角度が所定の角度以下である前記線分の3次元点の集合を、線状物を構成する3次元点群として抽出する、
 ように構成されている線状物検出装置。
(Additional note 1)
memory and
at least one processor connected to the memory;
including;
The processor includes:
A three-dimensional point group representing the three-dimensional coordinates of points on the surface of the structure is projected onto a horizontal plane to obtain a projected point group, and a straight line is detected based on the two-dimensional coordinates of the projected point group on the horizontal plane. Extracting projection points within a predetermined distance from the straight line from the projection point group,
A plurality of line segments connecting each three-dimensional point included in the three-dimensional point group corresponding to the extracted projected point group are calculated, and three of the line segments whose angle with the horizontal plane is less than or equal to a predetermined angle are calculated. Extract a set of dimensional points as a 3-dimensional point group that constitutes a linear object,
A linear object detection device configured as follows.
 (付記項2)
 線状物検出処理を実行するようにコンピュータによって実行可能なプログラムを記憶した非一時的記憶媒体であって、
 前記線状物検出処理は、
 構造物の表面上の点における3次元座標を表す3次元点群を水平面上に投影して投影点群とし、前記投影点群の前記水平面における2次元座標に基づいて直線を検出し、検出した前記直線から所定の距離内の投影点を前記投影点群から抽出し、
 抽出された投影点群に対応する3次元点群に含まれる各3次元点間を結んだ線分を複数、算出し、前記水平面とのなす角度が所定の角度以下である前記線分の3次元点の集合を、線状物を構成する3次元点群として抽出する、
 非一時的記憶媒体。
(Additional note 2)
A non-temporary storage medium storing a program executable by a computer to execute a linear object detection process,
The linear object detection process includes:
A three-dimensional point group representing the three-dimensional coordinates of points on the surface of the structure is projected onto a horizontal plane to obtain a projected point group, and a straight line is detected based on the two-dimensional coordinates of the projected point group on the horizontal plane. Extracting projection points within a predetermined distance from the straight line from the projection point group,
A plurality of line segments connecting each three-dimensional point included in the three-dimensional point group corresponding to the extracted projected point group are calculated, and three of the line segments whose angle with the horizontal plane is less than or equal to a predetermined angle are calculated. Extract a set of dimensional points as a 3-dimensional point group that constitutes a linear object,
Non-transitory storage medium.
20 点群測定器
22 スキャナ
30 線状物検出装置
31 CPU
32 ROM
33 RAM
34 ストレージ
37 表示部
38 通信I/F
39 バス
40 読込部
42 パラメータ設定部
44 線状物モデル算出部
46 保存制御部
48 表示制御部
50 除外部
52 直線検出部
54 抽出部
56 線状物モデル生成部
20 Point cloud measuring device 22 Scanner 30 Linear object detection device 31 CPU
32 ROM
33 RAM
34 Storage 37 Display unit 38 Communication I/F
39 Bus 40 Reading section 42 Parameter setting section 44 Linear object model calculation section 46 Storage control section 48 Display control section 50 Exclusion section 52 Straight line detection section 54 Extraction section 56 Linear object model generation section

Claims (6)

  1.  構造物の表面上の点における3次元座標を表す3次元点群を水平面上に投影して投影点群とし、前記投影点群の前記水平面における2次元座標に基づいて直線を検出し、検出した前記直線から所定の距離内の投影点を前記投影点群から抽出する直線検出部と、
     抽出された投影点群に対応する3次元点群に含まれる各3次元点間を結んだ線分を複数、算出し、前記水平面とのなす角度が所定の角度以下である前記線分の3次元点の集合を、線状物を構成する3次元点群として抽出する抽出部と、
     を備えた線状物検出装置。
    A three-dimensional point group representing the three-dimensional coordinates of points on the surface of the structure is projected onto a horizontal plane to obtain a projected point group, and a straight line is detected based on the two-dimensional coordinates of the projected point group on the horizontal plane. a straight line detection unit that extracts projected points within a predetermined distance from the straight line from the projected point group;
    A plurality of line segments connecting each three-dimensional point included in the three-dimensional point group corresponding to the extracted projected point group are calculated, and three of the line segments whose angle with the horizontal plane is less than or equal to a predetermined angle are calculated. an extraction unit that extracts a set of dimensional points as a three-dimensional point group forming a linear object;
    A linear object detection device equipped with
  2.  前記抽出部は、複数の前記集合を抽出した場合、集合毎に、異なる線状物を構成する3次元点群を抽出する
     請求項1に記載の線状物検出装置。
    The linear object detection device according to claim 1, wherein, when the plurality of sets are extracted, the extraction unit extracts a three-dimensional point group constituting a different linear object for each set.
  3.  前記3次元点群のうち、前記水平面と交差する鉛直方向に並んだ複数の3次元点同士の間隔が所定の間隔以下となる部分点群の鉛直方向の長さが所定の長さ以上の場合、前記部分点群を、前記水平面上に投影する対象から除外する除外部をさらに備えた、
     請求項1に記載の線状物検出装置。
    If, among the three-dimensional point group, the vertical length of a partial point group in which the interval between a plurality of three-dimensional points arranged in the vertical direction that intersects the horizontal plane is equal to or less than a predetermined interval is greater than or equal to the predetermined length; , further comprising an exclusion unit that excludes the partial point group from being projected onto the horizontal plane;
    The linear object detection device according to claim 1.
  4.  抽出した3次元点群に対して主成分分析を行い、第1主成分軸を導出し、導出した前記第1主成分軸に対して直交する平面と、前記抽出した3次元点群に含まれる3次元点間を結んだ線分との交点を用いて、前記線状物を表す線状物モデルを生成する線状物モデル生成部をさらに備えた
     請求項1に記載の線状物検出装置。
    Principal component analysis is performed on the extracted three-dimensional point group, a first principal component axis is derived, and a plane orthogonal to the derived first principal component axis and a plane included in the extracted three-dimensional point group are The linear object detection device according to claim 1, further comprising a linear object model generation unit that generates a linear object model representing the linear object using an intersection with a line segment connecting three-dimensional points. .
  5.  直線検出部が、構造物の表面上の点における3次元座標を表す3次元点群を水平面上に投影して投影点群とし、前記投影点群の前記水平面における2次元座標に基づいて直線を検出し、検出した前記直線から所定の距離内の投影点を前記投影点群から抽出し、
     抽出部が、抽出された投影点群に対応する3次元点群に含まれる各3次元点間を結んだ線分を複数、算出し、前記水平面とのなす角度が所定の角度以下である前記線分の3次元点の集合を、線状物を構成する3次元点群として抽出する、
     線状物検出方法。
    A straight line detection unit projects a three-dimensional point group representing the three-dimensional coordinates of points on the surface of the structure onto a horizontal plane to obtain a projected point group, and detects a straight line based on the two-dimensional coordinates of the projected point group on the horizontal plane. Detecting and extracting projection points within a predetermined distance from the detected straight line from the projection point group,
    The extraction unit calculates a plurality of line segments connecting each three-dimensional point included in the three-dimensional point group corresponding to the extracted projected point group, and calculates a plurality of line segments connecting the three-dimensional points included in the three-dimensional point group corresponding to the extracted projected point group, and Extract a set of 3D points of a line segment as a 3D point group that constitutes a linear object.
    Linear object detection method.
  6.  コンピュータを請求項1に記載の線状物検出装置の各部として機能させるための線状物検出プログラム。 A linear object detection program for causing a computer to function as each part of the linear object detection device according to claim 1.
PCT/JP2022/028847 2022-07-26 2022-07-26 Linear object detecting device, linear object detecting method, and linear object detecting program WO2024023949A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/028847 WO2024023949A1 (en) 2022-07-26 2022-07-26 Linear object detecting device, linear object detecting method, and linear object detecting program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/028847 WO2024023949A1 (en) 2022-07-26 2022-07-26 Linear object detecting device, linear object detecting method, and linear object detecting program

Publications (1)

Publication Number Publication Date
WO2024023949A1 true WO2024023949A1 (en) 2024-02-01

Family

ID=89705823

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/028847 WO2024023949A1 (en) 2022-07-26 2022-07-26 Linear object detecting device, linear object detecting method, and linear object detecting program

Country Status (1)

Country Link
WO (1) WO2024023949A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110144942A1 (en) * 2009-12-02 2011-06-16 Eurocopter Method of using telemetry to detect at least one suspended threadlike object, the object lying in the detection field of a telemeter mounted on board a vehicle
JP2015078849A (en) * 2013-10-15 2015-04-23 日本電信電話株式会社 Facility state detection method and device therefor
JP2016206178A (en) * 2015-04-21 2016-12-08 国際航業株式会社 Laser measurement method, laser measurement marker and coordinate calculation program
WO2021033249A1 (en) * 2019-08-19 2021-02-25 日本電信電話株式会社 Linear structure detection device, detection method, and detection program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110144942A1 (en) * 2009-12-02 2011-06-16 Eurocopter Method of using telemetry to detect at least one suspended threadlike object, the object lying in the detection field of a telemeter mounted on board a vehicle
JP2015078849A (en) * 2013-10-15 2015-04-23 日本電信電話株式会社 Facility state detection method and device therefor
JP2016206178A (en) * 2015-04-21 2016-12-08 国際航業株式会社 Laser measurement method, laser measurement marker and coordinate calculation program
WO2021033249A1 (en) * 2019-08-19 2021-02-25 日本電信電話株式会社 Linear structure detection device, detection method, and detection program

Similar Documents

Publication Publication Date Title
JP4791423B2 (en) Automatic three-dimensional scan data alignment system and method
Overby et al. Automatic 3D building reconstruction from airborne laser scanning and cadastral data using Hough transform
CN108416785B (en) Topology segmentation method and device for closed space
US20150134303A1 (en) Three-dimensional scanning system and method with hole-filling function for point cloud using contact probe
US8600713B2 (en) Method of online building-model reconstruction using photogrammetric mapping system
JP6381137B2 (en) Label detection apparatus, method, and program
US20110202318A1 (en) Interference determination device, interference determination method, and computer program product
KR101918168B1 (en) Method for performing 3D measurement and Apparatus thereof
JP6185385B2 (en) Spatial structure estimation apparatus, spatial structure estimation method, and spatial structure estimation program
Wiemann et al. Automatic construction of polygonal maps from point cloud data
TW201514446A (en) System and method for obtaining cloud points in 3D coordinates measurement
TW201616451A (en) System and method for selecting point clouds using a free selection tool
CN111932669A (en) Deformation monitoring method based on slope rock mass characteristic object
JP2013205175A (en) Device, method and program for recognizing three-dimensional target surface
CN109410183B (en) Plane extraction method, system and device based on point cloud data and storage medium
JP2014190962A (en) Data analysis device, data analysis method, and program
WO2024023949A1 (en) Linear object detecting device, linear object detecting method, and linear object detecting program
JP6237122B2 (en) Robot, image processing method and robot system
JP2009198382A (en) Environment map acquiring device
JP7093680B2 (en) Structure difference extraction device, structure difference extraction method and program
WO2024023950A1 (en) Linear object detection device, linear object detection method, and linear object detection program
WO2023005195A1 (en) Map data processing method and apparatus, and household appliance and readable storage medium
Jin et al. High precision indoor model contour extraction algorithm based on geometric information
Long Nguyen et al. Comparative study of automatic plane fitting registration for MLS sparse point clouds with different plane segmentation methods
JP2019215180A (en) Measurement system and measurement method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22953044

Country of ref document: EP

Kind code of ref document: A1