WO2023127037A1 - Information processing device, information processing method, and computer-readable medium - Google Patents

Information processing device, information processing method, and computer-readable medium Download PDF

Info

Publication number
WO2023127037A1
WO2023127037A1 PCT/JP2021/048615 JP2021048615W WO2023127037A1 WO 2023127037 A1 WO2023127037 A1 WO 2023127037A1 JP 2021048615 W JP2021048615 W JP 2021048615W WO 2023127037 A1 WO2023127037 A1 WO 2023127037A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional data
time point
measured
information processing
object measured
Prior art date
Application number
PCT/JP2021/048615
Other languages
French (fr)
Japanese (ja)
Inventor
次朗 安倍
勝広 油谷
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2021/048615 priority Critical patent/WO2023127037A1/en
Publication of WO2023127037A1 publication Critical patent/WO2023127037A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61KAUXILIARY EQUIPMENT SPECIALLY ADAPTED FOR RAILWAYS, NOT OTHERWISE PROVIDED FOR
    • B61K9/00Railway vehicle profile gauges; Detecting or indicating overheating of components; Apparatus on locomotives or cars to indicate bad track sections; General design of track recording vehicles
    • B61K9/08Measuring installations for surveying permanent way
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C7/00Tracing profiles

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a non-transitory computer-readable medium storing a program.
  • Patent Document 1 a reference point is provided outside the range of the monitored part, and a range including the monitored part and the reference point is measured using a three-dimensional laser scanner under the same measurement conditions determined based on the reference point.
  • a technique has been disclosed in which measurements are taken at predetermined time intervals, and measurement points measured under the same measurement conditions by a three-dimensional laser scanner are compared to monitor variations in a monitored part (see, for example, Patent Document 1). .
  • an object of the present disclosure is to provide an information processing device, an information processing method, and a non-transitory computer-readable medium storing a program that can appropriately detect displacement of an object to be monitored. .
  • acquisition means for acquiring three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point; and the three-dimensional data of the object measured at the second time point, respectively, based on the results of fitting a predetermined model to each of the first time point and the second time point specifying means for specifying a displacement from a point in time; output means for outputting information according to the displacement; is provided.
  • three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point are acquired, and at the first time point The first time point and the second time point of the object based on results of fitting a predetermined model to each of the measured three-dimensional data of the object and the three-dimensional data of the object measured at the second time point. and outputting information according to the displacement.
  • a non-transitory computer-readable medium storing a program for causing a computer to execute a process of specifying a displacement from two points in time and a process of outputting information according to the displacement is provided.
  • the displacement of the monitored object can be detected appropriately.
  • FIG. 4 is a flowchart showing an example of processing of the information processing device according to the embodiment; It is a figure which shows an example of the three-dimensional data of the object measured at the 1st time which concerns on embodiment. It is a figure which shows an example of the three-dimensional data of the object measured at the 2nd time which concerns on embodiment.
  • FIG. 4 is a diagram showing an example of a model fitted to three-dimensional data of an object measured at a first point in time according to the embodiment; FIG.
  • FIG. 10 is a diagram showing an example of a model fitted to three-dimensional data of an object measured at a second point in time according to the embodiment; It is a figure which shows an example of the incident angle of each plane area
  • FIG. 4 is a diagram showing an example in which models fitted to three-dimensional data of an object at each point in time according to the embodiment are arranged in the same three-dimensional space; It is a figure which shows the example of a display of the displacement amount of the object which concerns on embodiment. It is a figure which shows an example of the information recorded on threshold value DB which concerns on embodiment.
  • FIG. 1 is a diagram showing an example of the configuration of an information processing device 10 according to an embodiment.
  • the information processing device 10 has an acquisition unit 11 , a specification unit 12 and an output unit 13 .
  • Each of these units may be implemented by cooperation of one or more programs installed in the information processing device 10 and hardware such as the processor 101 and the memory 102 of the information processing device 10 .
  • the acquisition unit 11 acquires various types of information from a storage unit inside the information processing device 10 or from an external device.
  • the acquiring unit 11 acquires, for example, three-dimensional data of an object measured at a first time and three-dimensional data of the object measured at a second time.
  • the specifying unit 12 executes various processes based on the information acquired by the acquiring unit 11.
  • the output unit 13 outputs information corresponding to the displacement specified by the specifying unit 12, for example.
  • FIG. 2 is a diagram showing a configuration example of the monitoring system 1 according to the embodiment.
  • the monitoring system 1 includes a sensor 20A, a sensor 20B (hereinafter, simply referred to as "sensor 20" if there is no need to distinguish between them), a terminal 30A, a terminal 30B (hereinafter to be distinguished (hereinafter simply referred to as “terminal 30” when not required) and information processing device 10 .
  • the numbers of sensors 20, terminals 30, and information processing apparatuses 10 are not limited to the example in FIG.
  • the terminal 30 and the information processing device 10 are connected by the network N so as to be communicable.
  • the network N include, for example, the Internet, a mobile communication system, a wireless LAN (Local Area Network), short-range wireless communication such as BLE, a LAN, and a bus.
  • mobile communication systems include, for example, fifth generation mobile communication systems (5G), fourth generation mobile communication systems (4G), third generation mobile communication systems (3G), and the like.
  • the sensor 20A and the terminal 30A, and the sensor 20B and the terminal 30B are connected by a cable (external bus) or the like.
  • the sensor 20 measures three-dimensional data of the object 50 to be monitored.
  • the sensor 20 emits light, receives reflected light from the emitted light hitting an object, compares the emitted light with the received reflected light, measures the distance to the object, and obtains three-dimensional data (point Group data, depth image) may be LiDAR (Light Detection And Ranging).
  • the sensor 20 may be a ToF (Time of Flight) LiDAR that measures (scans) the shape of an object whose distance is measured based on the time difference between radiation and reception.
  • the sensor 20 may be an FMCW (Frequency Modulated Continuous Wave) LiDAR that measures distance based on the phase difference between emitted light and received reflected light.
  • FMCW Frequency Modulated Continuous Wave
  • the senor 20 may be a sensor that uses SfM/MVS (Structure from Motion/Multi-view stereo), which is a technique for constructing three-dimensional data from a plurality of image data from different viewpoints, for example.
  • Sensor 20 may also be a sensor that uses depth estimation, which is a machine learning technique that converts camera images into depth images.
  • SfM/MVS Structure from Motion/Multi-view stereo
  • the terminal 30 may be, for example, a device such as a personal computer, tablet, IoT (Internet of Things) communication device, smart phone, or mobile phone.
  • the terminal 30 transmits three-dimensional data measured by the sensor 20 to the information processing device 10 .
  • the information processing device 10 may be, for example, a device such as a server, cloud, personal computer, tablet, or smartphone.
  • the information processing device 10 monitors the displacement (fluctuation) of the monitored object 50 based on the three-dimensional data measured by the sensor 20 .
  • the information processing device 10 may, for example, monitor the displacement of railroad tracks (rails on which railroad vehicles run).
  • railroad tracks rails on which railroad vehicles run.
  • under-track crossing work for constructing a road tunnel or the like that crosses under a track there is a risk of track displacement due to taking in too much excavated earth and sand or insufficient control of mud pressure.
  • Distortion that occurs in a railroad track is also called track irregularity or track irregularity.
  • Track irregularities include, for example, irregularities in alignment, irregularities in elevation, irregularities in level, irregularities in gauge, irregularities in flatness, and the like. Randomness is the longitudinal distortion of the rail top surface.
  • Elevation irregularity means that there is a distortion in the length direction of the rail side surface.
  • Level deviation means that there is a difference in height between the left and right rails.
  • Gauge deviation is a difference in the basic dimensions of the gauge (left and right rail spacing). The flatness deviation represents the state of "torsion" with respect to the plane of the track, and is the difference between the levels of two points spaced apart by a fixed interval.
  • the information processing device 10 may also monitor displacement of steel frames and the like provided at construction sites of facilities such as buildings and tunnels, for example. Further, the information processing apparatus 10 may monitor the displacement of a steel pole or the like installed in a substation or the like during construction or the like in the surrounding area.
  • FIG. 3 is a diagram illustrating a hardware configuration example of the information processing device 10 and the terminal 30 according to the embodiment.
  • the information processing apparatus 10 will be described below as an example, the hardware configuration of the terminal 30 may be the same as that of the information processing apparatus 10 .
  • the information processing device 10 (computer 100) includes a processor 101, a memory 102, and a communication interface 103. These units may be connected by a bus or the like. Memory 102 stores at least a portion of program 104 . Communication interface 103 includes interfaces necessary for communication with other network elements.
  • Memory 102 may be of any type suitable for a local technology network. Memory 102 may be, as a non-limiting example, a non-transitory computer-readable storage medium. Also, memory 102 may be implemented using any suitable data storage technology, such as semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed and removable memory, and the like. Although only one memory 102 is shown in computer 100, there may be several physically different memory modules in computer 100.
  • FIG. Processor 101 may be of any type.
  • Processor 101 may include one or more of a general purpose computer, a special purpose computer, a microprocessor, a Digital Signal Processor (DSP), and a processor based on a multi-core processor architecture as non-limiting examples.
  • Computer 100 may have multiple processors, such as application specific integrated circuit chips that are temporally dependent on a clock that synchronizes the main processor.
  • Embodiments of the present disclosure may be implemented in hardware or dedicated circuitry, software, logic, or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software, which may be executed by a controller, microprocessor or other computing device.
  • the present disclosure also provides at least one computer program product tangibly stored on a non-transitory computer-readable storage medium.
  • a computer program product comprises computer-executable instructions, such as those contained in program modules, to be executed on a device on a target real or virtual processor to perform the processes or methods of the present disclosure.
  • Program modules include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the functionality of the program modules may be combined or split between program modules as desired in various embodiments.
  • Machine-executable instructions for program modules may be executed within local or distributed devices. In a distributed device, program modules can be located in both local and remote storage media.
  • Program code for executing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes are provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus. When the program code is executed by the processor or controller, the functions/acts in the flowchart illustrations and/or implementing block diagrams are performed. Program code may run entirely on a machine, partly on a machine, as a stand-alone software package, partly on a machine, partly on a remote machine, or entirely on a remote machine or server. be.
  • Non-transitory computer-readable media include various types of tangible storage media.
  • Examples of non-transitory computer-readable media include magnetic recording media, magneto-optical recording media, optical disc media, semiconductor memories, and the like.
  • Magnetic recording media include, for example, flexible disks, magnetic tapes, hard disk drives, and the like.
  • Magneto-optical recording media include, for example, magneto-optical disks.
  • Optical disc media include, for example, Blu-ray discs, CD (Compact Disc)-ROM (Read Only Memory), CD-R (Recordable), CD-RW (ReWritable), and the like.
  • Semiconductor memories include, for example, solid state drives, mask ROMs, PROMs (Programmable ROMs), EPROMs (Erasable PROMs), flash ROMs, RAMs (random access memories), and the like.
  • the program may also be delivered to the computer by various types of transitory computer readable media. Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves. Transitory computer-readable media can deliver the program to the computer via wired channels, such as wires and optical fibers, or wireless channels.
  • FIG. 4 is a flowchart showing an example of processing of the information processing apparatus 10 according to the embodiment.
  • FIG. 5 is a diagram showing an example of three-dimensional data of the object 50 measured at the first point in time according to the embodiment.
  • FIG. 6 is a diagram showing an example of three-dimensional data of the object 50 measured at the second point in time according to the embodiment.
  • FIG. 7 is a diagram showing an example of a model fitted to the three-dimensional data of the object 50 measured at the first point in time according to the embodiment.
  • FIG. 5 is a diagram showing an example of three-dimensional data of the object 50 measured at the first point in time according to the embodiment.
  • FIG. 6 is a diagram showing an example of three-dimensional data of the object 50 measured at the second point in time according to the embodiment.
  • FIG. 7 is a diagram showing an example of a model fitted to the three-dimensional data of the object 50 measured at the first point in time according to the embodiment.
  • FIG. 8 is a diagram showing an example of a model fitted to the three-dimensional data of the object 50 measured at the second point in time according to the embodiment.
  • FIG. 9 is a diagram showing an example of incident angles of each planar region of the object 50 according to the embodiment.
  • FIG. 10 is a diagram showing an example in which the models applied to the three-dimensional data of the object 50 at each point in time according to the embodiment are arranged in the same three-dimensional space.
  • FIG. 11 is a diagram showing a display example of the amount of displacement of the object 50 according to the embodiment.
  • FIG. 12 is a diagram showing an example of information recorded in the threshold DB 1201 according to the embodiment.
  • step S101 the acquisition unit 11 of the information processing device 10 acquires three-dimensional data of the object 50 measured by the sensor 20 at a first point in time (hereinafter also referred to as "first three-dimensional data"), Three-dimensional data of the object 50 measured by the sensor 20 at the second time (hereinafter also referred to as “second three-dimensional data” as appropriate) are acquired.
  • FIG. 5 shows an example of point cloud data 501 included in the first three-dimensional data.
  • FIG. 6 shows an example of point cloud data 601 included in the second three-dimensional data.
  • the point cloud data 501 and the point cloud data 601 are collections of points having coordinate data on three-dimensional XYZ coordinates, respectively.
  • the three-dimensional data of the object 50 may include point cloud data, which is a collection of points having coordinate data that can be converted into three-dimensional XYZ coordinates.
  • the point cloud data may be, for example, polar coordinate data (set of elevation angle, horizontal angle, and distance) with the position of the sensor 20 as the origin.
  • the three-dimensional data of the object 50 may include a depth image that expresses three-dimensional information by storing distance information from the sensor as pixel values of a two-dimensional image.
  • the three-dimensional data of the object 50 may also include mesh data representing three-dimensional information as a set of vertices, edges, and faces.
  • the specifying unit 12 of the information processing device 10 applies a predetermined model to each of the first three-dimensional data and the second three-dimensional data (step S102).
  • a predetermined model for example, when a predetermined figure and predetermined three-dimensional data are rotated and moved in a three-dimensional space, the specifying unit 12 rotates such that it best matches the three-dimensional data of the object 50 at each point in time. Amounts, movement amounts, and the like may be determined.
  • the identification unit 12 detects an area that matches a predetermined figure (for example, one or more planes) from the first three-dimensional data as applying a predetermined model to each three-dimensional data of the measured object. , an area that matches the predetermined figure may be detected from the second three-dimensional data.
  • the specifying unit 12 may, for example, detect the first plane area from the first three-dimensional data and detect the second plane area from the second three-dimensional data.
  • the identifying unit 12 may detect, for example, a combination of two plane regions that are not parallel to each other from the three-dimensional data of the object 50 .
  • FIG. 7 shows an example of combinations of plane regions 701 and 702 detected from the point cloud data 501 included in the first three-dimensional data.
  • FIG. 8 shows an example of combinations of planar regions 801 and 802 detected from the point cloud data 601 included in the second three-dimensional data.
  • the identifying unit 12 may calculate, for example, two sets of coefficients of the plane equation.
  • the identifying unit 12 may detect a straight line from the three-dimensional data of the object 50, for example, by applying a predetermined model to each three-dimensional data of the measured object.
  • the specifying unit 12 may, for example, detect two planar regions that are not parallel to each other from the three-dimensional data of the object 50, and detect the line of intersection of the detected two planar regions as the straight line.
  • the specifying unit 12 calculates, for example, a set of coefficients (a 1 , b 1 , c 1 , x 0 , y 0 , z 0 ) of the linear equation shown in the following equation (2). good too.
  • the set of a 1 , b 1 , and c 1 indicates the direction of the straight line
  • the set of x 0 , y 0 , and z 0 indicates the coordinates of a point along which the straight line passes.
  • the specifying unit 12 may detect a planar region from three-dimensional data using, for example, RANSAC (Random Sample Consensus).
  • RANSAC Random Sample Consensus
  • RANSAC is a technology that finds a plane area from three-dimensional data by repeating a series of processes consisting of random data point extraction, plane fitting to the extracted data points, and approximation accuracy evaluation of the fitted plane multiple times.
  • a plurality of planar regions are preferentially detected from regions having large areas. Therefore, for example, the side of the monitored object 50 (for example, a railroad track, etc.), such as the ground or the side of a building, may be detected first.
  • the identifying unit 12 may exclude planar regions of objects other than the object 50 to be monitored, based on information indicating the shape of the object 50 to be monitored, which is specified in advance.
  • the identifying unit 12 determines a plane region that is not elongated (for example, the ratio of the length in the horizontal direction to the length in the height direction is equal to or less than a threshold value) and a predetermined z coordinate (high ) may be excluded.
  • the identifying unit 12 extracts (cuts out) data of a preset three-dimensional area from the three-dimensional data of the object 50, for example, so as not to detect a planar area of an object other than the object 50 to be monitored. You may do so.
  • the identifying unit 12 when detecting the second planar region, extracts (cuts out) data within a predetermined range from the already detected first planar region, thereby extracting data from another planar region. It is also possible not to detect the area. As a result, for example, even when the sensor 20 obtains a small amount of data from the second plane area, it is possible to increase the possibility of appropriately detecting the second side surface.
  • the identifying unit 12 may, for example, perform highly accurate plane approximation processing using principal component analysis or the like following the RANSAC processing. As a result, it is possible to improve the accuracy of approximation when applying a plane to the three-dimensional data of the object 50 .
  • Principal component analysis is, for example, a technique for calculating a plane that minimizes the error with respect to three-dimensional data.
  • the specifying unit 12 detects a plurality of planar regions from the three-dimensional data of the object 50, and applies (to specify the displacement) among the planar regions based on the orientation (angle) of each detected planar region with respect to the sensor 20. (used) may be specified. In this case, the specifying unit 12 determines a higher priority for selection as a planar region to be applied as the angle (incident angle in the planar region) with respect to the normal to the planar region viewed from the origin (position of the sensor 20) is smaller. good too.
  • the accuracy of the three-dimensional data measured by the sensor 20 depends on the incident angle
  • a plane region with higher accuracy can be selected as the plane region to be applied.
  • the top surface 51 of the object 50 which is a rail, has a relatively large incident angle ⁇ 1 from the sensor 20, and the side surface 52 of the object 50 has a relatively small incident angle ⁇ 2 from the sensor 20.
  • the side surface 52 is selected as the planar area to be applied to the object 50 .
  • the identifying unit 12 calculates a first parameter for superimposing (approximating) predetermined three-dimensional data on first three-dimensional data, for example, as application of a predetermined model to each three-dimensional data of the measured object. Then, a second parameter for superimposing the predetermined three-dimensional data on the second three-dimensional data may be calculated.
  • the identifying unit 12 may calculate parameters for rigid transformation between the three-dimensional data of the object 50 and predetermined three-dimensional data using, for example, ICP (Iterative Closest Point).
  • ICP is a technique in which one three-dimensional data is subjected to optimal rigid body transformation (three-dimensional rotation, translation, etc.) and superimposed on the other three-dimensional data.
  • the calculated parameters may include, for example, information indicating the position of the center of gravity in the three-dimensional space and the rotation matrix.
  • the predetermined three-dimensional data may be, for example, three-dimensional data obtained in advance by measuring the object 50 or an analogue of the object 50 with the sensor 20 at a relatively short distance. Further, the predetermined three-dimensional data may be three-dimensional data created based on design data of the object 50, for example.
  • the specifying unit 12 of the information processing device 10 determines the object 50 based on the result of applying the predetermined model to the first three-dimensional data and the result of applying the predetermined model to the second three-dimensional data. is specified (step S103).
  • the specifying unit 12 arranges, for example, the result of applying a predetermined model to the first three-dimensional data and the result of applying a predetermined model to the second three-dimensional data in the same three-dimensional space. Spatial differences between the two may be calculated as mutations.
  • the identifying unit 12 calculates the shortest distance from each point on the predetermined model that is applied to the first three-dimensional data to the predetermined model that is applied to the second three-dimensional data. may be calculated as the difference between
  • the identifying unit 12 may also calculate a rigid transformation for superimposing a predetermined model applied to the first three-dimensional data on a predetermined model applied to the second three-dimensional data, for example. . Then, the identifying unit 12 may calculate, as the difference between the points, the amount of displacement when each point on the predetermined model applied to the first three-dimensional data is subjected to rigid transformation, for example.
  • FIG. 10 shows an example in which the planar regions 701 and 702 of FIG. 7 and the planar regions 801 and 802 of FIG. 8 are arranged in the same three-dimensional space.
  • the output unit 13 of the information processing device 10 outputs information corresponding to the displacement specified by the specifying unit 12 (step S104).
  • the output unit 13 may graphically display the displacement in the three-dimensional space specified by the specifying unit 12, as shown in FIG. 11, for example. In the example of FIG. 11, each point included in the point cloud data 501 of FIG. displayed in color.
  • the output unit 13 may, for example, display the displacement amount specified by the specifying unit 12 as a numerical value. In this case, the output unit 13 may output, for example, the maximum value of the displacement amount, or may output a list of displacement amount values for each predetermined interval.
  • the output unit 13 calculates the amount of various track irregularities (for example, irregularity, elevation error, etc.) from the first point in time to the second point in time. It may be calculated and output.
  • the output unit 13 may calculate, for example, the direction of the line of intersection of two planes detected from the three-dimensional data of the object 50 as the trajectory direction of the object 50 (for example, railroad tracks).
  • the output unit 13 may calculate, for example, the direction in which the three-dimensional data of the object 50 extends the most as the trajectory direction of the object 50 (for example, railroad tracks).
  • the output unit 13 may output information according to the amount of various track irregularities (eg, irregularity, elevation irregularity, etc.) from the first point in time to the second point in time, for example.
  • the output unit 13 refers to, for example, the threshold DB 1201 of FIG. If it is equal to or greater than a threshold corresponding to the combination of , an alarm indicating the specific type may be output.
  • the threshold DB 1201 records (registers and sets) thresholds for determining necessity of warning for each combination of the sensor ID and the type of track irregularity.
  • the sensor ID is identification information of the sensor 20 .
  • the threshold DB 1201 may be stored (set, registered) in advance in a storage device inside or outside the information processing apparatus 10 .
  • the information processing device 10 may be a device included in one housing, but the information processing device 10 of the present disclosure is not limited to this.
  • Each unit of the information processing apparatus 10 may be implemented by cloud computing configured by one or more computers, for example.
  • the information processing device 10 and the terminal 30 may be integrated as an information processing device. Also, at least part of the processing of the information processing device 10 may be executed by the terminal 30 .
  • the information processing device 10 such as these is also included in an example of the "information processing device" of the present disclosure.
  • Information processing device having (Appendix 2) The identifying means applies a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point, at the first time point detecting a first planar region from the measured three-dimensional data of the
  • the identifying means applies a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point, at the first time point detecting intersection lines of a first plurality of plane regions from the measured three-dimensional data of the object, and detecting intersection lines of a second plurality of plane regions from the three-dimensional data of the object measured at the second time point; To detect, The information processing device according to appendix 1 or 2.
  • the specifying means detects a plurality of plane regions from each three-dimensional data of the object measured by the sensor at the first time point and the second time point, and based on the orientation of each detected plane region with respect to the sensor, Identifying a planar region used to identify the displacement among the planar regions;
  • the information processing device according to Appendix 2 or 3.
  • the identifying means applies a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point, at the first time point calculating a first parameter of rigid transformation between the measured three-dimensional data of the object and the predetermined three-dimensional data, and calculating the three-dimensional data of the object measured at the second time point and the predetermined three-dimensional data; Compute the second parameter of the rigid transformation with 5.
  • the information processing device according to any one of appendices 1 to 4.
  • the predetermined three-dimensional data is three-dimensional data specified based on at least one of three-dimensional data corresponding to the object and design data of the object, The information processing device according to appendix 5.
  • the output means is If the amount of track irregularity of a specific type is equal to or greater than a threshold value corresponding to the specific type, outputting information indicating the specific type; 7.
  • the information processing device according to any one of appendices 1 to 6.
  • (Appendix 8) Acquiring three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point; the first time point of the object based on the result of fitting a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point; and the second point in time; outputting information according to the displacement; Information processing methods.
  • (Appendix 9) a process of acquiring three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point; the first time point of the object based on the result of fitting a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point; and a process of identifying the displacement between the second time point and a process of outputting information according to the displacement;
  • a non-transitory computer-readable medium that stores a program that causes a computer to execute

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Image Processing (AREA)

Abstract

Provided is an information processing device (10) including: an acquisition means (11) that acquires three-dimensional data for an object measured at a first point in time and three-dimensional data for the object measured at a second point in time; an identification means (12) that identifies the displacement of the object between the first point in time and the second point in time on the basis of the result of applying a predetermined model to each of the three-dimensional data for the object measured at the first point in time and the three-dimensional data for the object measured at the second point in time; and an output means (13) that outputs information corresponding to the displacement.

Description

情報処理装置、情報処理方法、及びコンピュータ可読媒体Information processing device, information processing method, and computer readable medium
 本開示は、情報処理装置、情報処理方法、及びプログラムが格納された非一時的なコンピュータ可読媒体に関する。 The present disclosure relates to an information processing device, an information processing method, and a non-transitory computer-readable medium storing a program.
 特許文献1には、監視対象部の範囲外に基準ポイントを設け、3次元レーザスキャナを用いて監視対象部と基準ポイントを含む範囲を、基準ポイントを基に決定される同一の測定条件で、所定時間ごとに測定し、3次元レーザスキャナで同一の測定条件で測定された測定点を比較して、監視対象部の変動を監視する技術が開示されている(例えば、特許文献1を参照)。 In Patent Document 1, a reference point is provided outside the range of the monitored part, and a range including the monitored part and the reference point is measured using a three-dimensional laser scanner under the same measurement conditions determined based on the reference point. A technique has been disclosed in which measurements are taken at predetermined time intervals, and measurement points measured under the same measurement conditions by a three-dimensional laser scanner are compared to monitor variations in a monitored part (see, for example, Patent Document 1). .
特開2008-076058号公報JP 2008-076058 A
 しかしながら、特許文献1に記載の技術では、例えば、監視対象の物体の変位(変動)を適切に検出できない場合がある。 However, with the technique described in Patent Document 1, for example, there are cases where the displacement (fluctuation) of the monitored object cannot be detected appropriately.
 本開示の目的は、上述した課題を鑑み、監視対象の物体の変位を適切に検出できる情報処理装置、情報処理方法、及びプログラムが格納された非一時的なコンピュータ可読媒体を提供することにある。 In view of the above problems, an object of the present disclosure is to provide an information processing device, an information processing method, and a non-transitory computer-readable medium storing a program that can appropriately detect displacement of an object to be monitored. .
 本開示に係る第1の態様では、第1時点で測定された物体の3次元データと、第2時点で測定された前記物体の3次元データと、を取得する取得手段と、前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれに所定のモデルを当てはめた結果に基づいて、前記物体の前記第1時点と前記第2時点との変位を特定する特定手段と、前記変位に応じた情報を出力する出力手段と、
を有する情報処理装置が提供される。
In a first aspect of the present disclosure, acquisition means for acquiring three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point; and the three-dimensional data of the object measured at the second time point, respectively, based on the results of fitting a predetermined model to each of the first time point and the second time point specifying means for specifying a displacement from a point in time; output means for outputting information according to the displacement;
is provided.
 また、本開示に係る第2の態様では、第1時点で測定された物体の3次元データと、第2時点で測定された前記物体の3次元データと、を取得し、前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれに所定のモデルを当てはめた結果に基づいて、前記物体の前記第1時点と前記第2時点との変位を特定し、前記変位に応じた情報を出力する、情報処理方法が提供される。 Further, in a second aspect of the present disclosure, three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point are acquired, and at the first time point The first time point and the second time point of the object based on results of fitting a predetermined model to each of the measured three-dimensional data of the object and the three-dimensional data of the object measured at the second time point. and outputting information according to the displacement.
 また、本開示に係る第3の態様では、第1時点で測定された物体の3次元データと、第2時点で測定された前記物体の3次元データと、を取得する処理と、前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれに所定のモデルを当てはめた結果に基づいて、前記物体の前記第1時点と前記第2時点との変位を特定する処理と、前記変位に応じた情報を出力する処理と、をコンピュータに実行させるプログラムが格納された非一時的なコンピュータ可読媒体が提供される。 Further, in a third aspect of the present disclosure, a process of acquiring three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point; Based on the result of fitting a predetermined model to each of the three-dimensional data of the object measured at the time point and the three-dimensional data of the object measured at the second time point, the first time point and the first time point of the object are determined. A non-transitory computer-readable medium storing a program for causing a computer to execute a process of specifying a displacement from two points in time and a process of outputting information according to the displacement is provided.
 一側面によれば、監視対象の物体の変位を適切に検出できる。 According to one aspect, the displacement of the monitored object can be detected appropriately.
実施形態に係る情報処理装置の構成の一例を示す図である。It is a figure which shows an example of a structure of the information processing apparatus which concerns on embodiment. 実施形態に係る監視システムの構成例を示す図である。It is a figure which shows the structural example of the monitoring system which concerns on embodiment. 実施形態に係る情報処理装置のハードウェア構成例を示す図である。It is a figure which shows the hardware structural example of the information processing apparatus which concerns on embodiment. 実施形態に係る情報処理装置の処理の一例を示すフローチャートである。4 is a flowchart showing an example of processing of the information processing device according to the embodiment; 実施形態に係る第1時点で測定された物体の3次元データの一例を示す図である。It is a figure which shows an example of the three-dimensional data of the object measured at the 1st time which concerns on embodiment. 実施形態に係る第2時点で測定された物体の3次元データの一例を示す図である。It is a figure which shows an example of the three-dimensional data of the object measured at the 2nd time which concerns on embodiment. 実施形態に係る第1時点で測定された物体の3次元データに当てはめられたモデルの一例を示す図である。FIG. 4 is a diagram showing an example of a model fitted to three-dimensional data of an object measured at a first point in time according to the embodiment; 実施形態に係る第2時点で測定された物体の3次元データに当てはめられたモデルの一例を示す図である。FIG. 10 is a diagram showing an example of a model fitted to three-dimensional data of an object measured at a second point in time according to the embodiment; 実施形態に係る物体の各平面領域の入射角の一例について示す図である。It is a figure which shows an example of the incident angle of each plane area|region of the object which concerns on embodiment. 実施形態に係る各時点の物体の3次元データに当てはめられたモデルを同一の3次元空間上に配置した例を示す図である。FIG. 4 is a diagram showing an example in which models fitted to three-dimensional data of an object at each point in time according to the embodiment are arranged in the same three-dimensional space; 実施形態に係る物体の変位量の表示例を示す図である。It is a figure which shows the example of a display of the displacement amount of the object which concerns on embodiment. 実施形態に係る閾値DBに記録される情報の一例を示す図である。It is a figure which shows an example of the information recorded on threshold value DB which concerns on embodiment.
 本開示の原理は、いくつかの例示的な実施形態を参照して説明される。これらの実施形態は、例示のみを目的として記載されており、本開示の範囲に関する制限を示唆することなく、当業者が本開示を理解および実施するのを助けることを理解されたい。本明細書で説明される開示は、以下で説明されるもの以外の様々な方法で実装される。
 以下の説明および特許請求の範囲において、他に定義されない限り、本明細書で使用されるすべての技術用語および科学用語は、本開示が属する技術分野の当業者によって一般に理解されるのと同じ意味を有する。
 以下、図面を参照して、本開示の実施形態を説明する。
The principles of the present disclosure will be explained with reference to several exemplary embodiments. It should be understood that these embodiments are described for illustrative purposes only, and do not imply any limitation on the scope of the disclosure, and are intended to assist those skilled in the art in understanding and practicing the present disclosure. The disclosure described herein can be implemented in various ways other than those described below.
In the following description and claims, unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. have
Embodiments of the present disclosure will be described below with reference to the drawings.
 (実施の形態1)
 <構成>
 図1を参照し、実施形態に係る情報処理装置10の構成について説明する。図1は、実施形態に係る情報処理装置10の構成の一例を示す図である。情報処理装置10は、取得部11、特定部12、及び出力部13を有する。これら各部は、情報処理装置10にインストールされた1以上のプログラムと、情報処理装置10のプロセッサ101、及びメモリ102等のハードウェアとの協働により実現されてもよい。
(Embodiment 1)
<Configuration>
A configuration of an information processing apparatus 10 according to an embodiment will be described with reference to FIG. FIG. 1 is a diagram showing an example of the configuration of an information processing device 10 according to an embodiment. The information processing device 10 has an acquisition unit 11 , a specification unit 12 and an output unit 13 . Each of these units may be implemented by cooperation of one or more programs installed in the information processing device 10 and hardware such as the processor 101 and the memory 102 of the information processing device 10 .
 取得部11は、情報処理装置10内部の記憶部、または外部装置から各種の情報を取得する。取得部11は、例えば、第1時点で測定された物体の3次元データと、第2時点で測定された前記物体の3次元データと、を取得する。 The acquisition unit 11 acquires various types of information from a storage unit inside the information processing device 10 or from an external device. The acquiring unit 11 acquires, for example, three-dimensional data of an object measured at a first time and three-dimensional data of the object measured at a second time.
 特定部12は、取得部11により取得された情報に基づき、各種の処理を実行する。特定部12は、例えば、第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれに所定のモデルを当てはめた結果に基づいて、前記物体の前記第1時点と前記第2時点との変位を特定する。出力部13は、例えば、特定部12により特定された変位に応じた情報を出力する。 The specifying unit 12 executes various processes based on the information acquired by the acquiring unit 11. The specifying unit 12, for example, based on the result of applying a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point, the A displacement of the object between the first time point and the second time point is determined. The output unit 13 outputs information corresponding to the displacement specified by the specifying unit 12, for example.
 (実施の形態2)
 次に、図2を参照し、実施形態に係る監視システム1の構成について説明する。
 <システム構成>
 図2は、実施形態に係る監視システム1の構成例を示す図である。図2の例では、監視システム1は、センサ20A、センサ20B(以下で、区別する必要がない場合は、単に、「センサ20」とも称する。)、端末30A、端末30B(以下で、区別する必要がない場合は、単に、「端末30」とも称する。)、及び情報処理装置10を有する。なお、センサ20、端末30、及び情報処理装置10の数は図2の例に限定されない。
(Embodiment 2)
Next, the configuration of the monitoring system 1 according to the embodiment will be described with reference to FIG.
<System configuration>
FIG. 2 is a diagram showing a configuration example of the monitoring system 1 according to the embodiment. In the example of FIG. 2, the monitoring system 1 includes a sensor 20A, a sensor 20B (hereinafter, simply referred to as "sensor 20" if there is no need to distinguish between them), a terminal 30A, a terminal 30B (hereinafter to be distinguished (hereinafter simply referred to as “terminal 30” when not required) and information processing device 10 . Note that the numbers of sensors 20, terminals 30, and information processing apparatuses 10 are not limited to the example in FIG.
 図2の例では、端末30、及び情報処理装置10は、ネットワークNにより通信できるように接続されている。ネットワークNの例には、例えば、インターネット、移動通信システム、無線LAN(Local Area Network)、BLE等の近距離無線通信、LAN、及びバス等が含まれる。移動通信システムの例には、例えば、第5世代移動通信システム(5G)、第4世代移動通信システム(4G)、第3世代移動通信システム(3G)等が含まれる。また、センサ20Aと端末30A、センサ20Bと端末30Bは、それぞれ、ケーブル(外部バス)等により接続されている。 In the example of FIG. 2, the terminal 30 and the information processing device 10 are connected by the network N so as to be communicable. Examples of the network N include, for example, the Internet, a mobile communication system, a wireless LAN (Local Area Network), short-range wireless communication such as BLE, a LAN, and a bus. Examples of mobile communication systems include, for example, fifth generation mobile communication systems (5G), fourth generation mobile communication systems (4G), third generation mobile communication systems (3G), and the like. Further, the sensor 20A and the terminal 30A, and the sensor 20B and the terminal 30B are connected by a cable (external bus) or the like.
 センサ20は、監視対象である物体50の3次元データを測定する。センサ20は、例えば、光を放射し、放射光が物体に当たることによる反射光を受光し、放射光と受光した反射光を比較することで、物体までの距離を測定し、3次元データ(点群データ、深度画像)を生成するLiDAR(Light Detection And Ranging)でもよい。この場合、センサ20は、放射から受光までの時間差に基づいて距離を測定する物体の形状を測定(スキャン)するToF(Time of Flight)方式のLiDARでもよい。また、センサ20は、放射光と受光した反射光の位相差に基づいて距離を測定するFMCW(Frequency Modulated Continuous Wave)方式のLiDARでもよい。 The sensor 20 measures three-dimensional data of the object 50 to be monitored. For example, the sensor 20 emits light, receives reflected light from the emitted light hitting an object, compares the emitted light with the received reflected light, measures the distance to the object, and obtains three-dimensional data (point Group data, depth image) may be LiDAR (Light Detection And Ranging). In this case, the sensor 20 may be a ToF (Time of Flight) LiDAR that measures (scans) the shape of an object whose distance is measured based on the time difference between radiation and reception. Further, the sensor 20 may be an FMCW (Frequency Modulated Continuous Wave) LiDAR that measures distance based on the phase difference between emitted light and received reflected light.
 また、センサ20は、例えば、視点の異なる複数の画像データから3次元データを構築する技術であるSfM/MVS(Structure from Motion / Multi-view stereo)を用いるセンサでもよい。また、センサ20は、カメラ画像を深度画像に変換する機械学習技術である深度推定を用いるセンサでもよい。 Also, the sensor 20 may be a sensor that uses SfM/MVS (Structure from Motion/Multi-view stereo), which is a technique for constructing three-dimensional data from a plurality of image data from different viewpoints, for example. Sensor 20 may also be a sensor that uses depth estimation, which is a machine learning technique that converts camera images into depth images.
 端末30は、例えば、パーソナルコンピュータ、タブレット、IoT(Internet of Things)通信装置、スマートフォン、または携帯電話機等の装置でもよい。端末30は、センサ20により測定された3次元データを、情報処理装置10に送信する。 The terminal 30 may be, for example, a device such as a personal computer, tablet, IoT (Internet of Things) communication device, smart phone, or mobile phone. The terminal 30 transmits three-dimensional data measured by the sensor 20 to the information processing device 10 .
 情報処理装置10は、例えば、サーバ、クラウド、パーソナルコンピュータ、タブレット、またはスマートフォン等の装置でもよい。情報処理装置10は、センサ20により測定された3次元データに基づいて、監視対象の物体50の変位(変動)を監視する。 The information processing device 10 may be, for example, a device such as a server, cloud, personal computer, tablet, or smartphone. The information processing device 10 monitors the displacement (fluctuation) of the monitored object 50 based on the three-dimensional data measured by the sensor 20 .
 情報処理装置10は、例えば、鉄道の線路(鉄道車両が走行するレール)の変位を監視してもよい。なお、例えば、線路の下を横断する道路のトンネル等を建設するための線路下横断工事において、掘削土砂を取り込みすぎたり泥土圧の管理が不足したりすることによる軌道変位のリスクが存在する。また、路盤の陥没、路盤の***、噴発といったリスクも存在する。線路に生じる歪みは、軌道変位、または軌道狂いとも称されている。軌道変位には、例えば、通り狂い、高低狂い、水準狂い、軌間狂い、及び平面性狂い等が含まれる。通り狂いは、レール頭頂面の長さ方向の歪みがあることである。高低狂いは、レール側面の長さ方向の歪みがあることである。水準狂いは、左右レールの高さに差があることである。軌間狂いは、軌間(左右レール間隔)の基本寸法に対して差異があることである。平面性狂いは、軌道の平面に対する「ねじれ」の状態を表すもので、一定間隔を隔てた2点の水準に差があることである。 The information processing device 10 may, for example, monitor the displacement of railroad tracks (rails on which railroad vehicles run). In addition, for example, in under-track crossing work for constructing a road tunnel or the like that crosses under a track, there is a risk of track displacement due to taking in too much excavated earth and sand or insufficient control of mud pressure. There are also risks such as subsidence of the roadbed, upheaval of the roadbed, and eruption. Distortion that occurs in a railroad track is also called track irregularity or track irregularity. Track irregularities include, for example, irregularities in alignment, irregularities in elevation, irregularities in level, irregularities in gauge, irregularities in flatness, and the like. Randomness is the longitudinal distortion of the rail top surface. Elevation irregularity means that there is a distortion in the length direction of the rail side surface. Level deviation means that there is a difference in height between the left and right rails. Gauge deviation is a difference in the basic dimensions of the gauge (left and right rail spacing). The flatness deviation represents the state of "torsion" with respect to the plane of the track, and is the difference between the levels of two points spaced apart by a fixed interval.
 また、情報処理装置10は、例えば、建物、及びトンネル等の施設の建設現場に設けられた鉄骨等の変位を監視してもよい。また、情報処理装置10は、例えば、変電所等に設けられた鉄柱等の、周辺での工事等の際の変位を監視してもよい。 The information processing device 10 may also monitor displacement of steel frames and the like provided at construction sites of facilities such as buildings and tunnels, for example. Further, the information processing apparatus 10 may monitor the displacement of a steel pole or the like installed in a substation or the like during construction or the like in the surrounding area.
 <ハードウェア構成>
 図3は、実施形態に係る情報処理装置10及び端末30のハードウェア構成例を示す図である。以下では、情報処理装置10を例として説明するが、端末30のハードウェア構成も情報処理装置10のものと同様でもよい。
<Hardware configuration>
FIG. 3 is a diagram illustrating a hardware configuration example of the information processing device 10 and the terminal 30 according to the embodiment. Although the information processing apparatus 10 will be described below as an example, the hardware configuration of the terminal 30 may be the same as that of the information processing apparatus 10 .
 図3の例では、情報処理装置10(コンピュータ100)は、プロセッサ101、メモリ102、通信インターフェイス103を含む。これら各部は、バス等により接続されてもよい。メモリ102は、プログラム104の少なくとも一部を格納する。通信インターフェイス103は、他のネットワーク要素との通信に必要なインターフェイスを含む。 In the example of FIG. 3, the information processing device 10 (computer 100) includes a processor 101, a memory 102, and a communication interface 103. These units may be connected by a bus or the like. Memory 102 stores at least a portion of program 104 . Communication interface 103 includes interfaces necessary for communication with other network elements.
 プログラム104が、プロセッサ101及びメモリ102等の協働により実行されると、コンピュータ100により本開示の実施形態の少なくとも一部の処理が行われる。メモリ102は、ローカル技術ネットワークに適した任意のタイプのものであってもよい。メモリ102は、非限定的な例として、非一時的なコンピュータ可読記憶媒体でもよい。また、メモリ102は、半導体ベースのメモリデバイス、磁気メモリデバイスおよびシステム、光学メモリデバイスおよびシステム、固定メモリおよびリムーバブルメモリなどの任意の適切なデータストレージ技術を使用して実装されてもよい。コンピュータ100には1つのメモリ102のみが示されているが、コンピュータ100にはいくつかの物理的に異なるメモリモジュールが存在してもよい。プロセッサ101は、任意のタイプのものであってよい。プロセッサ101は、汎用コンピュータ、専用コンピュータ、マイクロプロセッサ、デジタル信号プロセッサ(DSP:Digital Signal Processor)、および非限定的な例としてマルチコアプロセッサアーキテクチャに基づくプロセッサの1つ以上を含んでよい。コンピュータ100は、メインプロセッサを同期させるクロックに時間的に従属する特定用途向け集積回路チップなどの複数のプロセッサを有してもよい。 When the program 104 is executed by cooperation of the processor 101 and the memory 102, etc., the computer 100 performs at least part of the processing of the embodiment of the present disclosure. Memory 102 may be of any type suitable for a local technology network. Memory 102 may be, as a non-limiting example, a non-transitory computer-readable storage medium. Also, memory 102 may be implemented using any suitable data storage technology, such as semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed and removable memory, and the like. Although only one memory 102 is shown in computer 100, there may be several physically different memory modules in computer 100. FIG. Processor 101 may be of any type. Processor 101 may include one or more of a general purpose computer, a special purpose computer, a microprocessor, a Digital Signal Processor (DSP), and a processor based on a multi-core processor architecture as non-limiting examples. Computer 100 may have multiple processors, such as application specific integrated circuit chips that are temporally dependent on a clock that synchronizes the main processor.
 本開示の実施形態は、ハードウェアまたは専用回路、ソフトウェア、ロジックまたはそれらの任意の組み合わせで実装され得る。いくつかの態様はハードウェアで実装されてもよく、一方、他の態様はコントローラ、マイクロプロセッサまたは他のコンピューティングデバイスによって実行され得るファームウェアまたはソフトウェアで実装されてもよい。 Embodiments of the present disclosure may be implemented in hardware or dedicated circuitry, software, logic, or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software, which may be executed by a controller, microprocessor or other computing device.
 本開示はまた、非一時的なコンピュータ可読記憶媒体に有形に記憶された少なくとも1つのコンピュータプログラム製品を提供する。コンピュータプログラム製品は、プログラムモジュールに含まれる命令などのコンピュータ実行可能命令を含み、対象の実プロセッサまたは仮想プロセッサ上のデバイスで実行され、本開示のプロセスまたは方法を実行する。プログラムモジュールには、特定のタスクを実行したり、特定の抽象データ型を実装したりするルーチン、プログラム、ライブラリ、オブジェクト、クラス、コンポーネント、データ構造などが含まれる。プログラムモジュールの機能は、様々な実施形態で望まれるようにプログラムモジュール間で結合または分割されてもよい。プログラムモジュールのマシン実行可能命令は、ローカルまたは分散デバイス内で実行できる。分散デバイスでは、プログラムモジュールはローカルとリモートの両方のストレージメディアに配置できる。 The present disclosure also provides at least one computer program product tangibly stored on a non-transitory computer-readable storage medium. A computer program product comprises computer-executable instructions, such as those contained in program modules, to be executed on a device on a target real or virtual processor to perform the processes or methods of the present disclosure. Program modules include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or split between program modules as desired in various embodiments. Machine-executable instructions for program modules may be executed within local or distributed devices. In a distributed device, program modules can be located in both local and remote storage media.
 本開示の方法を実行するためのプログラムコードは、1つ以上のプログラミング言語の任意の組み合わせで書かれてもよい。これらのプログラムコードは、汎用コンピュータ、専用コンピュータ、またはその他のプログラム可能なデータ処理装置のプロセッサまたはコントローラに提供される。プログラムコードがプロセッサまたはコントローラによって実行されると、フローチャートおよび/または実装するブロック図内の機能/動作が実行される。プログラムコードは、完全にマシン上で実行され、一部はマシン上で、スタンドアロンソフトウェアパッケージとして、一部はマシン上で、一部はリモートマシン上で、または完全にリモートマシンまたはサーバ上で実行される。 Program code for executing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes are provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus. When the program code is executed by the processor or controller, the functions/acts in the flowchart illustrations and/or implementing block diagrams are performed. Program code may run entirely on a machine, partly on a machine, as a stand-alone software package, partly on a machine, partly on a remote machine, or entirely on a remote machine or server. be.
 プログラムは、様々なタイプの非一時的なコンピュータ可読媒体を用いて格納され、コンピュータに供給することができる。非一時的なコンピュータ可読媒体は、様々なタイプの実体のある記録媒体を含む。非一時的なコンピュータ可読媒体の例には、磁気記録媒体、光磁気記録媒体、光ディスク媒体、半導体メモリ等が含まれる。磁気記録媒体には、例えば、フレキシブルディスク、磁気テープ、ハードディスクドライブ等が含まれる。光磁気記録媒体には、例えば、光磁気ディスク等が含まれる。光ディスク媒体には、例えば、ブルーレイディスク、CD(Compact Disc)-ROM(Read Only Memory)、CD-R(Recordable)、CD-RW(ReWritable)等が含まれる。半導体メモリには、例えば、ソリッドステートドライブ、マスクROM、PROM(Programmable ROM)、EPROM(Erasable PROM)、フラッシュROM、RAM(random access memory)等が含まれる。また、プログラムは、様々なタイプの一時的なコンピュータ可読媒体によってコンピュータに供給されてもよい。一時的なコンピュータ可読媒体の例は、電気信号、光信号、及び電磁波を含む。一時的なコンピュータ可読媒体は、電線及び光ファイバ等の有線通信路、又は無線通信路を介して、プログラムをコンピュータに供給できる。 Programs can be stored and supplied to computers using various types of non-transitory computer-readable media. Non-transitory computer-readable media include various types of tangible storage media. Examples of non-transitory computer-readable media include magnetic recording media, magneto-optical recording media, optical disc media, semiconductor memories, and the like. Magnetic recording media include, for example, flexible disks, magnetic tapes, hard disk drives, and the like. Magneto-optical recording media include, for example, magneto-optical disks. Optical disc media include, for example, Blu-ray discs, CD (Compact Disc)-ROM (Read Only Memory), CD-R (Recordable), CD-RW (ReWritable), and the like. Semiconductor memories include, for example, solid state drives, mask ROMs, PROMs (Programmable ROMs), EPROMs (Erasable PROMs), flash ROMs, RAMs (random access memories), and the like. The program may also be delivered to the computer by various types of transitory computer readable media. Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves. Transitory computer-readable media can deliver the program to the computer via wired channels, such as wires and optical fibers, or wireless channels.
 <処理>
 次に、図4から図12を参照し、実施形態に係る情報処理装置10の処理の一例について説明する。図4は、実施形態に係る情報処理装置10の処理の一例を示すフローチャートである。図5は、実施形態に係る第1時点で測定された物体50の3次元データの一例を示す図である。図6は、実施形態に係る第2時点で測定された物体50の3次元データの一例を示す図である。図7は、実施形態に係る第1時点で測定された物体50の3次元データに当てはめられたモデルの一例を示す図である。図8は、実施形態に係る第2時点で測定された物体50の3次元データに当てはめられたモデルの一例を示す図である。図9は、実施形態に係る物体50の各平面領域の入射角の一例について示す図である。図10は、実施形態に係る各時点の物体50の3次元データに当てはめられたモデルを同一の3次元空間上に配置した例を示す図である。図11は、実施形態に係る物体50の変位量の表示例を示す図である。図12は、実施形態に係る閾値DB1201に記録される情報の一例を示す図である。
<Processing>
Next, an example of processing of the information processing apparatus 10 according to the embodiment will be described with reference to FIGS. 4 to 12. FIG. FIG. 4 is a flowchart showing an example of processing of the information processing apparatus 10 according to the embodiment. FIG. 5 is a diagram showing an example of three-dimensional data of the object 50 measured at the first point in time according to the embodiment. FIG. 6 is a diagram showing an example of three-dimensional data of the object 50 measured at the second point in time according to the embodiment. FIG. 7 is a diagram showing an example of a model fitted to the three-dimensional data of the object 50 measured at the first point in time according to the embodiment. FIG. 8 is a diagram showing an example of a model fitted to the three-dimensional data of the object 50 measured at the second point in time according to the embodiment. FIG. 9 is a diagram showing an example of incident angles of each planar region of the object 50 according to the embodiment. FIG. 10 is a diagram showing an example in which the models applied to the three-dimensional data of the object 50 at each point in time according to the embodiment are arranged in the same three-dimensional space. FIG. 11 is a diagram showing a display example of the amount of displacement of the object 50 according to the embodiment. FIG. 12 is a diagram showing an example of information recorded in the threshold DB 1201 according to the embodiment.
 ステップS101において、情報処理装置10の取得部11は、センサ20により第1時点で測定された物体50の3次元データ(以下で、適宜、「第1の3次元データ」とも称する。)と、センサ20により第2時点で測定された物体50の3次元データ(以下で、適宜、「第2の3次元データ」とも称する。)と、を取得する。図5には、第1の3次元データに含まれる点群データ501の例が示されている。また、図6には、第2の3次元データに含まれる点群データ601の例が示されている。図5及び図6の例では、点群データ501及び点群データ601は、それぞれ、3次元のXYZ座標上の座標データをもつ点の集まりである。 In step S101, the acquisition unit 11 of the information processing device 10 acquires three-dimensional data of the object 50 measured by the sensor 20 at a first point in time (hereinafter also referred to as "first three-dimensional data"), Three-dimensional data of the object 50 measured by the sensor 20 at the second time (hereinafter also referred to as “second three-dimensional data” as appropriate) are acquired. FIG. 5 shows an example of point cloud data 501 included in the first three-dimensional data. Also, FIG. 6 shows an example of point cloud data 601 included in the second three-dimensional data. In the examples of FIGS. 5 and 6, the point cloud data 501 and the point cloud data 601 are collections of points having coordinate data on three-dimensional XYZ coordinates, respectively.
 また、物体50の3次元データには、3次元のXYZ座標に変換可能な座標データをもつ点の集まりである点群データが含まれてもよい。この場合、当該点群データは、例えば、センサ20の位置を原点とする極座標データ(仰角、水平角、及び距離の組みのデータ)でもよい。 Also, the three-dimensional data of the object 50 may include point cloud data, which is a collection of points having coordinate data that can be converted into three-dimensional XYZ coordinates. In this case, the point cloud data may be, for example, polar coordinate data (set of elevation angle, horizontal angle, and distance) with the position of the sensor 20 as the origin.
 また、物体50の3次元データには、二次元画像の画素値としてセンサからの距離情報を格納することで3次元情報を表現する深度画像が含まれてもよい。また、物体50の3次元データには、頂点、辺、及び面の集まりとして3次元情報を表現するメッシュデータが含まれてもよい。 Also, the three-dimensional data of the object 50 may include a depth image that expresses three-dimensional information by storing distance information from the sensor as pixel values of a two-dimensional image. The three-dimensional data of the object 50 may also include mesh data representing three-dimensional information as a set of vertices, edges, and faces.
 続いて、情報処理装置10の特定部12は、第1の3次元データ、及び第2の3次元データのそれぞれに所定のモデルを当てはめる(ステップS102)。ここで、特定部12は、例えば、所定の図形、及び所定の3次元データを3次元空間上で回転および移動させた場合に、各時点の物体50の3次元データに最も合致するような回転量および移動量等を決定してもよい。 Subsequently, the specifying unit 12 of the information processing device 10 applies a predetermined model to each of the first three-dimensional data and the second three-dimensional data (step S102). Here, for example, when a predetermined figure and predetermined three-dimensional data are rotated and moved in a three-dimensional space, the specifying unit 12 rotates such that it best matches the three-dimensional data of the object 50 at each point in time. Amounts, movement amounts, and the like may be determined.
 (所定の図形に当てはめる例)
 特定部12は、例えば、測定された物体の各3次元データへの所定のモデルの当てはめとして、第1の3次元データから所定の図形(例えば、1以上の平面)に合致する領域を検出し、第2の3次元データから当該所定の図形に合致する領域を検出してもよい。この場合、特定部12は、例えば、第1の3次元データから第1平面領域を検出し、第2の3次元データから第2平面領域を検出してもよい。
(Example of applying to a predetermined figure)
For example, the identification unit 12 detects an area that matches a predetermined figure (for example, one or more planes) from the first three-dimensional data as applying a predetermined model to each three-dimensional data of the measured object. , an area that matches the predetermined figure may be detected from the second three-dimensional data. In this case, the specifying unit 12 may, for example, detect the first plane area from the first three-dimensional data and detect the second plane area from the second three-dimensional data.
 この場合、特定部12は、例えば、物体50の3次元データから、互いに平行でない2つの平面領域の組み合わせを検出してもよい。図7には、第1の3次元データに含まれる点群データ501から検出された平面領域701及び平面領域702の組み合わせの例が示されている。また、図8には、第2の3次元データに含まれる点群データ601から検出された平面領域801及び平面領域802の組み合わせの例が示されている。この場合、特定部12は、例えば、以下の式(1)で示される平面の方程式の各係数(a、b、c、d)を、2つの平面領域のそれぞれに対して算出してもよい。
 ax+by+cz+d=0 ・・・(1)
 この場合、特定部12により、例えば、平面の方程式の各係数の組みが2つ算出されてもよい。
In this case, the identifying unit 12 may detect, for example, a combination of two plane regions that are not parallel to each other from the three-dimensional data of the object 50 . FIG. 7 shows an example of combinations of plane regions 701 and 702 detected from the point cloud data 501 included in the first three-dimensional data. Also, FIG. 8 shows an example of combinations of planar regions 801 and 802 detected from the point cloud data 601 included in the second three-dimensional data. In this case, the identifying unit 12 may calculate, for example, each coefficient (a, b, c, d) of the plane equation represented by the following formula (1) for each of the two plane regions. .
ax+by+cz+d=0 (1)
In this case, the identifying unit 12 may calculate, for example, two sets of coefficients of the plane equation.
 また、特定部12は、例えば、物体50の3次元データから、測定された物体の各3次元データへの所定のモデルの当てはめとして、直線を検出してもよい。この場合、特定部12は、例えば、物体50の3次元データから、互いに平行でない2つの平面領域を検出し、検出した2つの平面領域の交線を当該直線として検出してもよい。 In addition, the identifying unit 12 may detect a straight line from the three-dimensional data of the object 50, for example, by applying a predetermined model to each three-dimensional data of the measured object. In this case, the specifying unit 12 may, for example, detect two planar regions that are not parallel to each other from the three-dimensional data of the object 50, and detect the line of intersection of the detected two planar regions as the straight line.
 この場合、特定部12により、例えば、以下の式(2)で示される直線の方程式の各係数(a、b、c、x、y、z)の組みが算出されてもよい。なお、a、b、cの組みは直線の方向を示し、x、y、zの組みは直線が通る一点の座標を示す。
 (x-x)/a=(y-y)/b=(z-z)/c ・・・(2)
In this case, the specifying unit 12 calculates, for example, a set of coefficients (a 1 , b 1 , c 1 , x 0 , y 0 , z 0 ) of the linear equation shown in the following equation (2). good too. The set of a 1 , b 1 , and c 1 indicates the direction of the straight line, and the set of x 0 , y 0 , and z 0 indicates the coordinates of a point along which the straight line passes.
(xx 0 )/a 1 =(yy 0 )/b 1 =(zz 0 )/c 1 (2)
 ((平面領域の検出について))
 なお、特定部12は、例えば、RANSAC(Random Sample Consensus)等を用いて、3次元データから平面領域を検出してもよい。なお、RANSACは、ランダムなデータ点の抽出、抽出したデータ点に対する平面当てはめ、当てはめた平面の近似精度評価、からなる一連の処理を複数回繰り返して、3次元データから平面領域を見つける技術である。RANSACでは面積の大きい領域から優先的に複数の平面領域が検出される。そのため、例えば、地面や建物の側面など、監視対象の物体50(例えば、線路等)の側面以外が最初に検出される場合がある。そのため、特定部12は、例えば、予め指定されている監視対象の物体50の形状を示す情報に基づいて、監視対象の物体50以外の物体による平面領域を除外してもよい。この場合、特定部12は、例えば、細長く分布していない(例えば、水平方向の長さと、高さ方向の長さとの比の値が閾値以下である)平面領域、及び所定のz座標(高さ)の範囲を含まない平面領域を除外してもよい。
((Regarding detection of flat areas))
Note that the specifying unit 12 may detect a planar region from three-dimensional data using, for example, RANSAC (Random Sample Consensus). In addition, RANSAC is a technology that finds a plane area from three-dimensional data by repeating a series of processes consisting of random data point extraction, plane fitting to the extracted data points, and approximation accuracy evaluation of the fitted plane multiple times. . In RANSAC, a plurality of planar regions are preferentially detected from regions having large areas. Therefore, for example, the side of the monitored object 50 (for example, a railroad track, etc.), such as the ground or the side of a building, may be detected first. For this reason, the identifying unit 12 may exclude planar regions of objects other than the object 50 to be monitored, based on information indicating the shape of the object 50 to be monitored, which is specified in advance. In this case, the identifying unit 12, for example, determines a plane region that is not elongated (for example, the ratio of the length in the horizontal direction to the length in the height direction is equal to or less than a threshold value) and a predetermined z coordinate (high ) may be excluded.
 また、特定部12は、例えば、物体50の3次元データから、予め設定されている3次元領域のデータを抽出(切り出し)することにより、監視対象の物体50以外の物体による平面領域を検出しないようにしてもよい。 Further, the identifying unit 12 extracts (cuts out) data of a preset three-dimensional area from the three-dimensional data of the object 50, for example, so as not to detect a planar area of an object other than the object 50 to be monitored. You may do so.
 また、特定部12は、例えば、2つ目の平面領域を検出する際に、既に検出している1つ目の平面領域から所定範囲内のデータを抽出(切り出し)することにより、他の平面領域を検出しないようにしてもよい。これにより、例えば、センサ20により2つ目の平面領域から得られるデータ数が少なかった場合においても、2つ目の側面を適切に検出できる可能性を高めることができる。 Further, for example, when detecting the second planar region, the identifying unit 12 extracts (cuts out) data within a predetermined range from the already detected first planar region, thereby extracting data from another planar region. It is also possible not to detect the area. As a result, for example, even when the sensor 20 obtains a small amount of data from the second plane area, it is possible to increase the possibility of appropriately detecting the second side surface.
 また、特定部12は、例えば、RANSACの処理に続けて、主成分分析などを用いる高精度な平面近似処理を実行してもよい。これにより、物体50の3次元データに平面を当てはめる際の近似の精度を向上させることができる。なお、主成分分析は、例えば、3次元データに対して誤差が最小になるような平面を算出する技術である。 Further, the identifying unit 12 may, for example, perform highly accurate plane approximation processing using principal component analysis or the like following the RANSAC processing. As a result, it is possible to improve the accuracy of approximation when applying a plane to the three-dimensional data of the object 50 . Principal component analysis is, for example, a technique for calculating a plane that minimizes the error with respect to three-dimensional data.
 (((センサ20への角度に基づいて検出対象とする平面を選択する例)))
 特定部12は、物体50の3次元データから複数の平面領域を検出し、検出した各平面領域のセンサ20に対する向き(角度)に基づいて、当該各平面領域のうち、当てはめる(変位の特定に用いる)平面領域を特定してもよい。この場合、特定部12は、原点(センサ20の位置)から見た平面領域の法線との角度(平面領域における入射角)が小さいほど、当てはめる平面領域として選択する優先度を高く決定してもよい。これにより、センサ20で測定する3次元データの精度(3次元データが実世界の線路を再現する正確さ)が入射角に依存する場合に、より精度の高い平面領域を当てはめる平面領域として選択できる。
((((Example of selecting a plane to be detected based on the angle to the sensor 20)))
The specifying unit 12 detects a plurality of planar regions from the three-dimensional data of the object 50, and applies (to specify the displacement) among the planar regions based on the orientation (angle) of each detected planar region with respect to the sensor 20. (used) may be specified. In this case, the specifying unit 12 determines a higher priority for selection as a planar region to be applied as the angle (incident angle in the planar region) with respect to the normal to the planar region viewed from the origin (position of the sensor 20) is smaller. good too. As a result, when the accuracy of the three-dimensional data measured by the sensor 20 (the accuracy with which the three-dimensional data reproduces a real-world railroad track) depends on the incident angle, a plane region with higher accuracy can be selected as the plane region to be applied. .
 図7の例では、レールである物体50の上面51は、センサ20からの入射角θが比較的大きく、物体50の側面52は、センサ20からの入射角θが比較的小さい。そのため、上面51と側面52とが検出された場合は、物体50に当てはめる平面領域として側面52が選択される。 In the example of FIG. 7, the top surface 51 of the object 50, which is a rail, has a relatively large incident angle θ 1 from the sensor 20, and the side surface 52 of the object 50 has a relatively small incident angle θ 2 from the sensor 20. In FIG. Therefore, when the top surface 51 and the side surface 52 are detected, the side surface 52 is selected as the planar area to be applied to the object 50 .
 (所定の3次元データに当てはめる例)
 特定部12は、例えば、測定された物体の各3次元データへの所定のモデルの当てはめとして、所定の3次元データを第1の3次元データに重ね合わせる(近似する)第1のパラメータを算出し、当該所定の3次元データを第2の3次元データに重ね合わせる第2のパラメータを算出してもよい。この場合、特定部12は、例えば、ICP(Iterative Closest Point)等を用いて、物体50の3次元データと所定の3次元データとの剛体変換のパラメータを算出してもよい。なお、ICPは、一方の3次元データに最適な剛体変換(3次元での、回転および平行移動等)を施して他方の3次元データに重ね合わせる技術である。この場合、算出されるパラメータには、例えば、3次元空間上での重心位置、及び回転行列を示す情報が含まれてもよい。
(Example of applying to predetermined three-dimensional data)
The identifying unit 12 calculates a first parameter for superimposing (approximating) predetermined three-dimensional data on first three-dimensional data, for example, as application of a predetermined model to each three-dimensional data of the measured object. Then, a second parameter for superimposing the predetermined three-dimensional data on the second three-dimensional data may be calculated. In this case, the identifying unit 12 may calculate parameters for rigid transformation between the three-dimensional data of the object 50 and predetermined three-dimensional data using, for example, ICP (Iterative Closest Point). Note that ICP is a technique in which one three-dimensional data is subjected to optimal rigid body transformation (three-dimensional rotation, translation, etc.) and superimposed on the other three-dimensional data. In this case, the calculated parameters may include, for example, information indicating the position of the center of gravity in the three-dimensional space and the rotation matrix.
 なお、当該所定の3次元データは、例えば、物体50、または物体50の類似物を、比較的近距離でセンサ20により測定して予め取得している3次元データでもよい。
また、当該所定の3次元データは、例えば、物体50の設計データに基づいて作成された3次元データでもよい。
The predetermined three-dimensional data may be, for example, three-dimensional data obtained in advance by measuring the object 50 or an analogue of the object 50 with the sensor 20 at a relatively short distance.
Further, the predetermined three-dimensional data may be three-dimensional data created based on design data of the object 50, for example.
 続いて、情報処理装置10の特定部12は、第1の3次元データに所定のモデルを当てはめた結果と、第2の3次元データに所定のモデルを当てはめた結果とに基づいて、物体50の前記第1時点と前記第2時点との変位を特定する(ステップS103)。 Subsequently, the specifying unit 12 of the information processing device 10 determines the object 50 based on the result of applying the predetermined model to the first three-dimensional data and the result of applying the predetermined model to the second three-dimensional data. is specified (step S103).
 ここで、特定部12は、例えば、第1の3次元データに所定のモデルを当てはめた結果と、第2の3次元データに所定のモデルを当てはめた結果とを同一の3次元空間上に配置した場合の空間的な差異を、変異として算出してもよい。この場合、特定部12は、例えば、第1の3次元データに当てはめられた所定のモデル上の各点について、第2の3次元データに当てはめられた所定のモデルまでの最短距離を当該各点の差異として算出してもよい。また、特定部12は、例えば、第1の3次元データに当てはめられた所定のモデルを、第2の3次元データに当てはめられた所定のモデルに重ね合わせるための剛体変換を算出してもよい。そして、特定部12は、例えば、第1の3次元データに当てはめられた所定のモデル上の各点に剛体変換を施した場合の変位量を、当該各点の差異として算出してもよい。 Here, the specifying unit 12 arranges, for example, the result of applying a predetermined model to the first three-dimensional data and the result of applying a predetermined model to the second three-dimensional data in the same three-dimensional space. Spatial differences between the two may be calculated as mutations. In this case, for example, the identifying unit 12 calculates the shortest distance from each point on the predetermined model that is applied to the first three-dimensional data to the predetermined model that is applied to the second three-dimensional data. may be calculated as the difference between The identifying unit 12 may also calculate a rigid transformation for superimposing a predetermined model applied to the first three-dimensional data on a predetermined model applied to the second three-dimensional data, for example. . Then, the identifying unit 12 may calculate, as the difference between the points, the amount of displacement when each point on the predetermined model applied to the first three-dimensional data is subjected to rigid transformation, for example.
 図10には、図7の平面領域701及び平面領域702と、図8の平面領域801及び平面領域802を同一の3次元空間上に配置した例が示されている。 FIG. 10 shows an example in which the planar regions 701 and 702 of FIG. 7 and the planar regions 801 and 802 of FIG. 8 are arranged in the same three-dimensional space.
 続いて、情報処理装置10の出力部13は、特定部12により特定された変位に応じた情報を出力する(ステップS104)。ここで、出力部13は、例えば、図11に示すように、特定部12により特定された3次元空間上の変位をグラフィカルに表示させてもよい。図11の例では、図5の点群データ501に含まれる各点が、図7の平面領域701及び平面領域702と、図8の平面領域801及び平面領域802との差異の値に応じた色で表示されている。 Subsequently, the output unit 13 of the information processing device 10 outputs information corresponding to the displacement specified by the specifying unit 12 (step S104). Here, the output unit 13 may graphically display the displacement in the three-dimensional space specified by the specifying unit 12, as shown in FIG. 11, for example. In the example of FIG. 11, each point included in the point cloud data 501 of FIG. displayed in color.
 また、出力部13は、例えば、特定部12により特定された変位量を数値で表示させてもよい。この場合、出力部13は、例えば、変位量の最大値を出力してもよいし、所定の間隔ごとの変位量値のリストを出力してもよい。 Also, the output unit 13 may, for example, display the displacement amount specified by the specifying unit 12 as a numerical value. In this case, the output unit 13 may output, for example, the maximum value of the displacement amount, or may output a list of displacement amount values for each predetermined interval.
 また、出力部13は、例えば、特定部12により特定された変位量に基づいて、第1時点から第2時点までの間における各種の軌道変位(例えば、通り狂い、高低狂い等)の量を算出して出力してもよい。この場合、出力部13は、例えば、物体50の3次元データから検出された2つの平面の交線の方向を、物体50(例えば、線路)の軌道方向として算出してもよい。また、出力部13は、例えば、物体50の3次元データが最も伸びる方向を、物体50(例えば、線路)の軌道方向として算出してもよい。 Further, the output unit 13, for example, based on the amount of displacement specified by the specifying unit 12, calculates the amount of various track irregularities (for example, irregularity, elevation error, etc.) from the first point in time to the second point in time. It may be calculated and output. In this case, the output unit 13 may calculate, for example, the direction of the line of intersection of two planes detected from the three-dimensional data of the object 50 as the trajectory direction of the object 50 (for example, railroad tracks). Also, the output unit 13 may calculate, for example, the direction in which the three-dimensional data of the object 50 extends the most as the trajectory direction of the object 50 (for example, railroad tracks).
 また、出力部13は、例えば、第1時点から第2時点までの間における各種の軌道変位(例えば、通り狂い、高低狂い等)の量に応じた情報を出力してもよい。この場合、出力部13は、例えば、図12の閾値DB1201を参照し、第1時点から第2時点までの間における特定の種別の軌道変位の量が、センサ20と当該特定の種別の軌道変位との組み合わせに応じた閾値以上である場合に、当該特定の種別を示す警報を出力してもよい。図12の例では、閾値DB1201には、センサIDと軌道変位の種別との組み合わせ毎に、警報の要否を判断するための閾値が記録(登録、設定)されている。センサIDは、センサ20の識別情報である。なお、閾値DB1201は、情報処理装置10の内部または外部の記憶装置に予め記憶(設定、登録)されていてもよい。 In addition, the output unit 13 may output information according to the amount of various track irregularities (eg, irregularity, elevation irregularity, etc.) from the first point in time to the second point in time, for example. In this case, the output unit 13 refers to, for example, the threshold DB 1201 of FIG. If it is equal to or greater than a threshold corresponding to the combination of , an alarm indicating the specific type may be output. In the example of FIG. 12 , the threshold DB 1201 records (registers and sets) thresholds for determining necessity of warning for each combination of the sensor ID and the type of track irregularity. The sensor ID is identification information of the sensor 20 . Note that the threshold DB 1201 may be stored (set, registered) in advance in a storage device inside or outside the information processing apparatus 10 .
 <変形例>
 情報処理装置10は、一つの筐体に含まれる装置でもよいが、本開示の情報処理装置10はこれに限定されない。情報処理装置10の各部は、例えば1以上のコンピュータにより構成されるクラウドコンピューティングにより実現されていてもよい。また、情報処理装置10と端末30とを一体の情報処理装置としてもよい。また、情報処理装置10の少なくとも一部の処理を、端末30にて実行させるようにしてもよい。これらのような情報処理装置10についても、本開示の「情報処理装置」の一例に含まれる。
<Modification>
The information processing device 10 may be a device included in one housing, but the information processing device 10 of the present disclosure is not limited to this. Each unit of the information processing apparatus 10 may be implemented by cloud computing configured by one or more computers, for example. Further, the information processing device 10 and the terminal 30 may be integrated as an information processing device. Also, at least part of the processing of the information processing device 10 may be executed by the terminal 30 . The information processing device 10 such as these is also included in an example of the "information processing device" of the present disclosure.
 なお、本開示は上記実施の形態に限られたものではなく、趣旨を逸脱しない範囲で適宜変更することが可能である。 It should be noted that the present disclosure is not limited to the above embodiments, and can be modified as appropriate without departing from the scope.
 上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。
 上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。
 (付記1)
 第1時点で測定された物体の3次元データと、第2時点で測定された前記物体の3次元データと、を取得する取得手段と、
 前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれに所定のモデルを当てはめた結果に基づいて、前記物体の前記第1時点と前記第2時点との変位を特定する特定手段と、
 前記変位に応じた情報を出力する出力手段と、
を有する情報処理装置。
 (付記2)
 前記特定手段は、前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれへの所定のモデルの当てはめとして、前記第1時点で測定された前記物体の3次元データから第1平面領域を検出し、前記第2時点で測定された前記物体の3次元データから第2平面領域を検出する、
付記1に記載の情報処理装置。
 (付記3)
 前記特定手段は、前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれへの所定のモデルの当てはめとして、前記第1時点で測定された前記物体の3次元データから第1の複数の平面領域の交線を検出し、前記第2時点で測定された前記物体の3次元データから第2の複数の平面領域の交線を検出する、
付記1または2に記載の情報処理装置。
 (付記4)
 前記特定手段は、前記第1時点及び前記第2時点でセンサにより測定された前記物体の各3次元データから複数の平面領域を検出し、検出した各平面領域の前記センサに対する向きに基づいて、当該各平面領域のうち前記変位の特定に用いる平面領域を特定する、
付記2または3に記載の情報処理装置。
 (付記5)
 前記特定手段は、前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれへの所定のモデルの当てはめとして、前記第1時点で測定された前記物体の3次元データと前記所定の3次元データとの剛体変換の第1のパラメータを算出し、前記第2時点で測定された前記物体の3次元データと前記所定の3次元データとの剛体変換の第2のパラメータを算出する、
付記1から4のいずれか一項に記載の情報処理装置。
 (付記6)
 前記所定の3次元データは、前記物体に応じた3次元データ、及び、前記物体の設計データの少なくとも一方に基づいて特定された3次元データである、
付記5に記載の情報処理装置。
 (付記7)
 前記物体は線路であり、
 前記出力手段は、
 特定の種別の軌道変位の量が、前記特定の種別に応じた閾値以上である場合、前記特定の種別を示す情報を出力する、
付記1から6のいずれか一項に記載の情報処理装置。
 (付記8)
 第1時点で測定された物体の3次元データと、第2時点で測定された前記物体の3次元データと、を取得し、
 前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれに所定のモデルを当てはめた結果に基づいて、前記物体の前記第1時点と前記第2時点との変位を特定し、
 前記変位に応じた情報を出力する、
情報処理方法。
 (付記9)
 第1時点で測定された物体の3次元データと、第2時点で測定された前記物体の3次元データと、を取得する処理と、
 前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれに所定のモデルを当てはめた結果に基づいて、前記物体の前記第1時点と前記第2時点との変位を特定する処理と、
 前記変位に応じた情報を出力する処理と、
をコンピュータに実行させるプログラムが格納された非一時的なコンピュータ可読媒体。
Some or all of the above-described embodiments can also be described in the following supplementary remarks, but are not limited to the following.
Some or all of the above-described embodiments can also be described in the following supplementary remarks, but are not limited to the following.
(Appendix 1)
acquisition means for acquiring three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point;
the first time point of the object based on the result of fitting a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point; and a specifying means for specifying the displacement between and the second time point;
output means for outputting information according to the displacement;
Information processing device having
(Appendix 2)
The identifying means applies a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point, at the first time point detecting a first planar region from the measured three-dimensional data of the object, and detecting a second planar region from the three-dimensional data of the object measured at the second time point;
The information processing device according to appendix 1.
(Appendix 3)
The identifying means applies a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point, at the first time point detecting intersection lines of a first plurality of plane regions from the measured three-dimensional data of the object, and detecting intersection lines of a second plurality of plane regions from the three-dimensional data of the object measured at the second time point; To detect,
The information processing device according to appendix 1 or 2.
(Appendix 4)
The specifying means detects a plurality of plane regions from each three-dimensional data of the object measured by the sensor at the first time point and the second time point, and based on the orientation of each detected plane region with respect to the sensor, Identifying a planar region used to identify the displacement among the planar regions;
The information processing device according to Appendix 2 or 3.
(Appendix 5)
The identifying means applies a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point, at the first time point calculating a first parameter of rigid transformation between the measured three-dimensional data of the object and the predetermined three-dimensional data, and calculating the three-dimensional data of the object measured at the second time point and the predetermined three-dimensional data; Compute the second parameter of the rigid transformation with
5. The information processing device according to any one of appendices 1 to 4.
(Appendix 6)
The predetermined three-dimensional data is three-dimensional data specified based on at least one of three-dimensional data corresponding to the object and design data of the object,
The information processing device according to appendix 5.
(Appendix 7)
the object is a railroad track,
The output means is
If the amount of track irregularity of a specific type is equal to or greater than a threshold value corresponding to the specific type, outputting information indicating the specific type;
7. The information processing device according to any one of appendices 1 to 6.
(Appendix 8)
Acquiring three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point;
the first time point of the object based on the result of fitting a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point; and the second point in time;
outputting information according to the displacement;
Information processing methods.
(Appendix 9)
a process of acquiring three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point;
the first time point of the object based on the result of fitting a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point; and a process of identifying the displacement between the second time point and
a process of outputting information according to the displacement;
A non-transitory computer-readable medium that stores a program that causes a computer to execute
1 監視システム
10 情報処理装置
11 取得部
12 特定部
13 出力部
20 センサ
30 端末
50 物体
1 monitoring system 10 information processing device 11 acquisition unit 12 identification unit 13 output unit 20 sensor 30 terminal 50 object

Claims (9)

  1.  第1時点で測定された物体の3次元データと、第2時点で測定された前記物体の3次元データと、を取得する取得手段と、
     前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれに所定のモデルを当てはめた結果に基づいて、前記物体の前記第1時点と前記第2時点との変位を特定する特定手段と、
     前記変位に応じた情報を出力する出力手段と、
    を有する情報処理装置。
    acquisition means for acquiring three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point;
    the first time point of the object based on the result of fitting a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point; and a specifying means for specifying the displacement between and the second time point;
    output means for outputting information according to the displacement;
    Information processing device having
  2.  前記特定手段は、前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれへの所定のモデルの当てはめとして、前記第1時点で測定された前記物体の3次元データから第1平面領域を検出し、前記第2時点で測定された前記物体の3次元データから第2平面領域を検出する、
    請求項1に記載の情報処理装置。
    The identifying means applies a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point, at the first time point detecting a first planar region from the measured three-dimensional data of the object, and detecting a second planar region from the three-dimensional data of the object measured at the second time point;
    The information processing device according to claim 1 .
  3.  前記特定手段は、前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれへの所定のモデルの当てはめとして、前記第1時点で測定された前記物体の3次元データから第1の複数の平面領域の交線を検出し、前記第2時点で測定された前記物体の3次元データから第2の複数の平面領域の交線を検出する、
    請求項1または2に記載の情報処理装置。
    The identifying means applies a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point, at the first time point detecting intersection lines of a first plurality of plane regions from the measured three-dimensional data of the object, and detecting intersection lines of a second plurality of plane regions from the three-dimensional data of the object measured at the second time point; To detect,
    The information processing apparatus according to claim 1 or 2.
  4.  前記特定手段は、前記第1時点及び前記第2時点でセンサにより測定された前記物体の各3次元データから複数の平面領域を検出し、検出した各平面領域の前記センサに対する向きに基づいて、当該各平面領域のうち前記変位の特定に用いる平面領域を特定する、
    請求項2または3に記載の情報処理装置。
    The specifying means detects a plurality of plane regions from each three-dimensional data of the object measured by the sensor at the first time point and the second time point, and based on the orientation of each detected plane region with respect to the sensor, Identifying a planar region used to identify the displacement among the planar regions;
    The information processing apparatus according to claim 2 or 3.
  5.  前記特定手段は、前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれへの所定のモデルの当てはめとして、前記第1時点で測定された前記物体の3次元データと前記所定の3次元データとの剛体変換の第1のパラメータを算出し、前記第2時点で測定された前記物体の3次元データと前記所定の3次元データとの剛体変換の第2のパラメータを算出する、
    請求項1から4のいずれか一項に記載の情報処理装置。
    The identifying means applies a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point, at the first time point calculating a first parameter of rigid transformation between the measured three-dimensional data of the object and the predetermined three-dimensional data, and calculating the three-dimensional data of the object measured at the second time point and the predetermined three-dimensional data; Compute the second parameter of the rigid transformation with
    The information processing apparatus according to any one of claims 1 to 4.
  6.  前記所定の3次元データは、前記物体に応じた3次元データ、及び、前記物体の設計データの少なくとも一方に基づいて特定された3次元データである、
    請求項5に記載の情報処理装置。
    The predetermined three-dimensional data is three-dimensional data specified based on at least one of three-dimensional data corresponding to the object and design data of the object,
    The information processing device according to claim 5 .
  7.  前記物体は線路であり、
     前記出力手段は、
     特定の種別の軌道変位の量が、前記特定の種別に応じた閾値以上である場合、前記特定の種別を示す情報を出力する、
    請求項1から6のいずれか一項に記載の情報処理装置。
    the object is a railroad track,
    The output means is
    If the amount of track irregularity of a specific type is equal to or greater than a threshold corresponding to the specific type, outputting information indicating the specific type;
    The information processing apparatus according to any one of claims 1 to 6.
  8.  第1時点で測定された物体の3次元データと、第2時点で測定された前記物体の3次元データと、を取得し、
     前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれに所定のモデルを当てはめた結果に基づいて、前記物体の前記第1時点と前記第2時点との変位を特定し、
     前記変位に応じた情報を出力する、
    情報処理方法。
    Acquiring three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point;
    the first time point of the object based on the result of fitting a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point; and the second point in time;
    outputting information according to the displacement;
    Information processing methods.
  9.  第1時点で測定された物体の3次元データと、第2時点で測定された前記物体の3次元データと、を取得する処理と、
     前記第1時点で測定された物体の3次元データ、及び前記第2時点で測定された前記物体の3次元データのそれぞれに所定のモデルを当てはめた結果に基づいて、前記物体の前記第1時点と前記第2時点との変位を特定する処理と、
     前記変位に応じた情報を出力する処理と、
    をコンピュータに実行させるプログラムが格納された非一時的なコンピュータ可読媒体。
    a process of acquiring three-dimensional data of an object measured at a first time point and three-dimensional data of the object measured at a second time point;
    the first time point of the object based on the result of fitting a predetermined model to each of the three-dimensional data of the object measured at the first time point and the three-dimensional data of the object measured at the second time point; and a process of identifying the displacement between the second time point and
    a process of outputting information according to the displacement;
    A non-transitory computer-readable medium that stores a program that causes a computer to execute
PCT/JP2021/048615 2021-12-27 2021-12-27 Information processing device, information processing method, and computer-readable medium WO2023127037A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/048615 WO2023127037A1 (en) 2021-12-27 2021-12-27 Information processing device, information processing method, and computer-readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/048615 WO2023127037A1 (en) 2021-12-27 2021-12-27 Information processing device, information processing method, and computer-readable medium

Publications (1)

Publication Number Publication Date
WO2023127037A1 true WO2023127037A1 (en) 2023-07-06

Family

ID=86998334

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/048615 WO2023127037A1 (en) 2021-12-27 2021-12-27 Information processing device, information processing method, and computer-readable medium

Country Status (1)

Country Link
WO (1) WO2023127037A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004325209A (en) * 2003-04-24 2004-11-18 Kokusai Kogyo Co Ltd Measuring device and method for structure displacement
JP2006071356A (en) * 2004-08-31 2006-03-16 Sgs:Kk Three-dimensional movement measuring method using nonprism distance measuring means
JP2007170821A (en) * 2005-12-19 2007-07-05 Enzan Kobo:Kk Three-dimensional displacement measurement method
JP2017033374A (en) * 2015-08-04 2017-02-09 セイコーエプソン株式会社 Data collation device, design data correction device, shape measurement device, data collation method and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004325209A (en) * 2003-04-24 2004-11-18 Kokusai Kogyo Co Ltd Measuring device and method for structure displacement
JP2006071356A (en) * 2004-08-31 2006-03-16 Sgs:Kk Three-dimensional movement measuring method using nonprism distance measuring means
JP2007170821A (en) * 2005-12-19 2007-07-05 Enzan Kobo:Kk Three-dimensional displacement measurement method
JP2017033374A (en) * 2015-08-04 2017-02-09 セイコーエプソン株式会社 Data collation device, design data correction device, shape measurement device, data collation method and program

Similar Documents

Publication Publication Date Title
CN106997049B (en) Method and device for detecting barrier based on laser point cloud data
US10534091B2 (en) Method and apparatus for generating road surface, method and apparatus for processing point cloud data, computer program, and computer readable recording medium
EP3401671B1 (en) Detection device and detection method
JP2018531402A (en) Slope stability rider
JP2018531402A6 (en) Slope stability rider
CA3038066C (en) System and method for measuring geometric change in a subterranean structure
CN114444158B (en) Underground roadway deformation early warning method and system based on three-dimensional reconstruction
CN110348138B (en) Method and device for generating real underground roadway model in real time and storage medium
KR101934318B1 (en) Method for processing scanning data using 3-dimensional laser scanner
CN112455502A (en) Train positioning method and device based on laser radar
KR20210069385A (en) Mapping device between image and space, and computer trogram that performs each step of the device
WO2023127037A1 (en) Information processing device, information processing method, and computer-readable medium
JP7146271B2 (en) Buried object measuring device, method, and program
CN113156456A (en) Pavement and tunnel integrated detection method and detection equipment and vehicle
US20230296798A1 (en) Rock fall analyser
KR102357109B1 (en) Tunnel surface mapping system under construction
JP2023072823A (en) Structure shape confirmation system
KR101907057B1 (en) Device and Method for Depth Information Compensation by Sphere Surface Modeling
CN117647789A (en) Method for monitoring deformation of tunnel portal slope based on laser radar
CN117630863A (en) Method for monitoring tunnel lining deformation based on laser radar
US20240257376A1 (en) Method and system for detection a line above ground from a helicopter
Chen et al. Automatic detection of tunnel lining using image processing supported by terrestrial laser scanning technology
JP2023108245A (en) Hollow measuring device, hollow measuring method, and hollow measuring program
JP6664243B2 (en) Building limit display device, building limit display control device, building limit display method, and building limit display control method
EP4348583A1 (en) Method and system for detecting a line above ground from a helicopter

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21969921

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023570526

Country of ref document: JP