WO2024084601A1 - Change detection method, change detection system, and change detection apparatus - Google Patents

Change detection method, change detection system, and change detection apparatus Download PDF

Info

Publication number
WO2024084601A1
WO2024084601A1 PCT/JP2022/038830 JP2022038830W WO2024084601A1 WO 2024084601 A1 WO2024084601 A1 WO 2024084601A1 JP 2022038830 W JP2022038830 W JP 2022038830W WO 2024084601 A1 WO2024084601 A1 WO 2024084601A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
coordinate
feature amount
point cloud
coordinates
Prior art date
Application number
PCT/JP2022/038830
Other languages
French (fr)
Japanese (ja)
Inventor
雅也 藤若
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2022/038830 priority Critical patent/WO2024084601A1/en
Publication of WO2024084601A1 publication Critical patent/WO2024084601A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a change detection method, a change detection system, and a change detection device.
  • Patent Document 1 describes a technology that compares previously captured images with currently captured images to detect whether any changes have occurred in the target area. At this time, the target data for the changed area caused by the work is excluded from the output.
  • Patent Document 2 describes a technology that acquires three-dimensional distance data and uses a three-dimensional polar coordinate grid map to detect the presence or absence of an object based on the acquired three-dimensional distance data.
  • Patent Documents 1 and 2 disclose how to compare images and detect the presence or absence of objects, but do not mention this problem and are not able to solve it.
  • the objective of this disclosure is to provide a change detection method, a change detection system, and a change detection device that can improve the accuracy of detecting changes between point clouds, even when the point clouds being compared do not represent accurate information.
  • a change detection method is executed by a computer, and calculates a first feature of a first coordinate in a first point cloud using information about the presence of a point in each of the first small regions in a first region that is composed of a plurality of first small regions and includes the first coordinate, calculates a second feature of a second coordinate in a second point cloud that corresponds to the first coordinate using information about the presence of a point in each of the second small regions in a second region that is composed of a plurality of second small regions and includes the second coordinate, and calculates a second feature of a second coordinate in a second point cloud that corresponds to the first coordinate using information that indicates that a point is present, that a point is not present, or that the presence of a point is unknown in each of the second small regions in a second region that is composed of a plurality of second small regions and includes the second coordinate, and determines whether a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate using the first feature and the
  • the change detection system includes a first feature calculation means for calculating a first feature of a first coordinate in a first point cloud using information on the presence of a point in each of the first small regions in a first region that is composed of a plurality of first small regions and includes the first coordinates; a second feature calculation means for calculating a second feature of a second coordinate in a second point cloud that corresponds to the first coordinate using information on the presence of a point in each of the second small regions in a second region that is composed of a plurality of second small regions and includes the second coordinates, indicating that a point is present, that a point is not present, or that the presence of a point is unknown in each of the second small regions; and a determination means for determining whether a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates using the first feature and the second feature.
  • a change detection device includes a first feature calculation means for calculating a first feature of a first coordinate in a first point cloud using information on the presence of a point in each of the first small regions in a first region that is composed of a plurality of first small regions and includes the first coordinates; a second feature calculation means for calculating a second feature of a second coordinate in a second point cloud that corresponds to the first coordinate using information indicating that a point is present, that a point is not present, or that the presence of a point is unknown in a second region that is composed of a plurality of second small regions and includes the second coordinates; and a determination means for determining whether a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates using the first feature and the second feature.
  • the present disclosure provides a change detection method, a change detection system, and a change detection device that can improve the accuracy of detecting changes between point clouds, even when the point clouds being compared do not show accurate information.
  • FIG. 1 is a block diagram showing an example of a change detection device according to a first embodiment
  • 4 is a diagram for explaining calculations performed by a feature amount calculation unit according to the first embodiment
  • FIG. 4 is a flowchart showing an example of a representative process of the change detection device according to the first embodiment
  • 1 is a block diagram showing an example of a change detection system according to a first embodiment
  • FIG. 11 is a block diagram showing an example of a monitoring system according to a second embodiment.
  • FIG. 11 is a block diagram showing an example of a center server according to a second embodiment. 13 shows an example of a reference point group according to the second embodiment. 13 shows an example of an input point group according to the second embodiment.
  • FIG. 11 is a diagram showing a polar coordinate system centered on the coordinates that are the target of calculation of the reference point group in the second embodiment.
  • 13 is an example of a spatial feature amount calculated for the coordinates of a reference point group according to the second embodiment.
  • 13 is an example of a spatial feature calculated for coordinates of an input point group according to the second embodiment.
  • FIG. 9C is a diagram for explaining equation (5) in the example of FIGS. 9B and 9C.
  • 13 is a flowchart showing an outline of an example of processing by a center server according to the second embodiment; 13 is a flowchart illustrating an example of detailed processing of the center server. 13 is a flowchart showing another example of detailed processing of the center server.
  • 1 shows an example of an image where changes are detected using the direct comparison method. 13 shows an example of an image in which a change is detected using the technique of the present disclosure.
  • FIG. 11 is a block diagram showing another example of the center server according to the second embodiment.
  • 13 is a flowchart showing an outline of an example of processing by a center server according to the second embodiment; 13 is a flowchart illustrating an example of detailed processing of the center server.
  • FIG. 13 is a flowchart showing another example of detailed processing of the center server.
  • FIG. 11 is a block diagram showing another example of the center server according to the second embodiment.
  • FIG. 11 is a block diagram showing another example of the center server according to the second embodiment.
  • 13 is a flowchart showing an outline of an example of processing by a center server according to the second embodiment;
  • 13 is a flowchart showing an outline of an example of processing by a center server according to the second embodiment;
  • FIG. 2 is a block diagram showing an example of a hardware configuration of an apparatus according to each embodiment.
  • FIG. 1 is a block diagram showing an example of a change detection device.
  • the change detection device 10 includes a first feature amount calculation unit 11, a second feature amount calculation unit 12, and a determination unit 13. Each unit (means) of the change detection device 10 is controlled by a control unit (controller) (not shown). Each unit will be described below.
  • the first feature amount calculation unit 11 calculates the first feature amount of the first coordinate in the first point cloud.
  • the first point cloud is first data indicating the presence or absence of a point at each coordinate in a predetermined area.
  • the first point cloud represents, for example, the shape of an object in a three-dimensional space, and an example thereof is data obtained by using a sensor and visualizing an object at a predetermined location.
  • the sensor used may be, for example, a range sensor or an imaging element such as a camera. When a range sensor is used, the first point cloud becomes mapping data obtained by measuring and visualizing an object at a predetermined location.
  • a specific example of the range sensor is LiDAR (Light Detection And Ranging) using light detection and ranging.
  • Both 3DLiDAR and 2DLiDAR can be used as LiDAR.
  • An example of the case where 3DLiDAR is used will be described later in the second embodiment.
  • the first point cloud becomes mapping data generated based on a two-dimensional image of a predetermined location.
  • the above-mentioned measurements or photographs may be actual measurements or photographs, or may be virtual measurements or photographs performed by a computer.
  • the objects indicated by the first point cloud are not limited to these.
  • the change detection device 10 may obtain the first point cloud data from outside the change detection device 10, or may generate the first point cloud data within the change detection device 10.
  • the first feature amount calculation unit 11 calculates the first feature amount as follows.
  • the first feature amount calculation unit 11 defines a first area that is composed of a plurality of first small areas and includes first coordinates for the first point group.
  • the first feature amount calculation unit 11 then calculates the first feature amount by using information about the presence of a point in each first small area.
  • the information about the presence of a point may be, for example, information indicating whether a point is present or not present in each first small area.
  • the information about the presence of a point may be information indicating whether a point is present, not present, or the presence of a point is unknown in each first small area. The definition of the presence of a point being unknown will be described later.
  • the information about the presence of a point may be generated by the first feature amount calculation unit 11 by analyzing the acquired first point group, or may be included in information acquired by the first feature amount calculation unit 11 from outside.
  • FIG. 2 is a diagram for explaining the calculations of the first feature amount calculation unit 11.
  • the first point group is G1
  • the first coordinate is FC
  • the first region including FC is R1.
  • Region R1 can be divided into small regions SR1.
  • the presence of points in point group G1 is indicated by black circles, and the absence of points is indicated by white circles.
  • the first feature amount calculation unit 11 defines a region R1 that is composed of multiple small regions SR1 and that includes the coordinates FC, for which the feature amount is to be calculated.
  • the first feature amount calculation unit 11 then calculates the feature amount for the coordinates FC by using information about the presence of points in each small region SR1 of the region R1.
  • the calculated first feature amount may be expressed as a scalar amount or a vector amount.
  • the second feature amount (described below) corresponding to the first feature amount is also expressed as a scalar amount. Then, in the judgment process of the judgment unit 13 described below, the first feature amount is compared with the corresponding second feature amount to execute the judgment process described below.
  • a first feature amount is expressed as a vector amount
  • the corresponding second feature amount is also expressed as a vector amount. Then, in the judgment process of the judgment unit 13 described below, each element in the first feature amount is compared with each element in the second feature amount corresponding to each element of the first feature amount, thereby executing the judgment process described below.
  • a specific example in which a feature amount is expressed as a vector amount will be described in detail in the second embodiment.
  • FIG. 2 shows an example in which information on the presence or absence of one point is associated with one small region SR1.
  • information on the presence or absence of multiple points may be associated with one small region SR1.
  • the number of small regions SR1 contained in region R1 and the shapes of region R1 and small regions SR1 are arbitrary.
  • the second feature calculation unit 12 calculates a second feature of a second coordinate in the second point cloud, which is different from the first point cloud.
  • the second point cloud is second data indicating the presence or absence of a point at each coordinate in a specified region, and an example thereof is similar to that of the first point cloud.
  • the change detection device 10 may obtain the second point cloud data from outside the change detection device 10, or may generate the second point cloud data within the change detection device 10.
  • first point cloud and the second point cloud are point clouds to be compared, and are, for example, mapping data in which the same location is measured. Also, the first coordinates and the second coordinates are compared, and, for example, the first coordinates and the second coordinates may indicate the same position, but the relationship between the first coordinates and the second coordinates is not limited to this.
  • the change detection device 10 calculates feature amounts for a first coordinate in a first point cloud and a second coordinate in a second point cloud that corresponds to the first coordinate by using a first feature amount calculation unit 11 and a second feature amount calculation unit 12. By using these feature amounts, it is possible to determine whether or not a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate.
  • the second feature amount calculation unit 12 calculates the second feature amount as follows.
  • the second feature amount calculation unit 12 defines a second area, which is composed of a plurality of second small areas and includes second coordinates, for the second point group.
  • the definition of this second area and the second small areas is the same as the method executed by the first feature amount calculation unit 11, and is as described in the example of FIG. 2.
  • the second feature calculation unit 12 calculates the second feature using information indicating that in each second sub-region in this second region, a point is present, a point is not present, or the presence of a point is unknown. "Presence of a point is unknown" indicates that although the presence or absence of a point is defined in the second sub-region in the second point cloud acquired by the second feature calculation unit 12, it is considered that it is unknown whether a point actually exists.
  • the second feature calculation unit 12 calculates the feature for that sub-region to be different from either the case where a point exists or the case where a point does not exist.
  • the existence of a point, the absence of a point, or the unknown existence of a point can be defined, for example, as follows:
  • the second point cloud is mapping data acquired by measurement using a range sensor. If there is a point in the second small region, the second small region is defined as having a point present. On the other hand, if there is no point in the second small region, it is determined whether the second small region is located between the position where the point exists in the second point cloud at the time of measurement and the position of the range sensor. If the second small region is located between the position where the point exists in the second point cloud at the time of measurement and the position of the range sensor, it is defined as not having a point in the second small region. On the other hand, if the second small region is not located between the position where the point exists in the second point cloud and the position of the range sensor, the presence of the point in the second small region is defined as unknown.
  • This definition is based on the assumption that when a range sensor acquires mapping data, light should be incident on the range sensor from an object shown in the mapping data, and no object should exist between the object and the range sensor. Ray tracing technology can be applied to this definition.
  • This is a particularly effective definition for improving detection accuracy, for example, when the density of points in the second point cloud is sparser than the density of points in the first point cloud.
  • a location where there is no point in the second point cloud may not only be a location where there is no point in reality, but also a location where an object exists in the measurement but is not recorded as data. In this case, when it is not possible to determine that there is no point in reality, it is preferable to define the location where there is no point as a location where the presence of a point is unknown.
  • N the number of points in the second small region.
  • M the number of times the light passes through the second small region.
  • the presence, absence, or unknownness of a point can be defined according to the values of N-M and N+M.
  • Th1 is an integer equal to or greater than 0
  • the presence of a point in the second small region is defined as being unknown. This is because the number of points in the second small region in the second point group is small, and the number of cases in which light from other points in the second point group passes through the second small region is also small, making it difficult to determine whether or not a point exists in the second small region.
  • the presence or absence of a point in the second small region is defined depending on whether or not the value of N-M is equal to or greater than a threshold Th2 (Th2 is an integer, and is 0 as an example, but is not limited to this).
  • Th2 is an integer, and is 0 as an example, but is not limited to this.
  • the information shown above about whether a point exists, does not exist, or is unknown may be generated by the second feature calculation unit 12 by analyzing the acquired second point cloud, or may be included in information acquired by the second feature calculation unit 12 from outside.
  • the determination unit 13 uses the first feature calculated by the first feature calculation unit 11 and the second feature calculated by the second feature calculation unit 12 to determine whether or not a change in the presence or absence of a point has occurred between the first coordinate and the corresponding second coordinate.
  • the determination method of the determination unit 13 may be performed by any calculation process, such as arithmetic operations, using the first feature amount and the second feature amount, or may be performed by an algorithm based on a predefined rule base. For example, when the first feature amount and the second feature amount are scalar amounts, the determination unit 13 may calculate the difference between the first feature amount and the second feature amount and determine whether the difference is equal to or greater than a threshold value.
  • the determination unit 13 determines that a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates, and when the difference is less than the threshold value, the determination unit 13 determines that a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates.
  • the determination unit 13 may compare corresponding elements of each vector amount and determine whether a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate based on the comparison results for all elements. For example, when comparing elements, the determination unit 13 may determine whether the elements have the same value or not, or whether the difference value of the elements is larger or smaller than a threshold, and calculate a similarity as a comparison result for all elements based on the determination. If this similarity is equal to or greater than a predetermined threshold, it is determined that there is no change in the presence or absence of a point between the first coordinate and the second coordinate.
  • the judgment method of the judgment unit 13 may be performed using an AI (Artificial Intelligence) model that has been trained in advance, such as a neural network.
  • This learning is performed by inputting teacher data, including information on the first and second feature amounts as samples and information (correct answer label) corresponding to the information, indicating whether a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate.
  • the judgment unit 13 inputs the first feature amount calculated by the first feature amount calculation unit 11 and the second feature amount calculated by the second feature amount calculation unit 12 to the AI model. Based on this input information, the AI model outputs information indicating whether a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate. Even in this way, the judgment unit 13 can execute the judgment process.
  • any technology such as logistic regression or neural network, can be used for training the learning model.
  • the first feature amount calculation unit 11 calculates a first feature amount of a first coordinate in the first point cloud (step S11; first feature amount calculation step).
  • the second feature amount calculation unit 12 calculates a second feature amount of a second coordinate in the second point cloud (step S12; second feature amount calculation step).
  • the determination unit 13 uses the first feature amount and the second feature amount to determine whether or not a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate (step S13; determination step). Note that either the process of step S11 or the process of S12 may be executed first, or both processes may be executed in parallel.
  • the second feature calculation unit 12 calculates the second feature to reflect that state. Then, the determination unit 13 makes a determination that reflects the first feature and the second feature. Therefore, for a location where erroneous point information is indicated, such as a point not existing (or existing) in the second point cloud even though a point actually exists (or does not exist) in the second point cloud, the determination unit 13 can determine that the presence of that location is unknown. It is estimated that this determination result has a higher accuracy than when the erroneous point information is used as is. Therefore, even if the second point cloud to be compared with the first point cloud does not indicate accurate information, the change detection device 10 can improve the accuracy of detecting changes between the point clouds.
  • the determination unit 13 may perform the above determination at multiple corresponding coordinates in the first point cloud and the second point cloud to detect a change in the presence or absence of points in a predetermined area of the point cloud. For example, the determination unit 13 may perform the above determination for all coordinates of the first point cloud or all coordinates of the second point cloud. This makes it possible to detect a change in the presence or absence of points in the entire first point cloud or the entire second point cloud. Therefore, for example, when the first point cloud and the second point cloud are point cloud data measured at the same location, the change detection device 10 can identify a location that has changed in the two point clouds. The changed location is, for example, a location where an object that existed in one of the two point clouds no longer exists in the other point cloud.
  • the change detection device 10 may further include a detection unit that detects a change in the presence or absence of an object between the first point cloud and the second point cloud based on the above-mentioned determination result of the determination unit 13. A specific detection method will be described later in the second embodiment.
  • the first feature calculation unit 11 may also calculate the first feature using information indicating that in each first small region in the first region, a point is present, a point is not present, or the presence of a point is unknown.
  • the definition of the presence of a point being unknown is as described above. This allows a state in which the presence of a point in the first point cloud is unknown to be reflected in change detection, thereby further improving the change detection accuracy in the change detection device 10.
  • the change detection device 10 may further include an output unit that outputs the determination result of the determination unit 13 to the inside or outside of the change detection device 10.
  • the output unit may visually emphasize the location where the determination unit 13 has determined that a change in the presence or absence of a point has occurred, and output the data of the determination result to the outside of the change detection device 10 (e.g., a monitor) in a visible format such as an image or a point cloud. This process may be performed for all locations where the determination unit 13 has determined that a change in the presence or absence of a point has occurred, thereby making it possible to present to the user the locations where the presence or absence of an object has changed in the two point clouds.
  • Examples of "visual emphasis” include surrounding the location where the presence or absence of a point has changed (or the location where the presence or absence of an object has changed) with a frame, displaying the outline of the location in a color (e.g., red) different from the color of the outline of other objects, blinking the location, filling the location and displaying it as a shadow, and the like, but are not limited to these.
  • the output unit may also output the determination result of the determination unit 13 by sound or the like via a speaker. In this way, the output unit can output an alert by image or sound.
  • the output unit may also output the determination result of the determination unit 13 to another device. Furthermore, the output unit may output the detection result of detecting a change in the presence or absence of an object between the first point cloud and the second point cloud as described above.
  • FIG. 4 is a block diagram showing an example of a change detection system.
  • the change detection system 20 includes a feature amount calculation device 21 and a judgment device 22.
  • the feature amount calculation device 21 includes a first feature amount calculation unit 11 and a second feature amount calculation unit 12, and the judgment device 22 includes a judgment unit 13 and an output unit 14.
  • the first feature amount calculation unit 11 to the judgment unit 13 execute the same processing as that shown in (1A).
  • the judgment unit 13 of the judgment device 22 executes the processing shown in (1A) using the information on the feature amounts.
  • the output unit 14 of the judgment device 22 is the output unit described in (1A), and outputs the judgment result of the judgment unit 13 to the inside or outside of the judgment device 22.
  • the change detection process according to the present disclosure may be realized by a single device as shown in (1A), or may be realized as a system in which the processes to be executed are distributed among multiple devices as shown in (1B).
  • the device configuration shown in (1B) is merely an example.
  • the first device may have a first feature amount calculation unit 11, and the second device may have a second feature amount calculation unit 12 and a determination unit 13.
  • the first device may have an acquisition unit that acquires the first point cloud.
  • three different devices may be provided, each having the first feature amount calculation unit 11, the second feature amount calculation unit 12, and the determination unit 13.
  • each device may further have an acquisition unit that acquires the first point cloud, an acquisition unit that acquires the second point cloud, and an output unit 14.
  • the change detection system 20 may be provided in part or in its entirety in a cloud server built on the cloud, or in other types of virtualized servers generated using virtualization technology, etc. Functions other than those provided in such servers are placed at the edge.
  • the edge is a device placed at or near the site, and is also a device that is close to the terminal in terms of the network hierarchy.
  • Embodiment 2 In the following embodiment 2, a specific example of the change detection method described in embodiment 1 will be disclosed. However, the specific example of the change detection method described in embodiment 1 is not limited to the one shown below. Furthermore, the configurations and processes described below are merely examples, and are not limited to these.
  • Fig. 5 is a block diagram showing an example of a monitoring system.
  • the monitoring system 100 includes a plurality of robots 101A, 101B, and 101C (hereinafter collectively referred to as robots 101), a base station 110, a center server 120, and a reference point group DB 130.
  • robots 101 are provided on the edge side (on-site side) of the monitoring system 100, and the center server 120 is disposed at a position away from the site (on the cloud side).
  • the center server 120 is disposed at a position away from the site (on the cloud side).
  • the robot 101 functions as a terminal that measures a specific location while moving through a site to be monitored in order to inspect infrastructure equipment for failures or abnormalities.
  • the robot 101 is an edge device connected to a network, has a LiDAR 102, and can measure any location.
  • the robot 101 transmits the measured point cloud data to the center server 120 via the base station 110.
  • the robot 101 transmits the point cloud data via a wireless line.
  • the point cloud data may also be transmitted via a wired line.
  • the robot 101 may transmit the point cloud data acquired by measurement using the LiDAR 102 directly to the center server 120, or may perform appropriate preprocessing on the acquired point cloud data before transmitting it to the center server 120.
  • the robot 101 may transmit information indicating the measurement position to the center server 120 together with the point cloud data acquired at that position.
  • the robot 101 has functions such as AMCL (Adaptive Monte Carlo Localization) or SLAM (Simultaneous Localization and Mapping), and estimates its own position using these functions and transmits the information to the center server 120.
  • the robot 101 may obtain information indicating the measurement position using a satellite positioning system function such as GPS (Global Positioning System), and transmit the information to the center server 120.
  • GPS Global Positioning System
  • the movement route and measurement points of the robot 101 may be determined in advance, and the robot 101 and the center server 120 may share the information in advance, so that the center server 120 is aware of the position of the robot 101.
  • the robot 101 may be, for example, an AGV (Automatic Guided Vehicle) that runs under the control of the center server 120, an AMR (Autonomous Mobile Robot) that is capable of autonomous movement, a drone, etc., but is not limited to these.
  • AGV Automatic Guided Vehicle
  • AMR Automatic Mobile Robot
  • the base station 110 transfers the point clouds transmitted from each robot 101 to the center server 120 via the network.
  • the base station 110 is a local 5G (5th Generation) base station, a 5G gNB (next Generation Node B), an LTE eNB (evolved Node B), a wireless LAN access point, etc., but may also be other relay devices.
  • the network is, for example, a core network such as 5GC (5th Generation Core network) or EPC (Evolved Packet Core), the Internet, etc.
  • a server other than the center server 120 may be connected to the base station 110.
  • a MEC (Multi-access Edge Computing) server may be connected to the base station 110.
  • the MEC server can, for example, control the bit rate of the data transmitted by each robot 101 by assigning a bit rate for the data transmitted by each robot 101 to the base station 110 and transmitting this information to each robot 101.
  • the MEC server can transmit information on the bit rate of each robot 101 to the center server 120, allowing the center server 120 to grasp the bit rate information.
  • FIG. 6 is a block diagram showing an example of the center server 120.
  • the center server 120 includes a reference point cloud acquisition unit 121, an input point cloud acquisition unit 122, a reference feature amount calculation unit 123, an input feature amount calculation unit 124, a change detection unit 125, and a detection result generation unit 126.
  • the center server 120 can detect changes and generate comparison results by comparing each piece of point cloud data measured and acquired by the robot 101 with the reference data. Each part of the center server 120 will be described below.
  • the reference point cloud acquisition unit 121 acquires a reference point cloud for comparison with the point cloud acquired by the robot 101 during measurement.
  • the reference point cloud is a point cloud acquired by previously measuring the location measured by the robot 101, and is stored in the reference point cloud DB 130.
  • the reference point cloud is data that represents the shape of an object in three-dimensional space.
  • the center server 120 when the input point cloud acquisition unit 122, which will be described later, acquires the point cloud data transmitted by the robot 101 (hereinafter also referred to as the input point cloud), the center server 120 also acquires information on the measurement positions of the input point cloud. Furthermore, the reference point cloud DB 130 stores the reference point cloud in association with the position information at which the reference point cloud was measured. The reference point cloud acquisition unit 121 compares the measurement position information of the input point cloud with the measurement position information stored in the reference point cloud DB 130. If the comparison results in a match in the measurement position information stored in the reference point cloud DB 130, the reference point cloud acquisition unit 121 acquires the reference point cloud associated with the matching measurement position information. In this way, the reference point cloud acquisition unit 121 acquires the reference point cloud data to be compared with the point cloud measured and acquired by the robot 101 by searching the reference point cloud DB 130.
  • data previously measured and acquired may be stored in the reference point cloud DB 130 as an image rather than as a point cloud.
  • the reference point cloud acquisition unit 121 identifies matching measurement position information by searching the reference point cloud DB 130 described above, it can acquire the reference point cloud by acquiring an image associated with the matching measurement position information and converting the data format of the image into a point cloud.
  • the input point cloud acquisition unit 122 acquires the input point cloud acquired by the robot 101 through measurement via the communication unit of the center server 120. This allows the input point cloud acquisition unit 122 to acquire point cloud data measured in real time. However, if the robot 101 takes an image and transmits the image data to the center server 120, the input point cloud acquisition unit 122 can acquire the input point cloud by converting the data format of the image into a point cloud. Like the reference point cloud, the input point cloud is data that represents the shape of an object in three-dimensional space. However, there are differences between the input point cloud and the reference point cloud, as described below.
  • the first issue is that the point densities in the reference point cloud and the input point cloud may differ, and the second issue is that the measurement position of the robot 101 may deviate from the measurement position of the reference point cloud.
  • Figures 7A and 7B show examples of a reference point cloud and an input point cloud.
  • the reference point cloud in Figure 7A is 3D data measured using a sensor before the robot 101 performs measurements related to the input point cloud. "Before performing measurements related to the input point cloud” may be when the robot is deployed at the site, when the robot is inspected at the site before operating at the site, or before work begins, such as on the morning of an operating day.
  • Figure 7B is 3D data acquired by measurement by the robot 101.
  • Figures 7A and 7B are data measured inside a warehouse, and racks L inside the warehouse are captured as a point cloud.
  • Figure 7B also captures an object OB not in Figure 7A as a point cloud.
  • the reference point cloud T1 has a higher density of points, whereas the input point cloud T2 has a lower density of points. This is because the input point cloud T2 is data measured in real time, and the amount of data is smaller. In other words, compared to the reference point cloud T1, the input point cloud T2 may not record point information at coordinates where an object exists and a point should be located. In addition, the point density may change between the reference point cloud T1 and the input point cloud T2 depending on the measurement environment of the reference point cloud T1 and the input point cloud T2. Thus, in actual use, it is expected that the point density of the reference point cloud and the input point cloud will differ greatly. If these two point clouds are directly compared, it may be difficult to accurately detect changes in the two point clouds. Therefore, it is preferable to be able to improve the accuracy of detecting changes between the point clouds, even if the input point cloud does not show accurate information due to changes in the point cloud density compared to the reference point cloud.
  • FIG. 8A shows an example of a situation in which a reference point cloud is obtained by measurement
  • FIG. 8B shows an example of a situation in which an input point cloud is obtained by measurement
  • a location I1 where factory equipment is located is measured by a dedicated sensor S.
  • This dedicated sensor S obtains a point cloud by using LiDAR.
  • the measurement range of the dedicated sensor S is indicated by G11.
  • Gas tanks T1 and T2 are included within the range of G11.
  • the same location I1 is measured by a LiDAR 102 of a robot 101.
  • the measurement range of the dedicated sensor S is indicated by G12.
  • object L1 which was not included in FIG.
  • the robot 101 is controlled during measurement so that the estimated position of the robot 101 is the same as the measured position of the dedicated sensor S. Ideally, therefore, G11 and G12 are in the same range.
  • the error in the estimation of the robot 101's own position can become large depending on the environment in which the robot 101 is placed. Also, the error can become large when the position of the robot 101 cannot be corrected by the control of the center server 120. In such a case, as shown in FIG. 8B, a deviation occurs between G11 and G12.
  • a deviation in the measured position itself hereinafter also referred to as a deviation in the translation direction
  • a deviation in the measured direction hereinafter also referred to as a deviation in the rotation direction
  • Figure 8C shows the ideal comparison result between the reference point cloud and the input point cloud
  • Figure 8D shows the actual comparison result between the reference point cloud and the input point cloud.
  • the change detection method proposed in this disclosure makes it possible to solve these problems through the process described below.
  • the reference feature calculation unit 123 corresponds to the first feature calculation unit 11 in the first embodiment, and calculates the feature of each coordinate in the reference point group. This feature is expressed as vectorized information on the presence or absence of points at the coordinates to be calculated and at the surrounding coordinates, and will hereafter also be referred to as a spatial feature. Details of the method of calculation of the spatial feature by the reference feature calculation unit 123 will be described below.
  • FIG. 9A shows a polar coordinate system centered on the coordinates to be calculated for the reference point group.
  • the reference feature calculation unit 123 acquires data on a group of points in a spherical region S1 of radius ⁇ centered on coordinate F1 in the reference point group. Then, it calculates and finds parameters (r, ⁇ , ⁇ ) in the polar coordinate system shown in FIG. 9A for each coordinate in the spherical region S1.
  • the reference feature calculation unit 123 divides the spherical region S1 into a plurality of small regions SR1 (first small regions in the first embodiment) so that each coordinate is included within the small region SR1.
  • the spherical region S1 is divided so that the distance r and angle ( ⁇ , ⁇ ) in each small region SR1 are discretized values.
  • the angle ⁇ is divided by 2 ⁇ (rad)
  • the angle ⁇ is divided by ⁇ (rad)
  • the distance r is divided by d.
  • 2 ⁇ is a value equal to or less than ⁇
  • d is a value equal to or less than ⁇ /2.
  • the spherical region S1 is
  • the image is divided into a number of small regions SR1.
  • the number of small regions SR1 is expressed by using a floor function and a ceiling function, respectively.
  • the reference feature calculation unit 123 calculates a vectorized spatial feature by defining the presence or absence of a point in each divided small region SR1 as an element of the spatial feature. For example, a vector representation in which 1 is indicated when a point exists at each coordinate and 2 is indicated when a point does not exist at each coordinate is assumed, but the example of the vector representation is not limited to this.
  • FIG. 9B is an example of spatial features calculated for the coordinates of the reference point group.
  • FIG. 9B in a spherical region S1 with coordinate F1 at center O, the presence of a point in small region SR1 is indicated by a black circle, and the absence of a point is indicated by a white circle.
  • FIG. 9B also shows coordinate H1 on spherical region S1, which will be used as an example when explaining the calculation of spatial features below.
  • the reference feature calculation unit 123 sets a neighborhood area ⁇ 1 of the coordinate F1 in the reference point group.
  • the neighborhood area ⁇ 1 is an area including at least one coordinate other than the coordinate F1, and is set as an area of radius ⁇ 1 in a polar coordinate system centered on the coordinate F1, for example, but the setting of the neighborhood area ⁇ 1 is not limited to this.
  • the reference feature calculation unit 123 then calculates spatial features for each coordinate other than the coordinate F1 included in the neighborhood area ⁇ 1 in the same manner as the calculation of spatial features at the coordinate F1 described above.
  • the reference feature calculation unit 123 defines a spherical area (third area) including the coordinates, which is composed of multiple small areas (third small areas), for each coordinate other than the coordinate F1 included in the neighborhood area ⁇ 1, in the same manner as the spherical area S1 and the small area SR1. Then, the spatial features are calculated using information on the presence or absence of a point in each small area.
  • the input feature calculation unit 124 corresponds to the second feature calculation unit 12 in the first embodiment, and calculates the spatial feature of each coordinate in the input point cloud.
  • This spatial feature is expressed as a vectorized information indicating whether a point exists, does not exist, or whether the existence of a point is unknown at the coordinate to be calculated and its surrounding coordinates.
  • the input feature calculation unit 124 uses the distance r from the center O and the angle ( ⁇ , ⁇ ) as parameters to identify coordinates included in a predetermined region (the second region in the first embodiment) that includes the coordinate F2.
  • the coordinate F2 is a coordinate that is to be compared with the coordinate F1 in terms of spatial features, and here indicates the same coordinate in the reference point cloud and the input point cloud.
  • the input feature calculation unit 124 acquires data on the point cloud in a spherical region S2 of radius ⁇ centered on coordinate F2 in the input point cloud.
  • the size of the spherical region S2 is the same as that of the spherical region S1. Then, it calculates and finds the parameters (r, ⁇ , ⁇ ) in the polar coordinate system shown in Figure 9A for each coordinate in the spherical region.
  • the input feature calculation unit 124 divides the spherical region S2 into a plurality of small regions SR2 (second small regions in the first embodiment) so that each coordinate is included within the small region SR2.
  • the spherical region S1 is divided in the same way as the spherical region S2. That is, the angle ⁇ is divided by 2 ⁇ (rad), the angle ⁇ is divided by ⁇ (rad), and the distance r is divided by d. Note that 2 ⁇ is a value equal to or less than ⁇ , and d is a value equal to or less than ⁇ /2. Therefore, the spherical region S2 is,
  • the input feature amount calculation unit 124 calculates a vectorized spatial feature amount by defining, as an element of the spatial feature amount, information indicating whether a point exists, whether a point does not exist, or whether the existence of a point is unknown in each divided small region SR2. For example, a vector notation is assumed in which 1 is indicated when a point exists in each coordinate, 2 is indicated when a point does not exist, and 0 is indicated when the existence of a point is unknown, but examples of the vector notation are not limited to this.
  • the input feature calculation unit 124 determines information indicating whether a point exists in each small region SR2, whether a point does not exist, or whether the existence of a point is unknown, as follows. If a point exists in a small region SR2, the input feature calculation unit 124 defines the small region SR2 as having a point. On the other hand, if there is no point in the small region SR2, the input feature calculation unit 124 determines whether the small region SR2 is located between the position where the point exists in the input point cloud at the time of measurement and the position of LiDAR 102.
  • the input feature calculation unit 124 defines the small region SR2 as having no point. On the other hand, if the small region SR2 is not located between the position where the point exists in the input point cloud at the time of measurement and the position of LiDAR 102, the input feature calculation unit 124 defines the existence of a point in the small region SR2 as being unknown. This type of ray tracing technique can be applied to the definition.
  • the density of points in the input point cloud is sparser than the density of points in the reference point cloud. Therefore, it is possible that an object that is captured in the reference point cloud will not be captured accurately in the input point cloud, and some points of the object will not be recorded in the input point cloud. Therefore, when it is not possible to be certain that a point does not actually exist, it is preferable to define the presence of a point as unknown for a small region SR2 that does not contain a point.
  • Figure 9C is an example of spatial features calculated for the coordinates of the input point cloud.
  • Figure 9C in a spherical region S2 with coordinate F2 at center O, the presence of a point in small region SR2 is indicated by a black circle, the absence of a point is indicated by a white circle, and the presence of a point is unknown is indicated by a triangle.
  • Figure 9C also shows coordinate H2 on spherical region S2, which will be used as an example when explaining the calculation of spatial features below. Note that coordinates H1 and H2 indicate the same coordinates in the reference point cloud and the input point cloud.
  • the input feature amount calculation unit 124 sets a neighborhood area ⁇ 2 of the coordinate F2 in the input point cloud.
  • the neighborhood area ⁇ 2 is an area including at least one coordinate other than the coordinate F2, and is set as an area of radius ⁇ 2 in a polar coordinate system centered on the coordinate F2, for example, but the setting of the neighborhood area ⁇ 2 is not limited to this.
  • the input feature amount calculation unit 124 calculates the spatial feature amount for each coordinate other than the coordinate F2 included in the neighborhood area ⁇ 2 in the same manner as the calculation of the spatial feature amount for the coordinate F2 shown above.
  • the input feature amount calculation unit 124 defines a spherical area (fourth area) including the coordinates, which is composed of multiple small areas (fourth small areas) in the same manner as the spherical area S2 and the small area SR2. Then, the spatial feature amount is calculated by using information indicating that a point exists in each small area, that a point does not exist, or that the existence of a point is unknown.
  • the size of the neighborhood region ⁇ 2 in the input point cloud may be the same as or different from the size of the neighborhood region ⁇ 1 in the reference point cloud. That is, in this example, the radius ⁇ 1 may be the same length as the radius ⁇ 2, or may be a different length.
  • the reference feature calculation unit 123 and the input feature calculation unit 124 calculate the above-mentioned spatial features at each coordinate of the reference point group and the input point group.
  • the change detection unit 125 corresponds to the determination unit 13 in the first embodiment.
  • the change detection unit 125 calculates the similarity by comparing the spatial feature amounts between the coordinate F1 in the reference point group and the coordinate F2 in the input point group. Then, using the similarity, it determines whether or not the presence or absence of a point at the coordinate F1 has changed at the coordinate F2 when changing from the reference point group to the input point group.
  • the change detection unit 125 can execute: (I) Determine whether a point that did not exist in the coordinate F1 of the reference point group now exists in the coordinate F2 of the input point group. (II) Determine whether a point that existed in the coordinate F1 of the reference point group no longer exists in the coordinate F2 of the input point group. Details of processes (I) and (II) are explained below.
  • the change detection unit 125 calculates the similarity between the feature of coordinate F1 in the reference point group calculated by the reference feature calculation unit 123 and the feature of coordinate F2 in the input point group calculated by the input feature calculation unit 124.
  • the feature of coordinate F1 is P1
  • the feature of coordinate F2 is P2
  • the similarity between the two is S(P1, P2).
  • the change detection unit 125 calculates S(P1, P2) as follows:
  • ⁇ , ⁇ , and r in (5) to (7) are divided into units of 2 ⁇ , ⁇ , and d, respectively.
  • P1 ⁇ r on the right side of (5) indicates an element of feature amount P1, which is an element of the spatial feature amount defined in small region SR1 in spherical region S1.
  • P2 ⁇ r indicates an element of feature amount P2, which is an element of the spatial feature amount defined in small region SR2 in spherical region S2.
  • P1 ⁇ r and P2 ⁇ r are the elements of the feature amounts in small regions SR1 and SR2 that are in the same position when the reference point group and the input point group are compared.
  • ValidNum in (5) is the number of small regions SR2 in the spherical region S2 other than the small regions SR2 defined as "where the presence of a point is unknown.” Therefore, (5) indicates that Score (P1 ⁇ r , P2 ⁇ r ) is summed over all small regions SR2 in the spherical region S2 (or all small regions SR1 in the spherical region S1), and the sum is normalized by ValidNum. At this time, the small regions SR2 defined as "where the presence of a point is unknown" are not evaluated in the similarity calculation (i.e., they are ignored in the calculation).
  • Score(P1 ⁇ r , P2 ⁇ r ) is 1 when Same around (P1 ⁇ r , P2 ⁇ r ) exists for P1 ⁇ r and P2 ⁇ r , and is 0 otherwise.
  • FIG. 9D is a diagram for explaining the above formula (7) in the examples of FIG. 9B and FIG. 9C.
  • a part of the spherical region S1 near the coordinate H1 in FIG. 9B and a part of the spherical region S2 near the coordinate H2 in FIG. 9C are shown.
  • the element in the spatial feature of the small region SR1 including the coordinate H1 is P1 ⁇ r
  • the element in the spatial feature of the small region SR2 including the coordinate H2 is P2 ⁇ r .
  • FIG. 9D is a diagram for explaining the above formula (7) in the examples of FIG. 9B and FIG. 9C.
  • a part of the spherical region S1 near the coordinate H1 in FIG. 9B and a part of the spherical region S2 near the coordinate H2 in FIG. 9C are shown.
  • the element in the spatial feature of the small region SR1 including the coordinate H1 is P1 ⁇ r
  • the elements in the spatial feature of the small region SR1 adjacent in the ⁇ direction to the small region SR1 including the coordinate H1 in the ⁇ direction are P1 ( ⁇ +1) ⁇ r and P1 ( ⁇ -1) ⁇ r .
  • the small region SR1 adjacent in the r direction is defined in each of the small region SR1 including the coordinate H1, the small region SR1 at P1 ( ⁇ +1) ⁇ r , and the small region SR1 at P1 ( ⁇ -1) ⁇ r.
  • the elements in the spatial feature amount of these small regions SR1 are P1 ⁇ (r ⁇ 1) , P1 ( ⁇ +1) ⁇ (r ⁇ 1) , and P1 ( ⁇ 1) ⁇ (r ⁇ 1), respectively.
  • P1 ⁇ r indicates that a point does not exist
  • P2 ⁇ r indicates that a point exists, so they are not the same element.
  • the range of ⁇ indicated by formula (7) includes P1 ( ⁇ +1) ⁇ r and P1 ( ⁇ -1) ⁇ r .
  • P1 ( ⁇ -1) ⁇ r indicates that a point exists, so P1 ( ⁇ -1) ⁇ r and P2 ⁇ r are the same element. Therefore, in the example of FIG. 9D, Same around (P1 ⁇ r , P2 ⁇ r ) shown in (7) exists, and Score (P1 ⁇ r , P2 ⁇ r ) shown in (6) is 1.
  • the change detection unit 125 executes a matching process to calculate the similarity S(P1, P2) shown in equation (5) by calculating Score (P1 ⁇ r , P2 ⁇ r ) for all ⁇ , ⁇ , and r.
  • the change detection unit 125 also calculates the similarity S(Pn, P2) for each coordinate other than the coordinate F1 included in the neighborhood region ⁇ 1 set by the reference feature calculation unit 123, using the same calculation method as for S(P1, P2).
  • Pn is the feature at each coordinate other than the coordinate F1, and is calculated by the reference feature calculation unit 123 in the above process.
  • S(PN, P2) is defined as the similarity S within the neighborhood region ⁇ 1 including S(P1, P2) and S(Pn, P2).
  • the change detection unit 125 After calculating all similarities S(PN,P2) in the neighborhood region ⁇ 1 in this way, the change detection unit 125 identifies Smax (PN,P2) which is the maximum value among S(PN,P2).
  • the coordinates of the reference point group corresponding to this Smax (PN,P2) are the coordinates most similar to the coordinate F2 in the input point group in terms of the state indicating whether or not a point exists among the coordinates included in the neighborhood region ⁇ 1 in the reference point group.
  • the change detection unit 125 compares Smax (PN,P2) with a predetermined threshold value ThS1 to determine which is larger.
  • Smax (PN,P2) is equal to or smaller than ThS1
  • the change detection unit 125 determines that the coordinates included in the neighborhood region ⁇ 1 have a low similarity to the coordinate F2.
  • the change detection unit 125 determines that a point that did not exist in the coordinate F1 of the reference point group now exists in the coordinate F2 of the input point group.
  • Smax (PN,P2) is greater than ThS1
  • the change detection unit 125 does not make the above determination.
  • the change detection unit 125 executes a matching process to calculate S(P1, P2), which is the similarity between the feature amount of the coordinate F1 and the feature amount of the coordinate F2. This calculation method is the same as in (I), so the explanation will be omitted.
  • the change detection unit 125 also calculates the similarity S(P1, Pm) for each coordinate other than the coordinate F2 included in the neighborhood region ⁇ 2 set by the input feature calculation unit 124, using the same calculation method as for S(P1, P2). Note that Pm is the feature at each coordinate other than the coordinate F2, and is calculated by the input feature calculation unit 124 in the above process.
  • S(P1, PM) is defined as the similarity S within the neighborhood region ⁇ 2 including S(P1, P2) and S(P1, Pm).
  • the change detection unit 125 After calculating all similarities SS(P1,PM) in the neighborhood region ⁇ 2 in this manner, the change detection unit 125 identifies Smax (P1,PM) which is the maximum value among S(P1,PM).
  • the coordinates of the reference point group corresponding to this Smax (P1,PM) are the coordinates which, among the coordinates included in the neighborhood region ⁇ 2 in the input point group, are most similar in terms of the state indicating whether or not a point exists to the coordinate F1 in the reference point group.
  • the change detection unit 125 compares Smax (P1,PM) with a predetermined threshold value ThS2.
  • the change detection unit 125 determines that the coordinates included in the neighborhood region ⁇ 2 have a low similarity to the coordinate F1. The change detection unit 125 then determines that a point that existed at the coordinate F1 of the reference point group no longer exists at the coordinate F2 of the input point group. On the other hand, if Smax (P1,PM) is greater than ThS2, the change detection unit 125 does not make the above determination.
  • the change detection unit 125 determines that the presence or absence of a point at the coordinate F1 has not changed at the coordinate F2, i.e., that there is no change between the coordinate F1 and the coordinate F2.
  • the reference feature calculation unit 123 and the input feature calculation unit 124 can change the size of the neighborhood regions ⁇ 1 and ⁇ 2 depending on the situation.
  • the reference feature calculation unit 123 may increase the size of the neighborhood region ⁇ 1 (e.g., the size of the radius ⁇ 1) as the distance between the position indicated by the coordinate F1 when the reference point group is measured and the position of the dedicated sensor S at the time of measurement increases. This is because the influence of the above-mentioned deviation occurs over a wider range as the coordinate F1 of the reference point group is farther away from the dedicated sensor S at the time of measurement, so it is preferable to set the coordinates of the reference point group to be compared over a wider range.
  • the input feature calculation unit 124 may increase the size of the neighborhood region ⁇ 2 (e.g., the size of the radius ⁇ 2) as the distance between the position indicated by the coordinate F2 when the input point group is measured and the position of the LiDAR 102 at the time of measurement increases.
  • the change detection unit 125 executes the above processes (I) and (II) for each coordinate in the reference point cloud and each coordinate in the input point cloud that corresponds to each of those coordinates. This enables the change detection unit 125 to detect changes in the presence or absence of points in all coordinates of the reference point cloud and the input point cloud.
  • the coordinates of the input point cloud are selected as the starting point for detecting changes, and the presence or absence of a change is determined for each coordinate of the input point cloud, thereby detecting whether a point that is not in the reference point cloud has been newly added to the input point cloud.
  • the coordinates of the reference point cloud are selected as the starting point for detecting changes, and the presence or absence of a change is determined for each coordinate of the reference point cloud, thereby detecting whether a point in the reference point cloud is no longer in the input point cloud.
  • the detection result generating unit 126 generates an image showing the change detected by the change detection unit 125 as a result of the change detection unit 125 executing the process shown in (I). For example, when the reference point group and the input point group are acquired in the situation shown in FIG. 8A and 8B, the detection result generating unit 126 can generate the image shown in FIG. 8C as a result of the change detection unit 125 executing the process shown in (I). The detection result generating unit 126 can also generate an image showing the change detected by the change detection unit 125 as a result of the change detection unit 125 executing the process shown in (II).
  • the center server 120 may have an interface such as a display that shows the image generated by the detection result generating unit 126 to the user.
  • the detection result generating unit 126 may generate a point group of 3D data showing the change detected by the change detection unit 125 instead of an image.
  • the detection result generating unit 126 may perform processing to visually emphasize locations where a change in the presence or absence of a point has occurred (or locations where a change in the presence or absence of an object has occurred) on the generated screen or point cloud, similar to the output unit described in embodiment 1.
  • FIG. 10A to 10C are flowcharts showing an example of a representative process of the center server 120, and an overview of the process of the center server 120 is explained using these flowcharts. The details of each process are as described above, and therefore will not be explained as appropriate.
  • Fig. 10A is a flowchart showing an overview of an example of the process of the center server 120, and the process flow will be explained first using Fig. 10A.
  • the reference point cloud acquisition unit 121 acquires reference point cloud data from the reference point cloud DB 130 (step S21; acquisition step).
  • the input point cloud acquisition unit 122 acquires input point cloud data by acquiring data transmitted by the robot 101 (step S22; acquisition step). Note that either step S21 or S22 may be performed first, or both steps may be performed in parallel.
  • Process (A) indicates the above process (I) and processes related to it, and the details thereof will be described using FIG. 10B.
  • the reference feature calculation unit 123, the input feature calculation unit 124, and the change detection unit 125 also execute process (B) (step S24; process (B) step).
  • Process (B) indicates the above process (II) and processes related thereto, and the details thereof will be explained using FIG. 10C. Note that either of the processes in steps S23 and S24 may be performed first, or both processes may be performed in parallel.
  • the detection result generating unit 126 generates an image showing the change detected in the process (A) based on the information indicating the presence or absence of a change at each coordinate generated by the process (A). Similarly, the detection result generating unit 126 generates an image showing the change detected in the process (B) based on the information indicating the presence or absence of a change at each coordinate generated by the process (B) (step S25; detection result image generating step).
  • the input feature calculation unit 124 calculates spatial features for coordinates in the input point cloud for which features have not yet been calculated (step S31; feature calculation step).
  • the input feature calculation unit 124 outputs information on the calculated coordinates to the reference feature calculation unit 123.
  • the reference feature calculation unit 123 Based on the output coordinate information, the reference feature calculation unit 123 identifies coordinates in the reference point group that correspond to the coordinates.
  • the coordinate information output by the input feature calculation unit 124 is the coordinate F2 information in the above example
  • the coordinate information identified by the reference feature calculation unit 123 is the coordinate F1 information in the above example.
  • the reference feature calculation unit 123 sets a neighborhood region ⁇ 1 that includes the coordinate F1, and calculates the spatial features of each coordinate included in the neighborhood region ⁇ 1 (step S32; feature calculation step). Note that in the above, the process of step S31 is performed before the process of step S32, but the process of step S32 may be performed before the process of step S31, or both processes may be performed in parallel.
  • the change detection unit 125 calculates the similarity between coordinates F1 and F2 using the spatial features calculated in steps S31 and S32 (step S33; similarity calculation step).
  • the change detection unit 125 compares the maximum value of the calculated similarity with a predetermined threshold value to determine whether or not a change has occurred in the presence or absence of a point at coordinate F2 (step S34; change detection step).
  • the change detection unit 125 determines whether or not the similarity calculation and change detection determination have been completed for all coordinates in the input point cloud (step S35; completion determination step). If the similarity calculation and change detection determination have not been completed for all coordinates in the input point cloud (No in step S35), the process returns to step S31 and is repeated for coordinates in the input point cloud for which similarity has not been calculated. If the similarity calculation and change detection determination have been completed for all coordinates in the input point cloud (Yes in step S35), process (A) ends. Then, as described above, the detection result generation unit 126 generates a change detection image based on the information generated by process (A) indicating the presence or absence of a change at each coordinate.
  • the reference feature calculation unit 123 calculates the spatial feature for coordinates in the reference point group for which the feature has not yet been calculated (step S41; feature calculation step).
  • the reference feature calculation unit 123 outputs information on the calculated coordinates to the input feature calculation unit 124.
  • the input feature calculation unit 124 Based on the output coordinate information, the input feature calculation unit 124 identifies coordinates in the input point cloud that correspond to the coordinates.
  • the coordinate information output by the reference feature calculation unit 123 is the coordinate F1 information in the above example
  • the coordinate information identified by the input feature calculation unit 124 is the coordinate F2 information in the above example.
  • the input feature calculation unit 124 sets a neighborhood region ⁇ 2 that includes the coordinate F2, and calculates the spatial features of each coordinate included in the neighborhood region ⁇ 2 (step S42; feature calculation step). Note that in the above, the process of step S41 is performed before the process of step S42, but the process of step S42 may be performed before the process of step S41, or both processes may be performed in parallel.
  • the change detection unit 125 calculates the similarity between coordinates F1 and F2 using the spatial features calculated in steps S41 and S42 (step S43; similarity calculation step). The change detection unit 125 compares the maximum value of the calculated similarity with a predetermined threshold value to determine whether or not a change has occurred in the presence or absence of a point at coordinates F1 (step S44; change detection step).
  • the change detection unit 125 determines whether or not the similarity calculation and change detection determination have been completed for all coordinates in the reference point group (step S45; completion determination step). If the similarity calculation and change detection determination have not been completed for all coordinates in the reference point group (No in step S45), the process returns to step S41 and is repeated for coordinates in the reference point group for which similarity has not been calculated. If the similarity calculation and change detection determination have been completed for all coordinates in the reference point group (Yes in step S45), process (B) ends. Then, as described above, the detection result generation unit 126 generates an image in which changes have been detected, based on the information generated by process (B) indicating the presence or absence of changes at each coordinate.
  • the similarity for all coordinates in the input point cloud may be calculated first, and then a change detection determination may be performed for each coordinate in the input point cloud.
  • the similarity for all coordinates in the reference point cloud may be calculated first, and then a change detection determination may be performed for each coordinate in the reference point cloud.
  • both processes (A) and (B) are executed, but only one of processes (A) and (B) may be executed, and the change detection unit 125 may generate an image for only the executed process.
  • the change detection unit 125 may generate an image for only the executed process.
  • the center server 120 can generate a detection image for each input point cloud by executing the comparison process described above between each input point cloud and the reference point cloud.
  • the input feature calculation unit 124 calculates the spatial feature of the input point cloud by vectorizing information indicating the presence or absence of a point, or the presence or absence of a point is unknown.
  • the change detection unit 125 can determine whether a change in the presence or absence of a point has occurred between the reference point cloud and the input point cloud by using the spatial feature of the calculated reference point cloud and the spatial feature of the input point cloud.
  • the density of the points of the input point cloud is sparser than the density of the points of the reference point cloud, and there are cases where an object exists and a point is not recorded at a coordinate where the point should be. Even in such a case, it is estimated that the accuracy of the determination is higher by performing the above process in the center server 120 compared to when the data of the input point cloud is used as is. This is particularly effective when performing a point cloud comparison process in real time.
  • the reference feature calculation unit 123 may also calculate the feature of each coordinate in the neighborhood region ⁇ 1 of the calculation target coordinate.
  • the change detection unit 125 can determine whether or not a point that does not exist in the calculation target coordinate exists in the corresponding coordinate in the input point group, using the spatial feature of the calculation target coordinate, the spatial feature of each coordinate in the neighborhood region ⁇ 1 of the calculation target coordinate, and the spatial feature of the calculation target coordinate and the corresponding coordinate in the input point group.
  • the input feature calculation unit 124 may also calculate the feature of each coordinate in the neighborhood ⁇ 2 of the calculation target coordinate.
  • the change detection unit 125 can determine whether a point that exists in the calculation target coordinate does not exist in the corresponding coordinate in the reference point cloud, using the spatial feature of the calculation target coordinate, the spatial feature of each coordinate in the neighborhood ⁇ 2 of the calculation target coordinate, and the spatial feature of the calculation target coordinate and the corresponding coordinate in the reference point cloud. This makes it possible to suppress deterioration of change detection accuracy even if a translational deviation occurs between the measurement of the reference point cloud and the measurement of the input point cloud.
  • Figures 11A and 11B show images in which changes between the reference point cloud and input point cloud shown in Figures 7A and 7B have been detected by directly comparing the presence or absence of points, and images in which changes between the two have been detected using the method of this disclosure.
  • Figures 7A and 7B show measurements of a rack L in a warehouse using a sensor. At this time, it is assumed that the measurement positions of the reference point cloud and the input point cloud are shifted by a predetermined position in the translation direction. Generally, the measurement conditions of the reference point cloud and the input point cloud are not exactly the same, but are often different, and this situation reflects this. Furthermore, when the input point cloud is measured, an object OB is present that was not present when the reference point cloud was measured.
  • the points represented by dots are the points detected as changes between the reference point cloud and the input point cloud.
  • image C1 of FIG. 11A not only is object OB, which should be detected as a change, detected as a change is rack L, whose appearance from the sensor changes between the reference point cloud and the input point cloud.
  • image C2 of FIG. 11B detection of rack L as a change is suppressed, and object OB is detected as a clear change. In this way, the method disclosed herein can suppress deterioration of change detection accuracy even when a translational deviation occurs between the measurement of the reference point cloud and the measurement of the input point cloud.
  • the reference point cloud is data acquired using a sensor, and the reference point cloud acquisition unit 121 may increase the size of the neighborhood region ⁇ 1 as the distance between the position indicated by the coordinates subject to change detection at the time of measurement and the position of the sensor increases. This makes it possible to measure an area farther away from the sensor, and set the neighborhood region to cover the effects of the deviation even if the effect of the deviation is greater, thereby preventing deterioration in change detection accuracy.
  • the input point cloud is data acquired using a sensor
  • the input point cloud acquisition unit 122 may increase the size of the neighborhood region ⁇ 2 as the distance between the position indicated by the coordinates subject to change detection at the time of measurement and the position of the sensor increases. This makes it possible to measure an area farther away from the sensor, and even if the effect of the deviation is greater, set the neighborhood region so as to cover the effect of the deviation, thereby suppressing deterioration in change detection accuracy.
  • the change detection unit 125 may calculate the similarity between the spatial feature of the calculation target coordinates of the reference point group and the spatial feature of the corresponding coordinates of the input point group that correspond to the calculation target coordinates, and the similarity between the spatial feature of each coordinate in the neighboring region ⁇ 1 and the spatial feature of the corresponding coordinates of the input point group.
  • the similarity between the spatial features of the calculation target coordinates and the spatial features of the corresponding coordinates is calculated using the elements of the spatial features of the calculation target coordinates in a specific small region SR1 in the spherical region S1 and a small region SR1 included in the peripheral region of that small region SR1, and the elements of a small region SR2 in the spherical region S2 that corresponds to the specific small region SR1.
  • the similarity between the spatial features of each coordinate in the neighboring region ⁇ 1 and the spatial features of the corresponding coordinate is calculated using the elements of the spatial features of a specific small region SR1 in the spherical region S1 of each coordinate and a small region SR1 included in the peripheral region of that small region SR1, and the elements of the small region SR2 in the spherical region S2 that corresponds to the specific small region SR1.
  • the change detection unit 125 may calculate the similarity between the spatial feature of the calculation target coordinates of the input point group and the spatial feature of the corresponding coordinates of the reference point group that correspond to the calculation target coordinates, and the similarity between the spatial feature of each coordinate in the neighboring region ⁇ 2 and the spatial feature of the corresponding coordinates of the reference point group.
  • the similarity between the spatial features of the calculation target coordinates and the spatial features of the corresponding coordinates is calculated using the elements of the spatial features of the calculation target coordinates in each of a specific small region SR2 in the spherical region S2 and a small region SR2 included in the peripheral region of that small region SR2, and the elements of the small region SR1 in the spherical region S1 that corresponds to the specific small region SR2.
  • the similarity between the spatial features of each coordinate in the neighboring region ⁇ 2 and the spatial features of the corresponding coordinate is calculated using the elements of the spatial features of a specific small region SR2 in the spherical region S2 of each coordinate and a small region SR2 included in the peripheral region of that small region SR2, and the elements of the small region SR1 in the spherical region S1 that corresponds to the specific small region SR2.
  • the change detection unit 125 may appropriately change the size of at least one of the surrounding areas of small region SR1 or the surrounding areas of small region SR2. This makes it possible to change the tolerance for rotational misalignment.
  • the reference feature calculation unit 123 may also calculate spatial features in the input point cloud by vectorizing information indicating whether a point exists, whether a point does not exist, or whether the existence of a point is unknown, in a manner similar to that of the input feature calculation unit 124.
  • the ray tracing technology described in the explanation of the input feature calculation unit 124 can be applied to the definition of information indicating whether the existence of a point is unknown.
  • the change detection unit 125 calculates the similarity using the spatial feature calculated in this way and the spatial feature calculated by the input feature calculation unit 124 in the above-mentioned method.
  • ValidNum in formula (5) is the number of small areas SR1 in the spherical area S1 other than the small area SR1 defined as "where the presence of a point is unknown". Therefore, the small areas SR1 defined as "where the presence of a point is unknown" are not evaluated in the similarity calculation. This makes it possible to reflect a state in which the presence of a point in the reference point group is unknown in the change detection, thereby making it possible to further improve the accuracy of change detection between the reference point group and the input point group.
  • the reference point cloud and the input point cloud may be mapping data generated based on two-dimensional images of a specific location taken by a camera or a positioning sensor.
  • the following cases are also considered as cases in which a misalignment occurs between the images related to the reference point cloud and the images related to the input point cloud.
  • a misalignment may occur when images of the same location are taken at different times using a movable camera carried by a person and the reference point cloud and the input point cloud are generated based on the images.
  • the change detection unit 125 may perform process (I) for a number of coordinates of less than all of the input point cloud and a number of coordinates of the reference point cloud corresponding to those coordinates. Similarly, the change detection unit 125 may perform process (II) for a number of coordinates of less than all of the reference point cloud and a number of coordinates of the input point cloud corresponding to those coordinates.
  • the detection result generation unit 126 generates an image showing the changes detected by the change detection unit 125 based on this determination result. In this way, if there is an area in the point cloud that does not require determination, the center server 120 can detect changes in the coordinates excluding that area.
  • FIG. 12 is a block diagram showing another example of the center server according to the second embodiment.
  • the center server 120 further includes an extraction unit 127 in addition to the components shown in Fig. 6. Below, the processing of the center server 120 will be described with reference to (2A) omitted, and only the points specific to this example will be described.
  • the extraction unit 127 directly compares the presence or absence of points at each corresponding coordinate in the reference point group acquired by the reference point group acquisition unit 121 and the input point group acquired by the input point group acquisition unit 122. This extracts information on coordinates where the presence or absence of points at corresponding coordinates differs between the reference point group and the input point group (information on changed coordinates), and does not extract other coordinates.
  • the reference feature calculation unit 123 and the input feature calculation unit 124 execute the process shown in (2A) for the coordinates of the reference point group and the coordinates of the input point group extracted in this manner, and calculate the spatial feature.
  • the change detection unit 125 and the detection result generation unit 126 execute the process shown in (2A) using the calculated spatial feature.
  • FIGS. 13A to 13C are flowcharts showing an example of a representative process of the center server 120 in (2B), which corresponds to Figures 10A to 10C, and explain an overview of the process of the center server 120. Note that descriptions of the same points as in (2A) will be omitted as appropriate.
  • the reference point cloud acquisition unit 121 acquires reference point cloud data from the reference point cloud DB 130 (step S21; acquisition step). Also, the input point cloud acquisition unit 122 acquires input point cloud data by acquiring data transmitted by the robot 101 (step S22; acquisition step).
  • the extraction unit 127 directly compares the presence or absence of points at corresponding coordinates between the reference point group acquired by the reference point group acquisition unit 121 and the input point group acquired by the input point group acquisition unit 122. This detects information on changed coordinates (step S26; change detection step).
  • the reference feature amount calculation unit 123, the input feature amount calculation unit 124, and the change detection unit 125 execute process (A') (step S23'; process (A') step). Details of process (A') will be explained using FIG. 13B.
  • process (B') indicates the above process (II) and processes related thereto, and the details thereof will be explained using FIG. 13C. Note that either of the processes in steps S23' and S24' may be performed first, or both may be performed in parallel.
  • the detection result generating unit 126 generates an image showing the change detected with respect to process (A') based on the information indicating the presence or absence of a change at each coordinate generated by process (A'). Similarly, the detection result generating unit 126 generates an image showing the change detected with respect to process (B') based on the information indicating the presence or absence of a change at each coordinate generated by process (B') (step S25; detection result image generating step).
  • the input feature calculation unit 124 calculates spatial features for coordinates in the input point cloud extracted by the extraction unit 127 for which features have not yet been calculated (step S31; feature calculation step).
  • the input feature calculation unit 124 outputs information on the calculated coordinates to the reference feature calculation unit 123.
  • the reference feature calculation unit 123 Based on the output coordinate information, the reference feature calculation unit 123 identifies the coordinate F1 in the reference point group that corresponds to that coordinate. The reference feature calculation unit 123 sets a neighborhood region ⁇ 1 that includes the coordinate F1, and calculates the spatial feature of each coordinate included in the neighborhood region ⁇ 1 (step S32; feature calculation step). As described above, the process of step S31 or the process of step S32 may be performed in any order, or both processes may be performed in parallel.
  • the change detection unit 125 calculates the similarity between coordinates F1 and F2 using the spatial features calculated in steps S31 and S32 (step S33; similarity calculation step).
  • the change detection unit 125 compares the maximum value of the calculated similarity with a predetermined threshold value to determine whether or not a change has occurred in the presence or absence of a point at coordinate F2 (step S34; change detection step).
  • the change detection unit 125 determines whether or not the similarity calculation and change detection determination have been completed for all coordinates extracted in the input point cloud (step S36; completion determination step). If the similarity calculation and change detection determination have not been completed for all extracted coordinates (No in step S36), the process returns to step S31 and repeats the process for the coordinates for which the similarity has not been calculated. If the similarity calculation and change detection determination have been completed for all extracted coordinates (Yes in step S36), process (A') ends. Then, as described above, the detection result generation unit 126 generates a change detection image based on the information generated by process (A') that indicates the presence or absence of a change at each coordinate.
  • the reference feature calculation unit 123 calculates spatial features for coordinates in the reference point group extracted by the extraction unit 127 for which features have not yet been calculated (step S41; feature calculation step).
  • the reference feature calculation unit 123 outputs information on the calculated coordinates to the input feature calculation unit 124.
  • the input feature calculation unit 124 Based on the output coordinate information, the input feature calculation unit 124 identifies coordinate F2 in the input point cloud that corresponds to that coordinate.
  • the input feature calculation unit 124 sets a neighborhood region ⁇ 2 that includes coordinate F2, and calculates the spatial feature of each coordinate included in neighborhood region ⁇ 2 (step S42; feature calculation step).
  • step S42 feature calculation step
  • the process of step S41 or the process of step S42 may be performed in any order, or both processes may be performed in parallel.
  • the change detection unit 125 calculates the similarity between coordinates F1 and F2 using the spatial features calculated in steps S41 and S42 (step S43; similarity calculation step). The change detection unit 125 compares the maximum value of the calculated similarity with a predetermined threshold value to determine whether or not a change has occurred in the presence or absence of a point at coordinates F1 (step S44; change detection step).
  • the change detection unit 125 determines whether or not the similarity calculation and change detection determination have been completed for all coordinates extracted in the reference point group (step S46; completion determination step). If the similarity calculation and change detection determination have not been completed for all extracted coordinates (No in step S46), the process returns to step S41 and repeats the process for the coordinates for which the similarity has not been calculated. If the similarity calculation and change detection determination have been completed for all extracted coordinates (Yes in step S46), process (B') ends. Then, as described above, the detection result generation unit 126 generates an image in which changes have been detected, based on the information indicating the presence or absence of changes at each coordinate, generated by process (B').
  • the center server 120 can extract information on coordinates that have changed using the extraction unit 127, and execute the change detection process shown in (2A) for the extracted coordinates. This can reduce the calculation costs required for all steps of the process, compared to executing the change detection process shown in (2A) for the coordinates of all input point groups or reference point groups.
  • FIG. 14A is a block diagram showing another example of the center server according to the embodiment 2.
  • the center server 120 further includes an object identification unit 128 in addition to the components shown in Fig. 12.
  • object identification unit 128 in addition to the components shown in Fig. 12.
  • the object identification unit 128 determines whether the change in the presence or absence of a point shown in the image generated by the detection result generation unit 126 as a result of executing the process shown in (I) or the image generated as a result of executing the process shown in (II) corresponds to any object (e.g., a container, a vehicle, construction materials, etc.).
  • the object identification unit 128 determines whether the changed point cloud portion is equal to or larger than a predetermined size, and if it is equal to or larger than the predetermined size, it determines that the changed point cloud portion corresponds to some kind of object. If the changed point cloud portion is smaller than the predetermined size, the object identification unit 128 determines that the changed point cloud portion is noise.
  • the object identification unit 128 may compare the location of the changed point cloud with pre-stored point cloud data of various objects for determination, such as containers, vehicles, construction materials, etc. If the location of the changed point cloud matches the point cloud data of any object, or matches except for an error of, for example, a few percent, the object identification unit 128 identifies the matching object as an object whose presence or absence has changed between the input point cloud and the reference point cloud.
  • the object identification unit 128 may perform object determination using a pre-trained AI model. This training is performed by inputting training data, including image information showing the changed point cloud as a sample and information (correct label) showing various objects corresponding to that information, into the AI model. After the AI model has been trained using the training data, the object identification unit 128 inputs the image generated by the detection result generation unit 126 into the AI model. Based on this input image, the AI model outputs information showing the object shown in the input image. In this way, the object identification unit 128 can also perform object identification processing. Note that any technology, such as logistic regression or neural network, can be used to train the learning model.
  • the object identifying unit 128 determines whether the change in the presence or absence of points shown in the point cloud, which is the detection result, corresponds to some object.
  • the detection result generating unit 126 can perform object determination using an AI model that has been trained in advance. This learning is performed by inputting training data including a changed point cloud that serves as a sample and information (correct answer label) indicating various objects corresponding to that information into the AI model. After the AI model has been trained using the training data, the object identifying unit 128 inputs the point cloud generated by the detection result generating unit 126 into the AI model. Based on this point cloud, the AI model can output information indicating the object indicated by the point cloud.
  • Other determination methods that the detection result generating unit 126 can execute are as described above.
  • the center server 120 can detect whether there has been a change in the presence or absence of an object between the reference point group and the input point group. More preferably, the center server 120 can also identify an object whose presence or absence has changed between the reference point group and the input point group.
  • the object identification unit 128 may generate and output a screen on which a process for visually highlighting the identified object has been performed. The process for visually highlighting the object is as described in the first embodiment.
  • the object identification unit 128 may output an alert by voice via a speaker.
  • the object identification unit 128 may output the determination result of the determination unit 13 to another device.
  • FIG. 14B is a block diagram showing another example of the center server according to the embodiment 2.
  • the center server 120 further includes a mobility control unit 129.
  • the points explained in (2A) to (2C) are omitted, and only points unique to this example are explained.
  • the movement control unit 129 controls the movement of the robot 101. For example, when the object identification unit 128 determines that the location of the changed point cloud corresponds to some kind of object, the movement control unit 129 can instruct the robot 101 to approach the location of the changed point cloud and then measure the location further. This instruction is given in order to obtain more detailed point cloud data of the location and analyze it. Alternatively, the movement control unit 129 may control the robot 101 as described above when the object identification unit 128 identifies a specific type of object.
  • This instruction is output in at least one of the following cases: as a result of the process shown in (I), it is determined that an object does not exist in the reference point cloud and an object has become present in the input point cloud, or as a result of the process shown in (II), it is determined that an object exists in the reference point cloud and that the object no longer exists in the input point cloud.
  • the instruction is output at least when an object does not exist in the reference point cloud and an object has become present in the input point cloud.
  • FIG. 15A and 15B are flowcharts showing an example of a representative process of the center server 120 in (2D), which corresponds to Fig. 13A, and explain an overview of the process of the center server 120. Note that descriptions of the same points as in (2A) and (2B) will be omitted as appropriate.
  • Steps S21 to S25 are the same as those in FIG. 13A, and therefore will not be described here.
  • the object identification unit 128 executes the process shown in (2C) and determines whether or not the changed point cloud portion corresponds to any object in the image generated by the detection result generation unit 126 through the process of (I) (step S27; object detection determination step). If the changed point cloud portion does not correspond to an object (No in step S27), the movement control unit 129 does not execute any special control. In this case, the robot 101 moves along, for example, a previously set movement route.
  • step S27 the movement control unit 129 controls the robot 101 to approach the changed point cloud portion and then further measure the portion (step S28; robot movement step). In this case, the robot 101 moves away from the previously set movement route.
  • steps S27 and S28 can also be performed on the images generated by the detection result generating unit 126 through the process of (II).
  • the center server 120 can also perform the processes shown in (2A) to (2D) on the images acquired by the robot approaching and measuring as a result of the process of step S28.
  • the center server 120 controls the robot 101 to acquire detailed point cloud data. This enables more detailed inspection of failures or abnormalities in infrastructure facilities. Note that even when the detection result generating unit 126 generates a point cloud, the movement control unit 129 can execute similar control based on the object determination result of the object identifying unit 128.
  • the movement control unit 129 may set the movement route to a route that avoids the location where the object is present.
  • the movement control unit 129 controls the movement of the robot 101 so that the robot 101 moves along the newly set route.
  • steps S31 to S34 are repeatedly executed for each point in the input point cloud.
  • a process of calculating spatial features for all coordinates to be processed in the input point cloud and a process of calculating spatial features for an area including the coordinates of the reference point cloud that correspond to each coordinate of the input point cloud may be executed first.
  • the processes of steps S33 to S34 are executed for all coordinates, making it possible to detect a change in the presence or absence of points in the area to be processed.
  • the order of the processes may be similarly changed.
  • polar coordinates are used to define the spherical region S1 surrounding the coordinate F1 and the spherical region S2 surrounding the coordinate F2.
  • the use of polar coordinates has the effect of making the calculations of the reference feature calculation unit 123 and the input feature calculation unit 124 easier.
  • the change detection unit 125 calculates S(P1, P2) by simply adding Score(P1 ⁇ r , P2 ⁇ r ) shown in (6).
  • S(P1, P2) may be calculated by other methods so that S(P1, P2) increases monotonically with respect to the number of Same around (P1 ⁇ r , P2 ⁇ r ) present in the entire range of ⁇ , ⁇ , and r.
  • the change detection unit 125 calculates S(P1, P2) as being proportional to the inverse of ValidNum.
  • S(P1, P2) may be calculated by other methods so that S(P1, P2) decreases monotonically with respect to ValidNum.
  • any part of the above-mentioned processing executed by each unit of the center server 120 may be executed by at least one of the robot 101 and a different server.
  • the processing of the center server 120 may be realized by a distributed system.
  • the center server 120 may not be provided, and the robot 101 may execute the above-mentioned processing of the center server 120 in a stand-alone manner.
  • the reference point cloud and the position information at which the reference point cloud was measured are stored in association with each other in the storage unit of the robot 101.
  • the robot 101 performs measurement with its own LiDAR 102 and acquires the input point cloud.
  • the robot 101 searches the storage unit using the position information at which the input point cloud was measured, and acquires the data of the reference point cloud to be compared with the input point cloud.
  • the details are the same as those of the processing of the reference point cloud acquisition unit 121 described above.
  • the robot 101 can execute the processing related to the reference feature amount calculation unit 123 to the detection result generation unit 126 described above.
  • the robot 101 may also execute processing related to the object identification unit 128.
  • the robot 101 detects an object with the object identification unit 128, it can control its own movement unit to approach the location of the changed point cloud and then measure the location.
  • this disclosure has been described as a hardware configuration, but this disclosure is not limited to this.
  • This disclosure can also be realized by having a processor in a computer execute a computer program to execute the processes (steps) of the change detection device, each device in the change detection system, or the center server described in the above embodiment.
  • FIG. 16 is a block diagram showing an example of the hardware configuration of an information processing device in which the processes of the above-described embodiments are executed.
  • this information processing device 90 includes a signal processing circuit 91, a processor 92, and a memory 93.
  • the signal processing circuit 91 is a circuit for processing signals according to the control of the processor 92.
  • the signal processing circuit 91 may also include a communication circuit for receiving signals from a transmitting device.
  • the processor 92 is connected (coupled) to the memory 93, and performs the processing of the device described in the above embodiment by reading and executing software (computer programs) from the memory 93.
  • Examples of the processor 92 include a CPU (Central Processing Unit), an MPU (Micro Processing Unit), an FPGA (Field-Programmable Gate Array), a DSP (Demand-Side Platform), and an ASIC (Application Specific Integrated Circuit).
  • a single processor may be used as the processor 92, or multiple processors may be used in cooperation with each other.
  • Memory 93 may be composed of volatile memory, non-volatile memory, or a combination of both.
  • the volatile memory may be, for example, a RAM (Random Access Memory) such as DRAM (Dynamic Random Access Memory) or SRAM (Static Random Access Memory).
  • the non-volatile memory may be, for example, a ROM (Read Only Memory) such as PROM (Programmable Random Only Memory) or EPROM (Erasable Programmable Read Only Memory), flash memory, or an SSD (Solid State Drive).
  • a single memory may be used as memory 93, or multiple memories may be used in cooperation with each other.
  • the memory 93 is used to store one or more instructions.
  • the one or more instructions are stored in the memory 93 as a group of software modules.
  • the processor 92 can perform the processing described in the above embodiment by reading and executing these groups of software modules from the memory 93.
  • the memory 93 may include memory built into the processor 92 in addition to memory provided outside the processor 92.
  • the memory 93 may also include storage located away from the processors that make up the processor 92.
  • the processor 92 can access the memory 93 via an I/O (Input/Output) interface.
  • processors in each device in the above-mentioned embodiments execute one or more programs including a set of instructions for causing a computer to execute the algorithm described in the drawings. This process realizes the information processing described in each embodiment.
  • the program includes instructions (or software code) that, when loaded into a computer, cause the computer to perform one or more functions described in the embodiments.
  • the program may be stored on a non-transitory computer-readable medium or tangible storage medium.
  • computer-readable medium or tangible storage medium may include random-access memory (RAM), read-only memory (ROM), flash memory, solid-state drive (SSD) or other memory technology, CD-ROM, digital versatile disk (DVD), Blu-ray® disk or other optical disk storage, magnetic cassette, magnetic tape, magnetic disk storage or other magnetic storage device.
  • the program may be transmitted on a transitory computer-readable medium or communication medium.
  • transitory computer-readable medium or communication medium may include electrical, optical, acoustic, or other forms of propagated signals.
  • the first feature amount is calculated using information indicating that a point is present in each of the first small regions, that a point is not present, or that the presence of a point is unknown. 2.
  • the change detection method of claim 1. (Appendix 3) Further calculating a feature amount of each coordinate in a first neighborhood of the first coordinate in the first point cloud, in a third neighborhood including the coordinate, the third neighborhood being composed of a plurality of third small regions, using information regarding the presence of a point in each of the third small regions; determining whether or not a point that does not exist in the first coordinates is present in the second coordinates by using a feature amount of each coordinate in the first neighboring region, the first feature amount, and the second feature amount; 3.
  • the first point cloud data is data acquired using a first sensor, the size of the first neighborhood area is increased as the distance between the position indicated by the first coordinates and the position of the first sensor increases when the first point cloud data is acquired; 4.
  • the second point cloud data is data acquired by a second sensor, Regarding the second small region in which there is no point in the second point cloud, if the second small region is located between a position where a point exists in the second point cloud at the time of acquiring the data and the position of the second sensor, it is defined that there is no point in the second small region, and if the second small region is not located between a position where a point exists in the second point cloud and the position of the second sensor, the presence of a point in the second small region is defined as unknown. 7.
  • a change detection method according to any one of claims 1 to 6.
  • the first point cloud data is data acquired by a first sensor, Regarding the first small region in which there is no point in the first point cloud, if the first small region is located between a position where a point exists in the first point cloud at the time of acquiring the data and the position of the first sensor, it is defined that there is no point in the first small region, and if the first small region is not located between a position where a point exists in the first point cloud and the position of the first sensor, the presence of a point in the first small region is defined as unknown. 3.
  • the second point cloud data is data acquired using a second sensor, the size of the second neighborhood area is increased as the distance between the position indicated by the second coordinates and the position of the second sensor increases when the second point cloud data is acquired. 6.
  • (Appendix 12) determining whether a point that exists at the first coordinates does not exist at the second coordinates by calculating a similarity between the first feature amount and the second feature amount and, for each coordinate in the second neighboring region, a similarity between the feature amount of the coordinate and the first feature amount; a similarity between the first feature amount and the second feature amount is calculated using elements of the second feature amount in each of the second small region in the second region and a second small region included in a peripheral region of the second small region, and elements of the first small region in the first region corresponding to the second small region; a similarity between the feature amount of each coordinate in the second neighboring region and the first feature amount is calculated using an element of the fourth feature amount in each of the fourth small region in the fourth region and a small region included in a peripheral region of the fourth small region, and an element of the first small region in the first region corresponding to the fourth small region; 12.
  • the change detection method according to claim 5 or 11.
  • Appendix 13 a first feature amount calculation means for calculating a first feature amount of a first coordinate in a first point cloud using information on the presence of a point in each of the first small regions, the first region being composed of a plurality of first small regions and including the first coordinate; a second feature amount calculation means for calculating a second feature amount of a second coordinate in a second point group corresponding to the first coordinate, using information indicating that a point exists, that a point does not exist, or that the existence of a point is unknown in a second area including the second coordinate, the second area being composed of a plurality of second small areas; a determination means for determining whether or not a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates by using the first feature amount and the second feature amount;
  • a change detection system comprising: (Appendix 14) the first feature amount calculation means calculates the first feature amount using information indicating that a point is present in each of the first small
  • the first feature amount calculation means further calculates feature amounts of each coordinate in a first neighborhood area of the first coordinate in the first point cloud, in a third area including the coordinate, the third area being composed of a plurality of third small areas, using information regarding the presence of points in each of the third small areas; the determining means determines whether or not a point that does not exist in the first coordinates is present in the second coordinates by using a feature amount of each coordinate in the first neighboring region, the first feature amount, and the second feature amount. 15.
  • the first point cloud data is data acquired using a first sensor
  • the first feature amount calculation means increases a size of the first neighboring region as a distance between a position indicated by the first coordinates and a position of the first sensor increases when the first point cloud data is acquired; 16.
  • the second feature amount calculation means further calculates feature amounts of each coordinate in a second neighboring region of the second coordinate in the second point cloud, the fourth region being composed of a plurality of fourth small regions and including the coordinate, using information indicating that a point exists in each of the fourth small regions, that a point does not exist, or that the existence of a point is unknown; the determining means determines whether or not a point existing at the first coordinates does not exist at the second coordinates by using a feature amount of each coordinate in the second neighboring region, the first feature amount, and the second feature amount. 17.
  • a change detection system according to any one of claims 13 to 16.
  • (Appendix 18) a detection unit that detects a change in the presence or absence of an object between the first point cloud and the second point cloud by performing the determination at a plurality of corresponding coordinates in the first point cloud and the second point cloud; An output unit that outputs the result of the detection, 18.
  • a change detection system according to any one of claims 13 to 17.
  • the second point cloud data is data acquired by a second sensor, Regarding the second small region in which there is no point in the second point cloud, if the second small region is located between a position where a point exists in the second point cloud at the time of acquiring the data and the position of the second sensor, it is defined that there is no point in the second small region, and if the second small region is not located between a position where a point exists in the second point cloud and the position of the second sensor, the presence of a point in the second small region is defined as unknown. 19.
  • a change detection system according to any one of claims 13 to 18.
  • Appendix 20 An extraction unit that compares the first point cloud with the second point cloud and extracts coordinates where a point is present or absent in the first point cloud and the second point cloud, At least one of the first coordinates and the second coordinates is a coordinate extracted by the extraction unit. 20.
  • a change detection system according to any one of claims 13 to 19.
  • the first point cloud data is data acquired by a first sensor, Regarding the first small region in which there is no point in the first point cloud, if the first small region is located between a position where a point exists in the first point cloud at the time of acquiring the data and the position of the first sensor, it is defined that there is no point in the first small region, and if the first small region is not located between a position where a point exists in the first point cloud and the position of the first sensor, the presence of a point in the first small region is defined as unknown. 15.
  • the determining means determines whether or not a point that does not exist in the first coordinates is present in the second coordinates by calculating a similarity between the first feature amount and the second feature amount and, for each coordinate in the first neighboring region, a similarity between the feature amount of the coordinate and the second feature amount; a similarity between the first feature amount and the second feature amount is calculated using elements of the first feature amount in each of the first small region in the first region and a first small region included in a peripheral region of the first small region, and elements of the second small region in the second region corresponding to the first small region; a similarity between the feature amount of each coordinate in the first neighboring region and the second feature amount is calculated using elements of the first feature amount in each of the third small region in the third region and a small region included in a peripheral region of the third small region, and elements of the second small region in the second region corresponding to the third small region; 17.
  • the second point cloud data is data acquired using a second sensor
  • the second feature amount calculation means increases a size of the second neighborhood area as a distance between a position indicated by the second coordinates and a position of the second sensor increases when the second point cloud data is acquired; 18.
  • the determining means determines whether or not a point existing at the first coordinates does not exist at the second coordinates by calculating a similarity between the first feature amount and the second feature amount and, for each coordinate in the second neighboring region, a similarity between the feature amount of the coordinate and the first feature amount; a similarity between the first feature amount and the second feature amount is calculated using elements of the second feature amount in each of the second small region in the second region and a second small region included in a peripheral region of the second small region, and elements of the first small region in the first region corresponding to the second small region; a similarity between the feature amount of each coordinate in the second neighboring region and the first feature amount is calculated using an element of the fourth feature amount in each of the fourth small region in the fourth region and a small region included in a peripheral region of the fourth small region, and an element of the first small region in the first region corresponding to the fourth small region; 24.
  • the change detection system of claim 17 or 23 (Appendix 25) a first feature amount calculation means for calculating a first feature amount of a first coordinate in a first point cloud using information on the presence of a point in each of the first small regions, the first region being composed of a plurality of first small regions and including the first coordinate; a second feature amount calculation means for calculating a second feature amount of a second coordinate in a second point group corresponding to the first coordinate, using information indicating that a point exists, that a point does not exist, or that the existence of a point is unknown in a second area including the second coordinate, the second area being composed of a plurality of second small areas; a determination means for determining whether or not a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates by using the first feature amount and the second feature amount;
  • a change detection device comprising: (Appendix 26) the first feature amount calculation means calculates the first feature amount using information indicating that a point is present in each of the first small regions
  • the change detection apparatus of claim 25 (Appendix 27) the first feature amount calculation means further calculates feature amounts of each coordinate in a first neighborhood area of the first coordinate in the first point cloud, in a third area including the coordinate, the third area being composed of a plurality of third small areas, using information regarding the presence of points in each of the third small areas; the determining means determines whether or not a point that does not exist in the first coordinates is present in the second coordinates by using a feature amount of each coordinate in the first neighboring region, the first feature amount, and the second feature amount. 27.
  • the change detection device of claim 25 or 26 The change detection device of claim 25 or 26.
  • the first point cloud data is data captured by a first sensor
  • the first feature amount calculation means increases a size of the first neighboring region as a distance between a position indicated by the first coordinates and a position of the first sensor at the time of acquiring the data increases; 28.
  • the second feature amount calculation means further calculates feature amounts of each coordinate in a second neighboring region of the second coordinate in the second point cloud, the fourth region being composed of a plurality of fourth small regions and including the coordinate, using information indicating that a point exists in each of the fourth small regions, that a point does not exist, or that the existence of a point is unknown; the determining means determines whether or not a point existing at the first coordinates does not exist at the second coordinates by using a feature amount of each coordinate in the second neighboring region, the first feature amount, and the second feature amount. 29.
  • a change detection device according to any one of claims 25 to 28.
  • Appendix 30 An extraction unit that compares the first point cloud with the second point cloud and extracts coordinates where a point is present or absent in the first point cloud and the second point cloud, At least one of the first coordinates and the second coordinates is a coordinate extracted by the extraction unit. 30.
  • a change detection device according to any one of claims 25 to 29.
  • the second point cloud data is data acquired by a second sensor, Regarding the second small region in which there is no point in the second point cloud, if the second small region is located between a position where a point exists in the second point cloud at the time of acquiring the data and the position of the second sensor, it is defined that there is no point in the second small region, and if the second small region is not located between a position where a point exists in the second point cloud and the position of the second sensor, the presence of a point in the second small region is defined as unknown. 31.
  • a change detection device according to any one of claims 25 to 30.
  • the first point cloud data is data acquired by a first sensor, Regarding the first small region in which there is no point in the first point cloud, if the first small region is located between a position where a point exists in the first point cloud at the time of acquiring the data and the position of the first sensor, it is defined that there is no point in the first small region, and if the first small region is not located between a position where a point exists in the first point cloud and the position of the first sensor, the presence of a point in the first small region is defined as unknown.
  • the change detection apparatus of claim 26 is if the first small region is located between a position where a point exists in the first point cloud at the time of acquiring the data and the position of the first sensor, it is defined that there is no point in the first small region, and if the first small region is not located between a position where a point exists in the first point cloud and the position of the first sensor, the presence of a point in the first small region is defined as unknown.
  • the determining means determines whether or not a point that does not exist in the first coordinates is present in the second coordinates by calculating a similarity between the first feature amount and the second feature amount and, for each coordinate in the first neighboring region, a similarity between the feature amount of the coordinate and the second feature amount; a similarity between the first feature amount and the second feature amount is calculated using elements of the first feature amount in each of the first small region in the first region and a first small region included in a peripheral region of the first small region, and elements of the second small region in the second region corresponding to the first small region; a similarity between the feature amount of each coordinate in the first neighboring region and the second feature amount is calculated using elements of the first feature amount in each of the third small region in the third region and a small region included in a peripheral region of the third small region, and elements of the second small region in the second region corresponding to the third small region; 29.
  • the change detection apparatus of claim 27 or 28 (Appendix 34) the second point cloud data is data acquired using a second sensor, the second feature amount calculation means increases a size of the second neighborhood area as a distance between a position indicated by the second coordinates and a position of the second sensor increases when the second point cloud data is acquired; 30.
  • the change detection apparatus of claim 29 (Appendix 34) the second point cloud data is data acquired using a second sensor, the second feature amount calculation means increases a size of the second neighborhood area as a distance between a position indicated by the second coordinates and a position of the second sensor increases when the second point cloud data is acquired; 30.
  • the change detection apparatus of claim 29 The change detection apparatus of claim 29.
  • the determining means determines whether or not a point existing at the first coordinates does not exist at the second coordinates by calculating a similarity between the first feature amount and the second feature amount and, for each coordinate in the second neighboring region, a similarity between the feature amount of the coordinate and the first feature amount; a similarity between the first feature amount and the second feature amount is calculated using elements of the second feature amount in each of the second small region in the second region and a second small region included in a peripheral region of the second small region, and elements of the first small region in the first region corresponding to the second small region; a similarity between the feature amount of each coordinate in the second neighboring region and the first feature amount is calculated using an element of the fourth feature amount in each of the fourth small region in the fourth region and a small region included in a peripheral region of the fourth small region, and an element of the first small region in the first region corresponding to the fourth small region; 35.
  • the change detection apparatus of claim 29 or 34 (Appendix 36) Calculating a first feature amount of a first coordinate in a first point cloud using information about the presence of a point in each of the first small regions, the first region being composed of a plurality of first small regions and including the first coordinate; calculating a second feature amount of a second coordinate in a second point cloud corresponding to the first coordinate, using information indicating that a point exists, that a point does not exist, or that the existence of a point is unknown in a second region that is composed of a plurality of second small regions and includes the second coordinate; determining whether or not a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates by using the first feature amount and the second feature amount; A non-transitory computer-readable medium on which a program for causing a computer to execute a process is stored.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geophysics (AREA)
  • Image Analysis (AREA)

Abstract

In one aspect, a change detection apparatus (10) according to the present embodiment includes: a first feature calculation unit (11) that calculates a first feature of first coordinates in a first point cloud by using information about the presence of a point in each of a plurality of first subregions in a first region which is formed from the first subregions and which contains the first coordinates; a second feature calculation unit (12) that calculates a second feature of second coordinates in a second point cloud, the second coordinates corresponding to the first coordinates, using information indicating whether a point is present, a point is absent, or the presence of a point is indeterminate in each of a plurality of second subregions in a second region which is formed from the second subregions and which contains the second coordinates; and a determination unit (13) that uses the first and second features to determine whether or not a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates.

Description

変化検出方法、変化検出システム及び変化検出装置Change detection method, change detection system, and change detection device
 本発明は、変化検出方法、変化検出システム及び変化検出装置に関する。 The present invention relates to a change detection method, a change detection system, and a change detection device.
 各種インフラ設備の点検や、防犯等の理由により、ドローン等のロボットが監視対象場所を巡回しながら所定の場所を撮影することで、監視作業を行う技術の研究がなされている。 Research is being conducted into technologies for surveillance work in which drones or other robots patrol target locations and take pictures of designated locations for the purpose of inspecting various infrastructure facilities and for crime prevention reasons.
 例えば、特許文献1には、過去に撮影された映像と今回撮影された映像とを比較し、対象部分に変化が生じているかを検出する技術が記載されている。このとき、作業に起因して生じた変化部分の対象データは、出力の対象から除外される。また、特許文献2には、3次元距離データを取得し、3次元極座標グリッドマップを用いることで、取得した3次元距離データに基づく物体の有無を検知する技術が記載されている。 For example, Patent Document 1 describes a technology that compares previously captured images with currently captured images to detect whether any changes have occurred in the target area. At this time, the target data for the changed area caused by the work is excluded from the output. Patent Document 2 describes a technology that acquires three-dimensional distance data and uses a three-dimensional polar coordinate grid map to detect the presence or absence of an object based on the acquired three-dimensional distance data.
特開2022-063600号公報JP 2022-063600 A 特開2021-081235号公報JP 2021-081235 A
 2つの点群データを比較する際に、点群データの撮影環境や撮影するデバイスが異なる場合、点群データにおける点の密度が異なることが想定される。すなわち、一方の点群の密度が他方の点群の密度よりも低いことになる。このとき、点群同士の正確な比較がしにくくなり、変化を検出する精度が低下してしまう可能性があった。特許文献1及び2にかかる技術は、映像同士の比較や物体の有無を検知する点について開示しているが、このような課題について言及しておらず、この課題を解決できるものではなかった。 When comparing two point cloud data, if the shooting environment or device used to shoot the point cloud data is different, it is expected that the density of points in the point cloud data will be different. In other words, the density of one point cloud will be lower than the density of the other point cloud. In this case, it becomes difficult to accurately compare the point clouds, and there is a possibility that the accuracy of detecting changes will decrease. The technologies in Patent Documents 1 and 2 disclose how to compare images and detect the presence or absence of objects, but do not mention this problem and are not able to solve it.
 本開示の目的は、比較対象となる点群が正確な情報を示していない場合であっても、点群同士の変化を検出する精度を高めることが可能な変化検出方法、変化検出システム及び変化検出装置を提供することである。 The objective of this disclosure is to provide a change detection method, a change detection system, and a change detection device that can improve the accuracy of detecting changes between point clouds, even when the point clouds being compared do not represent accurate information.
 本実施形態にかかる一態様の変化検出方法は、コンピュータが実行するものであって、第1の点群における第1の座標の第1の特徴量を、複数の第1の小領域から構成され、前記第1の座標を含む第1の領域において、各前記第1の小領域における点の存在に関する情報を用いて計算し、前記第1の座標と対応する、第2の点群における第2の座標の第2の特徴量を、複数の第2の小領域から構成され、前記第2の座標を含む第2の領域において、各前記第2の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算し、前記第1の特徴量及び前記第2の特徴量を用いて、前記第1の座標と前記第2の座標との間で点の有無の変化が生じたか否かを判定するものである。 A change detection method according to one aspect of the present embodiment is executed by a computer, and calculates a first feature of a first coordinate in a first point cloud using information about the presence of a point in each of the first small regions in a first region that is composed of a plurality of first small regions and includes the first coordinate, calculates a second feature of a second coordinate in a second point cloud that corresponds to the first coordinate using information about the presence of a point in each of the second small regions in a second region that is composed of a plurality of second small regions and includes the second coordinate, and calculates a second feature of a second coordinate in a second point cloud that corresponds to the first coordinate using information that indicates that a point is present, that a point is not present, or that the presence of a point is unknown in each of the second small regions in a second region that is composed of a plurality of second small regions and includes the second coordinate, and determines whether a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate using the first feature and the second feature.
 本実施形態にかかる一態様の変化検出システムは、第1の点群における第1の座標の第1の特徴量を、複数の第1の小領域から構成され、前記第1の座標を含む第1の領域において、各前記第1の小領域における点の存在に関する情報を用いて計算する第1の特徴量計算手段と、前記第1の座標と対応する、第2の点群における第2の座標の第2の特徴量を、複数の第2の小領域から構成され、前記第2の座標を含む第2の領域において、各前記第2の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する第2の特徴量計算手段と、前記第1の特徴量及び前記第2の特徴量を用いて、前記第1の座標と前記第2の座標との間で点の有無の変化が生じたか否かを判定する判定手段とを備えるものである。 The change detection system according to one aspect of the present embodiment includes a first feature calculation means for calculating a first feature of a first coordinate in a first point cloud using information on the presence of a point in each of the first small regions in a first region that is composed of a plurality of first small regions and includes the first coordinates; a second feature calculation means for calculating a second feature of a second coordinate in a second point cloud that corresponds to the first coordinate using information on the presence of a point in each of the second small regions in a second region that is composed of a plurality of second small regions and includes the second coordinates, indicating that a point is present, that a point is not present, or that the presence of a point is unknown in each of the second small regions; and a determination means for determining whether a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates using the first feature and the second feature.
 本実施形態にかかる一態様の変化検出装置は、第1の点群における第1の座標の第1の特徴量を、複数の第1の小領域から構成され、前記第1の座標を含む第1の領域において、各前記第1の小領域における点の存在に関する情報を用いて計算する第1の特徴量計算手段と、前記第1の座標と対応する、第2の点群における第2の座標の第2の特徴量を、複数の第2の小領域から構成され、前記第2の座標を含む第2の領域において、各前記第2の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する第2の特徴量計算手段と、前記第1の特徴量及び前記第2の特徴量を用いて、前記第1の座標と前記第2の座標との間で点の有無の変化が生じたか否かを判定する判定手段とを備えるものである。 A change detection device according to one aspect of the present embodiment includes a first feature calculation means for calculating a first feature of a first coordinate in a first point cloud using information on the presence of a point in each of the first small regions in a first region that is composed of a plurality of first small regions and includes the first coordinates; a second feature calculation means for calculating a second feature of a second coordinate in a second point cloud that corresponds to the first coordinate using information indicating that a point is present, that a point is not present, or that the presence of a point is unknown in a second region that is composed of a plurality of second small regions and includes the second coordinates; and a determination means for determining whether a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates using the first feature and the second feature.
 本開示によれば、比較対象となる点群が正確な情報を示していない場合であっても、点群同士の変化を検出する精度を高めることが可能な変化検出方法、変化検出システム及び変化検出装置を提供することができる。 The present disclosure provides a change detection method, a change detection system, and a change detection device that can improve the accuracy of detecting changes between point clouds, even when the point clouds being compared do not show accurate information.
実施の形態1にかかる変化検出装置の一例を示すブロック図である。1 is a block diagram showing an example of a change detection device according to a first embodiment; 実施の形態1にかかる特徴量計算部の計算を説明するための図である。4 is a diagram for explaining calculations performed by a feature amount calculation unit according to the first embodiment; FIG. 実施の形態1にかかる変化検出装置の代表的な処理の一例を示すフローチャートである。4 is a flowchart showing an example of a representative process of the change detection device according to the first embodiment; 実施の形態1にかかる変化検出システムの一例を示すブロック図である。1 is a block diagram showing an example of a change detection system according to a first embodiment; 実施の形態2にかかる監視システムの一例を示すブロック図である。FIG. 11 is a block diagram showing an example of a monitoring system according to a second embodiment. 実施の形態2にかかるセンターサーバの一例を示すブロック図である。FIG. 11 is a block diagram showing an example of a center server according to a second embodiment. 実施の形態2にかかる参照点群の一例を示す。13 shows an example of a reference point group according to the second embodiment. 実施の形態2にかかる入力点群の一例を示す。13 shows an example of an input point group according to the second embodiment. 実施の形態2にかかる参照点群のデータを取得する状況の一例を示す。13 shows an example of a situation in which reference point group data according to the second embodiment is acquired. 実施の形態2にかかる入力点群のデータを取得する状況の一例を示す。13 shows an example of a situation in which input point cloud data is acquired according to the second embodiment. 実施の形態2にかかる参照点群と入力点群の理想的な比較結果を示す。13 shows an ideal comparison result between a reference point group and an input point group according to the second embodiment. 実施の形態2にかかる参照点群と入力点群の実際の比較結果を示す。13 shows an actual comparison result between a reference point group and an input point group according to the second embodiment. 実施の形態2にかかる参照点群の計算対象となる座標を中心とした極座標系を示す図である。FIG. 11 is a diagram showing a polar coordinate system centered on the coordinates that are the target of calculation of the reference point group in the second embodiment. 実施の形態2にかかる参照点群の座標について計算された空間特徴量の一例である。13 is an example of a spatial feature amount calculated for the coordinates of a reference point group according to the second embodiment. 実施の形態2にかかる入力点群の座標について計算された空間特徴量の一例である。13 is an example of a spatial feature calculated for coordinates of an input point group according to the second embodiment. 図9B及び9Cの例において、式(5)を説明するための図である。FIG. 9C is a diagram for explaining equation (5) in the example of FIGS. 9B and 9C. 実施の形態2にかかるセンターサーバの処理例の概略を示すフローチャートである。13 is a flowchart showing an outline of an example of processing by a center server according to the second embodiment; センターサーバの詳細な処理の一例を示すフローチャートである。13 is a flowchart illustrating an example of detailed processing of the center server. センターサーバの詳細な処理の別の例を示すフローチャートである。13 is a flowchart showing another example of detailed processing of the center server. 直接比較の手法によって変化を検出した場合の画像の例を示す。1 shows an example of an image where changes are detected using the direct comparison method. 本開示の手法によって変化を検出した場合の画像の例を示す。13 shows an example of an image in which a change is detected using the technique of the present disclosure. 実施の形態2にかかるセンターサーバの別の例を示すブロック図である。FIG. 11 is a block diagram showing another example of the center server according to the second embodiment. 実施の形態2にかかるセンターサーバの処理例の概略を示すフローチャートである。13 is a flowchart showing an outline of an example of processing by a center server according to the second embodiment; センターサーバの詳細な処理の一例を示すフローチャートである。13 is a flowchart illustrating an example of detailed processing of the center server. センターサーバの詳細な処理の別の例を示すフローチャートである。13 is a flowchart showing another example of detailed processing of the center server. 実施の形態2にかかるセンターサーバの別の例を示すブロック図である。FIG. 11 is a block diagram showing another example of the center server according to the second embodiment. 実施の形態2にかかるセンターサーバの別の例を示すブロック図である。FIG. 11 is a block diagram showing another example of the center server according to the second embodiment. 実施の形態2にかかるセンターサーバの処理例の概略を示すフローチャートである。13 is a flowchart showing an outline of an example of processing by a center server according to the second embodiment; 実施の形態2にかかるセンターサーバの処理例の概略を示すフローチャートである。13 is a flowchart showing an outline of an example of processing by a center server according to the second embodiment; 各実施の形態にかかる装置のハードウェア構成の一例を示すブロック図である。FIG. 2 is a block diagram showing an example of a hardware configuration of an apparatus according to each embodiment.
 以下、図面を参照して、本開示の実施の形態について説明する。なお、実施の形態における以下の記載及び図面は、説明の明確化のため、適宜、省略及び簡略化がなされている。また、この開示では、明記のない限り、複数の項目について「その少なくともいずれか」が定義された場合、その定義は、任意の1つの項目を意味しても良いし、任意の複数の項目(全ての項目を含む)を意味しても良い。 Below, an embodiment of the present disclosure will be described with reference to the drawings. Note that the following description and drawings of the embodiment have been omitted or simplified as appropriate for clarity of explanation. Furthermore, in this disclosure, unless otherwise specified, when "at least any of" multiple items is defined, the definition may mean any one item, or any multiple items (including all items).
 実施の形態1
 (1A)
 以下、図面を参照して本開示の実施の形態1について説明する。この(1A)では、変化検出装置について説明する。
First embodiment
(1A)
A first embodiment of the present disclosure will be described below with reference to the drawings. In this embodiment (1A), a change detection device will be described.
 図1は、変化検出装置の一例を示すブロック図である。変化検出装置10は、第1の特徴量計算部11、第2の特徴量計算部12及び判定部13を備える。変化検出装置10の各部(各手段)は、不図示の制御部(コントローラ)により制御される。以下、各部について説明する。 FIG. 1 is a block diagram showing an example of a change detection device. The change detection device 10 includes a first feature amount calculation unit 11, a second feature amount calculation unit 12, and a determination unit 13. Each unit (means) of the change detection device 10 is controlled by a control unit (controller) (not shown). Each unit will be described below.
 [構成の説明]
 第1の特徴量計算部11は、第1の点群における第1の座標の第1の特徴量を計算する。第1の点群は、所定の領域内において、領域の各座標における点の有無を示す第1のデータである。第1の点群は、例えば3次元空間における物体の形を表したものであり、その一例として、センサを用いて取得された、所定の場所にある物体が視覚化されたデータが挙げられる。用いられるセンサは、例えば測域センサであってもよいし、カメラ等の撮像素子であってもよい。測域センサを用いた場合、第1の点群は、所定の場所にある物体が計測され、視覚化されたマッピングデータとなる。測域センサの具体例として、光検出及び測距を用いたLiDAR(Light Detection And Ranging)が挙げられる。LiDARは、3DLiDAR、2DLiDARのどちらも用いることが可能である。3DLiDARを用いた場合の例については実施の形態2で後述する。また、カメラを用いた場合、第1の点群は、所定の場所が撮影された2次元の画像に基づいて生成されたマッピングデータとなる。なお、上述の計測又は撮影は、実際になされたものであってもよいし、コンピュータで仮想的になされたものであってもよい。ただし、第1の点群が示す対象は、これに限られない。変化検出装置10は、この第1の点群データを、変化検出装置10の外部から取得してもよいし、変化検出装置10内で生成してもよい。
[Configuration Description]
The first feature amount calculation unit 11 calculates the first feature amount of the first coordinate in the first point cloud. The first point cloud is first data indicating the presence or absence of a point at each coordinate in a predetermined area. The first point cloud represents, for example, the shape of an object in a three-dimensional space, and an example thereof is data obtained by using a sensor and visualizing an object at a predetermined location. The sensor used may be, for example, a range sensor or an imaging element such as a camera. When a range sensor is used, the first point cloud becomes mapping data obtained by measuring and visualizing an object at a predetermined location. A specific example of the range sensor is LiDAR (Light Detection And Ranging) using light detection and ranging. Both 3DLiDAR and 2DLiDAR can be used as LiDAR. An example of the case where 3DLiDAR is used will be described later in the second embodiment. When a camera is used, the first point cloud becomes mapping data generated based on a two-dimensional image of a predetermined location. The above-mentioned measurements or photographs may be actual measurements or photographs, or may be virtual measurements or photographs performed by a computer. However, the objects indicated by the first point cloud are not limited to these. The change detection device 10 may obtain the first point cloud data from outside the change detection device 10, or may generate the first point cloud data within the change detection device 10.
 第1の特徴量計算部11は、第1の特徴量を次のように計算する。第1の特徴量計算部11は、第1の点群に関して、複数の第1の小領域から構成され、第1の座標を含む第1の領域を定義する。そして、各第1の小領域における点の存在に関する情報を用いることにより、第1の特徴量を計算する。点の存在に関する情報は、例えば、各第1の小領域において点が存在するか存在しないかのいずれかを示す情報であってもよい。別の例として、点の存在に関する情報は、各第1の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報であってもよい。点の存在が不明であることの定義については後述する。点の存在に関する情報は、第1の特徴量計算部11が、取得した第1の点群を解析することにより生成してもよいし、第1の特徴量計算部11が外部から取得した情報に含まれていてもよい。 The first feature amount calculation unit 11 calculates the first feature amount as follows. The first feature amount calculation unit 11 defines a first area that is composed of a plurality of first small areas and includes first coordinates for the first point group. The first feature amount calculation unit 11 then calculates the first feature amount by using information about the presence of a point in each first small area. The information about the presence of a point may be, for example, information indicating whether a point is present or not present in each first small area. As another example, the information about the presence of a point may be information indicating whether a point is present, not present, or the presence of a point is unknown in each first small area. The definition of the presence of a point being unknown will be described later. The information about the presence of a point may be generated by the first feature amount calculation unit 11 by analyzing the acquired first point group, or may be included in information acquired by the first feature amount calculation unit 11 from outside.
 図2は、第1の特徴量計算部11の計算を説明するための図である。図2では、第1の点群がG1、第1の座標がFC、FCを含む第1の領域がR1として示されている。領域R1は、小領域SR1として分割することができる。また、図2では、点群G1において点があることが黒丸で、点がないことが白丸でそれぞれ表記されている。 FIG. 2 is a diagram for explaining the calculations of the first feature amount calculation unit 11. In FIG. 2, the first point group is G1, the first coordinate is FC, and the first region including FC is R1. Region R1 can be divided into small regions SR1. Also, in FIG. 2, the presence of points in point group G1 is indicated by black circles, and the absence of points is indicated by white circles.
 図2の状態において、第1の特徴量計算部11は、特徴量の計算対象となる座標FCについて、複数の小領域SR1から構成され、座標FCを含む領域R1を定義する。そして、第1の特徴量計算部11は、領域R1の各小領域SR1における点の存在に関する情報を用いることにより、座標FCについての特徴量を計算する。 In the state shown in FIG. 2, the first feature amount calculation unit 11 defines a region R1 that is composed of multiple small regions SR1 and that includes the coordinates FC, for which the feature amount is to be calculated. The first feature amount calculation unit 11 then calculates the feature amount for the coordinates FC by using information about the presence of points in each small region SR1 of the region R1.
 計算される第1の特徴量は、スカラー量で表されてもよいし、ベクトル量として表されてもよい。第1の特徴量がスカラー量として表される場合、第1の特徴量に対応する第2の特徴量(後述)もスカラー量で表される。そして、後述の判定部13の判定処理において、第1の特徴量と、それに対応する第2の特徴量とを比較することで、後述の判定処理を実行する。 The calculated first feature amount may be expressed as a scalar amount or a vector amount. When the first feature amount is expressed as a scalar amount, the second feature amount (described below) corresponding to the first feature amount is also expressed as a scalar amount. Then, in the judgment process of the judgment unit 13 described below, the first feature amount is compared with the corresponding second feature amount to execute the judgment process described below.
 また、第1の特徴量がベクトル量として表される場合、対応する第2の特徴量もベクトル量で表される。そして、後述の判定部13の判定処理において、第1の特徴量における各要素と、その各要素に対応する第2の特徴量における各要素とを比較することで、後述の判定処理を実行する。特徴量がベクトル量として表される具体例は、実施の形態2で詳細に説明する。 In addition, when a first feature amount is expressed as a vector amount, the corresponding second feature amount is also expressed as a vector amount. Then, in the judgment process of the judgment unit 13 described below, each element in the first feature amount is compared with each element in the second feature amount corresponding to each element of the first feature amount, thereby executing the judgment process described below. A specific example in which a feature amount is expressed as a vector amount will be described in detail in the second embodiment.
 なお、図2では、1つの小領域SR1に対して、1つの点の有無の情報が対応付けられる例を示している。しかしながら、小領域SR1の大きさを大きくすることにより、1つの小領域SR1に対して、複数の点の有無の情報が対応付けられていてもよい。領域R1が含む小領域SR1の個数や、領域R1及び小領域SR1の形状は任意である。 Note that FIG. 2 shows an example in which information on the presence or absence of one point is associated with one small region SR1. However, by increasing the size of the small region SR1, information on the presence or absence of multiple points may be associated with one small region SR1. The number of small regions SR1 contained in region R1 and the shapes of region R1 and small regions SR1 are arbitrary.
 第2の特徴量計算部12は、第1の点群と異なる、第2の点群における第2の座標の第2の特徴量を計算する。第2の点群は、所定の領域内において、領域の各座標における点の有無を示す第2のデータであり、その例は第1の点群と同様である。変化検出装置10は、第2の点群データを、変化検出装置10の外部から取得してもよいし、変化検出装置10内で生成してもよい。 The second feature calculation unit 12 calculates a second feature of a second coordinate in the second point cloud, which is different from the first point cloud. The second point cloud is second data indicating the presence or absence of a point at each coordinate in a specified region, and an example thereof is similar to that of the first point cloud. The change detection device 10 may obtain the second point cloud data from outside the change detection device 10, or may generate the second point cloud data within the change detection device 10.
 ここで、第1の点群と、第2の点群とは、比較対象となる点群であり、例えば同じ場所が計測されたマッピングデータである。また、第1の座標と、第2の座標は比較対象となっており、例えば、第1の座標と第2の座標とは同じ位置を示すものであってもよいが、第1の座標と第2の座標との関係はこれに限られない。 Here, the first point cloud and the second point cloud are point clouds to be compared, and are, for example, mapping data in which the same location is measured. Also, the first coordinates and the second coordinates are compared, and, for example, the first coordinates and the second coordinates may indicate the same position, but the relationship between the first coordinates and the second coordinates is not limited to this.
 変化検出装置10は、第1の点群における第1の座標と、第1の座標と対応する、第2の点群における第2の座標について、第1の特徴量計算部11、第2の特徴量計算部12によって、それぞれの特徴量を計算する。これらの特徴量を用いることで、第1の座標と第2の座標との間で点の有無の変化が生じたか否かを判定することができる。 The change detection device 10 calculates feature amounts for a first coordinate in a first point cloud and a second coordinate in a second point cloud that corresponds to the first coordinate by using a first feature amount calculation unit 11 and a second feature amount calculation unit 12. By using these feature amounts, it is possible to determine whether or not a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate.
 第2の特徴量計算部12は、第2の特徴量を次のように計算する。第2の特徴量計算部12は、第2の点群に関して、複数の第2の小領域から構成され、第2の座標を含む第2の領域を定義する。この第2の領域及び第2の小領域の定義については、第1の特徴量計算部11が実行する手法と同様であり、図2の例で説明した通りである。 The second feature amount calculation unit 12 calculates the second feature amount as follows. The second feature amount calculation unit 12 defines a second area, which is composed of a plurality of second small areas and includes second coordinates, for the second point group. The definition of this second area and the second small areas is the same as the method executed by the first feature amount calculation unit 11, and is as described in the example of FIG. 2.
 第2の特徴量計算部12は、第2の特徴量を、この第2の領域において、各第2の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する。「点の存在が不明」とは、第2の特徴量計算部12が取得した第2の点群における第2の小領域では点の有り又は無しが定義されているものの、実際には点が存在しているかどうかが不明と考えられることを示す。第2の特徴量計算部12は、第2の小領域において点の存在が不明である場合、その小領域に関する特徴量を、点が存在する、又は点が存在しない場合のいずれとも比較して異なるように計算する。 The second feature calculation unit 12 calculates the second feature using information indicating that in each second sub-region in this second region, a point is present, a point is not present, or the presence of a point is unknown. "Presence of a point is unknown" indicates that although the presence or absence of a point is defined in the second sub-region in the second point cloud acquired by the second feature calculation unit 12, it is considered that it is unknown whether a point actually exists. When the presence of a point is unknown in a second sub-region, the second feature calculation unit 12 calculates the feature for that sub-region to be different from either the case where a point exists or the case where a point does not exist.
 点が存在する、点が存在しない、又は点の存在が不明であることは、例えば以下のようにして定義される。 The existence of a point, the absence of a point, or the unknown existence of a point can be defined, for example, as follows:
 一例では、第2の点群が、測域センサを用いた計測により取得されたマッピングデータであるとする。第2の小領域において点が有る場合、その第2の小領域には、点が存在すると定義される。一方、第2の小領域において点が無い場合、その第2の小領域が、計測時における第2の点群において点が存在する位置と、測域センサの位置との間に位置するか否かが判定される。第2の小領域が、計測時における第2の点群において点が存在する位置と、測域センサの位置との間に位置する場合、第2の小領域に点が存在しないと定義される。一方、第2の点群において点が存在する位置と、測域センサの位置との間に第2の小領域が位置しない場合、第2の小領域において点の存在が不明であると定義される。 In one example, the second point cloud is mapping data acquired by measurement using a range sensor. If there is a point in the second small region, the second small region is defined as having a point present. On the other hand, if there is no point in the second small region, it is determined whether the second small region is located between the position where the point exists in the second point cloud at the time of measurement and the position of the range sensor. If the second small region is located between the position where the point exists in the second point cloud at the time of measurement and the position of the range sensor, it is defined as not having a point in the second small region. On the other hand, if the second small region is not located between the position where the point exists in the second point cloud and the position of the range sensor, the presence of the point in the second small region is defined as unknown.
 この定義は、測域センサがマッピングデータを取得する場合、マッピングデータに写っている物体から測域センサに光が入射されるはずであり、物体と測域センサとの間には何の物体も存在しないはずであるという仮定に基づいている。この定義付けには、レイトレーシングの技術を適用することが可能である。これは、例えば第2の点群の点の密度が、第1の点群の点の密度よりも疎である場合に、検出の精度を高めるために、特に有効な定義である。第2の点群の点の密度が疎である場合に、第2の点群において点が無い箇所は、現実に点が存在しない場合だけでなく、計測において物体が存在したにもかかわらずデータとして記録されないような場合が考えられる。この場合には、現実に点が存在しないことが確定できないようなときに、点が無い箇所を、点の存在が不明であると定義することが好ましい。 This definition is based on the assumption that when a range sensor acquires mapping data, light should be incident on the range sensor from an object shown in the mapping data, and no object should exist between the object and the range sensor. Ray tracing technology can be applied to this definition. This is a particularly effective definition for improving detection accuracy, for example, when the density of points in the second point cloud is sparser than the density of points in the first point cloud. When the density of points in the second point cloud is sparse, a location where there is no point in the second point cloud may not only be a location where there is no point in reality, but also a location where an object exists in the measurement but is not recorded as data. In this case, when it is not possible to determine that there is no point in reality, it is preferable to define the location where there is no point as a location where the presence of a point is unknown.
 また、以下のような別の例も考えられる。まず、第2の小領域内に存在する点の数をNとする。また、上記のレイトレーシングの手法により、第2の点群内に存在するあらゆる点から測域センサに対して光が入射する状況を仮定したときに、光がその第2の小領域を通過する回数をMとする。このとき、N-M及びN+Mの値に応じて、点の有無、不明を定義することができる。 The following is another example. First, let N be the number of points in the second small region. Also, when assuming a situation in which light is incident on the range measurement sensor from every point in the second point cloud using the above ray tracing technique, let M be the number of times the light passes through the second small region. In this case, the presence, absence, or unknownness of a point can be defined according to the values of N-M and N+M.
 具体的には、N+Mの値が閾値Th1(Th1は0以上の整数)未満である場合、第2の小領域において点の存在が不明であると定義される。これは、第2の点群において、第2の小領域内に存在する点の数自体が少なく、また、第2の点群内に存在する他の点からの光がその第2の小領域を通過する場合の数も少ないため、第2の小領域において点が存在するか否かが判定しにくいためである。一方、N+Mの値が閾値Th1以上である場合、N-Mの値が閾値Th2(Th2は整数であり、一例としては0であるが、これに限られない)以上であるか否かによって、第2の小領域における点が存在の有無が定義される。N-Mの値が閾値Th2以上である場合、第2の小領域には、点が存在する確率が高いと考えられるため、点が存在すると定義される。一方、N-Mの値が閾値未満である(例えば負の値である)場合、第2の小領域には、点が存在しない確率が高いと考えられるため、点が存在しないと定義される。 Specifically, when the value of N+M is less than a threshold Th1 (Th1 is an integer equal to or greater than 0), the presence of a point in the second small region is defined as being unknown. This is because the number of points in the second small region in the second point group is small, and the number of cases in which light from other points in the second point group passes through the second small region is also small, making it difficult to determine whether or not a point exists in the second small region. On the other hand, when the value of N+M is equal to or greater than the threshold Th1, the presence or absence of a point in the second small region is defined depending on whether or not the value of N-M is equal to or greater than a threshold Th2 (Th2 is an integer, and is 0 as an example, but is not limited to this). When the value of N-M is equal to or greater than the threshold Th2, it is considered that there is a high probability that a point exists in the second small region, and therefore it is defined that a point exists. On the other hand, when the value of N-M is less than the threshold (e.g., a negative value), it is considered that there is a high probability that a point does not exist in the second small region, and therefore it is defined that there is no point.
 以上に示した、点が存在する、点が存在しない、又は点の存在が不明であることの情報は、第2の特徴量計算部12が、取得した第2の点群を解析することにより生成してもよいし、第2の特徴量計算部12が外部から取得した情報に含まれていてもよい。 The information shown above about whether a point exists, does not exist, or is unknown may be generated by the second feature calculation unit 12 by analyzing the acquired second point cloud, or may be included in information acquired by the second feature calculation unit 12 from outside.
 判定部13は、第1の特徴量計算部11が計算した第1の特徴量及び第2の特徴量計算部12が計算した第2の特徴量を用いて、第1の座標と、それに対応する第2の座標との間で点の有無の変化が生じたか否かを判定する。 The determination unit 13 uses the first feature calculated by the first feature calculation unit 11 and the second feature calculated by the second feature calculation unit 12 to determine whether or not a change in the presence or absence of a point has occurred between the first coordinate and the corresponding second coordinate.
 判定部13の判定方法は、第1の特徴量と第2の特徴量を用いた、四則演算等の任意の演算処理で実行されてもよいし、事前に定義されたルールベースに基づくアルゴリズムで実行されてもよい。例えば、第1の特徴量及び第2の特徴量がスカラー量である場合、判定部13は、第1の特徴量及び第2の特徴量の差分を計算し、その差分が閾値以上であるか否かを判定してもよい。判定部13は、差分が閾値以上である場合に、第1の座標と第2の座標との間で点の有無の変化が生じたと判定し、その差分が閾値未満である場合に、第1の座標と第2の座標との間で点の有無の変化が生じたと判定する。 The determination method of the determination unit 13 may be performed by any calculation process, such as arithmetic operations, using the first feature amount and the second feature amount, or may be performed by an algorithm based on a predefined rule base. For example, when the first feature amount and the second feature amount are scalar amounts, the determination unit 13 may calculate the difference between the first feature amount and the second feature amount and determine whether the difference is equal to or greater than a threshold value. When the difference is equal to or greater than the threshold value, the determination unit 13 determines that a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates, and when the difference is less than the threshold value, the determination unit 13 determines that a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates.
 別の例として、第1の特徴量及び第2の特徴量がベクトル量である場合、判定部13は、各ベクトル量の対応する要素同士を比較し、全要素に関するその比較結果に基づいて、第1の座標と第2の座標との間で点の有無の変化が生じたか否かを判定してもよい。例えば、判定部13は、要素同士を比較する際に、要素が同じ値を取るか否か、又は要素の差分値と閾値との大小関係について判定し、その判定に基づいて、全要素に関する比較結果として類似度を算出してもよい。この類似度が所定の閾値以上である場合、第1の座標と第2の座標との間で点の有無の変化が生じていないと判定される。これは、第1の座標と第2の座標の両方に点が存在する場合か、又はいずれにも点が存在していない場合が該当する。この類似度が所定の閾値未満である場合、第1の座標と第2の座標との間で点の有無の変化が生じていると判定される。これは、第1の座標と第2の座標の一方に点が存在し、他方に点が存在していない場合が該当する。このようなアルゴリズムに基づく判定の実施例については、実施の形態2で後述する。 As another example, when the first feature amount and the second feature amount are vector amounts, the determination unit 13 may compare corresponding elements of each vector amount and determine whether a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate based on the comparison results for all elements. For example, when comparing elements, the determination unit 13 may determine whether the elements have the same value or not, or whether the difference value of the elements is larger or smaller than a threshold, and calculate a similarity as a comparison result for all elements based on the determination. If this similarity is equal to or greater than a predetermined threshold, it is determined that there is no change in the presence or absence of a point between the first coordinate and the second coordinate. This corresponds to the case where a point exists in both the first coordinate and the second coordinate, or the case where a point does not exist in either. If this similarity is less than the predetermined threshold, it is determined that there is a change in the presence or absence of a point between the first coordinate and the second coordinate. This corresponds to the case where a point exists in one of the first coordinate and the second coordinate, and a point does not exist in the other. An example of a determination based on such an algorithm will be described later in the second embodiment.
 さらに、判定部13の判定方法は、ニューラルネットワーク等、事前に学習がなされたAI(Artificial Intelligence)モデルを用いることで実行されてもよい。この学習は、サンプルとなる第1の特徴量及び第2の特徴量の情報と、その情報に対応する、第1の座標と第2の座標との間で点の有無の変化が生じたか否かを示す情報(正解ラベル)と、を含む教師データを、AIモデルに入力させることでなされる。教師データを用いてAIモデルが学習された後、判定部13は、第1の特徴量計算部11によって計算された第1の特徴量と、第2の特徴量計算部12によって計算された第2の特徴量とを、AIモデルに入力する。AIモデルは、この入力情報に基づいて、第1の座標と第2の座標との間で点の有無の変化が生じたか否かを示す情報を出力する。このようにしても、判定部13は、判定処理を実行することができる。なお、学習モデルの学習には、ロジスティック回帰、ニューラルネットワークといった任意の技術を用いることができる。 Furthermore, the judgment method of the judgment unit 13 may be performed using an AI (Artificial Intelligence) model that has been trained in advance, such as a neural network. This learning is performed by inputting teacher data, including information on the first and second feature amounts as samples and information (correct answer label) corresponding to the information, indicating whether a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate. After the AI model has been trained using the teacher data, the judgment unit 13 inputs the first feature amount calculated by the first feature amount calculation unit 11 and the second feature amount calculated by the second feature amount calculation unit 12 to the AI model. Based on this input information, the AI model outputs information indicating whether a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate. Even in this way, the judgment unit 13 can execute the judgment process. Note that any technology, such as logistic regression or neural network, can be used for training the learning model.
 [処理フローの説明]
 図3は、変化検出装置10の代表的な処理の一例を示したフローチャートであり、このフローチャートによって、変化検出装置10の処理の概要が説明される。なお、各処理の詳細については上述の通りであるため、説明を省略する。
[Explanation of processing flow]
3 is a flowchart showing an example of a typical process of the change detection device 10, and this flowchart will be used to explain an overview of the process of the change detection device 10. Note that the details of each process are as described above, and therefore will not be explained here.
 まず、第1の特徴量計算部11は、第1の点群における第1の座標の第1の特徴量を計算する(ステップS11;第1の特徴量計算ステップ)。また、第2の特徴量計算部12は、第2の点群における第2の座標の第2の特徴量を計算する(ステップS12;第2の特徴量計算ステップ)。判定部13は、第1の特徴量及び第2の特徴量を用いて、第1の座標と第2の座標との間で点の有無の変化が生じたか否かを判定する(ステップS13;判定ステップ)。なお、ステップS11の処理とS12の処理とは、いずれの処理が先に実行されてもよいし、両方の処理が並列でなされてもよい。 First, the first feature amount calculation unit 11 calculates a first feature amount of a first coordinate in the first point cloud (step S11; first feature amount calculation step). The second feature amount calculation unit 12 calculates a second feature amount of a second coordinate in the second point cloud (step S12; second feature amount calculation step). The determination unit 13 uses the first feature amount and the second feature amount to determine whether or not a change in the presence or absence of a point has occurred between the first coordinate and the second coordinate (step S13; determination step). Note that either the process of step S11 or the process of S12 may be executed first, or both processes may be executed in parallel.
 [効果の説明]
 以上に示したように、第2の点群のある箇所において点の存在が不明である場合、第2の特徴量計算部12は、その状態を反映するように第2の特徴量を計算する。そして、判定部13は、第1の特徴量及び第2の特徴量を反映した判定をする。このため、第2の点群において本来点が存在する(又は存在しない)にも関わらず、第2の点群において点が存在しない(又は存在する)ような、誤った点の情報が示される箇所について、判定部13は、その箇所の存在が不明であるとして判定をすることが可能である。この判定結果は、誤った点の情報をそのまま用いたときと比較して、判定の精度が高くなると推定される。そのため、第1の点群と比較対象となる第2の点群が正確な情報を示していない場合であっても、変化検出装置10は、点群同士の変化を検出する精度を高めることが可能となる。
[Effects]
As described above, when the presence of a point is unknown at a certain location of the second point cloud, the second feature calculation unit 12 calculates the second feature to reflect that state. Then, the determination unit 13 makes a determination that reflects the first feature and the second feature. Therefore, for a location where erroneous point information is indicated, such as a point not existing (or existing) in the second point cloud even though a point actually exists (or does not exist) in the second point cloud, the determination unit 13 can determine that the presence of that location is unknown. It is estimated that this determination result has a higher accuracy than when the erroneous point information is used as is. Therefore, even if the second point cloud to be compared with the first point cloud does not indicate accurate information, the change detection device 10 can improve the accuracy of detecting changes between the point clouds.
 なお、判定部13は、第1の点群と第2の点群とにおいて対応する複数の座標で上記の判定を実行することにより、点群の所定の領域について点の有無の変化を検出してもよい。例えば、判定部13は、第1の点群の全座標又は第2の点群の全座標について、上述の判定を実行してもよい。これにより、第1の点群又は第2の点群の全体における点の有無の変化を検出することが可能となる。そのため、例えば第1の点群及び第2の点群が、同じ場所を計測した点群のデータである場合、変化検出装置10は、2つの点群において変化した箇所を特定することが可能となる。変化した箇所は、例えば2つの点群のうち一方の点群において存在していた物体が、他方の点群において存在しなくなった箇所である。 The determination unit 13 may perform the above determination at multiple corresponding coordinates in the first point cloud and the second point cloud to detect a change in the presence or absence of points in a predetermined area of the point cloud. For example, the determination unit 13 may perform the above determination for all coordinates of the first point cloud or all coordinates of the second point cloud. This makes it possible to detect a change in the presence or absence of points in the entire first point cloud or the entire second point cloud. Therefore, for example, when the first point cloud and the second point cloud are point cloud data measured at the same location, the change detection device 10 can identify a location that has changed in the two point clouds. The changed location is, for example, a location where an object that existed in one of the two point clouds no longer exists in the other point cloud.
 変化検出装置10は、判定部13の上記の判定結果に基づいて、第1の点群と第2の点群との間で物体の有無の変化があったことを検知する検知部をさらに有してもよい。具体的な検知方法については、実施の形態2で後述する。 The change detection device 10 may further include a detection unit that detects a change in the presence or absence of an object between the first point cloud and the second point cloud based on the above-mentioned determination result of the determination unit 13. A specific detection method will be described later in the second embodiment.
 また、第1の特徴量計算部11は、第1の特徴量を、第1の領域において、各第1の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算してもよい。点の存在が不明であることの定義は、上述に示した通りである。これにより、第1の点群において点の存在が不明となるような状態を変化の検出に反映することができるため、変化検出装置10における変化検出精度をさらに高めることが可能となる。 The first feature calculation unit 11 may also calculate the first feature using information indicating that in each first small region in the first region, a point is present, a point is not present, or the presence of a point is unknown. The definition of the presence of a point being unknown is as described above. This allows a state in which the presence of a point in the first point cloud is unknown to be reflected in change detection, thereby further improving the change detection accuracy in the change detection device 10.
 また、変化検出装置10は、判定部13の判定結果を変化検出装置10の内部又は外部に出力する出力部をさらに有してもよい。例えば、出力部は、判定部13が点の有無の変化が生じたと判定した箇所を視覚的に強調するようにして、判定結果のデータを画像、点群等の視認可能な形式で、変化検出装置10の外部(例えばモニタ)に出力することができる。この処理は、判定部13が点の有無の変化が生じたと判定した全ての箇所に実行されてもよく、それにより、2つの点群において物体の有無が変化した箇所をユーザに提示することができる。なお、「視覚的に強調」の例として、点の有無の変化が生じた箇所(又は、物体の有無が変化した箇所)を枠で囲む、その箇所の外形を、他の物体の外形の色と異なる色(例えば赤い色)で表示する、その箇所を点滅させる、その箇所を塗りつぶして影のように表示する、等が挙げられるが、これに限られない。また、出力部は、判定部13の判定結果を、スピーカを介して音声等によって出力してもよい。このようにして、出力部は、画像や音声によってアラートを出力することができる。また、出力部は、判定部13の判定結果を他の装置に出力してもよい。さらに、出力部は、第1の点群と第2の点群との間で物体の有無の変化を検知した検知結果を、上記に示したように出力してもよい。 The change detection device 10 may further include an output unit that outputs the determination result of the determination unit 13 to the inside or outside of the change detection device 10. For example, the output unit may visually emphasize the location where the determination unit 13 has determined that a change in the presence or absence of a point has occurred, and output the data of the determination result to the outside of the change detection device 10 (e.g., a monitor) in a visible format such as an image or a point cloud. This process may be performed for all locations where the determination unit 13 has determined that a change in the presence or absence of a point has occurred, thereby making it possible to present to the user the locations where the presence or absence of an object has changed in the two point clouds. Examples of "visual emphasis" include surrounding the location where the presence or absence of a point has changed (or the location where the presence or absence of an object has changed) with a frame, displaying the outline of the location in a color (e.g., red) different from the color of the outline of other objects, blinking the location, filling the location and displaying it as a shadow, and the like, but are not limited to these. The output unit may also output the determination result of the determination unit 13 by sound or the like via a speaker. In this way, the output unit can output an alert by image or sound. The output unit may also output the determination result of the determination unit 13 to another device. Furthermore, the output unit may output the detection result of detecting a change in the presence or absence of an object between the first point cloud and the second point cloud as described above.
 (1B)
 次に、(1B)では、変化検出システムについて説明する。図4は、変化検出システムの一例を示すブロック図である。変化検出システム20は、特徴量計算装置21及び判定装置22を備える。特徴量計算装置21は、第1の特徴量計算部11及び第2の特徴量計算部12を有し、判定装置22は、判定部13及び出力部14を有する。この第1の特徴量計算部11~判定部13は、(1A)に示したものと同じ処理を実行する。特徴量計算装置21が第1及び第2の特徴量を生成した場合、生成された特徴量の情報は判定装置22に出力される。判定装置22の判定部13は、その特徴量の情報を用いて、(1A)に示した処理を実行する。なお、判定装置22の出力部14は、(1A)において説明した出力部であり、判定部13の判定結果を判定装置22の内部又は外部に出力するものである。
(1B)
Next, in (1B), a change detection system will be described. FIG. 4 is a block diagram showing an example of a change detection system. The change detection system 20 includes a feature amount calculation device 21 and a judgment device 22. The feature amount calculation device 21 includes a first feature amount calculation unit 11 and a second feature amount calculation unit 12, and the judgment device 22 includes a judgment unit 13 and an output unit 14. The first feature amount calculation unit 11 to the judgment unit 13 execute the same processing as that shown in (1A). When the feature amount calculation device 21 generates the first and second feature amounts, information on the generated feature amounts is output to the judgment device 22. The judgment unit 13 of the judgment device 22 executes the processing shown in (1A) using the information on the feature amounts. The output unit 14 of the judgment device 22 is the output unit described in (1A), and outputs the judgment result of the judgment unit 13 to the inside or outside of the judgment device 22.
 以上に示したように、本開示に関する変化検出処理は、(1A)に示したように、単一の装置によって実現されてもよいし、(1B)に示したように、実行される処理が複数の装置に分散されたシステムとして実現されてもよい。なお、(1B)に示した装置構成はあくまで例示である。他の例として、第1の装置が第1の特徴量計算部11を有し、第2の装置が第2の特徴量計算部12及び判定部13を有してもよい。第1の装置は、第1の点群を取得する取得部を有していてもよい。また、異なる装置が3個設けられ、各装置がそれぞれ第1の特徴量計算部11、第2の特徴量計算部12、判定部13を有してもよい。ここで、各装置は、第1の点群を取得する取得部、第2の点群を取得する取得部、出力部14をさらに有してもよい。 As described above, the change detection process according to the present disclosure may be realized by a single device as shown in (1A), or may be realized as a system in which the processes to be executed are distributed among multiple devices as shown in (1B). Note that the device configuration shown in (1B) is merely an example. As another example, the first device may have a first feature amount calculation unit 11, and the second device may have a second feature amount calculation unit 12 and a determination unit 13. The first device may have an acquisition unit that acquires the first point cloud. Alternatively, three different devices may be provided, each having the first feature amount calculation unit 11, the second feature amount calculation unit 12, and the determination unit 13. Here, each device may further have an acquisition unit that acquires the first point cloud, an acquisition unit that acquires the second point cloud, and an output unit 14.
 さらに別の例として、変化検出システム20は、その一部又は全部が、クラウド上に構築されたクラウドサーバに設けられてもよいし、仮想化技術等を用いて生成されたその他の種類の仮想化サーバに設けられてもよい。このようなサーバに設けられた機能以外の機能は、エッジに配置される。例えば、ネットワークを介して現場で撮影された映像を監視するシステムにおいて、エッジは現場や現場の近くに配置された装置であり、また、ネットワークの階層として端末に近い装置である。 As yet another example, the change detection system 20 may be provided in part or in its entirety in a cloud server built on the cloud, or in other types of virtualized servers generated using virtualization technology, etc. Functions other than those provided in such servers are placed at the edge. For example, in a system that monitors video captured at a site via a network, the edge is a device placed at or near the site, and is also a device that is close to the terminal in terms of the network hierarchy.
 実施の形態2
 以下の実施の形態2では、実施の形態1にて説明した変化検出手法の具体例を開示する。ただし、実施の形態1に示した変化検出手法の具体例は、以下に示したものに限られない。また、以下で説明される構成及び処理は例示であり、これに限定されるものではない。
Embodiment 2
In the following embodiment 2, a specific example of the change detection method described in embodiment 1 will be disclosed. However, the specific example of the change detection method described in embodiment 1 is not limited to the one shown below. Furthermore, the configurations and processes described below are merely examples, and are not limited to these.
 (2A)
 [構成の説明]
 図5は、監視システムの一例を示すブロック図である。監視システム100は、複数のロボット101A、101B及び101C(以下、総称してロボット101と記載)、基地局110、センターサーバ120並びに参照点群DB130を備える。図5の例において、ロボット101は監視システム100のエッジ側(現場側)に設けられ、センターサーバ120は、現場から離れた位置(クラウド側)に配置されている。以下、各装置について説明する。
(2A)
[Configuration Description]
Fig. 5 is a block diagram showing an example of a monitoring system. The monitoring system 100 includes a plurality of robots 101A, 101B, and 101C (hereinafter collectively referred to as robots 101), a base station 110, a center server 120, and a reference point group DB 130. In the example of Fig. 5, the robots 101 are provided on the edge side (on-site side) of the monitoring system 100, and the center server 120 is disposed at a position away from the site (on the cloud side). Each device will be described below.
 ロボット101は、インフラ設備の故障又は異常に関する点検をするために、監視対象となる現場を移動しながら所定の場所を計測する端末として機能する。換言すれば、ロボット101は、ネットワークに接続されたエッジデバイスであって、LiDAR102を有し、任意の場所を計測することができる。ロボット101は、計測した点群のデータを、基地局110を介してセンターサーバ120に送信する。ロボット101は、この例では、無線回線によって点群のデータを送信する。ただし、点群のデータは、有線回線によって送信されてもよい。また、ロボット101は、LiDAR102による計測で取得された点群のデータをそのままセンターサーバ120に送信してもよいし、取得した点群のデータに適宜前処理を行ってセンターサーバ120に送信してもよい。 The robot 101 functions as a terminal that measures a specific location while moving through a site to be monitored in order to inspect infrastructure equipment for failures or abnormalities. In other words, the robot 101 is an edge device connected to a network, has a LiDAR 102, and can measure any location. The robot 101 transmits the measured point cloud data to the center server 120 via the base station 110. In this example, the robot 101 transmits the point cloud data via a wireless line. However, the point cloud data may also be transmitted via a wired line. The robot 101 may transmit the point cloud data acquired by measurement using the LiDAR 102 directly to the center server 120, or may perform appropriate preprocessing on the acquired point cloud data before transmitting it to the center server 120.
 さらに、ロボット101は、計測位置を示す情報を、その位置で取得した点群のデータと併せてセンターサーバ120に送信してもよい。ロボット101は、例えばAMCL(Adaptive Monte Carlo Localization)又はSLAM(Simultaneous Localization and Mapping)等の機能を有しており、それらの機能を用いて自己の位置を推定し、その情報をセンターサーバ120に送信する。別の例として、ロボット101は、GPS(Global Positioning System)等の衛星測位システム機能を用いて計測位置を示す情報を取得し、その情報をセンターサーバ120に送信してもよい。さらに別の例として、ロボット101の移動ルート及び計測地点が予め決定され、ロボット101及びセンターサーバ120がその情報を事前に共有することで、センターサーバ120がロボット101の位置を把握していてもよい。 Furthermore, the robot 101 may transmit information indicating the measurement position to the center server 120 together with the point cloud data acquired at that position. The robot 101 has functions such as AMCL (Adaptive Monte Carlo Localization) or SLAM (Simultaneous Localization and Mapping), and estimates its own position using these functions and transmits the information to the center server 120. As another example, the robot 101 may obtain information indicating the measurement position using a satellite positioning system function such as GPS (Global Positioning System), and transmit the information to the center server 120. As yet another example, the movement route and measurement points of the robot 101 may be determined in advance, and the robot 101 and the center server 120 may share the information in advance, so that the center server 120 is aware of the position of the robot 101.
 ロボット101は、例えばセンターサーバ120から制御されることによって走行するAGV(Automatic Guided Vehicle)、自律的な移動が可能なAMR(Autonomous Mobile Robot)、ドローン等であってもよいが、これに限られない。 The robot 101 may be, for example, an AGV (Automatic Guided Vehicle) that runs under the control of the center server 120, an AMR (Autonomous Mobile Robot) that is capable of autonomous movement, a drone, etc., but is not limited to these.
 基地局110は、ネットワークを介して各ロボット101から送信された点群をセンターサーバ120に転送する。例えば、基地局110は、ローカル5G(5th Generation)の基地局、5GのgNB(next Generation Node B)、LTEのeNB(evolved Node B)、無線LANのアクセスポイント等であるが、その他の中継装置でもよい。ネットワークは、例えば、5GC(5th Generation Core network)やEPC(Evolved Packet Core)などのコアネットワーク、インターネット等である。 The base station 110 transfers the point clouds transmitted from each robot 101 to the center server 120 via the network. For example, the base station 110 is a local 5G (5th Generation) base station, a 5G gNB (next Generation Node B), an LTE eNB (evolved Node B), a wireless LAN access point, etc., but may also be other relay devices. The network is, for example, a core network such as 5GC (5th Generation Core network) or EPC (Evolved Packet Core), the Internet, etc.
 なお、基地局110には、センターサーバ120以外のサーバが接続されていてもよい。例えば、基地局110にMEC(Multi-access Edge Computing)サーバが接続されていてもよい。MECサーバは、例えば、各ロボット101が基地局110に送信するデータのビットレートを割り当て、その情報を各ロボット101に送信することにより、各ロボット101が送信するデータのビットレートを制御することができる。また、MECサーバが各ロボット101のビットレートの情報をセンターサーバ120に送信することにより、センターサーバ120はビットレートの情報を把握することができる。 Note that a server other than the center server 120 may be connected to the base station 110. For example, a MEC (Multi-access Edge Computing) server may be connected to the base station 110. The MEC server can, for example, control the bit rate of the data transmitted by each robot 101 by assigning a bit rate for the data transmitted by each robot 101 to the base station 110 and transmitting this information to each robot 101. In addition, the MEC server can transmit information on the bit rate of each robot 101 to the center server 120, allowing the center server 120 to grasp the bit rate information.
 図6は、センターサーバ120の一例を示すブロック図である。センターサーバ120は、参照点群取得部121、入力点群取得部122、参照特徴量計算部123、入力特徴量計算部124、変化検出部125及び検出結果生成部126を備える。センターサーバ120は、ロボット101が計測して取得した点群のデータ毎に、参照データと比較することによって、変化を検出し、比較結果を生成することができる。以下、センターサーバ120の各部について説明する。 FIG. 6 is a block diagram showing an example of the center server 120. The center server 120 includes a reference point cloud acquisition unit 121, an input point cloud acquisition unit 122, a reference feature amount calculation unit 123, an input feature amount calculation unit 124, a change detection unit 125, and a detection result generation unit 126. The center server 120 can detect changes and generate comparison results by comparing each piece of point cloud data measured and acquired by the robot 101 with the reference data. Each part of the center server 120 will be described below.
 参照点群取得部121は、ロボット101が計測で取得した点群と比較するための参照点群を取得する。参照点群は、ロボット101が計測した場所を以前に計測して取得した点群であり、参照点群DB130に格納されている。参照点群は、3次元空間における物体の形を表したデータである。 The reference point cloud acquisition unit 121 acquires a reference point cloud for comparison with the point cloud acquired by the robot 101 during measurement. The reference point cloud is a point cloud acquired by previously measuring the location measured by the robot 101, and is stored in the reference point cloud DB 130. The reference point cloud is data that represents the shape of an object in three-dimensional space.
 例えば、センターサーバ120は、ロボット101が送信した点群のデータ(以下、入力点群とも記載)を後述の入力点群取得部122が取得した場合に、その入力点群の計測位置の情報を併せて取得する。また、参照点群DB130には、参照点群と、その参照点群が計測された位置情報とが関連付けて格納されている。参照点群取得部121は、入力点群の計測位置の情報と、参照点群DB130に格納された計測位置情報とを比較する。比較した結果、参照点群DB130に格納された計測位置情報で一致するものがあった場合に、参照点群取得部121は、一致した計測位置情報と関連付けられた参照点群を取得する。このようにして、参照点群取得部121は、ロボット101が計測して取得した点群と比較対象となる参照点群のデータを、参照点群DB130を検索することによって取得する。 For example, when the input point cloud acquisition unit 122, which will be described later, acquires the point cloud data transmitted by the robot 101 (hereinafter also referred to as the input point cloud), the center server 120 also acquires information on the measurement positions of the input point cloud. Furthermore, the reference point cloud DB 130 stores the reference point cloud in association with the position information at which the reference point cloud was measured. The reference point cloud acquisition unit 121 compares the measurement position information of the input point cloud with the measurement position information stored in the reference point cloud DB 130. If the comparison results in a match in the measurement position information stored in the reference point cloud DB 130, the reference point cloud acquisition unit 121 acquires the reference point cloud associated with the matching measurement position information. In this way, the reference point cloud acquisition unit 121 acquires the reference point cloud data to be compared with the point cloud measured and acquired by the robot 101 by searching the reference point cloud DB 130.
 なお、参照点群DB130には、以前に計測して取得したデータが点群としてではなく、画像して格納されていてもよい。この場合、参照点群取得部121は、上述の参照点群DB130に対する検索によって一致した計測位置情報を特定した場合に、一致した計測位置情報と関連付けられた画像を取得し、画像のデータ形式を点群に変換することによって、参照点群を取得することができる。 In addition, data previously measured and acquired may be stored in the reference point cloud DB 130 as an image rather than as a point cloud. In this case, when the reference point cloud acquisition unit 121 identifies matching measurement position information by searching the reference point cloud DB 130 described above, it can acquire the reference point cloud by acquiring an image associated with the matching measurement position information and converting the data format of the image into a point cloud.
 入力点群取得部122は、センターサーバ120の通信部を介して、ロボット101が計測で取得した入力点群を取得する。これにより、入力点群取得部122は、リアルタイムで計測がなされた点群のデータを取得することができる。ただし、ロボット101が画像を撮影し、その画像データをセンターサーバ120に送信した場合、入力点群取得部122は、その画像のデータ形式を点群に変換することによって、入力点群を取得することができる。入力点群は、参照点群と同様、3次元空間における物体の形を表したデータである。しかしながら、入力点群と参照点群とでは、以下に示す相違点が存在する。 The input point cloud acquisition unit 122 acquires the input point cloud acquired by the robot 101 through measurement via the communication unit of the center server 120. This allows the input point cloud acquisition unit 122 to acquire point cloud data measured in real time. However, if the robot 101 takes an image and transmits the image data to the center server 120, the input point cloud acquisition unit 122 can acquire the input point cloud by converting the data format of the image into a point cloud. Like the reference point cloud, the input point cloud is data that represents the shape of an object in three-dimensional space. However, there are differences between the input point cloud and the reference point cloud, as described below.
 以上に示すようにして取得した参照点群及び入力点群を用いて、リアルタイムによる点群の比較処理を実行する場合、以下に示す2つの課題が想定される。第1の課題は、参照点群及び入力点群における点の密度が異なる場合があるということであり、第2の課題は、ロボット101の計測位置が、参照点群の計測位置とずれることがあるということである。以下、各課題について説明する。 When performing real-time point cloud comparison processing using the reference point cloud and input point cloud obtained as described above, the following two issues are expected. The first issue is that the point densities in the reference point cloud and the input point cloud may differ, and the second issue is that the measurement position of the robot 101 may deviate from the measurement position of the reference point cloud. Each issue is explained below.
 (1)図7A及び7Bは、参照点群及び入力点群の一例を示す。図7Aにおける参照点群は、ロボット101が入力点群に係る計測を実行する前に、センサを用いて計測された3Dデータである。入力点群に係る計測を実行する前とは、ロボットが現場に配置された時であってもいいし、ロボットを現場で稼働前の現場確認時であってもよいし、稼働日の朝のような業務開始前であってもよい。また、図7Bは、ロボット101が計測によって取得した3Dデータである。図7A及び図7Bは、倉庫の中を計測したデータであり、倉庫の中にあるラックLが点群として写っている。また、図7Bでは、図7AにはないオブジェクトOBが点群として写っている。 (1) Figures 7A and 7B show examples of a reference point cloud and an input point cloud. The reference point cloud in Figure 7A is 3D data measured using a sensor before the robot 101 performs measurements related to the input point cloud. "Before performing measurements related to the input point cloud" may be when the robot is deployed at the site, when the robot is inspected at the site before operating at the site, or before work begins, such as on the morning of an operating day. Figure 7B is 3D data acquired by measurement by the robot 101. Figures 7A and 7B are data measured inside a warehouse, and racks L inside the warehouse are captured as a point cloud. Figure 7B also captures an object OB not in Figure 7A as a point cloud.
 図7A及び7Bを比較すると、参照点群T1の方が点の密度が密であるのに対し、入力点群T2の方が点の密度が疎である。これは、入力点群T2が、リアルタイムで計測されたデータであり、そのデータ量が少なくなるためである。換言すれば、入力点群T2では、参照点群T1と比較すると、物体が存在し、本来点があるべき座標において、点の情報が記録されないことがある。また、参照点群T1及び入力点群T2の計測環境によっても、参照点群T1と入力点群T2とにおいて、点の密度は変化することがある。このように、実際の使用において、参照点群及び入力点群のそれぞれの点の密度が大きく異なることが想定される。この2つの点群を直接比較すると、2つの点群における変化の正確な検出が困難になる可能性がある。そのため、入力点群が参照点群と比較して、点群の密度の変化に伴い、正確な情報を示していない場合であっても、点群同士の変化を検出する精度を高めるようにできることが好ましい。 7A and 7B, the reference point cloud T1 has a higher density of points, whereas the input point cloud T2 has a lower density of points. This is because the input point cloud T2 is data measured in real time, and the amount of data is smaller. In other words, compared to the reference point cloud T1, the input point cloud T2 may not record point information at coordinates where an object exists and a point should be located. In addition, the point density may change between the reference point cloud T1 and the input point cloud T2 depending on the measurement environment of the reference point cloud T1 and the input point cloud T2. Thus, in actual use, it is expected that the point density of the reference point cloud and the input point cloud will differ greatly. If these two point clouds are directly compared, it may be difficult to accurately detect changes in the two point clouds. Therefore, it is preferable to be able to improve the accuracy of detecting changes between the point clouds, even if the input point cloud does not show accurate information due to changes in the point cloud density compared to the reference point cloud.
 (2)図8Aは、参照点群を計測で取得する状況の一例を示し、図8Bは、入力点群を計測で取得する状況の一例を示す。図8Aでは、工場設備がある場所I1を、専用センサSによって計測している。この専用センサSは、LiDARを用いることで、点群を取得する。このとき、専用センサSの計測範囲はG11で示される。G11の範囲内には、ガスタンクT1及びT2が含まれる。一方、図8Bでは、同じ場所I1を、ロボット101のLiDAR102によって計測している。このとき、専用センサSの計測範囲はG12で示される。G12の範囲内には、ガスタンクT1及びT2のほか、図8Aではなかった物体L1が含まれる。このような状況では、参照点群及び入力点群を比較することにより、この物体L1を変化として検出し、他の物体については変化として検出しないようにすることが好ましい。換言すれば、入力点群が参照点群と比較して、計測範囲の変化に伴い、正確な情報を示していない場合であっても、点群同士の変化を検出する精度を高めるようにできることが好ましい。 (2) FIG. 8A shows an example of a situation in which a reference point cloud is obtained by measurement, and FIG. 8B shows an example of a situation in which an input point cloud is obtained by measurement. In FIG. 8A, a location I1 where factory equipment is located is measured by a dedicated sensor S. This dedicated sensor S obtains a point cloud by using LiDAR. At this time, the measurement range of the dedicated sensor S is indicated by G11. Gas tanks T1 and T2 are included within the range of G11. On the other hand, in FIG. 8B, the same location I1 is measured by a LiDAR 102 of a robot 101. At this time, the measurement range of the dedicated sensor S is indicated by G12. In addition to gas tanks T1 and T2, object L1, which was not included in FIG. 8A, is included within the range of G12. In such a situation, it is preferable to detect object L1 as a change by comparing the reference point cloud and the input point cloud, and not detect other objects as changes. In other words, it is preferable to be able to improve the accuracy of detecting changes between point clouds, even if the input point cloud does not show accurate information due to a change in the measurement range compared to the reference point cloud.
 図8Bにおいて、ロボット101は計測時に、ロボット101の推定された位置が専用センサSの計測位置と同じ位置となるように制御される。したがって、理想的には、G11とG12は同じ範囲となる。しかしながら、ロボット101の自己の位置推定は、ロボット101がおかれた環境に応じて、その誤差が大きくなることがある。また、ロボット101の位置がセンターサーバ120の制御によって修正できないような場合にも、その誤差が大きくなることがある。このような場合、図8Bに示すように、G11とG12にズレが生じる。このズレは、計測位置自体のズレ(以下、並進方向のズレとも記載)と、計測方向自体のズレ(以下、回転方向のズレとも記載)の2種類のものがある。 In FIG. 8B, the robot 101 is controlled during measurement so that the estimated position of the robot 101 is the same as the measured position of the dedicated sensor S. Ideally, therefore, G11 and G12 are in the same range. However, the error in the estimation of the robot 101's own position can become large depending on the environment in which the robot 101 is placed. Also, the error can become large when the position of the robot 101 cannot be corrected by the control of the center server 120. In such a case, as shown in FIG. 8B, a deviation occurs between G11 and G12. There are two types of deviation: a deviation in the measured position itself (hereinafter also referred to as a deviation in the translation direction) and a deviation in the measured direction itself (hereinafter also referred to as a deviation in the rotation direction).
 図8Cは、参照点群と入力点群の理想的な比較結果を示し、図8Dは、参照点群と入力点群の実際の比較結果を示す。G11とG12が同じ範囲となる場合、参照点群と入力点群とを比較することにより、図8Cに示したように、参照点群と入力点群との変化として物体L1のみが検出され、他の物体は検出されない。しかしながら、実際には、上述の通り、G11とG12にズレが生じる。したがって、図8Dに示したように、参照点群と入力点群との変化として、物体L1だけでなく、ガスタンクT1及びT2も検出されてしまうことがある。このように、計測範囲のズレが生じることで、変化検出精度が悪化してしまう可能性があった。 Figure 8C shows the ideal comparison result between the reference point cloud and the input point cloud, and Figure 8D shows the actual comparison result between the reference point cloud and the input point cloud. When G11 and G12 are in the same range, by comparing the reference point cloud and the input point cloud, as shown in Figure 8C, only object L1 is detected as a change between the reference point cloud and the input point cloud, and other objects are not detected. However, in reality, as described above, a deviation occurs between G11 and G12. Therefore, as shown in Figure 8D, not only object L1 but also gas tanks T1 and T2 may be detected as a change between the reference point cloud and the input point cloud. In this way, a deviation in the measurement range may cause a deterioration in change detection accuracy.
 本開示で提案する変化検出手法は、以下に示す処理により、このような課題を解決することを可能とする。 The change detection method proposed in this disclosure makes it possible to solve these problems through the process described below.
 図6に戻り、説明を続ける。参照特徴量計算部123は、実施の形態1における第1の特徴量計算部11に対応し、参照点群における各座標の特徴量を計算する。この特徴量は、計算対象となる座標及びその周辺の座標における点の有無の情報がベクトル化されて表されるものであり、以降、空間特徴量とも記載する。参照特徴量計算部123の空間特徴量の計算方法の詳細について、以下で記載する。 Returning to FIG. 6, the explanation will continue. The reference feature calculation unit 123 corresponds to the first feature calculation unit 11 in the first embodiment, and calculates the feature of each coordinate in the reference point group. This feature is expressed as vectorized information on the presence or absence of points at the coordinates to be calculated and at the surrounding coordinates, and will hereafter also be referred to as a spatial feature. Details of the method of calculation of the spatial feature by the reference feature calculation unit 123 will be described below.
 図9Aは、参照点群の計算対象となる座標を中心とした極座標系を示す図である。計算対象となる座標F1を中心Oに配置した場合に、中心Oからの距離r及び角度(θ、φ)をパラメータとして用いることにより、座標F1を含む所定の領域(実施の形態1における第1の領域)に含まれる座標を特定することが可能となる。 FIG. 9A shows a polar coordinate system centered on the coordinates to be calculated for the reference point group. When the coordinate F1 to be calculated is placed at the center O, it is possible to identify coordinates included in a predetermined region (the first region in the first embodiment) that includes the coordinate F1 by using the distance r and angles (θ, φ) from the center O as parameters.
 この例では、参照特徴量計算部123は、参照点群において、座標F1を中心とした半径εの球体領域S1における点群のデータを取得する。そして、球体領域S1における各座標における、図9Aで示した極座標系におけるパラメータ(r、θ、φ)を計算して求める。 In this example, the reference feature calculation unit 123 acquires data on a group of points in a spherical region S1 of radius ε centered on coordinate F1 in the reference point group. Then, it calculates and finds parameters (r, θ, φ) in the polar coordinate system shown in FIG. 9A for each coordinate in the spherical region S1.
 参照特徴量計算部123は、球体領域S1を複数の小領域SR1(実施の形態1における第1の小領域)に分割し、その小領域SR1内に各座標が含まれるようにする。ここで、各々の小領域SR1における距離rと角度(θ、φ)が離散化された値となるように、球体領域S1は分割される。この例では、角度θは2α(rad)で分割され、角度φはα(rad)で分割され、距離rはdで分割される。なお、2αはπ以下の値であり、dはε/2以下の値である。このとき、球体領域S1は、 The reference feature calculation unit 123 divides the spherical region S1 into a plurality of small regions SR1 (first small regions in the first embodiment) so that each coordinate is included within the small region SR1. Here, the spherical region S1 is divided so that the distance r and angle (θ, φ) in each small region SR1 are discretized values. In this example, the angle θ is divided by 2α (rad), the angle φ is divided by α (rad), and the distance r is divided by d. Note that 2α is a value equal to or less than π, and d is a value equal to or less than ε/2. In this case, the spherical region S1 is
式1 Equation 1
Figure JPOXMLDOC01-appb-I000001
又は
Figure JPOXMLDOC01-appb-I000001
or
式2 Equation 2
Figure JPOXMLDOC01-appb-I000002
の数の小領域SR1に分割される。(1)、(2)では、それぞれ床関数、天井関数を用いることで、小領域SR1の数が表されている。参照特徴量計算部123は、区切られた各小領域SR1における点の有無を空間特徴量の要素として定義することにより、ベクトル化された空間特徴量を計算する。例えば、各座標において点が存在する場合は1、存在しない場合は2と示されたベクトル表記が想定されるが、ベクトル表記の例はこれに限られない。
Figure JPOXMLDOC01-appb-I000002
The image is divided into a number of small regions SR1. In (1) and (2), the number of small regions SR1 is expressed by using a floor function and a ceiling function, respectively. The reference feature calculation unit 123 calculates a vectorized spatial feature by defining the presence or absence of a point in each divided small region SR1 as an element of the spatial feature. For example, a vector representation in which 1 is indicated when a point exists at each coordinate and 2 is indicated when a point does not exist at each coordinate is assumed, but the example of the vector representation is not limited to this.
 図9Bは、参照点群の座標について計算された空間特徴量の一例である。図9Bには、座標F1が中心Oである球体領域S1において、小領域SR1において点があることが黒丸で、点がないことが白丸でそれぞれ表記されている。また、図9Bには、以降の空間特徴量の計算を説明する際に例示する、球体領域S1上の座標H1が示される。 FIG. 9B is an example of spatial features calculated for the coordinates of the reference point group. In FIG. 9B, in a spherical region S1 with coordinate F1 at center O, the presence of a point in small region SR1 is indicated by a black circle, and the absence of a point is indicated by a white circle. FIG. 9B also shows coordinate H1 on spherical region S1, which will be used as an example when explaining the calculation of spatial features below.
 さらに、参照特徴量計算部123は、参照点群における座標F1の近傍領域δ1を設定する。近傍領域δ1は、座標F1以外の座標を少なくとも1つ含む領域であり、例えば、座標F1を中心とする極座標系において半径λ1の領域として設定されるが、近傍領域δ1の設定はこれに限られない。そして、参照特徴量計算部123は、近傍領域δ1に含まれる座標F1以外の各座標についても、上記で示した座標F1における空間特徴量の算出と同様に、空間特徴量を算出する。つまり、参照特徴量計算部123は、近傍領域δ1に含まれる座標F1以外の各座標について、球体領域S1及び小領域SR1と同様の手法で、複数の小領域(第3の小領域)から構成され、座標を含む球体領域(第3の領域)を定義する。そして、各小領域における点の存在の有無の情報を用いることで、空間特徴量を計算する。 Furthermore, the reference feature calculation unit 123 sets a neighborhood area δ1 of the coordinate F1 in the reference point group. The neighborhood area δ1 is an area including at least one coordinate other than the coordinate F1, and is set as an area of radius λ1 in a polar coordinate system centered on the coordinate F1, for example, but the setting of the neighborhood area δ1 is not limited to this. The reference feature calculation unit 123 then calculates spatial features for each coordinate other than the coordinate F1 included in the neighborhood area δ1 in the same manner as the calculation of spatial features at the coordinate F1 described above. In other words, the reference feature calculation unit 123 defines a spherical area (third area) including the coordinates, which is composed of multiple small areas (third small areas), for each coordinate other than the coordinate F1 included in the neighborhood area δ1, in the same manner as the spherical area S1 and the small area SR1. Then, the spatial features are calculated using information on the presence or absence of a point in each small area.
 次に、入力特徴量計算部124の計算方法について説明する。入力特徴量計算部124は、実施の形態1における第2の特徴量計算部12に対応し、入力点群における各座標の空間特徴量を計算する。この空間特徴量は、計算対象となる座標及びその周辺の座標において、点が存在する、点が存在しない、又は点の存在が不明であることを示す情報がベクトル化されて表されるものである。 Next, the calculation method of the input feature calculation unit 124 will be described. The input feature calculation unit 124 corresponds to the second feature calculation unit 12 in the first embodiment, and calculates the spatial feature of each coordinate in the input point cloud. This spatial feature is expressed as a vectorized information indicating whether a point exists, does not exist, or whether the existence of a point is unknown at the coordinate to be calculated and its surrounding coordinates.
 入力特徴量計算部124は、入力点群において計算対象となる座標F2を中心Oに配置した場合に、中心Oからの距離r及び角度(θ、φ)をパラメータとして用いることにより、座標F2を含む所定の領域(実施の形態1における第2の領域)に含まれる座標を特定する。座標F2は、座標F1と空間特徴量を比較する対象となる座標であり、ここでは、参照点群と入力点群において同じ座標を示す。 When the coordinate F2 to be calculated in the input point cloud is placed at the center O, the input feature calculation unit 124 uses the distance r from the center O and the angle (θ, φ) as parameters to identify coordinates included in a predetermined region (the second region in the first embodiment) that includes the coordinate F2. The coordinate F2 is a coordinate that is to be compared with the coordinate F1 in terms of spatial features, and here indicates the same coordinate in the reference point cloud and the input point cloud.
 入力特徴量計算部124は、入力点群において、座標F2を中心とした半径εの球体領域S2における点群のデータを取得する。球体領域S2の大きさは、球体領域S1と同じである。そして、球体領域における各座標における、図9Aで示した極座標系におけるパラメータ(r、θ、φ)を計算して求める。  The input feature calculation unit 124 acquires data on the point cloud in a spherical region S2 of radius ε centered on coordinate F2 in the input point cloud. The size of the spherical region S2 is the same as that of the spherical region S1. Then, it calculates and finds the parameters (r, θ, φ) in the polar coordinate system shown in Figure 9A for each coordinate in the spherical region.
 入力特徴量計算部124は、球体領域S2を複数の小領域SR2(実施の形態1における第2の小領域)に分割し、その小領域SR2内に各座標が含まれるようにする。球体領域S1は、球体領域S2と同じように分割される。すなわち、角度θは2α(rad)で分割され、角度φはα(rad)で分割され、距離rはdで分割される。なお、2αはπ以下の値であり、dはε/2以下の値である。そのため、球体領域S2は、 The input feature calculation unit 124 divides the spherical region S2 into a plurality of small regions SR2 (second small regions in the first embodiment) so that each coordinate is included within the small region SR2. The spherical region S1 is divided in the same way as the spherical region S2. That is, the angle θ is divided by 2α (rad), the angle φ is divided by α (rad), and the distance r is divided by d. Note that 2α is a value equal to or less than π, and d is a value equal to or less than ε/2. Therefore, the spherical region S2 is,
式3Equation 3
Figure JPOXMLDOC01-appb-I000003
又は
Figure JPOXMLDOC01-appb-I000003
or
式4Equation 4
Figure JPOXMLDOC01-appb-I000004
の数の小領域SR2に分割される。(3)、(4)は、それぞれ(1)、(2)と同じ式である。入力特徴量計算部124は、区切られた各小領域SR2において、点が存在する、点が存在しない、又は点の存在が不明であることを示す情報を空間特徴量の要素として定義することにより、ベクトル化された空間特徴量を計算する。例えば、各座標において点が存在する場合は1、存在しない場合は2、点の存在が不明である場合は0と示されたベクトル表記が想定されるが、ベクトル表記の例はこれに限られない。
Figure JPOXMLDOC01-appb-I000004
The input feature amount calculation unit 124 calculates a vectorized spatial feature amount by defining, as an element of the spatial feature amount, information indicating whether a point exists, whether a point does not exist, or whether the existence of a point is unknown in each divided small region SR2. For example, a vector notation is assumed in which 1 is indicated when a point exists in each coordinate, 2 is indicated when a point does not exist, and 0 is indicated when the existence of a point is unknown, but examples of the vector notation are not limited to this.
 ここで、入力特徴量計算部124は、各小領域SR2において点が存在する、点が存在しない、又は点の存在が不明であることを示す情報を次のようにして決定する。入力特徴量計算部124は、小領域SR2において点が有る場合、その小領域SR2には、点が存在すると定義する。一方、小領域SR2において点が無い場合、入力特徴量計算部124は、その小領域SR2が、計測時における入力点群において点が存在する位置と、LiDAR102の位置との間に位置するか否かを判定する。小領域SR2が、計測時における入力点群において点が存在する位置と、LiDAR102の位置との間に位置する場合、入力特徴量計算部124は、小領域SR2内に点が存在しないと定義する。一方、計測時における入力点群において点が存在する位置と、LiDAR102の位置との間に小領域SR2が位置しない場合、小領域SR2において点の存在が不明であると定義する。定義には、このようなレイトレーシングの技術が適用できる。 Here, the input feature calculation unit 124 determines information indicating whether a point exists in each small region SR2, whether a point does not exist, or whether the existence of a point is unknown, as follows. If a point exists in a small region SR2, the input feature calculation unit 124 defines the small region SR2 as having a point. On the other hand, if there is no point in the small region SR2, the input feature calculation unit 124 determines whether the small region SR2 is located between the position where the point exists in the input point cloud at the time of measurement and the position of LiDAR 102. If the small region SR2 is located between the position where the point exists in the input point cloud at the time of measurement and the position of LiDAR 102, the input feature calculation unit 124 defines the small region SR2 as having no point. On the other hand, if the small region SR2 is not located between the position where the point exists in the input point cloud at the time of measurement and the position of LiDAR 102, the input feature calculation unit 124 defines the existence of a point in the small region SR2 as being unknown. This type of ray tracing technique can be applied to the definition.
 上述の通り、入力点群の点の密度は、参照点群の点の密度よりも疎である。そのため、参照点群で写っていた物体が、入力点群において正確に写らなくなり、物体の一部の点が入力点群に記録されないような場合が考えられる。したがって、現実に点が存在しないことが確定できないようなときに、点が内部に無い小領域SR2について、点の存在が不明であると定義することが好ましい。 As mentioned above, the density of points in the input point cloud is sparser than the density of points in the reference point cloud. Therefore, it is possible that an object that is captured in the reference point cloud will not be captured accurately in the input point cloud, and some points of the object will not be recorded in the input point cloud. Therefore, when it is not possible to be certain that a point does not actually exist, it is preferable to define the presence of a point as unknown for a small region SR2 that does not contain a point.
 図9Cは、入力点群の座標について計算された空間特徴量の一例である。図9Cには、座標F2が中心Oである球体領域S2において、小領域SR2において点があることが黒丸、点がないことが白丸、点の存在が不明であることが三角でそれぞれ表記されている。また、図9Cには、以降の空間特徴量の計算を説明する際に例示する、球体領域S2上の座標H2が示される。なお、座標H1と座標H2は、参照点群と入力点群において同じ座標を示す。 Figure 9C is an example of spatial features calculated for the coordinates of the input point cloud. In Figure 9C, in a spherical region S2 with coordinate F2 at center O, the presence of a point in small region SR2 is indicated by a black circle, the absence of a point is indicated by a white circle, and the presence of a point is unknown is indicated by a triangle. Figure 9C also shows coordinate H2 on spherical region S2, which will be used as an example when explaining the calculation of spatial features below. Note that coordinates H1 and H2 indicate the same coordinates in the reference point cloud and the input point cloud.
 さらに、入力特徴量計算部124は、入力点群における座標F2の近傍領域δ2を設定する。近傍領域δ2は、座標F2以外の座標を少なくとも1つ含む領域であり、例えば、座標F2を中心とする極座標系において半径λ2の領域として設定されるが、近傍領域δ2の設定はこれに限られない。そして、入力特徴量計算部124は、近傍領域δ2に含まれる座標F2以外の各座標についても、上記で示した座標F2における空間特徴量の算出と同様に、空間特徴量を算出する。つまり、入力特徴量計算部124は、近傍領域δ2に含まれる座標F2以外の各座標について、球体領域S2及び小領域SR2と同様の手法で、複数の小領域(第4の小領域)から構成され、座標を含む球体領域(第4の領域)を定義する。そして、各小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いることで、空間特徴量を計算する。 Furthermore, the input feature amount calculation unit 124 sets a neighborhood area δ2 of the coordinate F2 in the input point cloud. The neighborhood area δ2 is an area including at least one coordinate other than the coordinate F2, and is set as an area of radius λ2 in a polar coordinate system centered on the coordinate F2, for example, but the setting of the neighborhood area δ2 is not limited to this. Then, the input feature amount calculation unit 124 calculates the spatial feature amount for each coordinate other than the coordinate F2 included in the neighborhood area δ2 in the same manner as the calculation of the spatial feature amount for the coordinate F2 shown above. In other words, for each coordinate other than the coordinate F2 included in the neighborhood area δ2, the input feature amount calculation unit 124 defines a spherical area (fourth area) including the coordinates, which is composed of multiple small areas (fourth small areas) in the same manner as the spherical area S2 and the small area SR2. Then, the spatial feature amount is calculated by using information indicating that a point exists in each small area, that a point does not exist, or that the existence of a point is unknown.
 なお、入力点群における近傍領域δ2の大きさは、参照点群における近傍領域δ1の大きさと同じであってもよいし、異なっていてもよい。すなわち、この例では、半径λ1は半径λ2と同じ長さであってもよいし、異なる長さであってもよい。 Note that the size of the neighborhood region δ2 in the input point cloud may be the same as or different from the size of the neighborhood region δ1 in the reference point cloud. That is, in this example, the radius λ1 may be the same length as the radius λ2, or may be a different length.
 以上のようにして、参照特徴量計算部123と、入力特徴量計算部124は、参照点群と入力点群の各座標において上述の空間特徴量を計算する。 In this manner, the reference feature calculation unit 123 and the input feature calculation unit 124 calculate the above-mentioned spatial features at each coordinate of the reference point group and the input point group.
 図6に戻り、説明を続ける。変化検出部125は、実施の形態1における判定部13に対応する。変化検出部125は、参照点群における座標F1と、入力点群における座標F2とについて、空間特徴量を比較することによって類似度を算出する。そして、その類似度を用いて、参照点群から入力点群に変化した際に、座標F1における点の有無が座標F2において変化したか否かを判定する。 Returning to FIG. 6, the explanation will continue. The change detection unit 125 corresponds to the determination unit 13 in the first embodiment. The change detection unit 125 calculates the similarity by comparing the spatial feature amounts between the coordinate F1 in the reference point group and the coordinate F2 in the input point group. Then, using the similarity, it determines whether or not the presence or absence of a point at the coordinate F1 has changed at the coordinate F2 when changing from the reference point group to the input point group.
 変化検出部125が実行可能な判定処理は、以下の2種類がある。
(I)参照点群の座標F1において存在しなかった点が、入力点群の座標F2において存在するようになったか否かを判定する
(II)参照点群の座標F1において存在した点が、入力点群の座標F2において存在しなくなったか否かを判定する
以下、処理(I)及び(II)の詳細について説明する。
There are two types of determination processing that the change detection unit 125 can execute:
(I) Determine whether a point that did not exist in the coordinate F1 of the reference point group now exists in the coordinate F2 of the input point group. (II) Determine whether a point that existed in the coordinate F1 of the reference point group no longer exists in the coordinate F2 of the input point group. Details of processes (I) and (II) are explained below.
 (I)の処理において、変化検出部125は、参照特徴量計算部123が計算した、参照点群における座標F1の特徴量と、入力特徴量計算部124が計算した、入力点群における座標F2の特徴量との類似度を算出する。ここで、座標F1の特徴量をP1、座標F2の特徴量をP2とし、両者の類似度をS(P1, P2)とする。 In process (I), the change detection unit 125 calculates the similarity between the feature of coordinate F1 in the reference point group calculated by the reference feature calculation unit 123 and the feature of coordinate F2 in the input point group calculated by the input feature calculation unit 124. Here, the feature of coordinate F1 is P1, the feature of coordinate F2 is P2, and the similarity between the two is S(P1, P2).
 変化検出部125は、S(P1, P2)を次のように計算する。 The change detection unit 125 calculates S(P1, P2) as follows:
式5Equation 5
Figure JPOXMLDOC01-appb-I000005
Figure JPOXMLDOC01-appb-I000005
式6Equation 6
Figure JPOXMLDOC01-appb-I000006
Figure JPOXMLDOC01-appb-I000006
式7Equation 7
Figure JPOXMLDOC01-appb-I000007
(5)~(7)におけるφ、θ、rは、上記の通り、それぞれ2α、α、dを単位として分割されている。(5)の右辺におけるP1φθrは、特徴量P1の一要素を示し、これは、球体領域S1における小領域SR1において定義された空間特徴量の一要素である。P2φθrは、特徴量P2の一要素を示し、これは、球体領域S2における小領域SR2において定義された空間特徴量の一要素である。P1φθrとP2φθrは、参照点群と入力点群とを比較したときに同じ位置にある小領域SR1と小領域SR2における、それぞれの特徴量の要素である。
Figure JPOXMLDOC01-appb-I000007
As described above, φ, θ, and r in (5) to (7) are divided into units of 2α, α, and d, respectively. P1 φθr on the right side of (5) indicates an element of feature amount P1, which is an element of the spatial feature amount defined in small region SR1 in spherical region S1. P2 φθr indicates an element of feature amount P2, which is an element of the spatial feature amount defined in small region SR2 in spherical region S2. P1 φθr and P2 φθr are the elements of the feature amounts in small regions SR1 and SR2 that are in the same position when the reference point group and the input point group are compared.
 また、(5)におけるValidNumは、球体領域S2において、「点の存在が不明」であると定義された小領域SR2以外の小領域SR2の数である。したがって、(5)は、球体領域S2における全ての小領域SR2(又は、球体領域S1における全ての小領域SR1)にわたって、Score(P1φθr,P2φθr)が合計され、合計値がValidNumによって規格化されることを示している。このとき、「点の存在が不明」であると定義された小領域SR2については、類似度の計算において評価がなされない(すなわち、計算上無視される)。 Furthermore, ValidNum in (5) is the number of small regions SR2 in the spherical region S2 other than the small regions SR2 defined as "where the presence of a point is unknown." Therefore, (5) indicates that Score (P1 φθr , P2 φθr ) is summed over all small regions SR2 in the spherical region S2 (or all small regions SR1 in the spherical region S1), and the sum is normalized by ValidNum. At this time, the small regions SR2 defined as "where the presence of a point is unknown" are not evaluated in the similarity calculation (i.e., they are ignored in the calculation).
 (6)は、Score(P1φθr,P2φθr)の定義を示している。Score(P1φθr,P2φθr)は、P1φθrとP2φθrに関して、後述のSamearound(P1φθr,P2φθr)が存在する場合には1となり、それ以外の場合では0となる。 (6) shows the definition of Score(P1 φθr , P2 φθr ). Score(P1 φθr , P2 φθr ) is 1 when Same around (P1 φθr , P2 φθr ) exists for P1 φθr and P2 φθr , and is 0 otherwise.
 (7)は、参照領域においてφが-ρ1+φ1からρ1+φ1までの範囲をとり、θが-ρ2+θ2からρ2+θ2までの範囲をとるP1(φ+d1)(θ+d2)rと、P2φθrとが同じ値となる場合があるときに、Samearound(P1φθr,P2φθr)が存在することを示している。なお、φ1、θ2は、P1φθrにおける小領域SR1のφ、θの値である。また、ρ1、ρ2は、それぞれφ、θに対するズレの許容値であり、少なくとも、P1φθrにおける小領域SR1に対して隣接した小領域SR1が含まれるように定義される。このように、(7)では、小領域SR1のP1φθr(空間特徴量の要素)だけでなく、その小領域SR1の周辺領域に含まれる第1の小領域の空間特徴量の要素についても、その小領域SR1に対応する入力点群における小領域SR2の空間特徴量の要素との比較がなされる。 (7) shows that Same around (P1 φθr , P2 φθr ) exists when P1 (φ+d1)(θ+d2)r , in which φ ranges from -ρ1+φ1 to ρ1+φ1 in the reference region and θ ranges from -ρ2+θ2 to ρ2+ θ2 , and P2 φθr may have the same value. Note that φ1 and θ2 are the values of φ and θ of the small region SR1 in P1 φθr . Also, ρ1 and ρ2 are the allowable values of deviation for φ and θ, respectively, and are defined so as to include at least the small region SR1 adjacent to the small region SR1 in P1 φθr . In this way, in (7), not only P1 φθr (spatial feature element) of the small region SR1, but also the spatial feature elements of the first small region included in the peripheral region of the small region SR1 are compared with the spatial feature elements of the small region SR2 in the input point cloud corresponding to the small region SR1.
 また、小領域SR1又はSR2において点の存在が不明である場合には、Samearound(P1φθr,P2φθr)は定義されず、(6)においてScore(P1φθr,P2φθr)は0となる。 Furthermore, when the presence of a point is unknown in the small region SR1 or SR2, Same around (P1 φθr , P2 φθr ) is not defined, and Score (P1 φθr , P2 φθr ) becomes 0 in (6).
 図9Dは、図9B及び9Cの例において、上記の式(7)を説明するための図である。この図においては、図9Bにおける座標H1の近傍である球体領域S1の一部と、図9Cにおける座標H2の近傍である球体領域S2の一部が示されている。座標H1を含む小領域SR1の空間特徴量における要素はP1φθr、座標H2を含む小領域SR2の空間特徴量における要素はP2φθrである。また、図9Dでは、φ方向において、座標H1を含む小領域SR1にφ方向において隣接した小領域SR1の空間特徴量における要素は、P1(φ+1)θr及びP1(φ-1)θrである。さらに、図9Dでは、座標H1を含む小領域SR1、P1(φ+1)θrにかかる小領域SR1及びP1(φ-1)θrにかかる小領域SR1のそれぞれにおいて、r方向において隣接した小領域SR1が定義されている。これらの小領域SR1の空間特徴量における要素は、それぞれP1φθ(r-1)、P1(φ+1)θ(r-1)、P1(φ-1)θ(r-1)である。 FIG. 9D is a diagram for explaining the above formula (7) in the examples of FIG. 9B and FIG. 9C. In this diagram, a part of the spherical region S1 near the coordinate H1 in FIG. 9B and a part of the spherical region S2 near the coordinate H2 in FIG. 9C are shown. The element in the spatial feature of the small region SR1 including the coordinate H1 is P1 φθr , and the element in the spatial feature of the small region SR2 including the coordinate H2 is P2 φθr . Also, in FIG. 9D, the elements in the spatial feature of the small region SR1 adjacent in the φ direction to the small region SR1 including the coordinate H1 in the φ direction are P1 (φ+1)θr and P1 (φ-1)θr . Furthermore, in FIG. 9D, the small region SR1 adjacent in the r direction is defined in each of the small region SR1 including the coordinate H1, the small region SR1 at P1 (φ+1) θr , and the small region SR1 at P1 (φ-1)θr. The elements in the spatial feature amount of these small regions SR1 are P1 φθ(r−1) , P1 (φ+1) θ(r−1) , and P1 (φ−1) θ(r−1), respectively.
 図9Dの例において、P1φθrは点が存在しないことを示し、P2φθrは点が存在することを示すため、両者は同じ要素ではない。しかしながら、式(7)が示すφの範囲は、P1(φ+1)θr及びP1(φ-1)θrを含む。この例では、P1(φ-1)θrが、点が存在することを示すものであるため、P1(φ-1)θrとP2φθrは同じ要素となる。そのため、図9Dの例では、(7)に示すSamearound(P1φθr,P2φθr)が存在し、(6)に示すScore(P1φθr,P2φθr)は1となる。 In the example of FIG. 9D, P1 φθr indicates that a point does not exist, and P2 φθr indicates that a point exists, so they are not the same element. However, the range of φ indicated by formula (7) includes P1 (φ+1) θr and P1 (φ-1) θr . In this example, P1 (φ-1) θr indicates that a point exists, so P1 (φ-1) θr and P2 φθr are the same element. Therefore, in the example of FIG. 9D, Same around (P1 φθr , P2 φθr ) shown in (7) exists, and Score (P1 φθr , P2 φθr ) shown in (6) is 1.
 このようにして、変化検出部125は、全てのφ、θ、rについてScore(P1φθr,P2φθr)を算出することで、式(5)に示す類似度S(P1, P2)を計算するマッチング処理を実行する。 In this manner, the change detection unit 125 executes a matching process to calculate the similarity S(P1, P2) shown in equation (5) by calculating Score (P1 φθr , P2 φθr ) for all φ, θ, and r.
 また、変化検出部125は、参照特徴量計算部123が設定した近傍領域δ1に含まれる座標F1以外の各座標についても、S(P1, P2)と同様の算出手法を用いて、類似度S(Pn, P2)を算出する。なお、Pnは、座標F1以外の各座標における特徴量であって、参照特徴量計算部123が上記の処理で計算したものである。以下、S(P1, P2)とS(Pn, P2)を含む近傍領域δ1内の類似度Sとして、S(PN, P2)を定義する。 The change detection unit 125 also calculates the similarity S(Pn, P2) for each coordinate other than the coordinate F1 included in the neighborhood region δ1 set by the reference feature calculation unit 123, using the same calculation method as for S(P1, P2). Note that Pn is the feature at each coordinate other than the coordinate F1, and is calculated by the reference feature calculation unit 123 in the above process. Hereinafter, S(PN, P2) is defined as the similarity S within the neighborhood region δ1 including S(P1, P2) and S(Pn, P2).
 変化検出部125は、このようにして近傍領域δ1内における全ての類似度S(PN, P2)を算出した後、S(PN, P2)の中で最大値となるSmax(PN, P2)を特定する。このSmax(PN, P2)にかかる参照点群の座標は、参照点群における近傍領域δ1に含まれる座標の中で、点が存在するか否かを示す状態が、入力点群における座標F2と最も類似している座標である。変化検出部125は、Smax(PN, P2)と、所定の閾値ThS1との大小関係を比較する。 After calculating all similarities S(PN,P2) in the neighborhood region δ1 in this way, the change detection unit 125 identifies Smax (PN,P2) which is the maximum value among S(PN,P2). The coordinates of the reference point group corresponding to this Smax (PN,P2) are the coordinates most similar to the coordinate F2 in the input point group in terms of the state indicating whether or not a point exists among the coordinates included in the neighborhood region δ1 in the reference point group. The change detection unit 125 compares Smax (PN,P2) with a predetermined threshold value ThS1 to determine which is larger.
 Smax(PN, P2)がThS1以下である場合、変化検出部125は、近傍領域δ1に含まれる座標は、座標F2との類似度合いが低いと判定する。そして、変化検出部125は、参照点群の座標F1において存在しなかった点が、入力点群の座標F2において存在するようになったことを判定する。一方、Smax(PN, P2)がThS1よりも大きい場合、変化検出部125は上記のように判定しない。 If Smax (PN,P2) is equal to or smaller than ThS1, the change detection unit 125 determines that the coordinates included in the neighborhood region δ1 have a low similarity to the coordinate F2. The change detection unit 125 then determines that a point that did not exist in the coordinate F1 of the reference point group now exists in the coordinate F2 of the input point group. On the other hand, if Smax (PN,P2) is greater than ThS1, the change detection unit 125 does not make the above determination.
 次に、(II)の処理について説明する。変化検出部125は、座標F1の特徴量と座標F2の特徴量との類似度であるS(P1, P2)を算出するマッチング処理を実行する。この算出方法は(I)と同様であるため、説明を省略する。 Next, we will explain the process of (II). The change detection unit 125 executes a matching process to calculate S(P1, P2), which is the similarity between the feature amount of the coordinate F1 and the feature amount of the coordinate F2. This calculation method is the same as in (I), so the explanation will be omitted.
 また、変化検出部125は、入力特徴量計算部124が設定した近傍領域δ2に含まれる座標F2以外の各座標についても、S(P1, P2)と同様の算出手法を用いて、類似度S(P1,Pm)を算出する。なお、Pmは、座標F2以外の各座標における特徴量であって、入力特徴量計算部124が上記の処理で計算したものである。以下、S(P1, P2)とS(P1,Pm)を含む近傍領域δ2内の類似度Sとして、S(P1, PM)を定義する。 The change detection unit 125 also calculates the similarity S(P1, Pm) for each coordinate other than the coordinate F2 included in the neighborhood region δ2 set by the input feature calculation unit 124, using the same calculation method as for S(P1, P2). Note that Pm is the feature at each coordinate other than the coordinate F2, and is calculated by the input feature calculation unit 124 in the above process. Below, S(P1, PM) is defined as the similarity S within the neighborhood region δ2 including S(P1, P2) and S(P1, Pm).
 変化検出部125は、このようにして近傍領域δ2内における全ての類似度SS(P1, PM)を算出した後、S(P1, PM)の中で最大値となるSmax(P1, PM)を特定する。このSmax(P1, PM)にかかる参照点群の座標は、入力点群における近傍領域δ2に含まれる座標の中で、点が存在するか否かを示す状態が、参照点群における座標F1と最も類似している座標である。変化検出部125は、Smax(P1, PM)と、所定の閾値ThS2との大小関係を比較する。 After calculating all similarities SS(P1,PM) in the neighborhood region δ2 in this manner, the change detection unit 125 identifies Smax (P1,PM) which is the maximum value among S(P1,PM). The coordinates of the reference point group corresponding to this Smax (P1,PM) are the coordinates which, among the coordinates included in the neighborhood region δ2 in the input point group, are most similar in terms of the state indicating whether or not a point exists to the coordinate F1 in the reference point group. The change detection unit 125 compares Smax (P1,PM) with a predetermined threshold value ThS2.
 Smax(P1, PM)がThS2以下である場合、変化検出部125は、近傍領域δ2に含まれる座標は、座標F1との類似度合いが低いと判定する。そして、変化検出部125は、参照点群の座標F1において存在していた点が、入力点群の座標F2において存在しなくなったことを判定する。一方、Smax(P1, PM)がThS2よりも大きい場合、変化検出部125は上記のように判定しない。 If Smax (P1,PM) is equal to or smaller than ThS2, the change detection unit 125 determines that the coordinates included in the neighborhood region δ2 have a low similarity to the coordinate F1. The change detection unit 125 then determines that a point that existed at the coordinate F1 of the reference point group no longer exists at the coordinate F2 of the input point group. On the other hand, if Smax (P1,PM) is greater than ThS2, the change detection unit 125 does not make the above determination.
 (I)において、Smax(PN, P2)がThS1より大きいと判定され、(II)において、Smax(P1, PM)がThS2より大きいと判定された場合には、変化検出部125は、座標F1における点の有無が座標F2において変化しなかったと判定する。つまり、座標F1と座標F2との間に変化がないと判定される。 If it is determined in (I) that Smax (PN,P2) is greater than ThS1, and in (II) that Smax (P1,PM) is greater than ThS2, the change detection unit 125 determines that the presence or absence of a point at the coordinate F1 has not changed at the coordinate F2, i.e., that there is no change between the coordinate F1 and the coordinate F2.
 なお、参照特徴量計算部123及び入力特徴量計算部124は、それぞれ近傍領域δ1、δ2の大きさを、状況に応じて変更することが可能である。例えば、参照特徴量計算部123は、参照点群の計測時に座標F1が示す位置と、計測時の専用センサSの位置との間の距離が長くなるほど、近傍領域δ1の大きさ(例えば、半径λ1の大きさ)を大きくしてもよい。これは、参照点群の座標F1が計測時に専用センサSから離れているほど、上記のズレの影響がより広範囲に生じるため、比較対象となる参照点群側の座標をより広範囲に設定することが好ましいためである。同様の理由で、入力特徴量計算部124は、入力点群の計測時に座標F2が示す位置と、計測時のLiDAR102の位置との間の距離が長くなるほど、近傍領域δ2の大きさ(例えば、半径λ2の大きさ)を大きくしてもよい。 Note that the reference feature calculation unit 123 and the input feature calculation unit 124 can change the size of the neighborhood regions δ1 and δ2 depending on the situation. For example, the reference feature calculation unit 123 may increase the size of the neighborhood region δ1 (e.g., the size of the radius λ1) as the distance between the position indicated by the coordinate F1 when the reference point group is measured and the position of the dedicated sensor S at the time of measurement increases. This is because the influence of the above-mentioned deviation occurs over a wider range as the coordinate F1 of the reference point group is farther away from the dedicated sensor S at the time of measurement, so it is preferable to set the coordinates of the reference point group to be compared over a wider range. For the same reason, the input feature calculation unit 124 may increase the size of the neighborhood region δ2 (e.g., the size of the radius λ2) as the distance between the position indicated by the coordinate F2 when the input point group is measured and the position of the LiDAR 102 at the time of measurement increases.
 変化検出部125は、参照点群における各座標と、その各座標に対応する入力点群における各座標について、上記の(I)及び(II)の処理を実行する。これにより、変化検出部125は、参照点群及び入力点群の全座標において、点の有無の変化を検出することができる。なお、(I)では、変化を検出する際の起点として入力点群の座標が選択され、入力点群の各座標について変化の有無が判定されることから、参照点群にない点が入力点群で新しく追加されたか否かが検出される。一方、(II)では、変化を検出する際の起点として参照点群の座標が選択され、参照点群の各座標について変化の有無が判定されることから、参照点群で有る点が、入力点群ではなくなったか否かが検出される。 The change detection unit 125 executes the above processes (I) and (II) for each coordinate in the reference point cloud and each coordinate in the input point cloud that corresponds to each of those coordinates. This enables the change detection unit 125 to detect changes in the presence or absence of points in all coordinates of the reference point cloud and the input point cloud. In (I), the coordinates of the input point cloud are selected as the starting point for detecting changes, and the presence or absence of a change is determined for each coordinate of the input point cloud, thereby detecting whether a point that is not in the reference point cloud has been newly added to the input point cloud. On the other hand, in (II), the coordinates of the reference point cloud are selected as the starting point for detecting changes, and the presence or absence of a change is determined for each coordinate of the reference point cloud, thereby detecting whether a point in the reference point cloud is no longer in the input point cloud.
 図6に戻り、説明を続ける。検出結果生成部126は、変化検出部125が(I)に示した処理を実行した結果、変化検出部125が検出した変化を示す画像を生成する。例えば、図8A及び8Bに示した状況において参照点群と入力点群が取得された場合、検出結果生成部126は、変化検出部125が(I)に示した処理を実行した結果として、図8Cに示す画像を生成することが可能となる。また、検出結果生成部126は、変化検出部125が(II)に示した処理を実行した結果、変化検出部125が検出した変化を示す画像を生成することもできる。このようにして生成される画像には、参照点群と入力点群との比較において変化しないと判定された箇所は表示されず、両者の差分の箇所を表示させることができる。なお、センターサーバ120は、検出結果生成部126が生成した画像をユーザに見せるディスプレイ等のインタフェースを有していてもよい。また、検出結果生成部126は、画像に代えて、変化検出部125が検出した変化を示す3Dデータの点群を生成してもよい。さらに、検出結果生成部126は、実施の形態1に示した出力部と同様に、生成した画面又は点群上で、点の有無の変化が生じた箇所(又は、物体の有無が変化した箇所)を視覚的に強調する処理を実行してもよい。 Returning to FIG. 6, the explanation will be continued. The detection result generating unit 126 generates an image showing the change detected by the change detection unit 125 as a result of the change detection unit 125 executing the process shown in (I). For example, when the reference point group and the input point group are acquired in the situation shown in FIG. 8A and 8B, the detection result generating unit 126 can generate the image shown in FIG. 8C as a result of the change detection unit 125 executing the process shown in (I). The detection result generating unit 126 can also generate an image showing the change detected by the change detection unit 125 as a result of the change detection unit 125 executing the process shown in (II). In the image generated in this way, the part determined not to have changed in the comparison between the reference point group and the input point group is not displayed, and the part of the difference between the two can be displayed. The center server 120 may have an interface such as a display that shows the image generated by the detection result generating unit 126 to the user. The detection result generating unit 126 may generate a point group of 3D data showing the change detected by the change detection unit 125 instead of an image. Furthermore, the detection result generating unit 126 may perform processing to visually emphasize locations where a change in the presence or absence of a point has occurred (or locations where a change in the presence or absence of an object has occurred) on the generated screen or point cloud, similar to the output unit described in embodiment 1.
 [処理の説明]
 図10A~10Cは、センターサーバ120の代表的な処理の一例を示したフローチャートであり、このフローチャートによって、センターサーバ120の処理の概要が説明される。各処理の詳細については上述の通りであるため、適宜説明を省略する。図10Aは、センターサーバ120の処理例の概略を示したフローチャートであり、まず、図10Aを用いて処理のフローを説明する。
[Processing Description]
10A to 10C are flowcharts showing an example of a representative process of the center server 120, and an overview of the process of the center server 120 is explained using these flowcharts. The details of each process are as described above, and therefore will not be explained as appropriate. Fig. 10A is a flowchart showing an overview of an example of the process of the center server 120, and the process flow will be explained first using Fig. 10A.
 まず、参照点群取得部121は、参照点群DB130から参照点群のデータを取得する(ステップS21;取得ステップ)。また、入力点群取得部122は、ロボット101が送信したデータを取得することで、入力点群のデータを取得する(ステップS22;取得ステップ)。なお、ステップS21とS22の処理は、どちらが先になされてもよいし、両方の処理が並列になされてもよい。 First, the reference point cloud acquisition unit 121 acquires reference point cloud data from the reference point cloud DB 130 (step S21; acquisition step). The input point cloud acquisition unit 122 acquires input point cloud data by acquiring data transmitted by the robot 101 (step S22; acquisition step). Note that either step S21 or S22 may be performed first, or both steps may be performed in parallel.
 次に、参照特徴量計算部123、入力特徴量計算部124及び変化検出部125は、処理(A)を実行する(ステップS23;処理(A)ステップ)。処理(A)は、上記の処理(I)及びそれに関連する処理を示し、その詳細は図10Bを用いて説明される。 Next, the reference feature calculation unit 123, the input feature calculation unit 124, and the change detection unit 125 execute process (A) (step S23; process (A) step). Process (A) indicates the above process (I) and processes related to it, and the details thereof will be described using FIG. 10B.
 また、参照特徴量計算部123、入力特徴量計算部124及び変化検出部125は、処理(B)も実行する(ステップS24;処理(B)ステップ)。処理(B)は、上記の処理(II)及びそれに関連する処理を示し、その詳細は図10Cを用いて説明される。なお、ステップS23とS24の処理は、どちらが先になされてもよいし、両方の処理が並列になされてもよい。 The reference feature calculation unit 123, the input feature calculation unit 124, and the change detection unit 125 also execute process (B) (step S24; process (B) step). Process (B) indicates the above process (II) and processes related thereto, and the details thereof will be explained using FIG. 10C. Note that either of the processes in steps S23 and S24 may be performed first, or both processes may be performed in parallel.
 検出結果生成部126は、処理(A)によって生成された、各座標における変化の有無を示す情報に基づいて、処理(A)に関して、検出された変化を示す画像を生成する。同様に、検出結果生成部126は、処理(B)によって生成された、各座標における変化の有無を示す情報に基づいて、処理(B)に関して、検出された変化を示す画像を生成する(ステップS25;検出結果画像生成ステップ)。 The detection result generating unit 126 generates an image showing the change detected in the process (A) based on the information indicating the presence or absence of a change at each coordinate generated by the process (A). Similarly, the detection result generating unit 126 generates an image showing the change detected in the process (B) based on the information indicating the presence or absence of a change at each coordinate generated by the process (B) (step S25; detection result image generating step).
 次に、図10Bを用いて、センターサーバの詳細な処理(A)の一例を示す。まず、入力特徴量計算部124は、入力点群内のまだ特徴量が計算されていない座標について、その空間特徴量を計算する(ステップS31;特徴量計算ステップ)。入力特徴量計算部124は、計算した座標の情報を、参照特徴量計算部123に出力する。 Next, an example of detailed processing (A) of the center server is shown using FIG. 10B. First, the input feature calculation unit 124 calculates spatial features for coordinates in the input point cloud for which features have not yet been calculated (step S31; feature calculation step). The input feature calculation unit 124 outputs information on the calculated coordinates to the reference feature calculation unit 123.
 参照特徴量計算部123は、出力された座標の情報に基づき、その座標と対応する参照点群内の座標を特定する。ここでは、入力特徴量計算部124が出力した座標の情報が、上記の例における座標F2の情報であり、参照特徴量計算部123が特定する座標の情報は、上記の例における座標F1の情報であるとする。参照特徴量計算部123は、座標F1を含む近傍領域δ1を設定し、近傍領域δ1内に含まれる各座標について、それらの空間特徴量を計算する(ステップS32;特徴量計算ステップ)。なお、上記では、ステップS31の処理がステップS32の処理よりも先になされているが、ステップS32の処理がステップS31の処理よりも先になされるようにしてもよいし、両方の処理が並列になされてもよい。 Based on the output coordinate information, the reference feature calculation unit 123 identifies coordinates in the reference point group that correspond to the coordinates. Here, the coordinate information output by the input feature calculation unit 124 is the coordinate F2 information in the above example, and the coordinate information identified by the reference feature calculation unit 123 is the coordinate F1 information in the above example. The reference feature calculation unit 123 sets a neighborhood region δ1 that includes the coordinate F1, and calculates the spatial features of each coordinate included in the neighborhood region δ1 (step S32; feature calculation step). Note that in the above, the process of step S31 is performed before the process of step S32, but the process of step S32 may be performed before the process of step S31, or both processes may be performed in parallel.
 変化検出部125は、ステップS31及びS32でそれぞれ計算された空間特徴量を用いて、座標F1と座標F2との類似度を計算する(ステップS33;類似度計算ステップ)。変化検出部125は、計算された類似度の最大値と、所定の閾値とを比較することにより、座標F2において点の有無に変化が生じたか否かを判定する(ステップS34;変化検出ステップ)。 The change detection unit 125 calculates the similarity between coordinates F1 and F2 using the spatial features calculated in steps S31 and S32 (step S33; similarity calculation step). The change detection unit 125 compares the maximum value of the calculated similarity with a predetermined threshold value to determine whether or not a change has occurred in the presence or absence of a point at coordinate F2 (step S34; change detection step).
 変化検出部125は、入力点群内の全座標について、類似度の計算及び変化検出の判定が終了したか否かを判定する(ステップS35;終了判定ステップ)。入力点群内の全座標について、類似度の計算及び変化検出の判定が終了していない場合(ステップS35のNo)、類似度を計算していない入力点群内の座標について、ステップS31に戻り、処理を繰り返す。入力点群内の全座標について、類似度の計算及び変化検出の判定が終了した場合(ステップS35のYes)、処理(A)は終了する。そして、上記の通り、検出結果生成部126は、処理(A)によって生成された、各座標における変化の有無を示す情報に基づいて、変化検出画像を生成する。 The change detection unit 125 determines whether or not the similarity calculation and change detection determination have been completed for all coordinates in the input point cloud (step S35; completion determination step). If the similarity calculation and change detection determination have not been completed for all coordinates in the input point cloud (No in step S35), the process returns to step S31 and is repeated for coordinates in the input point cloud for which similarity has not been calculated. If the similarity calculation and change detection determination have been completed for all coordinates in the input point cloud (Yes in step S35), process (A) ends. Then, as described above, the detection result generation unit 126 generates a change detection image based on the information generated by process (A) indicating the presence or absence of a change at each coordinate.
 次に、図10Cを用いて、センターサーバの詳細な処理(B)の一例を示す。まず、参照特徴量計算部123は、参照点群内のまだ特徴量が計算されていない座標について、その空間特徴量を計算する(ステップS41;特徴量計算ステップ)。参照特徴量計算部123は、計算した座標の情報を、入力特徴量計算部124に出力する。 Next, an example of detailed processing (B) of the center server is shown using FIG. 10C. First, the reference feature calculation unit 123 calculates the spatial feature for coordinates in the reference point group for which the feature has not yet been calculated (step S41; feature calculation step). The reference feature calculation unit 123 outputs information on the calculated coordinates to the input feature calculation unit 124.
 入力特徴量計算部124は、出力された座標の情報に基づき、その座標と対応する入力点群内の座標を特定する。ここでは、参照特徴量計算部123が出力した座標の情報が、上記の例における座標F1の情報であり、入力特徴量計算部124が特定する座標の情報は、上記の例における座標F2の情報であるとする。入力特徴量計算部124は、座標F2を含む近傍領域δ2を設定し、近傍領域δ2内に含まれる各座標について、それらの空間特徴量を計算する(ステップS42;特徴量計算ステップ)。なお、上記では、ステップS41の処理がステップS42の処理よりも先になされているが、ステップS42の処理がステップS41の処理よりも先になされるようにしてもよいし、両方の処理が並列になされてもよい。 Based on the output coordinate information, the input feature calculation unit 124 identifies coordinates in the input point cloud that correspond to the coordinates. Here, the coordinate information output by the reference feature calculation unit 123 is the coordinate F1 information in the above example, and the coordinate information identified by the input feature calculation unit 124 is the coordinate F2 information in the above example. The input feature calculation unit 124 sets a neighborhood region δ2 that includes the coordinate F2, and calculates the spatial features of each coordinate included in the neighborhood region δ2 (step S42; feature calculation step). Note that in the above, the process of step S41 is performed before the process of step S42, but the process of step S42 may be performed before the process of step S41, or both processes may be performed in parallel.
 変化検出部125は、ステップS41及びS42でそれぞれ計算された空間特徴量を用いて、座標F1と座標F2との類似度を計算する(ステップS43;類似度計算ステップ)。変化検出部125は、計算された類似度の最大値と、所定の閾値とを比較することにより、座標F1において点の有無に変化が生じたか否かを判定する(ステップS44;変化検出ステップ)。 The change detection unit 125 calculates the similarity between coordinates F1 and F2 using the spatial features calculated in steps S41 and S42 (step S43; similarity calculation step). The change detection unit 125 compares the maximum value of the calculated similarity with a predetermined threshold value to determine whether or not a change has occurred in the presence or absence of a point at coordinates F1 (step S44; change detection step).
 変化検出部125は、参照点群内の全座標について、類似度の計算及び変化検出の判定が終了したか否かを判定する(ステップS45;終了判定ステップ)。参照点群内の全座標について、類似度の計算及び変化検出の判定が終了していない場合(ステップS45のNo)、類似度を計算していない参照点群内の座標について、ステップS41に戻り、処理を繰り返す。参照点群内の全座標について、類似度の計算及び変化検出の判定が終了した場合(ステップS45のYes)、処理(B)は終了する。そして、上記の通り、検出結果生成部126は、処理(B)によって生成された、各座標における変化の有無を示す情報に基づいて、変化を検出した画像を生成する。 The change detection unit 125 determines whether or not the similarity calculation and change detection determination have been completed for all coordinates in the reference point group (step S45; completion determination step). If the similarity calculation and change detection determination have not been completed for all coordinates in the reference point group (No in step S45), the process returns to step S41 and is repeated for coordinates in the reference point group for which similarity has not been calculated. If the similarity calculation and change detection determination have been completed for all coordinates in the reference point group (Yes in step S45), process (B) ends. Then, as described above, the detection result generation unit 126 generates an image in which changes have been detected, based on the information generated by process (B) indicating the presence or absence of changes at each coordinate.
 なお、図10Bにおいて、先に入力点群内の全座標についての類似度を計算した後に、入力点群内の各座標について、変化検出の判定を実行してもよい。同様に、図10Cにおいて、先に参照点群内の全座標についての類似度を計算した後に、参照点群内の各座標について、変化検出の判定を実行してもよい。 In FIG. 10B, the similarity for all coordinates in the input point cloud may be calculated first, and then a change detection determination may be performed for each coordinate in the input point cloud. Similarly, in FIG. 10C, the similarity for all coordinates in the reference point cloud may be calculated first, and then a change detection determination may be performed for each coordinate in the reference point cloud.
 上記の例では、処理(A)、(B)の両方が実行される例について説明したが、処理(A)、(B)のいずれか一方だけが実行され、実行された処理のみについて、変化検出部125で画像が生成されてもよい。また、各処理を順に実行する際に、以前の処理で算出された特徴量や類似度といった情報について、後の処理で使用する際には、再度算出することなく、以前算出された結果を流用可能であることは言うまでもない。 In the above example, both processes (A) and (B) are executed, but only one of processes (A) and (B) may be executed, and the change detection unit 125 may generate an image for only the executed process. In addition, when executing each process in sequence, it goes without saying that when using information such as features and similarities calculated in a previous process in a later process, the previously calculated results can be reused without having to be calculated again.
 また、ロボット101が計測によって複数枚の点群を取得する場合、入力点群取得部122において、入力点群は複数生成される。センターサーバ120は、その各々の入力点群と、参照点群との間で、上記に示した比較処理を実行することにより、入力点群ごとに検出画像を生成することが可能である。 In addition, when the robot 101 acquires multiple point clouds by measurement, multiple input point clouds are generated in the input point cloud acquisition unit 122. The center server 120 can generate a detection image for each input point cloud by executing the comparison process described above between each input point cloud and the reference point cloud.
 [効果の説明]
 以上に示したように、入力特徴量計算部124は、点が存在する、点が存在しない、又は点の存在が不明であることを示す情報をベクトル化することで、入力点群における空間特徴量を計算する。変化検出部125は、計算された参照点群における空間特徴量と入力点群における空間特徴量とを用いることにより、参照点群と入力点群との間において、点の有無の変化が生じたか否かを判定することができる。上記の通り、入力点群の点の密度は、参照点群の点の密度よりも疎であり、物体が存在し、本来点があるべき座標において、点が記録されていないことがある。このような場合においても、センターサーバ120において上記の処理を実行することにより、入力点群のデータをそのまま用いたときと比較して、判定の精度が高くなると推定される。これは、リアルタイムによる点群の比較処理を実行する場合に、特に有効である。
[Effects]
As described above, the input feature calculation unit 124 calculates the spatial feature of the input point cloud by vectorizing information indicating the presence or absence of a point, or the presence or absence of a point is unknown. The change detection unit 125 can determine whether a change in the presence or absence of a point has occurred between the reference point cloud and the input point cloud by using the spatial feature of the calculated reference point cloud and the spatial feature of the input point cloud. As described above, the density of the points of the input point cloud is sparser than the density of the points of the reference point cloud, and there are cases where an object exists and a point is not recorded at a coordinate where the point should be. Even in such a case, it is estimated that the accuracy of the determination is higher by performing the above process in the center server 120 compared to when the data of the input point cloud is used as is. This is particularly effective when performing a point cloud comparison process in real time.
 また、参照特徴量計算部123は、参照点群の計算対象座標(上記の例における座標F1)の空間特徴量を計算する際に、計算対象座標の近傍領域δ1における各座標の特徴量についても計算してもよい。変化検出部125は、計算対象座標の空間特徴量と、計算対象座標の近傍領域δ1における各座標の空間特徴量と、計算対象座標と対応する入力点群における対応座標における空間特徴量とを用いて、計算対象座標において存在しない点が、入力点群における対応座標において存在するか否かを判定することができる。これにより、参照点群の計測と入力点群の計測において並進方向のズレが生じた場合であっても、類似度算出においてそのズレを考慮した計算が可能となるため、変化検出精度の悪化を抑制することができる。 Furthermore, when calculating the spatial feature of the calculation target coordinate of the reference point group (coordinate F1 in the above example), the reference feature calculation unit 123 may also calculate the feature of each coordinate in the neighborhood region δ1 of the calculation target coordinate. The change detection unit 125 can determine whether or not a point that does not exist in the calculation target coordinate exists in the corresponding coordinate in the input point group, using the spatial feature of the calculation target coordinate, the spatial feature of each coordinate in the neighborhood region δ1 of the calculation target coordinate, and the spatial feature of the calculation target coordinate and the corresponding coordinate in the input point group. As a result, even if a translational deviation occurs between the measurement of the reference point group and the measurement of the input point group, it is possible to perform a calculation that takes the deviation into account in the similarity calculation, thereby suppressing deterioration in change detection accuracy.
 同様に、入力特徴量計算部124は、入力点群の計算対象座標(上記の例における座標F2)の空間特徴量を計算する際に、計算対象座標の近傍領域δ2における各座標の特徴量についても計算してもよい。変化検出部125は、計算対象座標の空間特徴量と、計算対象座標の近傍領域δ2における各座標の空間特徴量と、計算対象座標と対応する参照点群における座標における空間特徴量とを用いて、計算対象座標において存在する点が、参照点群における対応座標において存在しないか否かを判定することができる。これにより、参照点群の計測と入力点群の計測において並進方向のズレが生じた場合であっても、変化検出精度の悪化を抑制することができる。 Similarly, when calculating the spatial feature of the calculation target coordinate of the input point cloud (coordinate F2 in the above example), the input feature calculation unit 124 may also calculate the feature of each coordinate in the neighborhood δ2 of the calculation target coordinate. The change detection unit 125 can determine whether a point that exists in the calculation target coordinate does not exist in the corresponding coordinate in the reference point cloud, using the spatial feature of the calculation target coordinate, the spatial feature of each coordinate in the neighborhood δ2 of the calculation target coordinate, and the spatial feature of the calculation target coordinate and the corresponding coordinate in the reference point cloud. This makes it possible to suppress deterioration of change detection accuracy even if a translational deviation occurs between the measurement of the reference point cloud and the measurement of the input point cloud.
 図11A及び11Bは、図7A及び7Bで示した参照点群と入力点群とにおいて、点の有無をそのまま直接比較することによって両者の変化を検出した画像と、本開示の手法によって両者の変化を検出した画像を示したものである。上記の通り、図7A及び7Bは、センサによって倉庫内のラックLを計測したものである。このとき、参照点群の計測位置と入力点群の計測位置とは、並進方向に所定の位置だけずれていると仮定する。一般的に、参照点群と入力点群の計測条件は全く同じにはならず、異なることが多く、この状況はそれを反映したものである。また、入力点群計測時には、参照点群計測時になかったオブジェクトOBが存在する。 Figures 11A and 11B show images in which changes between the reference point cloud and input point cloud shown in Figures 7A and 7B have been detected by directly comparing the presence or absence of points, and images in which changes between the two have been detected using the method of this disclosure. As described above, Figures 7A and 7B show measurements of a rack L in a warehouse using a sensor. At this time, it is assumed that the measurement positions of the reference point cloud and the input point cloud are shifted by a predetermined position in the translation direction. Generally, the measurement conditions of the reference point cloud and the input point cloud are not exactly the same, but are often different, and this situation reflects this. Furthermore, when the input point cloud is measured, an object OB is present that was not present when the reference point cloud was measured.
 図11Aの画像C1及び11Bの画像C2において、点で表された箇所が、参照点群と入力点群との変化として検出される箇所である。図11A及び11Bの画像において点が濃い箇所ほど、変化として強く検出された箇所となる。図11Aの画像C1においては、本来変化として検出されるべきオブジェクトOBだけでなく、参照点群と入力点群とでセンサから見え方が変化するラックLについても、変化として検出されている。しかしながら、図11Bの画像C2においては、ラックLが変化として検出することが抑制され、オブジェクトOBが明瞭な変化として検出されている。このように、本開示の手法は、参照点群の計測と入力点群の計測において並進方向のズレが生じた場合であっても、変化検出精度の悪化を抑制することができる。 In image C1 of FIG. 11A and image C2 of FIG. 11B, the points represented by dots are the points detected as changes between the reference point cloud and the input point cloud. The darker the dots in the images of FIG. 11A and 11B, the stronger the detection of a change. In image C1 of FIG. 11A, not only is object OB, which should be detected as a change, detected as a change is rack L, whose appearance from the sensor changes between the reference point cloud and the input point cloud. However, in image C2 of FIG. 11B, detection of rack L as a change is suppressed, and object OB is detected as a clear change. In this way, the method disclosed herein can suppress deterioration of change detection accuracy even when a translational deviation occurs between the measurement of the reference point cloud and the measurement of the input point cloud.
 さらなる効果として、点群同士の変化検出精度の悪化が抑制されるため、正確な変化の検出をするためにロボット101が計測し、取得するデータの量を少なくする(すなわち、計測回数を削減する)ことや、ロボット101がより遠方から計測対象を計測することが可能となる。そのため、ロボット101が移動する距離を削減することができ、インフラ施設の効率的な監視又は点検が可能となる。 As an additional effect, deterioration in change detection accuracy between point clouds is suppressed, so the amount of data that the robot 101 measures and acquires in order to accurately detect changes can be reduced (i.e., the number of measurements can be reduced), and the robot 101 can measure the measurement target from a greater distance. This reduces the distance that the robot 101 travels, enabling efficient monitoring or inspection of infrastructure facilities.
 また、参照点群は、センサを用いて取得されたデータであり、参照点群取得部121は、変化検出の対象となる座標が計測時に示す位置とセンサの位置との間の距離が長くなるほど、近傍領域δ1の大きさを大きくしてもよい。これにより、センサからの距離が遠くなる領域を計測し、ズレの影響がより大きく生じる場合であっても、そのズレの影響がカバーできるように近傍領域を設定することができるため、変化検出精度の悪化を抑制することができる。 The reference point cloud is data acquired using a sensor, and the reference point cloud acquisition unit 121 may increase the size of the neighborhood region δ1 as the distance between the position indicated by the coordinates subject to change detection at the time of measurement and the position of the sensor increases. This makes it possible to measure an area farther away from the sensor, and set the neighborhood region to cover the effects of the deviation even if the effect of the deviation is greater, thereby preventing deterioration in change detection accuracy.
 同様に、入力点群は、センサを用いて取得されたデータであり、入力点群取得部122は、変化検出の対象となる座標が計測時に示す位置とセンサの位置との間の距離が長くなるほど、近傍領域δ2の大きさを大きくしてもよい。これにより、センサからの距離が遠くなる領域を計測し、ズレの影響がより大きく生じる場合であっても、そのズレの影響がカバーできるように近傍領域を設定することができるため、変化検出精度の悪化を抑制することができる。 Similarly, the input point cloud is data acquired using a sensor, and the input point cloud acquisition unit 122 may increase the size of the neighborhood region δ2 as the distance between the position indicated by the coordinates subject to change detection at the time of measurement and the position of the sensor increases. This makes it possible to measure an area farther away from the sensor, and even if the effect of the deviation is greater, set the neighborhood region so as to cover the effect of the deviation, thereby suppressing deterioration in change detection accuracy.
 また、入力点群の小領域において点が存在しない場合に、点がないと判定するか、それとも点の存在が不明であるかを判定するかは、上記のレイトレーシングの技術によって決定されてもよい。これにより、入力点群の計測時において物体が存在しているものの、入力点群において点として記録されなかった箇所において、点の存在が不明であると判定することができる。そのため、このような箇所について点がないものとして取り扱う場合と比較して、変化検出精度の悪化を抑制することができる。 Furthermore, when no point exists in a small region of the input point cloud, whether to determine that there is no point or that the presence of the point is unknown may be determined by the above-mentioned ray tracing technology. This makes it possible to determine that the presence of a point is unknown in a location where an object exists when the input point cloud is measured but was not recorded as a point in the input point cloud. Therefore, it is possible to suppress deterioration in change detection accuracy compared to treating such locations as if there are no points.
 また、変化検出部125は、(I)の処理を実行する場合に、参照点群の計算対象座標の空間特徴量と、計算対象座標と対応する、入力点群の対応座標の空間特徴量との類似度と、近傍領域δ1における各座標の空間特徴量と、入力点群の対応座標の空間特徴量との類似度と、を算出してもよい。 In addition, when executing process (I), the change detection unit 125 may calculate the similarity between the spatial feature of the calculation target coordinates of the reference point group and the spatial feature of the corresponding coordinates of the input point group that correspond to the calculation target coordinates, and the similarity between the spatial feature of each coordinate in the neighboring region δ1 and the spatial feature of the corresponding coordinates of the input point group.
 ここで、計算対象座標の空間特徴量と対応座標の空間特徴量との類似度は、球体領域S1における所定の小領域SR1と、その小領域SR1の周辺領域に含まれる小領域SR1と、の各々における計算対象座標の空間特徴量の要素と、所定の小領域SR1に対応する、球体領域S2における小領域SR2の要素とを用いて計算される。 Here, the similarity between the spatial features of the calculation target coordinates and the spatial features of the corresponding coordinates is calculated using the elements of the spatial features of the calculation target coordinates in a specific small region SR1 in the spherical region S1 and a small region SR1 included in the peripheral region of that small region SR1, and the elements of a small region SR2 in the spherical region S2 that corresponds to the specific small region SR1.
 また、近傍領域δ1における各座標の空間特徴量と対応座標の空間特徴量との類似度は、各座標の球体領域S1における所定の小領域SR1と、その小領域SR1の周辺領域に含まれる小領域SR1と、の各々における空間特徴量の要素と、所定の小領域SR1に対応する、球体領域S2における小領域SR2の要素とを用いて計算される。 Furthermore, the similarity between the spatial features of each coordinate in the neighboring region δ1 and the spatial features of the corresponding coordinate is calculated using the elements of the spatial features of a specific small region SR1 in the spherical region S1 of each coordinate and a small region SR1 included in the peripheral region of that small region SR1, and the elements of the small region SR2 in the spherical region S2 that corresponds to the specific small region SR1.
 同様に、変化検出部125は、(II)の処理を実行する場合に、入力点群の計算対象座標の空間特徴量と、計算対象座標と対応する、参照点群の対応座標の空間特徴量との類似度と、近傍領域δ2における各座標の空間特徴量と、参照点群の対応座標の空間特徴量との類似度と、を算出してもよい。 Similarly, when executing process (II), the change detection unit 125 may calculate the similarity between the spatial feature of the calculation target coordinates of the input point group and the spatial feature of the corresponding coordinates of the reference point group that correspond to the calculation target coordinates, and the similarity between the spatial feature of each coordinate in the neighboring region δ2 and the spatial feature of the corresponding coordinates of the reference point group.
 ここで、計算対象座標の空間特徴量と対応座標の空間特徴量との類似度は、球体領域S2における所定の小領域SR2と、その小領域SR2の周辺領域に含まれる小領域SR2と、の各々における計算対象座標の空間特徴量の要素と、所定の小領域SR2に対応する、球体領域S1における小領域SR1の要素とを用いて計算される。 Here, the similarity between the spatial features of the calculation target coordinates and the spatial features of the corresponding coordinates is calculated using the elements of the spatial features of the calculation target coordinates in each of a specific small region SR2 in the spherical region S2 and a small region SR2 included in the peripheral region of that small region SR2, and the elements of the small region SR1 in the spherical region S1 that corresponds to the specific small region SR2.
 また、近傍領域δ2における各座標の空間特徴量と対応座標の空間特徴量との類似度は、各座標の球体領域S2における所定の小領域SR2と、その小領域SR2の周辺領域に含まれる小領域SR2と、の各々における空間特徴量の要素と、所定の小領域SR2に対応する、球体領域S1における小領域SR1の要素とを用いて計算される。 Furthermore, the similarity between the spatial features of each coordinate in the neighboring region δ2 and the spatial features of the corresponding coordinate is calculated using the elements of the spatial features of a specific small region SR2 in the spherical region S2 of each coordinate and a small region SR2 included in the peripheral region of that small region SR2, and the elements of the small region SR1 in the spherical region S1 that corresponds to the specific small region SR2.
 このような処理により、参照点群の計測と入力点群の計測において回転方向のズレが生じた場合であっても、類似度算出においてそのズレを考慮した計算が可能となるため、変化検出精度の悪化を抑制することができる。なお、変化検出部125は、上記の処理において、小領域SR1の周辺領域、又は小領域SR2の周辺領域の少なくともいずれかについて、その大きさを適宜変更してもよい。これにより、回転方向のズレに対する許容度を変更することができる。 By performing this type of processing, even if there is a rotational misalignment between the measurements of the reference point group and the input point group, it is possible to perform a calculation that takes that misalignment into account when calculating the similarity, thereby preventing a deterioration in change detection accuracy. In the above processing, the change detection unit 125 may appropriately change the size of at least one of the surrounding areas of small region SR1 or the surrounding areas of small region SR2. This makes it possible to change the tolerance for rotational misalignment.
 なお、参照特徴量計算部123においても、入力特徴量計算部124と同様の方法で、点が存在する、点が存在しない、又は点の存在が不明であることを示す情報をベクトル化することで、入力点群における空間特徴量を計算してもよい。点の存在が不明であることを示す情報の定義には、入力特徴量計算部124の説明において示した、レイトレーシングの技術が適用できる。 In addition, the reference feature calculation unit 123 may also calculate spatial features in the input point cloud by vectorizing information indicating whether a point exists, whether a point does not exist, or whether the existence of a point is unknown, in a manner similar to that of the input feature calculation unit 124. The ray tracing technology described in the explanation of the input feature calculation unit 124 can be applied to the definition of information indicating whether the existence of a point is unknown.
 変化検出部125は、このようにして計算された空間特徴量と、入力特徴量計算部124が計算した空間特徴量を用いて、上記の方法で、類似度を算出する。このとき、(II)の処理において、式(5)におけるValidNumは、球体領域S1において、「点の存在が不明」であると定義された小領域SR1以外の小領域SR1の数となる。そのため、「点の存在が不明」であると定義された小領域SR1については、類似度の計算において評価がなされない。これにより、参照点群において点の存在が不明となるような状態を変化の検出に反映することができるため、参照点群と入力点群との間の変化の検出精度をさらに高めることが可能となる。 The change detection unit 125 calculates the similarity using the spatial feature calculated in this way and the spatial feature calculated by the input feature calculation unit 124 in the above-mentioned method. At this time, in the process of (II), ValidNum in formula (5) is the number of small areas SR1 in the spherical area S1 other than the small area SR1 defined as "where the presence of a point is unknown". Therefore, the small areas SR1 defined as "where the presence of a point is unknown" are not evaluated in the similarity calculation. This makes it possible to reflect a state in which the presence of a point in the reference point group is unknown in the change detection, thereby making it possible to further improve the accuracy of change detection between the reference point group and the input point group.
 なお、参照点群と入力点群は、所定の場所がカメラや測位センサによって撮影された2次元の画像に基づいて生成されたマッピングデータであってもよい。参照点群にかかる画像の撮影と入力点群にかかる画像の撮影においてズレが生じる場合として、上記の例で示した場合のほか、次のような場合も想定される。例えば、人が持って移動する可動式のカメラで同じ場所の画像を異なるタイミングで撮影し、その画像に基づいて参照点群と入力点群を生成した場合に、ズレが生じることがある。また、設備に固定された定点カメラが同じ場所の画像を異なるタイミングで撮影した場合であっても、地震又は設備の劣化に伴い定点カメラの位置又は撮影方向がずれることによってズレが生じることが想定される。このような場合においても、上記の処理を実行することで、変化検出精度の悪化を抑制することができる。 The reference point cloud and the input point cloud may be mapping data generated based on two-dimensional images of a specific location taken by a camera or a positioning sensor. In addition to the above example, the following cases are also considered as cases in which a misalignment occurs between the images related to the reference point cloud and the images related to the input point cloud. For example, a misalignment may occur when images of the same location are taken at different times using a movable camera carried by a person and the reference point cloud and the input point cloud are generated based on the images. Even if a fixed camera attached to a facility takes images of the same location at different times, it is possible that a misalignment may occur due to a shift in the position or shooting direction of the fixed camera caused by an earthquake or deterioration of the facility. Even in such cases, the deterioration of change detection accuracy can be suppressed by performing the above process.
 変化検出部125は、(I)の処理を、入力点群のうち全部ではない複数の座標と、それに対応する参照点群の複数の座標について実行してもよい。同様に、変化検出部125は、(II)の処理を、参照点群のうち全部ではない複数の座標と、それに対応する入力点群の複数の座標について実行してもよい。検出結果生成部126は、この判定結果に基づいて、変化検出部125が検出した変化を示す画像を生成する。このようにして、点群のうち判定が不要な領域がある場合には、センターサーバ120は、その領域を除いた座標について変化を検出することができる。 The change detection unit 125 may perform process (I) for a number of coordinates of less than all of the input point cloud and a number of coordinates of the reference point cloud corresponding to those coordinates. Similarly, the change detection unit 125 may perform process (II) for a number of coordinates of less than all of the reference point cloud and a number of coordinates of the input point cloud corresponding to those coordinates. The detection result generation unit 126 generates an image showing the changes detected by the change detection unit 125 based on this determination result. In this way, if there is an area in the point cloud that does not require determination, the center server 120 can detect changes in the coordinates excluding that area.
 (2B)
 [構成の説明]
 図12は、実施の形態2にかかるセンターサーバの別の例を示すブロック図である。センターサーバ120は、図6に示した構成要素のほか、抽出部127をさらに備える。以下、センターサーバ120の処理について、(2A)で説明した点については省略し、本例に特有な点について説明する。
(2B)
[Configuration Description]
Fig. 12 is a block diagram showing another example of the center server according to the second embodiment. The center server 120 further includes an extraction unit 127 in addition to the components shown in Fig. 6. Below, the processing of the center server 120 will be described with reference to (2A) omitted, and only the points specific to this example will be described.
 抽出部127は、参照点群取得部121が取得した参照点群と、入力点群取得部122が取得した入力点群とにおいて、それぞれの対応する各座標同士の点の有無を、そのまま直接比較する。これにより、参照点群と、入力点群とのそれぞれにおいて、対応する座標における点の有無が異なる座標の情報(変化した座標の情報)が抽出され、そうでない座標については抽出されない。 The extraction unit 127 directly compares the presence or absence of points at each corresponding coordinate in the reference point group acquired by the reference point group acquisition unit 121 and the input point group acquired by the input point group acquisition unit 122. This extracts information on coordinates where the presence or absence of points at corresponding coordinates differs between the reference point group and the input point group (information on changed coordinates), and does not extract other coordinates.
 参照特徴量計算部123及び入力特徴量計算部124は、このようにして抽出された参照点群の座標と、入力点群の座標について、(2A)に示した処理を実行し、空間特徴量を計算する。また、変化検出部125及び検出結果生成部126は、計算された空間特徴量を用いて、(2A)に示した処理を実行する。 The reference feature calculation unit 123 and the input feature calculation unit 124 execute the process shown in (2A) for the coordinates of the reference point group and the coordinates of the input point group extracted in this manner, and calculate the spatial feature. The change detection unit 125 and the detection result generation unit 126 execute the process shown in (2A) using the calculated spatial feature.
 [処理の説明]
 図13A~13Cは、図10A~10Cに対応する、(2B)におけるセンターサーバ120の代表的な処理の一例を示したフローチャートであり、このフローチャートによって、センターサーバ120の処理の概要が説明される。なお、(2A)と同じ点については、適宜記載を省略する。
[Processing Description]
13A to 13C are flowcharts showing an example of a representative process of the center server 120 in (2B), which corresponds to Figures 10A to 10C, and explain an overview of the process of the center server 120. Note that descriptions of the same points as in (2A) will be omitted as appropriate.
 まず、図13Aにおいて、参照点群取得部121は、参照点群DB130から参照点群のデータを取得する(ステップS21;取得ステップ)。また、入力点群取得部122は、ロボット101が送信したデータを取得することで、入力点群のデータを取得する(ステップS22;取得ステップ)。 First, in FIG. 13A, the reference point cloud acquisition unit 121 acquires reference point cloud data from the reference point cloud DB 130 (step S21; acquisition step). Also, the input point cloud acquisition unit 122 acquires input point cloud data by acquiring data transmitted by the robot 101 (step S22; acquisition step).
 次に、抽出部127は、参照点群取得部121が取得した参照点群と、入力点群取得部122が取得した入力点群とにおいて、それぞれの対応する各座標同士の点の有無を直接比較する。これにより、変化した座標の情報を検出する(ステップS26;変化検出ステップ)。参照特徴量計算部123、入力特徴量計算部124及び変化検出部125は、処理(A’)を実行する(ステップS23’;処理(A’)ステップ)。処理(A’)の詳細は図13Bを用いて説明される。 Next, the extraction unit 127 directly compares the presence or absence of points at corresponding coordinates between the reference point group acquired by the reference point group acquisition unit 121 and the input point group acquired by the input point group acquisition unit 122. This detects information on changed coordinates (step S26; change detection step). The reference feature amount calculation unit 123, the input feature amount calculation unit 124, and the change detection unit 125 execute process (A') (step S23'; process (A') step). Details of process (A') will be explained using FIG. 13B.
 また、参照特徴量計算部123、入力特徴量計算部124及び変化検出部125は、処理(B’)も実行する(ステップS24’;処理(B’)ステップ)。処理(B’)は、上記の処理(II)及びそれに関連する処理を示し、その詳細は図13Cを用いて説明される。なお、ステップS23’とS24’の処理は、どちらが先になされてもよいし、両者が並列になされてもよい。 Furthermore, the reference feature calculation unit 123, the input feature calculation unit 124, and the change detection unit 125 also execute process (B') (step S24'; process (B') step). Process (B') indicates the above process (II) and processes related thereto, and the details thereof will be explained using FIG. 13C. Note that either of the processes in steps S23' and S24' may be performed first, or both may be performed in parallel.
 検出結果生成部126は、処理(A’)によって生成された、各座標における変化の有無を示す情報に基づいて、処理(A’)に関して、検出された変化を示す画像を生成する。同様に、検出結果生成部126は、処理(B’)によって生成された、各座標における変化の有無を示す情報に基づいて、処理(B’)に関して、検出された変化を示す画像を生成する(ステップS25;検出結果画像生成ステップ)。 The detection result generating unit 126 generates an image showing the change detected with respect to process (A') based on the information indicating the presence or absence of a change at each coordinate generated by process (A'). Similarly, the detection result generating unit 126 generates an image showing the change detected with respect to process (B') based on the information indicating the presence or absence of a change at each coordinate generated by process (B') (step S25; detection result image generating step).
 次に、図13Bを用いて、センターサーバの詳細な処理(A’)の一例を示す。まず、入力特徴量計算部124は、抽出部127によって抽出された入力点群の座標のうち、入力点群内のまだ特徴量が計算されていない座標について、その空間特徴量を計算する(ステップS31;特徴量計算ステップ)。入力特徴量計算部124は、計算した座標の情報を、参照特徴量計算部123に出力する。 Next, an example of detailed processing (A') of the center server is shown using FIG. 13B. First, the input feature calculation unit 124 calculates spatial features for coordinates in the input point cloud extracted by the extraction unit 127 for which features have not yet been calculated (step S31; feature calculation step). The input feature calculation unit 124 outputs information on the calculated coordinates to the reference feature calculation unit 123.
 参照特徴量計算部123は、出力された座標の情報に基づき、その座標と対応する参照点群内の座標F1を特定する。参照特徴量計算部123は、座標F1を含む近傍領域δ1を設定し、近傍領域δ1内に含まれる各座標について、それらの空間特徴量を計算する(ステップS32;特徴量計算ステップ)。上記の通り、ステップS31の処理とステップS32の処理は、いずれが先になされてもよいし、両方の処理が並列でなされてもよい。 Based on the output coordinate information, the reference feature calculation unit 123 identifies the coordinate F1 in the reference point group that corresponds to that coordinate. The reference feature calculation unit 123 sets a neighborhood region δ1 that includes the coordinate F1, and calculates the spatial feature of each coordinate included in the neighborhood region δ1 (step S32; feature calculation step). As described above, the process of step S31 or the process of step S32 may be performed in any order, or both processes may be performed in parallel.
 変化検出部125は、ステップS31及びS32でそれぞれ計算された空間特徴量を用いて、座標F1と座標F2との類似度を計算する(ステップS33;類似度計算ステップ)。変化検出部125は、計算された類似度の最大値と、所定の閾値とを比較することにより、座標F2において点の有無に変化が生じたか否かを判定する(ステップS34;変化検出ステップ)。 The change detection unit 125 calculates the similarity between coordinates F1 and F2 using the spatial features calculated in steps S31 and S32 (step S33; similarity calculation step). The change detection unit 125 compares the maximum value of the calculated similarity with a predetermined threshold value to determine whether or not a change has occurred in the presence or absence of a point at coordinate F2 (step S34; change detection step).
 変化検出部125は、入力点群において抽出された全座標について、類似度の計算及び変化検出の判定が終了したか否かを判定する(ステップS36;終了判定ステップ)。抽出された全座標について、類似度の計算及び変化検出の判定が終了していない場合(ステップS36のNo)、類似度を計算していない座標について、ステップS31に戻り、処理を繰り返す。抽出された全座標について、類似度の計算及び変化検出の判定が終了した場合(ステップS36のYes)、処理(A’)は終了する。そして、上記の通り、検出結果生成部126は、処理(A’)によって生成された、各座標における変化の有無を示す情報に基づいて、変化検出画像を生成する。 The change detection unit 125 determines whether or not the similarity calculation and change detection determination have been completed for all coordinates extracted in the input point cloud (step S36; completion determination step). If the similarity calculation and change detection determination have not been completed for all extracted coordinates (No in step S36), the process returns to step S31 and repeats the process for the coordinates for which the similarity has not been calculated. If the similarity calculation and change detection determination have been completed for all extracted coordinates (Yes in step S36), process (A') ends. Then, as described above, the detection result generation unit 126 generates a change detection image based on the information generated by process (A') that indicates the presence or absence of a change at each coordinate.
 次に、図13Cを用いて、センターサーバの詳細な処理(B’)の一例を示す。まず、参照特徴量計算部123は、抽出部127によって抽出された参照点群の座標のうち、参照点群内のまだ特徴量が計算されていない座標について、その空間特徴量を計算する(ステップS41;特徴量計算ステップ)。参照特徴量計算部123は、計算した座標の情報を、入力特徴量計算部124に出力する。 Next, an example of detailed processing (B') of the center server is shown using FIG. 13C. First, the reference feature calculation unit 123 calculates spatial features for coordinates in the reference point group extracted by the extraction unit 127 for which features have not yet been calculated (step S41; feature calculation step). The reference feature calculation unit 123 outputs information on the calculated coordinates to the input feature calculation unit 124.
 入力特徴量計算部124は、出力された座標の情報に基づき、その座標と対応する入力点群内の座標F2を特定する。入力特徴量計算部124は、座標F2を含む近傍領域δ2を設定し、近傍領域δ2内に含まれる各座標について、それらの空間特徴量を計算する(ステップS42;特徴量計算ステップ)。上記の通り、ステップS41の処理とステップS42の処理は、いずれが先になされてもよいし、両方の処理が並列でなされてもよい。 Based on the output coordinate information, the input feature calculation unit 124 identifies coordinate F2 in the input point cloud that corresponds to that coordinate. The input feature calculation unit 124 sets a neighborhood region δ2 that includes coordinate F2, and calculates the spatial feature of each coordinate included in neighborhood region δ2 (step S42; feature calculation step). As described above, the process of step S41 or the process of step S42 may be performed in any order, or both processes may be performed in parallel.
 変化検出部125は、ステップS41及びS42でそれぞれ計算された空間特徴量を用いて、座標F1と座標F2との類似度を計算する(ステップS43;類似度計算ステップ)。変化検出部125は、計算された類似度の最大値と、所定の閾値とを比較することにより、座標F1において点の有無に変化が生じたか否かを判定する(ステップS44;変化検出ステップ)。 The change detection unit 125 calculates the similarity between coordinates F1 and F2 using the spatial features calculated in steps S41 and S42 (step S43; similarity calculation step). The change detection unit 125 compares the maximum value of the calculated similarity with a predetermined threshold value to determine whether or not a change has occurred in the presence or absence of a point at coordinates F1 (step S44; change detection step).
 変化検出部125は、参照点群において抽出された全座標について、類似度の計算及び変化検出の判定が終了したか否かを判定する(ステップS46;終了判定ステップ)。抽出された全座標について、類似度の計算及び変化検出の判定が終了していない場合(ステップS46のNo)、類似度を計算していない座標について、ステップS41に戻り、処理を繰り返す。抽出された全座標について、類似度の計算及び変化検出の判定が終了した場合(ステップS46のYes)、処理(B’)は終了する。そして、上記の通り、検出結果生成部126は、処理(B’)によって生成された、各座標における変化の有無を示す情報に基づいて、変化を検出した画像を生成する。 The change detection unit 125 determines whether or not the similarity calculation and change detection determination have been completed for all coordinates extracted in the reference point group (step S46; completion determination step). If the similarity calculation and change detection determination have not been completed for all extracted coordinates (No in step S46), the process returns to step S41 and repeats the process for the coordinates for which the similarity has not been calculated. If the similarity calculation and change detection determination have been completed for all extracted coordinates (Yes in step S46), process (B') ends. Then, as described above, the detection result generation unit 126 generates an image in which changes have been detected, based on the information indicating the presence or absence of changes at each coordinate, generated by process (B').
 以上に示したように、センターサーバ120は、抽出部127が変化した座標の情報を抽出し、その抽出された座標について、(2A)に示した変化検出処理を実行することが可能である。これにより、全ての入力点群又は参照点群の座標に対して(2A)に示した変化検出処理を実行するよりも、処理の全工程に必要な計算コストを低減させることができる。 As described above, the center server 120 can extract information on coordinates that have changed using the extraction unit 127, and execute the change detection process shown in (2A) for the extracted coordinates. This can reduce the calculation costs required for all steps of the process, compared to executing the change detection process shown in (2A) for the coordinates of all input point groups or reference point groups.
 (2C)
 図14Aは、実施の形態2にかかるセンターサーバの別の例を示すブロック図である。センターサーバ120は、図12に示した構成要素のほか、物体識別部128をさらに備える。以下、センターサーバ120の処理について、(2A)及び(2B)で説明した点については省略し、本例に特有な点について説明する。
(2C)
Fig. 14A is a block diagram showing another example of the center server according to the embodiment 2. The center server 120 further includes an object identification unit 128 in addition to the components shown in Fig. 12. Below, regarding the processing of the center server 120, the points explained in (2A) and (2B) are omitted, and only points unique to this example are explained.
 物体識別部128は、検出結果生成部126によって生成された、(I)に示した処理を実行した結果の画像又は(II)に示した処理を実行した結果の画像について、画像に示された点の有無の変化箇所が、何らかの物体(例えば、コンテナ、車両、工事資材等)に相当するか否かを判定する。 The object identification unit 128 determines whether the change in the presence or absence of a point shown in the image generated by the detection result generation unit 126 as a result of executing the process shown in (I) or the image generated as a result of executing the process shown in (II) corresponds to any object (e.g., a container, a vehicle, construction materials, etc.).
 例えば、物体識別部128は、変化した点群の箇所が所定の大きさ以上か否かを判定し、所定の大きさ以上である場合、変化した点群の箇所が何らかの物体に相当すると判定する。物体識別部128は、変化した点群の箇所が所定の大きさ未満である場合、変化した点群の箇所はノイズであると判定する。 For example, the object identification unit 128 determines whether the changed point cloud portion is equal to or larger than a predetermined size, and if it is equal to or larger than the predetermined size, it determines that the changed point cloud portion corresponds to some kind of object. If the changed point cloud portion is smaller than the predetermined size, the object identification unit 128 determines that the changed point cloud portion is noise.
 別の例として、物体識別部128は、変化した点群の箇所と、事前に記憶したコンテナ、車両、工事資材等といった判定用の各種物体の点群データとを比較してもよい。変化した点群の箇所がいずれかの物体の点群データと一致、又は、例えば数パーセント程度の誤差を除いて一致する場合に、物体識別部128は、その一致した物体を、入力点群と参照点群との間で存在の有無が変化した物体であると特定する。 As another example, the object identification unit 128 may compare the location of the changed point cloud with pre-stored point cloud data of various objects for determination, such as containers, vehicles, construction materials, etc. If the location of the changed point cloud matches the point cloud data of any object, or matches except for an error of, for example, a few percent, the object identification unit 128 identifies the matching object as an object whose presence or absence has changed between the input point cloud and the reference point cloud.
 あるいは、物体識別部128は、事前に学習がなされたAIモデルを用いて、物体判定を実行してもよい。この学習は、サンプルとなる変化した点群を示す画像情報と、その情報に対応する各種物体を示す情報(正解ラベル)と、を含む教師データを、AIモデルに入力させることでなされる。教師データを用いてAIモデルが学習された後、物体識別部128は、検出結果生成部126によって生成された画像をAIモデルに入力する。AIモデルは、この入力画像に基づいて、その入力画像で示された物体を示す情報を出力する。このようにしても、物体識別部128は物体の識別処理を実行することができる。なお、学習モデルの学習には、ロジスティック回帰、ニューラルネットワークといった任意の技術を用いることができる。 Alternatively, the object identification unit 128 may perform object determination using a pre-trained AI model. This training is performed by inputting training data, including image information showing the changed point cloud as a sample and information (correct label) showing various objects corresponding to that information, into the AI model. After the AI model has been trained using the training data, the object identification unit 128 inputs the image generated by the detection result generation unit 126 into the AI model. Based on this input image, the AI model outputs information showing the object shown in the input image. In this way, the object identification unit 128 can also perform object identification processing. Note that any technology, such as logistic regression or neural network, can be used to train the learning model.
 なお、検出結果生成部126が検出結果として3Dデータの点群を生成した場合でも、物体識別部128は、検出結果である点群において示された点の有無の変化箇所が、何らかの物体に相当するか否かを判定する。例えば、検出結果生成部126は、事前に学習がなされたAIモデルを用いて、物体判定を実行することができる。この学習は、サンプルとなる変化した点群と、その情報に対応する各種物体を示す情報(正解ラベル)と、を含む教師データを、AIモデルに入力させることでなされる。教師データを用いてAIモデルが学習された後、物体識別部128は、検出結果生成部126によって生成された点群をAIモデルに入力する。AIモデルは、この点群に基づいて、その点群で示された物体を示す情報を出力することができる。検出結果生成部126が実行可能なそのほかの判定方法は、上記の通りである。 Note that even if the detection result generating unit 126 generates a point cloud of 3D data as the detection result, the object identifying unit 128 determines whether the change in the presence or absence of points shown in the point cloud, which is the detection result, corresponds to some object. For example, the detection result generating unit 126 can perform object determination using an AI model that has been trained in advance. This learning is performed by inputting training data including a changed point cloud that serves as a sample and information (correct answer label) indicating various objects corresponding to that information into the AI model. After the AI model has been trained using the training data, the object identifying unit 128 inputs the point cloud generated by the detection result generating unit 126 into the AI model. Based on this point cloud, the AI model can output information indicating the object indicated by the point cloud. Other determination methods that the detection result generating unit 126 can execute are as described above.
 [効果の説明]
 以上のようにして、センターサーバ120は、参照点群と入力点群との間で、物体の有無の変化があったか否かを検知することができる。さらに好ましくは、センターサーバ120は、参照点群と入力点群との間において存在の有無が変化した物体を識別することもできる。なお、物体識別部128は、識別した物体について、その物体を視覚的に強調する処理を実行した画面を生成し、出力してもよい。視覚的に強調する処理については、実施の形態1に説明した通りである。また、物体識別部128は、何らかの物体を識別した場合に、スピーカを介して、音声によってアラートを出力してもよい。また、物体識別部128は、判定部13の判定結果を他の装置に出力してもよい。
[Effects]
In this manner, the center server 120 can detect whether there has been a change in the presence or absence of an object between the reference point group and the input point group. More preferably, the center server 120 can also identify an object whose presence or absence has changed between the reference point group and the input point group. The object identification unit 128 may generate and output a screen on which a process for visually highlighting the identified object has been performed. The process for visually highlighting the object is as described in the first embodiment. In addition, when the object identification unit 128 identifies an object, it may output an alert by voice via a speaker. In addition, the object identification unit 128 may output the determination result of the determination unit 13 to another device.
 (2D)
 図14Bは、実施の形態2にかかるセンターサーバの別の例を示すブロック図である。センターサーバ120は、図14Aに示した構成要素のほか、移動制御部129をさらに備える。以下、センターサーバ120の処理について、(2A)~(2C)で説明した点については省略し、本例に特有な点について説明する。
(2D)
Fig. 14B is a block diagram showing another example of the center server according to the embodiment 2. In addition to the components shown in Fig. 14A, the center server 120 further includes a mobility control unit 129. Below, regarding the processing of the center server 120, the points explained in (2A) to (2C) are omitted, and only points unique to this example are explained.
 移動制御部129は、ロボット101の移動を制御する。例えば、物体識別部128が、変化した点群の箇所が何らかの物体に相当すると判定したとき、移動制御部129は、ロボット101に対して、変化した点群の箇所に対して接近した後、さらに当該箇所を計測するように指示することができる。この指示は、当該箇所の更に詳細な点群のデータを取得し、解析をするためになされるものである。あるいは、移動制御部129は、物体識別部128が特定の種類の物体を識別した場合に、上記のようにロボット101を制御してもよい。 The movement control unit 129 controls the movement of the robot 101. For example, when the object identification unit 128 determines that the location of the changed point cloud corresponds to some kind of object, the movement control unit 129 can instruct the robot 101 to approach the location of the changed point cloud and then measure the location further. This instruction is given in order to obtain more detailed point cloud data of the location and analyze it. Alternatively, the movement control unit 129 may control the robot 101 as described above when the object identification unit 128 identifies a specific type of object.
 この指示は、(I)に示した処理の結果、参照点群において物体が存在せず、入力点群において物体が存在するようになったと判定された場合、又は(II)に示した処理の結果、参照点群において物体が存在しており、入力点群において物体が存在しなくなったと判定された場合、の少なくともいずれかの場合に出力される。ただし、変化箇所について詳細な解析をするためには、少なくとも、参照点群において物体が存在せず、入力点群において物体が存在するようになった場合において、指示が出力されるのが好ましい。 This instruction is output in at least one of the following cases: as a result of the process shown in (I), it is determined that an object does not exist in the reference point cloud and an object has become present in the input point cloud, or as a result of the process shown in (II), it is determined that an object exists in the reference point cloud and that the object no longer exists in the input point cloud. However, in order to perform a detailed analysis of the change location, it is preferable that the instruction is output at least when an object does not exist in the reference point cloud and an object has become present in the input point cloud.
 [処理の説明]
 図15A~15Bは、図13Aに対応する、(2D)におけるセンターサーバ120の代表的な処理の一例を示したフローチャートであり、このフローチャートによって、センターサーバ120の処理の概要が説明される。なお、(2A)及び(2B)と同じ点については、適宜記載を省略する。
[Processing Description]
15A and 15B are flowcharts showing an example of a representative process of the center server 120 in (2D), which corresponds to Fig. 13A, and explain an overview of the process of the center server 120. Note that descriptions of the same points as in (2A) and (2B) will be omitted as appropriate.
 ステップS21~S25については、図13Aと同様であるため、説明を省略する。ステップS25の後、物体識別部128は(2C)に示した処理を実行し、(I)の処理によって検出結果生成部126により生成された画像において、変化した点群の箇所が、何らかの物体に相当するか否かを判定する(ステップS27;物体検出判定ステップ)。変化した点群の箇所が物体に相当しない場合(ステップS27のNo)、移動制御部129は特段の制御を実行しない。この場合、ロボット101は、例えば従前に設定された移動ルートに沿って移動する。一方、変化した点群の箇所が何らかの物体に相当する場合(ステップS27のYes)、移動制御部129は、ロボット101に対して、変化した点群の箇所に対して接近した後、さらに当該箇所を計測するように指示するように制御する(ステップS28;ロボット移動ステップ)。この場合、ロボット101は、従前に設定された移動ルートを一旦外れて移動する。 Steps S21 to S25 are the same as those in FIG. 13A, and therefore will not be described here. After step S25, the object identification unit 128 executes the process shown in (2C) and determines whether or not the changed point cloud portion corresponds to any object in the image generated by the detection result generation unit 126 through the process of (I) (step S27; object detection determination step). If the changed point cloud portion does not correspond to an object (No in step S27), the movement control unit 129 does not execute any special control. In this case, the robot 101 moves along, for example, a previously set movement route. On the other hand, if the changed point cloud portion corresponds to any object (Yes in step S27), the movement control unit 129 controls the robot 101 to approach the changed point cloud portion and then further measure the portion (step S28; robot movement step). In this case, the robot 101 moves away from the previously set movement route.
 なお、(II)の処理によって検出結果生成部126により生成された画像についても、ステップS27及びS28に示した処理を実行することが可能である。また、ステップS28の処理の結果、ロボットが接近して計測することにより取得した画像についても、センターサーバ120は、(2A)~(2D)に示した処理を実行することができる。 The processes shown in steps S27 and S28 can also be performed on the images generated by the detection result generating unit 126 through the process of (II). The center server 120 can also perform the processes shown in (2A) to (2D) on the images acquired by the robot approaching and measuring as a result of the process of step S28.
 [効果の説明]
 以上に示したように、検出結果生成部126が生成した画像において何らかの物体が検出されたような場合に、センターサーバ120は、ロボット101に対して詳細な点群のデータを取得させるように制御する。これにより、インフラ設備の故障又は異常に関する、より詳細な点検が可能となる。なお、検出結果生成部126が点群を生成する場合であっても、物体識別部128の物体判定結果に基づいて、移動制御部129は同様の制御を実行することができる。
[Effects]
As described above, when an object is detected in an image generated by the detection result generating unit 126, the center server 120 controls the robot 101 to acquire detailed point cloud data. This enables more detailed inspection of failures or abnormalities in infrastructure facilities. Note that even when the detection result generating unit 126 generates a point cloud, the movement control unit 129 can execute similar control based on the object determination result of the object identifying unit 128.
 他の例として、移動制御部129は、物体識別部128が検出した物体が、従前に設定された移動ルート上に存在すると判定した場合、移動ルートを、物体が存在する箇所を避けるようなルートに設定してもよい。移動制御部129は、新たに設定したルート上を移動するように、ロボット101の移動を制御する。 As another example, when the movement control unit 129 determines that an object detected by the object identification unit 128 is present on a previously set movement route, the movement control unit 129 may set the movement route to a route that avoids the location where the object is present. The movement control unit 129 controls the movement of the robot 101 so that the robot 101 moves along the newly set route.
 なお、本開示は上記実施の形態に限られたものではなく、趣旨を逸脱しない範囲で適宜変更することが可能である。例えば、上記実施の形態に記載した各構成又は各処理は、任意に組み合わせることが可能である。また、各フローチャートに記載の処理の順番は、図面の通りでなく、適宜入れ替えられてもよいし、複数の処理が並列に実行されてもよい。 Note that the present disclosure is not limited to the above-described embodiments, and can be modified as appropriate without departing from the spirit of the present disclosure. For example, the configurations or processes described in the above-described embodiments can be combined in any manner. Furthermore, the order of the processes described in each flowchart does not have to be as shown in the drawings, and can be rearranged as appropriate, and multiple processes can be executed in parallel.
 例えば、図10Bでは、ステップS31~S34の処理が、入力点群の各点に対して繰り返し実行されている。しかしながら、これに代えて、入力点群において処理対象である全座標について空間特徴量を計算する処理と、入力点群の各座標と対応する参照点群の座標を含む領域について空間特徴量を計算する処理と、が先になされてもよい。その後、全座標についてステップS33~S34の処理が実行されることで、処理対象となる領域における点の有無の変化を検知することができる。他のフローチャートでも、同様の処理の順番の入れ替えが可能である。 For example, in FIG. 10B, the processes of steps S31 to S34 are repeatedly executed for each point in the input point cloud. However, instead of this, a process of calculating spatial features for all coordinates to be processed in the input point cloud and a process of calculating spatial features for an area including the coordinates of the reference point cloud that correspond to each coordinate of the input point cloud may be executed first. Then, the processes of steps S33 to S34 are executed for all coordinates, making it possible to detect a change in the presence or absence of points in the area to be processed. In other flowcharts, the order of the processes may be similarly changed.
 実施の形態2では、極座標を用いて、座標F1の周辺領域である球体領域S1及び座標F2の周辺領域である球体領域S2を定義した。しかしながら、極座標に限らず、円筒座標等、他の種類の座標系を用いることにより、座標F1及びF2の周辺領域を定義することも可能である。ただし、極座標を用いることにより、参照特徴量計算部123及び入力特徴量計算部124の計算が容易になるという効果を奏する。 In the second embodiment, polar coordinates are used to define the spherical region S1 surrounding the coordinate F1 and the spherical region S2 surrounding the coordinate F2. However, it is also possible to define the surrounding regions of the coordinates F1 and F2 by using other types of coordinate systems, such as cylindrical coordinates, instead of polar coordinates. However, the use of polar coordinates has the effect of making the calculations of the reference feature calculation unit 123 and the input feature calculation unit 124 easier.
 また、実施の形態2において(5)~(7)に示した類似度の計算方法は、適宜変更することができる。例えば、(5)では、変化検出部125は、(6)に示すScore(P1φθr,P2φθr)を単純加算することでS(P1, P2)を計算した。しかしながら、それ以外の方法で、Samearound(P1φθr,P2φθr)がφ、θ、rの全範囲において存在する数に関してS(P1, P2)が単調増加となるように、S(P1, P2)を計算してもよい。また、(5)では、変化検出部125は、S(P1, P2)はValidNumの逆数に比例している。しかしながら、それ以外の方法で、ValidNumに関してS(P1, P2)が単調減少となるように、S(P1, P2)を計算してもよい。 Also, the method of calculating the similarity shown in (5) to (7) in the second embodiment can be changed as appropriate. For example, in (5), the change detection unit 125 calculates S(P1, P2) by simply adding Score(P1 φθr , P2 φθr ) shown in (6). However, S(P1, P2) may be calculated by other methods so that S(P1, P2) increases monotonically with respect to the number of Same around (P1 φθr , P2 φθr ) present in the entire range of φ, θ, and r. Also, in (5), the change detection unit 125 calculates S(P1, P2) as being proportional to the inverse of ValidNum. However, S(P1, P2) may be calculated by other methods so that S(P1, P2) decreases monotonically with respect to ValidNum.
 実施の形態2において、センターサーバ120が各部で実行する上述の処理における任意の一部は、ロボット101又は異なるサーバの少なくともいずれかによって実行されてもよい。つまり、センターサーバ120の処理は分散システムで実現されてもよい。 In the second embodiment, any part of the above-mentioned processing executed by each unit of the center server 120 may be executed by at least one of the robot 101 and a different server. In other words, the processing of the center server 120 may be realized by a distributed system.
 また、別の例として、センターサーバ120が設けられず、ロボット101がスタンドアローンで上述のセンターサーバ120の処理を実行してもよい。この場合、ロボット101内の記憶部に、参照点群と、その参照点群が計測された位置情報とが関連付けて格納されている。ロボット101は、自身のLiDAR102で計測を行い、入力点群を取得する。ロボット101は、入力点群を計測した位置情報を用いて記憶部を検索し、入力点群と比較対象となる参照点群のデータを取得する。この詳細については、上記の参照点群取得部121の処理と同様である。そして、ロボット101は、上記の参照特徴量計算部123~検出結果生成部126に関する処理を実行することができる。また、ロボット101は、物体識別部128に関する処理を実行してもよい。さらに、ロボット101は、物体識別部128で何らかの物体を検出した場合、変化した点群の箇所に対して接近した後、さらに当該箇所を計測するように、自身の移動部を制御することもできる。 As another example, the center server 120 may not be provided, and the robot 101 may execute the above-mentioned processing of the center server 120 in a stand-alone manner. In this case, the reference point cloud and the position information at which the reference point cloud was measured are stored in association with each other in the storage unit of the robot 101. The robot 101 performs measurement with its own LiDAR 102 and acquires the input point cloud. The robot 101 searches the storage unit using the position information at which the input point cloud was measured, and acquires the data of the reference point cloud to be compared with the input point cloud. The details are the same as those of the processing of the reference point cloud acquisition unit 121 described above. Then, the robot 101 can execute the processing related to the reference feature amount calculation unit 123 to the detection result generation unit 126 described above. The robot 101 may also execute processing related to the object identification unit 128. Furthermore, when the robot 101 detects an object with the object identification unit 128, it can control its own movement unit to approach the location of the changed point cloud and then measure the location.
 以上に示した実施の形態では、この開示をハードウェアの構成として説明したが、この開示は、これに限定されるものではない。この開示は、上述の実施形態において説明された変化検出装置、変化検出システムにおける各装置又はセンターサーバの処理(ステップ)を、コンピュータ内のプロセッサにコンピュータプログラムを実行させることにより実現することも可能である。 In the above embodiment, this disclosure has been described as a hardware configuration, but this disclosure is not limited to this. This disclosure can also be realized by having a processor in a computer execute a computer program to execute the processes (steps) of the change detection device, each device in the change detection system, or the center server described in the above embodiment.
 図16は、以上に示した各実施の形態の処理が実行される情報処理装置のハードウェア構成例を示すブロック図である。図16を参照すると、この情報処理装置90は、信号処理回路91、プロセッサ92及びメモリ93を含む。 FIG. 16 is a block diagram showing an example of the hardware configuration of an information processing device in which the processes of the above-described embodiments are executed. Referring to FIG. 16, this information processing device 90 includes a signal processing circuit 91, a processor 92, and a memory 93.
 信号処理回路91は、プロセッサ92の制御に応じて、信号を処理するための回路である。なお、信号処理回路91は、送信装置から信号を受信する通信回路を含んでいても良い。 The signal processing circuit 91 is a circuit for processing signals according to the control of the processor 92. The signal processing circuit 91 may also include a communication circuit for receiving signals from a transmitting device.
 プロセッサ92は、メモリ93と接続されて(結合して)おり、メモリ93からソフトウェア(コンピュータプログラム)を読み出して実行することで、上述の実施形態において説明された装置の処理を行う。プロセッサ92の一例として、CPU(Central Processing Unit)、MPU(Micro Processing Unit)、FPGA(Field-Programmable Gate Array)、DSP(Demand-Side Platform)、ASIC(Application Specific Integrated Circuit)が挙げられる。プロセッサ92として、1個のプロセッサが用いられてもよいし、複数のプロセッサが協働して用いられてもよい。 The processor 92 is connected (coupled) to the memory 93, and performs the processing of the device described in the above embodiment by reading and executing software (computer programs) from the memory 93. Examples of the processor 92 include a CPU (Central Processing Unit), an MPU (Micro Processing Unit), an FPGA (Field-Programmable Gate Array), a DSP (Demand-Side Platform), and an ASIC (Application Specific Integrated Circuit). A single processor may be used as the processor 92, or multiple processors may be used in cooperation with each other.
 メモリ93は、揮発性メモリや不揮発性メモリ、またはそれらの組み合わせで構成される。なお、揮発性メモリは、例えば、DRAM (Dynamic Random Access Memory)、SRAM (Static Random Access Memory)等のRAM (Random Access Memory)であってもよい。不揮発性メモリは、例えば、PROM (Programmable Random Only Memory)、EPROM (Erasable Programmable Read Only Memory) 等のROM (Read Only Memory)、フラッシュメモリや、SSD(Solid State Drive)であってもよい。メモリ93として、1個のメモリが用いられてもよいし、複数のメモリが協働して用いられてもよい。 Memory 93 may be composed of volatile memory, non-volatile memory, or a combination of both. The volatile memory may be, for example, a RAM (Random Access Memory) such as DRAM (Dynamic Random Access Memory) or SRAM (Static Random Access Memory). The non-volatile memory may be, for example, a ROM (Read Only Memory) such as PROM (Programmable Random Only Memory) or EPROM (Erasable Programmable Read Only Memory), flash memory, or an SSD (Solid State Drive). A single memory may be used as memory 93, or multiple memories may be used in cooperation with each other.
 メモリ93は、1以上の命令を格納するために使用される。ここで、1以上の命令は、ソフトウェアモジュール群としてメモリ93に格納される。プロセッサ92は、これらのソフトウェアモジュール群をメモリ93から読み出して実行することで、上述の実施形態において説明された処理を行うことができる。 The memory 93 is used to store one or more instructions. Here, the one or more instructions are stored in the memory 93 as a group of software modules. The processor 92 can perform the processing described in the above embodiment by reading and executing these groups of software modules from the memory 93.
 なお、メモリ93は、プロセッサ92の外部に設けられるものに加えて、プロセッサ92に内蔵されているものを含んでもよい。また、メモリ93は、プロセッサ92を構成するプロセッサから離れて配置されたストレージを含んでもよい。この場合、プロセッサ92は、I/O(Input/Output)インタフェースを介してメモリ93にアクセスすることができる。 Note that the memory 93 may include memory built into the processor 92 in addition to memory provided outside the processor 92. The memory 93 may also include storage located away from the processors that make up the processor 92. In this case, the processor 92 can access the memory 93 via an I/O (Input/Output) interface.
 以上に説明したように、上述の実施形態における各装置が有する1又は複数のプロセッサは、図面を用いて説明されたアルゴリズムをコンピュータに行わせるための命令群を含む1又は複数のプログラムを実行する。この処理により、各実施の形態に記載された情報処理が実現できる。 As explained above, one or more processors in each device in the above-mentioned embodiments execute one or more programs including a set of instructions for causing a computer to execute the algorithm described in the drawings. This process realizes the information processing described in each embodiment.
 プログラムは、コンピュータに読み込まれた場合に、実施形態で説明された1又はそれ以上の機能をコンピュータに行わせるための命令群(又はソフトウェアコード)を含む。プログラムは、非一時的なコンピュータ可読媒体又は実体のある記憶媒体に格納されてもよい。限定ではなく例として、コンピュータ可読媒体又は実体のある記憶媒体は、random-access memory(RAM)、read-only memory(ROM)、フラッシュメモリ、solid-state drive(SSD)又はその他のメモリ技術、CD-ROM、digital versatile disk(DVD)、Blu-ray(登録商標)ディスク又はその他の光ディスクストレージ、磁気カセット、磁気テープ、磁気ディスクストレージ又はその他の磁気ストレージデバイスを含む。プログラムは、一時的なコンピュータ可読媒体又は通信媒体上で送信されてもよい。限定ではなく例として、一時的なコンピュータ可読媒体又は通信媒体は、電気的、光学的、音響的、またはその他の形式の伝搬信号を含む。 The program includes instructions (or software code) that, when loaded into a computer, cause the computer to perform one or more functions described in the embodiments. The program may be stored on a non-transitory computer-readable medium or tangible storage medium. By way of example and not limitation, computer-readable medium or tangible storage medium may include random-access memory (RAM), read-only memory (ROM), flash memory, solid-state drive (SSD) or other memory technology, CD-ROM, digital versatile disk (DVD), Blu-ray® disk or other optical disk storage, magnetic cassette, magnetic tape, magnetic disk storage or other magnetic storage device. The program may be transmitted on a transitory computer-readable medium or communication medium. By way of example and not limitation, transitory computer-readable medium or communication medium may include electrical, optical, acoustic, or other forms of propagated signals.
 上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。
 (付記1)
 第1の点群における第1の座標の第1の特徴量を、複数の第1の小領域から構成され、前記第1の座標を含む第1の領域において、各前記第1の小領域における点の存在に関する情報を用いて計算し、
 前記第1の座標と対応する、第2の点群における第2の座標の第2の特徴量を、複数の第2の小領域から構成され、前記第2の座標を含む第2の領域において、各前記第2の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算し、
 前記第1の特徴量及び前記第2の特徴量を用いて、前記第1の座標と前記第2の座標との間で点の有無の変化が生じたか否かを判定する、
 コンピュータが実行する変化検出方法。
 (付記2)
 前記第1の特徴量を、各前記第1の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する、
 付記1に記載の変化検出方法。
 (付記3)
 前記第1の点群における前記第1の座標の第1の近傍領域における各座標の特徴量を、複数の第3の小領域から構成され、前記座標を含む第3の領域において、各前記第3の小領域における点の存在に関する情報を用いてさらに計算し、
 前記第1の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在しない点が前記第2の座標において存在するか否かを判定する、
 付記1又は2に記載の変化検出方法。
 (付記4)
 前記第1の点群のデータは、第1のセンサを用いて取得されるデータであり、
 前記第1の点群のデータの取得時に前記第1の座標が示す位置と前記第1のセンサの位置との間の距離が長くなるほど、前記第1の近傍領域の大きさを大きくする、
 付記3に記載の変化検出方法。
 (付記5)
 前記第2の点群における前記第2の座標の第2の近傍領域における各座標の特徴量を、複数の第4の小領域から構成され、前記座標を含む第4の領域において、各前記第4の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いてさらに計算し、
 前記第2の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在する点が前記第2の座標において存在しないか否かを判定する、
 付記1乃至4のいずれか1項に記載の変化検出方法。
 (付記6)
 前記判定を、前記第1の点群と前記第2の点群とにおいて対応する複数の座標で実行することによって、前記第1の点群と前記第2の点群との間で物体の有無の変化を検出し、
 前記検出の結果を出力する、
 請求項1乃至5のいずれか1項に記載の変化検出方法。
 (付記7)
 前記第2の点群のデータは、第2のセンサによって取得されるデータであり、
 前記第2の点群において点が無い前記第2の小領域について、前記第2の小領域が、前記データの取得時における前記第2の点群において点が存在する位置と、前記第2のセンサの位置との間に位置する場合、前記第2の小領域において点が存在しないと定義され、前記第2の点群において点が存在する位置と前記第2のセンサの位置との間に前記第2の小領域が位置しない場合、前記第2の小領域において点の存在が不明であると定義される、
 付記1乃至6のいずれか1項に記載の変化検出方法。
 (付記8)
 前記第1の点群と、前記第2の点群とを比較し、前記第1の点群と前記第2の点群において点の有無が異なる座標を抽出し、
 前記第1の座標又は前記第2の座標の少なくともいずれかは、点の有無が異なるとして抽出される座標である、
 付記1乃至7のいずれか1項に記載の変化検出方法。
 (付記9)
 前記第1の点群のデータは、第1のセンサによって取得されるデータであり、
 前記第1の点群において点が無い前記第1の小領域について、前記第1の小領域が、前記データの取得時における前記第1の点群において点が存在する位置と、前記第1のセンサの位置との間に位置する場合、前記第1の小領域において点が存在しないと定義され、前記第1の点群において点が存在する位置と前記第1のセンサの位置との間に前記第1の小領域が位置しない場合、前記第1の小領域において点の存在が不明であると定義される、
 付記2に記載の変化検出方法。
 (付記10)
 前記第1の特徴量と前記第2の特徴量との類似度と、前記第1の近傍領域における各座標について、前記座標の特徴量と前記第2の特徴量との類似度と、を算出することにより、前記第1の座標において存在しない点が前記第2の座標において存在するか否かを判定し、
 前記第1の特徴量と前記第2の特徴量との類似度は、前記第1の領域における前記第1の小領域と、前記第1の小領域の周辺領域に含まれる第1の小領域と、の各々における前記第1の特徴量の要素と、前記第1の小領域に対応する、前記第2の領域における前記第2の小領域の要素と、を用いて計算され、
 前記第1の近傍領域における各座標の特徴量と前記第2の特徴量との類似度は、前記第3の領域における前記第3の小領域と、前記第3の小領域の周辺領域に含まれる小領域と、の各々における前記第1の特徴量の要素と、前記第3の小領域に対応する、前記第2の領域における前記第2の小領域の要素と、を用いて計算される、
 付記3又は4に記載の変化検出方法。
 (付記11)
 前記第2の点群のデータは、第2のセンサを用いて取得されるデータであり、
 前記第2の点群のデータの取得時に前記第2の座標が示す位置と前記第2のセンサの位置との間の距離が長くなるほど、前記第2の近傍領域の大きさを大きくする、
 付記5に記載の変化検出方法。
 (付記12)
 前記第1の特徴量と前記第2の特徴量との類似度と、前記第2の近傍領域における各座標について、前記座標の特徴量と前記第1の特徴量との類似度と、を算出することにより、前記第1の座標において存在する点が前記第2の座標において存在しないか否かを判定し、
 前記第1の特徴量と前記第2の特徴量との類似度は、前記第2の領域における前記第2の小領域と、前記第2の小領域の周辺領域に含まれる第2の小領域と、の各々における前記第2の特徴量の要素と、前記第2の小領域に対応する、前記第1の領域における前記第1の小領域の要素と、を用いて計算され、
 前記第2の近傍領域における各座標の特徴量と前記第1の特徴量との類似度は、前記第4の領域における前記第4の小領域と、前記第4の小領域の周辺領域に含まれる小領域と、の各々における前記第4の特徴量の要素と、前記第4の小領域に対応する、前記第1の領域における前記第1の小領域の要素と、を用いて計算される、
 付記5又は11に記載の変化検出方法。
 (付記13)
 第1の点群における第1の座標の第1の特徴量を、複数の第1の小領域から構成され、前記第1の座標を含む第1の領域において、各前記第1の小領域における点の存在に関する情報を用いて計算する第1の特徴量計算手段と、
 前記第1の座標と対応する、第2の点群における第2の座標の第2の特徴量を、複数の第2の小領域から構成され、前記第2の座標を含む第2の領域において、各前記第2の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する第2の特徴量計算手段と、
 前記第1の特徴量及び前記第2の特徴量を用いて、前記第1の座標と前記第2の座標との間で点の有無の変化が生じたか否かを判定する判定手段と、
 を備える変化検出システム。
 (付記14)
 前記第1の特徴量計算手段は、前記第1の特徴量を、各前記第1の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する、
 付記13に記載の変化検出システム。
 (付記15)
 前記第1の特徴量計算手段は、前記第1の点群における前記第1の座標の第1の近傍領域における各座標の特徴量を、複数の第3の小領域から構成され、前記座標を含む第3の領域において、各前記第3の小領域における点の存在に関する情報を用いてさらに計算し、
 前記判定手段は、前記第1の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在しない点が前記第2の座標において存在するか否かを判定する、
 付記13又は14に記載の変化検出システム。
 (付記16)
 前記第1の点群のデータは、第1のセンサを用いて取得されるデータであり、
 前記第1の特徴量計算手段は、前記第1の点群データの取得時に前記第1の座標が示す位置と前記第1のセンサの位置との間の距離が長くなるほど、前記第1の近傍領域の大きさを大きくする、
 付記15に記載の変化検出システム。
 (付記17)
 前記第2の特徴量計算手段は、前記第2の点群における前記第2の座標の第2の近傍領域における各座標の特徴量を、複数の第4の小領域から構成され、前記座標を含む第4の領域において、各前記第4の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いてさらに計算し、
 前記判定手段は、前記第2の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在する点が前記第2の座標において存在しないか否かを判定する、
 付記13乃至16のいずれか1項に記載の変化検出システム。
 (付記18)
 前記判定を、前記第1の点群と前記第2の点群とにおいて対応する複数の座標で実行することによって、前記第1の点群と前記第2の点群との間で物体の有無の変化を検出する検出部と、
 前記検出の結果を出力する出力部と、をさらに備える、
 付記13乃至17のいずれか1項に記載の変化検出システム。
 (付記19)
 前記第2の点群のデータは、第2のセンサによって取得されるデータであり、
 前記第2の点群において点が無い前記第2の小領域について、前記第2の小領域が、前記データの取得時における前記第2の点群において点が存在する位置と、前記第2のセンサの位置との間に位置する場合、前記第2の小領域において点が存在しないと定義され、前記第2の点群において点が存在する位置と前記第2のセンサの位置との間に前記第2の小領域が位置しない場合、前記第2の小領域において点の存在が不明であると定義される、
 付記13乃至18のいずれか1項に記載の変化検出システム。
 (付記20)
 前記第1の点群と、前記第2の点群とを比較し、前記第1の点群と前記第2の点群において点の有無が異なる座標を抽出する抽出部をさらに備え、
 前記第1の座標又は前記第2の座標の少なくともいずれかは、前記抽出部が抽出した座標である、
 付記13乃至19のいずれか1項に記載の変化検出システム。
 (付記21)
 前記第1の点群のデータは、第1のセンサによって取得されるデータであり、
 前記第1の点群において点が無い前記第1の小領域について、前記第1の小領域が、前記データの取得時における前記第1の点群において点が存在する位置と、前記第1のセンサの位置との間に位置する場合、前記第1の小領域において点が存在しないと定義され、前記第1の点群において点が存在する位置と前記第1のセンサの位置との間に前記第1の小領域が位置しない場合、前記第1の小領域において点の存在が不明であると定義される、
 付記14に記載の変化検出システム。
 (付記22)
 前記判定手段は、前記第1の特徴量と前記第2の特徴量との類似度と、前記第1の近傍領域における各座標について、前記座標の特徴量と前記第2の特徴量との類似度と、を算出することにより、前記第1の座標において存在しない点が前記第2の座標において存在するか否かを判定し、
 前記第1の特徴量と前記第2の特徴量との類似度は、前記第1の領域における前記第1の小領域と、前記第1の小領域の周辺領域に含まれる第1の小領域と、の各々における前記第1の特徴量の要素と、前記第1の小領域に対応する、前記第2の領域における前記第2の小領域の要素と、を用いて計算され、
 前記第1の近傍領域における各座標の特徴量と前記第2の特徴量との類似度は、前記第3の領域における前記第3の小領域と、前記第3の小領域の周辺領域に含まれる小領域と、の各々における前記第1の特徴量の要素と、前記第3の小領域に対応する、前記第2の領域における前記第2の小領域の要素と、を用いて計算される、
 付記15又は16に記載の変化検出システム。
 (付記23)
 前記第2の点群のデータは、第2のセンサを用いて取得されるデータであり、
 前記第2の特徴量計算手段は、前記第2の点群のデータの取得時に前記第2の座標が示す位置と前記第2のセンサの位置との間の距離が長くなるほど、前記第2の近傍領域の大きさを大きくする、
 付記17に記載の変化検出システム。
 (付記24)
 前記判定手段は、前記第1の特徴量と前記第2の特徴量との類似度と、前記第2の近傍領域における各座標について、前記座標の特徴量と前記第1の特徴量との類似度と、を算出することにより、前記第1の座標において存在する点が前記第2の座標において存在しないか否かを判定し、
 前記第1の特徴量と前記第2の特徴量との類似度は、前記第2の領域における前記第2の小領域と、前記第2の小領域の周辺領域に含まれる第2の小領域と、の各々における前記第2の特徴量の要素と、前記第2の小領域に対応する、前記第1の領域における前記第1の小領域の要素と、を用いて計算され、
 前記第2の近傍領域における各座標の特徴量と前記第1の特徴量との類似度は、前記第4の領域における前記第4の小領域と、前記第4の小領域の周辺領域に含まれる小領域と、の各々における前記第4の特徴量の要素と、前記第4の小領域に対応する、前記第1の領域における前記第1の小領域の要素と、を用いて計算される、
 付記17又は23に記載の変化検出システム。
 (付記25)
 第1の点群における第1の座標の第1の特徴量を、複数の第1の小領域から構成され、前記第1の座標を含む第1の領域において、各前記第1の小領域における点の存在に関する情報を用いて計算する第1の特徴量計算手段と、
 前記第1の座標と対応する、第2の点群における第2の座標の第2の特徴量を、複数の第2の小領域から構成され、前記第2の座標を含む第2の領域において、各前記第2の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する第2の特徴量計算手段と、
 前記第1の特徴量及び前記第2の特徴量を用いて、前記第1の座標と前記第2の座標との間で点の有無の変化が生じたか否かを判定する判定手段と、
 を備える変化検出装置。
 (付記26)
 前記第1の特徴量計算手段は、前記第1の特徴量を、各前記第1の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する、
 付記25に記載の変化検出装置。
 (付記27)
 前記第1の特徴量計算手段は、前記第1の点群における前記第1の座標の第1の近傍領域における各座標の特徴量を、複数の第3の小領域から構成され、前記座標を含む第3の領域において、各前記第3の小領域における点の存在に関する情報を用いてさらに計算し、
 前記判定手段は、前記第1の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在しない点が前記第2の座標において存在するか否かを判定する、
 付記25又は26に記載の変化検出装置。
 (付記28)
 前記第1の点群のデータは、第1のセンサを用いて撮影されるデータであり、
 前記第1の特徴量計算手段は、前記データの取得時に前記第1の座標が示す位置と前記第1のセンサの位置との間の距離が長くなるほど、前記第1の近傍領域の大きさを大きくする、
 付記27に記載の変化検出装置。
 (付記29)
 前記第2の特徴量計算手段は、前記第2の点群における前記第2の座標の第2の近傍領域における各座標の特徴量を、複数の第4の小領域から構成され、前記座標を含む第4の領域において、各前記第4の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いてさらに計算し、
 前記判定手段は、前記第2の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在する点が前記第2の座標において存在しないか否かを判定する、
 付記25乃至28のいずれか1項に記載の変化検出装置。
 (付記30)
 前記第1の点群と、前記第2の点群とを比較し、前記第1の点群と前記第2の点群において点の有無が異なる座標を抽出する抽出部をさらに備え、
 前記第1の座標又は前記第2の座標の少なくともいずれかは、前記抽出部が抽出した座標である、
 付記25乃至29のいずれか1項に記載の変化検出装置。
 (付記31)
 前記第2の点群のデータは、第2のセンサによって取得されるデータであり、
 前記第2の点群において点が無い前記第2の小領域について、前記第2の小領域が、前記データの取得時における前記第2の点群において点が存在する位置と、前記第2のセンサの位置との間に位置する場合、前記第2の小領域において点が存在しないと定義され、前記第2の点群において点が存在する位置と前記第2のセンサの位置との間に前記第2の小領域が位置しない場合、前記第2の小領域において点の存在が不明であると定義される、
 付記25乃至30のいずれか1項に記載の変化検出装置。
 (付記32)
 前記第1の点群のデータは、第1のセンサによって取得されるデータであり、
 前記第1の点群において点が無い前記第1の小領域について、前記第1の小領域が、前記データの取得時における前記第1の点群において点が存在する位置と、前記第1のセンサの位置との間に位置する場合、前記第1の小領域において点が存在しないと定義され、前記第1の点群において点が存在する位置と前記第1のセンサの位置との間に前記第1の小領域が位置しない場合、前記第1の小領域において点の存在が不明であると定義される、
 付記26に記載の変化検出装置。
 (付記33)
 前記判定手段は、前記第1の特徴量と前記第2の特徴量との類似度と、前記第1の近傍領域における各座標について、前記座標の特徴量と前記第2の特徴量との類似度と、を算出することにより、前記第1の座標において存在しない点が前記第2の座標において存在するか否かを判定し、
 前記第1の特徴量と前記第2の特徴量との類似度は、前記第1の領域における前記第1の小領域と、前記第1の小領域の周辺領域に含まれる第1の小領域と、の各々における前記第1の特徴量の要素と、前記第1の小領域に対応する、前記第2の領域における前記第2の小領域の要素と、を用いて計算され、
 前記第1の近傍領域における各座標の特徴量と前記第2の特徴量との類似度は、前記第3の領域における前記第3の小領域と、前記第3の小領域の周辺領域に含まれる小領域と、の各々における前記第1の特徴量の要素と、前記第3の小領域に対応する、前記第2の領域における前記第2の小領域の要素と、を用いて計算される、
 付記27又は28に記載の変化検出装置。
 (付記34)
 前記第2の点群のデータは、第2のセンサを用いて取得されるデータであり、
 前記第2の特徴量計算手段は、前記第2の点群のデータの取得時に前記第2の座標が示す位置と前記第2のセンサの位置との間の距離が長くなるほど、前記第2の近傍領域の大きさを大きくする、
 付記29に記載の変化検出装置。
 (付記35)
 前記判定手段は、前記第1の特徴量と前記第2の特徴量との類似度と、前記第2の近傍領域における各座標について、前記座標の特徴量と前記第1の特徴量との類似度と、を算出することにより、前記第1の座標において存在する点が前記第2の座標において存在しないか否かを判定し、
 前記第1の特徴量と前記第2の特徴量との類似度は、前記第2の領域における前記第2の小領域と、前記第2の小領域の周辺領域に含まれる第2の小領域と、の各々における前記第2の特徴量の要素と、前記第2の小領域に対応する、前記第1の領域における前記第1の小領域の要素と、を用いて計算され、
 前記第2の近傍領域における各座標の特徴量と前記第1の特徴量との類似度は、前記第4の領域における前記第4の小領域と、前記第4の小領域の周辺領域に含まれる小領域と、の各々における前記第4の特徴量の要素と、前記第4の小領域に対応する、前記第1の領域における前記第1の小領域の要素と、を用いて計算される、
 付記29又は34に記載の変化検出装置。
 (付記36)
 第1の点群における第1の座標の第1の特徴量を、複数の第1の小領域から構成され、前記第1の座標を含む第1の領域において、各前記第1の小領域における点の存在に関する情報を用いて計算し、
 前記第1の座標と対応する、第2の点群における第2の座標の第2の特徴量を、複数の第2の小領域から構成され、前記第2の座標を含む第2の領域において、各前記第2の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算し、
 前記第1の特徴量及び前記第2の特徴量を用いて、前記第1の座標と前記第2の座標との間で点の有無の変化が生じたか否かを判定する、
 ことをコンピュータに実行させるプログラムが格納された非一時的なコンピュータ可読媒体。
A part or all of the above-described embodiments can be described as, but is not limited to, the following supplementary notes.
(Appendix 1)
Calculating a first feature amount of a first coordinate in a first point cloud using information about the presence of a point in each of the first small regions, the first region being composed of a plurality of first small regions and including the first coordinate;
calculating a second feature amount of a second coordinate in a second point cloud corresponding to the first coordinate, using information indicating that a point exists, that a point does not exist, or that the existence of a point is unknown in a second region that is composed of a plurality of second small regions and includes the second coordinate;
determining whether or not a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates by using the first feature amount and the second feature amount;
A computer implemented method for change detection.
(Appendix 2)
The first feature amount is calculated using information indicating that a point is present in each of the first small regions, that a point is not present, or that the presence of a point is unknown.
2. The change detection method of claim 1.
(Appendix 3)
Further calculating a feature amount of each coordinate in a first neighborhood of the first coordinate in the first point cloud, in a third neighborhood including the coordinate, the third neighborhood being composed of a plurality of third small regions, using information regarding the presence of a point in each of the third small regions;
determining whether or not a point that does not exist in the first coordinates is present in the second coordinates by using a feature amount of each coordinate in the first neighboring region, the first feature amount, and the second feature amount;
3. The change detection method according to claim 1 or 2.
(Appendix 4)
the first point cloud data is data acquired using a first sensor,
the size of the first neighborhood area is increased as the distance between the position indicated by the first coordinates and the position of the first sensor increases when the first point cloud data is acquired;
4. The change detection method of claim 3.
(Appendix 5)
further calculating a feature amount of each coordinate in a second neighborhood area of the second coordinate in the second point cloud, the feature amount being indicative of the presence, absence, or unknown presence of a point in each of the fourth small areas, the fourth small area being configured from a plurality of fourth small areas and including the coordinate;
determining whether or not a point that exists at the first coordinates does not exist at the second coordinates by using a feature amount of each coordinate in the second neighboring region, the first feature amount, and the second feature amount;
5. A change detection method according to any one of claims 1 to 4.
(Appendix 6)
detecting a change in the presence or absence of an object between the first cloud of points and the second cloud of points by performing the determination at a plurality of corresponding coordinates in the first cloud of points and the second cloud of points;
outputting the result of said detection;
A method for detecting changes according to any one of the preceding claims.
(Appendix 7)
the second point cloud data is data acquired by a second sensor,
Regarding the second small region in which there is no point in the second point cloud, if the second small region is located between a position where a point exists in the second point cloud at the time of acquiring the data and the position of the second sensor, it is defined that there is no point in the second small region, and if the second small region is not located between a position where a point exists in the second point cloud and the position of the second sensor, the presence of a point in the second small region is defined as unknown.
7. A change detection method according to any one of claims 1 to 6.
(Appendix 8)
comparing the first point cloud with the second point cloud, and extracting coordinates where the first point cloud and the second point cloud have different points,
At least one of the first coordinates and the second coordinates is a coordinate that is extracted as being different depending on whether a point is present or not.
8. A change detection method according to any one of claims 1 to 7.
(Appendix 9)
the first point cloud data is data acquired by a first sensor,
Regarding the first small region in which there is no point in the first point cloud, if the first small region is located between a position where a point exists in the first point cloud at the time of acquiring the data and the position of the first sensor, it is defined that there is no point in the first small region, and if the first small region is not located between a position where a point exists in the first point cloud and the position of the first sensor, the presence of a point in the first small region is defined as unknown.
3. The change detection method of claim 2.
(Appendix 10)
determining whether a point that does not exist in the first coordinates is present in the second coordinates by calculating a similarity between the first feature amount and the second feature amount and, for each coordinate in the first neighboring region, a similarity between the feature amount of the coordinates and the second feature amount;
a similarity between the first feature amount and the second feature amount is calculated using elements of the first feature amount in each of the first small region in the first region and a first small region included in a peripheral region of the first small region, and elements of the second small region in the second region corresponding to the first small region;
a similarity between the feature amount of each coordinate in the first neighboring region and the second feature amount is calculated using elements of the first feature amount in each of the third small region in the third region and a small region included in a peripheral region of the third small region, and elements of the second small region in the second region corresponding to the third small region;
5. The change detection method according to claim 3 or 4.
(Appendix 11)
the second point cloud data is data acquired using a second sensor,
the size of the second neighborhood area is increased as the distance between the position indicated by the second coordinates and the position of the second sensor increases when the second point cloud data is acquired.
6. The change detection method of claim 5.
(Appendix 12)
determining whether a point that exists at the first coordinates does not exist at the second coordinates by calculating a similarity between the first feature amount and the second feature amount and, for each coordinate in the second neighboring region, a similarity between the feature amount of the coordinate and the first feature amount;
a similarity between the first feature amount and the second feature amount is calculated using elements of the second feature amount in each of the second small region in the second region and a second small region included in a peripheral region of the second small region, and elements of the first small region in the first region corresponding to the second small region;
a similarity between the feature amount of each coordinate in the second neighboring region and the first feature amount is calculated using an element of the fourth feature amount in each of the fourth small region in the fourth region and a small region included in a peripheral region of the fourth small region, and an element of the first small region in the first region corresponding to the fourth small region;
12. The change detection method according to claim 5 or 11.
(Appendix 13)
a first feature amount calculation means for calculating a first feature amount of a first coordinate in a first point cloud using information on the presence of a point in each of the first small regions, the first region being composed of a plurality of first small regions and including the first coordinate;
a second feature amount calculation means for calculating a second feature amount of a second coordinate in a second point group corresponding to the first coordinate, using information indicating that a point exists, that a point does not exist, or that the existence of a point is unknown in a second area including the second coordinate, the second area being composed of a plurality of second small areas;
a determination means for determining whether or not a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates by using the first feature amount and the second feature amount;
A change detection system comprising:
(Appendix 14)
the first feature amount calculation means calculates the first feature amount using information indicating that a point is present in each of the first small regions, that a point is not present, or that the presence of a point is unknown;
14. The change detection system of claim 13.
(Appendix 15)
the first feature amount calculation means further calculates feature amounts of each coordinate in a first neighborhood area of the first coordinate in the first point cloud, in a third area including the coordinate, the third area being composed of a plurality of third small areas, using information regarding the presence of points in each of the third small areas;
the determining means determines whether or not a point that does not exist in the first coordinates is present in the second coordinates by using a feature amount of each coordinate in the first neighboring region, the first feature amount, and the second feature amount.
15. The change detection system of claim 13 or 14.
(Appendix 16)
the first point cloud data is data acquired using a first sensor,
the first feature amount calculation means increases a size of the first neighboring region as a distance between a position indicated by the first coordinates and a position of the first sensor increases when the first point cloud data is acquired;
16. The change detection system of claim 15.
(Appendix 17)
the second feature amount calculation means further calculates feature amounts of each coordinate in a second neighboring region of the second coordinate in the second point cloud, the fourth region being composed of a plurality of fourth small regions and including the coordinate, using information indicating that a point exists in each of the fourth small regions, that a point does not exist, or that the existence of a point is unknown;
the determining means determines whether or not a point existing at the first coordinates does not exist at the second coordinates by using a feature amount of each coordinate in the second neighboring region, the first feature amount, and the second feature amount.
17. A change detection system according to any one of claims 13 to 16.
(Appendix 18)
a detection unit that detects a change in the presence or absence of an object between the first point cloud and the second point cloud by performing the determination at a plurality of corresponding coordinates in the first point cloud and the second point cloud;
An output unit that outputs the result of the detection,
18. A change detection system according to any one of claims 13 to 17.
(Appendix 19)
the second point cloud data is data acquired by a second sensor,
Regarding the second small region in which there is no point in the second point cloud, if the second small region is located between a position where a point exists in the second point cloud at the time of acquiring the data and the position of the second sensor, it is defined that there is no point in the second small region, and if the second small region is not located between a position where a point exists in the second point cloud and the position of the second sensor, the presence of a point in the second small region is defined as unknown.
19. A change detection system according to any one of claims 13 to 18.
(Appendix 20)
An extraction unit that compares the first point cloud with the second point cloud and extracts coordinates where a point is present or absent in the first point cloud and the second point cloud,
At least one of the first coordinates and the second coordinates is a coordinate extracted by the extraction unit.
20. A change detection system according to any one of claims 13 to 19.
(Appendix 21)
the first point cloud data is data acquired by a first sensor,
Regarding the first small region in which there is no point in the first point cloud, if the first small region is located between a position where a point exists in the first point cloud at the time of acquiring the data and the position of the first sensor, it is defined that there is no point in the first small region, and if the first small region is not located between a position where a point exists in the first point cloud and the position of the first sensor, the presence of a point in the first small region is defined as unknown.
15. The change detection system of claim 14.
(Appendix 22)
the determining means determines whether or not a point that does not exist in the first coordinates is present in the second coordinates by calculating a similarity between the first feature amount and the second feature amount and, for each coordinate in the first neighboring region, a similarity between the feature amount of the coordinate and the second feature amount;
a similarity between the first feature amount and the second feature amount is calculated using elements of the first feature amount in each of the first small region in the first region and a first small region included in a peripheral region of the first small region, and elements of the second small region in the second region corresponding to the first small region;
a similarity between the feature amount of each coordinate in the first neighboring region and the second feature amount is calculated using elements of the first feature amount in each of the third small region in the third region and a small region included in a peripheral region of the third small region, and elements of the second small region in the second region corresponding to the third small region;
17. The change detection system of claim 15 or 16.
(Appendix 23)
the second point cloud data is data acquired using a second sensor,
the second feature amount calculation means increases a size of the second neighborhood area as a distance between a position indicated by the second coordinates and a position of the second sensor increases when the second point cloud data is acquired;
18. The change detection system of claim 17.
(Appendix 24)
the determining means determines whether or not a point existing at the first coordinates does not exist at the second coordinates by calculating a similarity between the first feature amount and the second feature amount and, for each coordinate in the second neighboring region, a similarity between the feature amount of the coordinate and the first feature amount;
a similarity between the first feature amount and the second feature amount is calculated using elements of the second feature amount in each of the second small region in the second region and a second small region included in a peripheral region of the second small region, and elements of the first small region in the first region corresponding to the second small region;
a similarity between the feature amount of each coordinate in the second neighboring region and the first feature amount is calculated using an element of the fourth feature amount in each of the fourth small region in the fourth region and a small region included in a peripheral region of the fourth small region, and an element of the first small region in the first region corresponding to the fourth small region;
24. The change detection system of claim 17 or 23.
(Appendix 25)
a first feature amount calculation means for calculating a first feature amount of a first coordinate in a first point cloud using information on the presence of a point in each of the first small regions, the first region being composed of a plurality of first small regions and including the first coordinate;
a second feature amount calculation means for calculating a second feature amount of a second coordinate in a second point group corresponding to the first coordinate, using information indicating that a point exists, that a point does not exist, or that the existence of a point is unknown in a second area including the second coordinate, the second area being composed of a plurality of second small areas;
a determination means for determining whether or not a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates by using the first feature amount and the second feature amount;
A change detection device comprising:
(Appendix 26)
the first feature amount calculation means calculates the first feature amount using information indicating that a point is present in each of the first small regions, that a point is not present, or that the presence of a point is unknown;
26. The change detection apparatus of claim 25.
(Appendix 27)
the first feature amount calculation means further calculates feature amounts of each coordinate in a first neighborhood area of the first coordinate in the first point cloud, in a third area including the coordinate, the third area being composed of a plurality of third small areas, using information regarding the presence of points in each of the third small areas;
the determining means determines whether or not a point that does not exist in the first coordinates is present in the second coordinates by using a feature amount of each coordinate in the first neighboring region, the first feature amount, and the second feature amount.
27. The change detection device of claim 25 or 26.
(Appendix 28)
the first point cloud data is data captured by a first sensor,
the first feature amount calculation means increases a size of the first neighboring region as a distance between a position indicated by the first coordinates and a position of the first sensor at the time of acquiring the data increases;
28. The change detection apparatus of claim 27.
(Appendix 29)
the second feature amount calculation means further calculates feature amounts of each coordinate in a second neighboring region of the second coordinate in the second point cloud, the fourth region being composed of a plurality of fourth small regions and including the coordinate, using information indicating that a point exists in each of the fourth small regions, that a point does not exist, or that the existence of a point is unknown;
the determining means determines whether or not a point existing at the first coordinates does not exist at the second coordinates by using a feature amount of each coordinate in the second neighboring region, the first feature amount, and the second feature amount.
29. A change detection device according to any one of claims 25 to 28.
(Appendix 30)
An extraction unit that compares the first point cloud with the second point cloud and extracts coordinates where a point is present or absent in the first point cloud and the second point cloud,
At least one of the first coordinates and the second coordinates is a coordinate extracted by the extraction unit.
30. A change detection device according to any one of claims 25 to 29.
(Appendix 31)
the second point cloud data is data acquired by a second sensor,
Regarding the second small region in which there is no point in the second point cloud, if the second small region is located between a position where a point exists in the second point cloud at the time of acquiring the data and the position of the second sensor, it is defined that there is no point in the second small region, and if the second small region is not located between a position where a point exists in the second point cloud and the position of the second sensor, the presence of a point in the second small region is defined as unknown.
31. A change detection device according to any one of claims 25 to 30.
(Appendix 32)
the first point cloud data is data acquired by a first sensor,
Regarding the first small region in which there is no point in the first point cloud, if the first small region is located between a position where a point exists in the first point cloud at the time of acquiring the data and the position of the first sensor, it is defined that there is no point in the first small region, and if the first small region is not located between a position where a point exists in the first point cloud and the position of the first sensor, the presence of a point in the first small region is defined as unknown.
27. The change detection apparatus of claim 26.
(Appendix 33)
the determining means determines whether or not a point that does not exist in the first coordinates is present in the second coordinates by calculating a similarity between the first feature amount and the second feature amount and, for each coordinate in the first neighboring region, a similarity between the feature amount of the coordinate and the second feature amount;
a similarity between the first feature amount and the second feature amount is calculated using elements of the first feature amount in each of the first small region in the first region and a first small region included in a peripheral region of the first small region, and elements of the second small region in the second region corresponding to the first small region;
a similarity between the feature amount of each coordinate in the first neighboring region and the second feature amount is calculated using elements of the first feature amount in each of the third small region in the third region and a small region included in a peripheral region of the third small region, and elements of the second small region in the second region corresponding to the third small region;
29. The change detection apparatus of claim 27 or 28.
(Appendix 34)
the second point cloud data is data acquired using a second sensor,
the second feature amount calculation means increases a size of the second neighborhood area as a distance between a position indicated by the second coordinates and a position of the second sensor increases when the second point cloud data is acquired;
30. The change detection apparatus of claim 29.
(Appendix 35)
the determining means determines whether or not a point existing at the first coordinates does not exist at the second coordinates by calculating a similarity between the first feature amount and the second feature amount and, for each coordinate in the second neighboring region, a similarity between the feature amount of the coordinate and the first feature amount;
a similarity between the first feature amount and the second feature amount is calculated using elements of the second feature amount in each of the second small region in the second region and a second small region included in a peripheral region of the second small region, and elements of the first small region in the first region corresponding to the second small region;
a similarity between the feature amount of each coordinate in the second neighboring region and the first feature amount is calculated using an element of the fourth feature amount in each of the fourth small region in the fourth region and a small region included in a peripheral region of the fourth small region, and an element of the first small region in the first region corresponding to the fourth small region;
35. The change detection apparatus of claim 29 or 34.
(Appendix 36)
Calculating a first feature amount of a first coordinate in a first point cloud using information about the presence of a point in each of the first small regions, the first region being composed of a plurality of first small regions and including the first coordinate;
calculating a second feature amount of a second coordinate in a second point cloud corresponding to the first coordinate, using information indicating that a point exists, that a point does not exist, or that the existence of a point is unknown in a second region that is composed of a plurality of second small regions and includes the second coordinate;
determining whether or not a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates by using the first feature amount and the second feature amount;
A non-transitory computer-readable medium on which a program for causing a computer to execute a process is stored.
 以上、実施の形態を参照して本開示を説明したが、本開示は上記によって限定されるものではない。本開示の構成や詳細には、開示のスコープ内で当業者が理解し得る様々な変更をすることができる。 The present disclosure has been described above with reference to the embodiments, but the present disclosure is not limited to the above. Various modifications that can be understood by a person skilled in the art can be made to the configuration and details of the present disclosure within the scope of the disclosure.
10   変化検出装置
11   第1の特徴量計算部    12   第2の特徴量計算部
13   判定部          14   出力部
20   変化検出システム
21   特徴量計算装置      22   判定装置
100  監視システム
101  ロボット         102  LiDAR
110  基地局 
120  センターサーバ
121  参照点群取得部      122  入力点群取得部
123  参照特徴量計算部     124  入力特徴量計算部
125  変化検出部        126  検出結果生成部
127  抽出部          128  物体識別部
129  移動制御部
130  参照点群DB
REFERENCE SIGNS LIST 10 Change detection device 11 First feature amount calculation unit 12 Second feature amount calculation unit 13 Determination unit 14 Output unit 20 Change detection system 21 Feature amount calculation device 22 Determination device 100 Monitoring system 101 Robot 102 LiDAR
110 Base Station
120 Center server 121 Reference point cloud acquisition unit 122 Input point cloud acquisition unit 123 Reference feature amount calculation unit 124 Input feature amount calculation unit 125 Change detection unit 126 Detection result generation unit 127 Extraction unit 128 Object identification unit 129 Movement control unit 130 Reference point cloud DB

Claims (20)

  1.  第1の点群における第1の座標の第1の特徴量を、複数の第1の小領域から構成され、前記第1の座標を含む第1の領域において、各前記第1の小領域における点の存在に関する情報を用いて計算し、
     前記第1の座標と対応する、第2の点群における第2の座標の第2の特徴量を、複数の第2の小領域から構成され、前記第2の座標を含む第2の領域において、各前記第2の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算し、
     前記第1の特徴量及び前記第2の特徴量を用いて、前記第1の座標と前記第2の座標との間で点の有無の変化が生じたか否かを判定する、
     コンピュータが実行する変化検出方法。
    Calculating a first feature amount of a first coordinate in a first point cloud using information about the presence of a point in each of the first small regions, the first region being composed of a plurality of first small regions and including the first coordinate;
    calculating a second feature amount of a second coordinate in a second point cloud corresponding to the first coordinate, using information indicating that a point exists, that a point does not exist, or that the existence of a point is unknown in a second region that is composed of a plurality of second small regions and includes the second coordinate;
    determining whether or not a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates by using the first feature amount and the second feature amount;
    A computer implemented method for change detection.
  2.  前記第1の特徴量を、各前記第1の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する、
     請求項1に記載の変化検出方法。
    The first feature amount is calculated using information indicating that a point is present in each of the first small regions, that a point is not present, or that the presence of a point is unknown.
    The change detection method of claim 1 .
  3.  前記第1の点群における前記第1の座標の第1の近傍領域における各座標の特徴量を、複数の第3の小領域から構成され、前記座標を含む第3の領域において、各前記第3の小領域における点の存在に関する情報を用いてさらに計算し、
     前記第1の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在しない点が前記第2の座標において存在するか否かを判定する、
     請求項1又は2に記載の変化検出方法。
    Further calculating a feature amount of each coordinate in a first neighborhood of the first coordinate in the first point cloud, in a third neighborhood including the coordinate, the third neighborhood being composed of a plurality of third small regions, using information regarding the presence of a point in each of the third small regions;
    determining whether or not a point that does not exist in the first coordinates is present in the second coordinates by using a feature amount of each coordinate in the first neighboring region, the first feature amount, and the second feature amount;
    The change detection method according to claim 1 or 2.
  4.  前記第1の点群のデータは、第1のセンサを用いて取得されるデータであり、
     前記第1の点群のデータの取得時に前記第1の座標が示す位置と前記第1のセンサの位置との間の距離が長くなるほど、前記第1の近傍領域の大きさを大きくする、
     請求項3に記載の変化検出方法。
    the first point cloud data is data acquired using a first sensor,
    the size of the first neighborhood area is increased as the distance between the position indicated by the first coordinates and the position of the first sensor increases when the first point cloud data is acquired;
    The change detection method of claim 3.
  5.  前記第2の点群における前記第2の座標の第2の近傍領域における各座標の特徴量を、複数の第4の小領域から構成され、前記座標を含む第4の領域において、各前記第4の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いてさらに計算し、
     前記第2の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在する点が前記第2の座標において存在しないか否かを判定する、
     請求項1乃至4のいずれか1項に記載の変化検出方法。
    further calculating a feature amount of each coordinate in a second neighborhood area of the second coordinate in the second point cloud, the feature amount being indicative of the presence, absence, or unknown presence of a point in each of the fourth small areas, the fourth small area being configured from a plurality of fourth small areas and including the coordinate;
    determining whether or not a point that exists at the first coordinates does not exist at the second coordinates by using a feature amount of each coordinate in the second neighboring region, the first feature amount, and the second feature amount;
    A method for detecting changes according to any one of claims 1 to 4.
  6.  前記判定を、前記第1の点群と前記第2の点群とにおいて対応する複数の座標で実行することによって、前記第1の点群と前記第2の点群との間で物体の有無の変化を検出し、
     前記検出の結果を出力する、
     請求項1乃至5のいずれか1項に記載の変化検出方法。
    detecting a change in the presence or absence of an object between the first cloud of points and the second cloud of points by performing the determination at a plurality of corresponding coordinates in the first cloud of points and the second cloud of points;
    outputting the result of said detection;
    A method for detecting changes according to any one of the preceding claims.
  7.  前記第2の点群のデータは、第2のセンサによって取得されるデータであり、
     前記第2の点群において点が無い前記第2の小領域について、前記第2の小領域が、前記データの取得時における前記第2の点群において点が存在する位置と、前記第2のセンサの位置との間に位置する場合、前記第2の小領域において点が存在しないと定義され、前記第2の点群において点が存在する位置と前記第2のセンサの位置との間に前記第2の小領域が位置しない場合、前記第2の小領域において点の存在が不明であると定義される、
     請求項1乃至6のいずれか1項に記載の変化検出方法。
    the second point cloud data is data acquired by a second sensor,
    Regarding the second small region in which there is no point in the second point cloud, if the second small region is located between a position where a point exists in the second point cloud at the time of acquiring the data and the position of the second sensor, it is defined that there is no point in the second small region, and if the second small region is not located between a position where a point exists in the second point cloud and the position of the second sensor, the presence of a point in the second small region is defined as unknown.
    A method for detecting changes according to any one of the preceding claims.
  8.  第1の点群における第1の座標の第1の特徴量を、複数の第1の小領域から構成され、前記第1の座標を含む第1の領域において、各前記第1の小領域における点の存在に関する情報を用いて計算する第1の特徴量計算手段と、
     前記第1の座標と対応する、第2の点群における第2の座標の第2の特徴量を、複数の第2の小領域から構成され、前記第2の座標を含む第2の領域において、各前記第2の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する第2の特徴量計算手段と、
     前記第1の特徴量及び前記第2の特徴量を用いて、前記第1の座標と前記第2の座標との間で点の有無の変化が生じたか否かを判定する判定手段と、
     を備える変化検出システム。
    a first feature amount calculation means for calculating a first feature amount of a first coordinate in a first point cloud using information on the presence of a point in each of the first small regions, the first region being composed of a plurality of first small regions and including the first coordinate;
    a second feature amount calculation means for calculating a second feature amount of a second coordinate in a second point group corresponding to the first coordinate, using information indicating that a point exists, that a point does not exist, or that the existence of a point is unknown in a second area including the second coordinate, the second area being composed of a plurality of second small areas;
    a determination means for determining whether or not a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates by using the first feature amount and the second feature amount;
    A change detection system comprising:
  9.  前記第1の特徴量計算手段は、前記第1の特徴量を、各前記第1の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する、
     請求項8に記載の変化検出システム。
    the first feature amount calculation means calculates the first feature amount using information indicating that a point is present in each of the first small regions, that a point is not present, or that the presence of a point is unknown;
    The change detection system of claim 8.
  10.  前記第1の特徴量計算手段は、前記第1の点群における前記第1の座標の第1の近傍領域における各座標の特徴量を、複数の第3の小領域から構成され、前記座標を含む第3の領域において、各前記第3の小領域における点の存在に関する情報を用いてさらに計算し、
     前記判定手段は、前記第1の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在しない点が前記第2の座標において存在するか否かを判定する、
     請求項8又は9に記載の変化検出システム。
    the first feature amount calculation means further calculates feature amounts of each coordinate in a first neighborhood area of the first coordinate in the first point cloud, in a third area including the coordinate, the third area being composed of a plurality of third small areas, using information regarding the presence of points in each of the third small areas;
    the determining means determines whether or not a point that does not exist in the first coordinates is present in the second coordinates by using a feature amount of each coordinate in the first neighboring region, the first feature amount, and the second feature amount.
    A change detection system according to claim 8 or 9.
  11.  前記第1の点群のデータは、第1のセンサを用いて取得されるデータであり、
     前記第1の特徴量計算手段は、前記第1の点群のデータの取得時に前記第1の座標が示す位置と前記第1のセンサの位置との間の距離が長くなるほど、前記第1の近傍領域の大きさを大きくする、
     請求項10に記載の変化検出システム。
    the first point cloud data is data acquired using a first sensor,
    the first feature amount calculation means increases a size of the first neighboring region as a distance between a position indicated by the first coordinates and a position of the first sensor increases when the first point cloud data is acquired;
    The change detection system of claim 10.
  12.  前記第2の特徴量計算手段は、前記第2の点群における前記第2の座標の第2の近傍領域における各座標の特徴量を、複数の第4の小領域から構成され、前記座標を含む第4の領域において、各前記第4の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いてさらに計算し、
     前記判定手段は、前記第2の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在する点が前記第2の座標において存在しないか否かを判定する、
     請求項8乃至11のいずれか1項に記載の変化検出システム。
    the second feature amount calculation means further calculates feature amounts of each coordinate in a second neighboring region of the second coordinate in the second point cloud, the fourth region being composed of a plurality of fourth small regions and including the coordinate, using information indicating that a point exists in each of the fourth small regions, that a point does not exist, or that the existence of a point is unknown;
    the determining means determines whether or not a point existing at the first coordinates does not exist at the second coordinates by using a feature amount of each coordinate in the second neighboring region, the first feature amount, and the second feature amount.
    A change detection system according to any one of claims 8 to 11.
  13.  前記判定を、前記第1の点群と前記第2の点群とにおいて対応する複数の座標で実行することによって、前記第1の点群と前記第2の点群との間で物体の有無の変化を検出する検出部と、
     前記検出の結果を出力する出力部と、をさらに備える、
     請求項8乃至12のいずれか1項に記載の変化検出システム。
    a detection unit that detects a change in the presence or absence of an object between the first point cloud and the second point cloud by performing the determination at a plurality of corresponding coordinates in the first point cloud and the second point cloud;
    An output unit that outputs the result of the detection,
    A change detection system according to any one of claims 8 to 12.
  14.  前記第2の点群のデータは、第2のセンサによって取得されるデータであり、
     前記第2の点群において点が無い前記第2の小領域について、前記第2の小領域が、前記データの取得時における前記第2の点群において点が存在する位置と、前記第2のセンサの位置との間に位置する場合、前記第2の小領域において点が存在しないと定義され、前記第2の点群において点が存在する位置と前記第2のセンサの位置との間に前記第2の小領域が位置しない場合、前記第2の小領域において点の存在が不明であると定義される、
     請求項8乃至13のいずれか1項に記載の変化検出システム。
    the second point cloud data is data acquired by a second sensor,
    Regarding the second small region in which there is no point in the second point cloud, if the second small region is located between a position where a point exists in the second point cloud at the time of acquiring the data and the position of the second sensor, it is defined that there is no point in the second small region, and if the second small region is not located between a position where a point exists in the second point cloud and the position of the second sensor, the presence of a point in the second small region is defined as unknown.
    A change detection system according to any one of claims 8 to 13.
  15.  第1の点群における第1の座標の第1の特徴量を、複数の第1の小領域から構成され、前記第1の座標を含む第1の領域において、各前記第1の小領域における点の存在に関する情報を用いて計算する第1の特徴量計算手段と、
     前記第1の座標と対応する、第2の点群における第2の座標の第2の特徴量を、複数の第2の小領域から構成され、前記第2の座標を含む第2の領域において、各前記第2の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する第2の特徴量計算手段と、
     前記第1の特徴量及び前記第2の特徴量を用いて、前記第1の座標と前記第2の座標との間で点の有無の変化が生じたか否かを判定する判定手段と、
     を備える変化検出装置。
    a first feature amount calculation means for calculating a first feature amount of a first coordinate in a first point cloud using information on the presence of a point in each of the first small regions, the first region being composed of a plurality of first small regions and including the first coordinate;
    a second feature amount calculation means for calculating a second feature amount of a second coordinate in a second point group corresponding to the first coordinate, using information indicating that a point exists, that a point does not exist, or that the existence of a point is unknown in a second area including the second coordinate, the second area being composed of a plurality of second small areas;
    a determination means for determining whether or not a change in the presence or absence of a point has occurred between the first coordinates and the second coordinates by using the first feature amount and the second feature amount;
    A change detection device comprising:
  16.  前記第1の特徴量計算手段は、前記第1の特徴量を、各前記第1の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いて計算する、
     請求項15に記載の変化検出装置。
    the first feature amount calculation means calculates the first feature amount using information indicating that a point is present in each of the first small regions, that a point is not present, or that the presence of a point is unknown;
    The change detection device of claim 15.
  17.  前記第1の特徴量計算手段は、前記第1の点群における前記第1の座標の第1の近傍領域における各座標の特徴量を、複数の第3の小領域から構成され、前記座標を含む第3の領域において、各前記第3の小領域における点の存在に関する情報を用いてさらに計算し、
     前記判定手段は、前記第1の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在しない点が前記第2の座標において存在するか否かを判定する、
     請求項15又は16に記載の変化検出装置。
    the first feature amount calculation means further calculates feature amounts of each coordinate in a first neighborhood area of the first coordinate in the first point cloud, in a third area including the coordinate, the third area being composed of a plurality of third small areas, using information regarding the presence of points in each of the third small areas;
    the determining means determines whether or not a point that does not exist in the first coordinates is present in the second coordinates by using a feature amount of each coordinate in the first neighboring region, the first feature amount, and the second feature amount.
    17. A change detection device according to claim 15 or 16.
  18.  前記第1の点群のデータは、第1のセンサを用いて取得されるデータであり、
     前記第1の特徴量計算手段は、前記第1の点群のデータの取得時に前記第1の座標が示す位置と前記第1のセンサの位置との間の距離が長くなるほど、前記第1の近傍領域の大きさを大きくする、
     請求項17に記載の変化検出装置。
    the first point cloud data is data acquired using a first sensor,
    the first feature amount calculation means increases a size of the first neighboring region as a distance between a position indicated by the first coordinates and a position of the first sensor increases when the first point cloud data is acquired;
    20. The change detection device of claim 17.
  19.  前記第2の特徴量計算手段は、前記第2の点群における前記第2の座標の第2の近傍領域における各座標の特徴量を、複数の第4の小領域から構成され、前記座標を含む第4の領域において、各前記第4の小領域において点が存在する、点が存在しない、又は点の存在が不明であることのいずれかを示す情報を用いてさらに計算し、
     前記判定手段は、前記第2の近傍領域における各座標の特徴量と、前記第1の特徴量と、前記第2の特徴量とを用いて、前記第1の座標において存在する点が前記第2の座標において存在しないか否かを判定する、
     請求項15乃至18のいずれか1項に記載の変化検出装置。
    the second feature amount calculation means further calculates feature amounts of each coordinate in a second neighboring region of the second coordinate in the second point cloud, the fourth region being composed of a plurality of fourth small regions and including the coordinate, using information indicating that a point exists in each of the fourth small regions, that a point does not exist, or that the existence of a point is unknown;
    the determining means determines whether or not a point existing at the first coordinates does not exist at the second coordinates by using a feature amount of each coordinate in the second neighboring region, the first feature amount, and the second feature amount.
    A change detection device according to any one of claims 15 to 18.
  20.  前記第1の点群と、前記第2の点群とを比較し、前記第1の点群と前記第2の点群において点の有無が異なる座標を抽出する抽出部をさらに備え、
     前記第1の座標又は前記第2の座標の少なくともいずれかは、前記抽出部が抽出した座標である、
     請求項15乃至19のいずれか1項に記載の変化検出装置。
    An extraction unit that compares the first point cloud with the second point cloud and extracts coordinates where a point is present or absent in the first point cloud and the second point cloud,
    At least one of the first coordinates and the second coordinates is a coordinate extracted by the extraction unit.
    20. A change detection device according to any one of claims 15 to 19.
PCT/JP2022/038830 2022-10-18 2022-10-18 Change detection method, change detection system, and change detection apparatus WO2024084601A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/038830 WO2024084601A1 (en) 2022-10-18 2022-10-18 Change detection method, change detection system, and change detection apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/038830 WO2024084601A1 (en) 2022-10-18 2022-10-18 Change detection method, change detection system, and change detection apparatus

Publications (1)

Publication Number Publication Date
WO2024084601A1 true WO2024084601A1 (en) 2024-04-25

Family

ID=90737075

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/038830 WO2024084601A1 (en) 2022-10-18 2022-10-18 Change detection method, change detection system, and change detection apparatus

Country Status (1)

Country Link
WO (1) WO2024084601A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009064287A (en) * 2007-09-07 2009-03-26 Meidensha Corp Intruder detector
JP2016217941A (en) * 2015-05-22 2016-12-22 株式会社東芝 Three-dimensional evaluation device, three-dimensional data measurement system and three-dimensional measurement method
JP2018181056A (en) * 2017-04-17 2018-11-15 富士通株式会社 Difference detection program, difference detection device, and difference detection method
JP2019046295A (en) * 2017-09-05 2019-03-22 三菱電機株式会社 Monitoring device
JP2020166516A (en) * 2019-03-29 2020-10-08 田中 成典 Point group data management system
JP2022098432A (en) * 2020-12-21 2022-07-01 コモンウェルス サイエンティフィック アンド インダストリアル リサーチ オーガナイゼーション Vehicle navigation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009064287A (en) * 2007-09-07 2009-03-26 Meidensha Corp Intruder detector
JP2016217941A (en) * 2015-05-22 2016-12-22 株式会社東芝 Three-dimensional evaluation device, three-dimensional data measurement system and three-dimensional measurement method
JP2018181056A (en) * 2017-04-17 2018-11-15 富士通株式会社 Difference detection program, difference detection device, and difference detection method
JP2019046295A (en) * 2017-09-05 2019-03-22 三菱電機株式会社 Monitoring device
JP2020166516A (en) * 2019-03-29 2020-10-08 田中 成典 Point group data management system
JP2022098432A (en) * 2020-12-21 2022-07-01 コモンウェルス サイエンティフィック アンド インダストリアル リサーチ オーガナイゼーション Vehicle navigation

Similar Documents

Publication Publication Date Title
CN111325796B (en) Method and apparatus for determining pose of vision equipment
US11450063B2 (en) Method and apparatus for training object detection model
JP5671281B2 (en) Position / orientation measuring apparatus, control method and program for position / orientation measuring apparatus
US20200003886A1 (en) Apparatus and method with ego motion estimation
JP4985516B2 (en) Information processing apparatus, information processing method, and computer program
JP5950122B2 (en) Calibration apparatus, calibration method, and calibration program
KR20200040665A (en) Systems and methods for detecting a point of interest change using a convolutional neural network
CN111079619A (en) Method and apparatus for detecting target object in image
US20200234452A1 (en) Imaging object tracking system and imaging object tracking method
CN114089330A (en) Indoor mobile robot glass detection and map updating method based on depth image restoration
JP5976089B2 (en) Position / orientation measuring apparatus, position / orientation measuring method, and program
JP7427615B2 (en) Information processing device, information processing method and program
Shah et al. Condition assessment of ship structure using robot assisted 3D-reconstruction
Park et al. Identifying reflected images from object detector in indoor environment utilizing depth information
WO2021186640A1 (en) Deterioration detection device, deterioration detection system, deterioration detection method, and program
WO2024084601A1 (en) Change detection method, change detection system, and change detection apparatus
US12002193B2 (en) Inspection device for inspecting a building or structure
Vaida et al. Automatic extrinsic calibration of LIDAR and monocular camera images
EP4006829A1 (en) Information processing device, data generation method, and non-transitory computer-readable medium having program stored thereon
US20240144624A1 (en) Measurement apparatus, measurement system, and measurement method
US11922667B2 (en) Object region identification device, object region identification method, and object region identification program
JP7443303B2 (en) Measuring device, measuring method and program
US20240127480A1 (en) Information processing apparatus, information processing method, and program
CN113537001B (en) Vehicle driving autonomous decision-making method and device based on visual target tracking
KR102139667B1 (en) Method and device for acquiring information

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22962713

Country of ref document: EP

Kind code of ref document: A1