US20200211222A1 - Work support system and work method - Google Patents

Work support system and work method Download PDF

Info

Publication number
US20200211222A1
US20200211222A1 US16/814,445 US202016814445A US2020211222A1 US 20200211222 A1 US20200211222 A1 US 20200211222A1 US 202016814445 A US202016814445 A US 202016814445A US 2020211222 A1 US2020211222 A1 US 2020211222A1
Authority
US
United States
Prior art keywords
dimensional data
mismatch amount
transfer
transfer object
structural
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/814,445
Inventor
Ryo MORINAGA
Sei Musha
Otoharu KUWAMURA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Priority to US16/814,445 priority Critical patent/US20200211222A1/en
Publication of US20200211222A1 publication Critical patent/US20200211222A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • G06Q10/047Optimisation of routes or paths, e.g. travelling salesman problem
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/083Shipping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/08Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10008Still image; Photographic image from scanner, fax or copier
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Definitions

  • Embodiments described herein relate generally to a work support system and a work method.
  • JP-A 2014-178794 (Kokai) a system is discussed in which a laser scanner or the like is used to acquire three-dimensional data of a structural object, and a transfer path is generated for transferring building materials and/or equipment into the interior of the structural object. According to the technology discussed in JP-A 2014-178794 (Kokai), the efficiency of the work for the transferring can be increased.
  • JP-A 2014-178794 Koreani
  • the fixation work of the transferred building materials and/or equipment is not described; and there is still room for improvement for the work efficiency of such fixation work.
  • FIG. 1 is a block diagram illustrating a configuration of a work support system according to an embodiment
  • FIG. 2 is a flowchart illustrating an example of work and operations of the work support system according to the embodiment
  • FIGS. 3A to 3D are schematic views illustrating an example of data stored in a storage part
  • FIG. 4 is a flowchart illustrating an example of specific processing of steps S 6 and S 7 illustrated in FIG. 2 ;
  • FIG. 5 is a schematic view illustrating an example when first three-dimensional data and second three-dimensional data are overlaid.
  • FIGS. 6A to 6C are schematic views illustrating the appearance when the three-dimensional data of FIG. 5 is projected onto each plane.
  • a work support system includes an imager, a storage part, and a processor.
  • the imager is configured to image a first transfer object after the first transfer object is transferred into a structural object and arranged inside the structural object.
  • the storage part stores first three-dimensional data.
  • the first three-dimensional data includes three-dimensional data of the structural object and three-dimensional data of the first transfer object.
  • the three-dimensional data of the first transfer object is overlaid at a first position inside the structural object of the three-dimensional data.
  • the processor detects a displacement of the first transfer object of second three-dimensional data with respect to the first transfer object of the first three-dimensional data.
  • the second three-dimensional data is obtained by imaging the first transfer object arranged inside the structural object.
  • FIG. 1 is a block diagram illustrating the configuration of the work support system 100 according to the embodiment.
  • the work support system 100 includes an input part 10 , an imager 20 , a storage part 30 , a processor 40 , a displayer 50 , and a terminal 60 .
  • the work support system 100 according to the embodiment is used to support work.
  • the work includes, for example, the transfer of multiple transfer objects B into a structural object A, the connection of the multiple transfer objects B, and the fixation of the multiple transfer objects B.
  • the multiple transfer objects B include first to nth transfer objects B 1 to B n .
  • first work and second work described below are examples of such work.
  • multiple equipment for configuring a paper sheet sorting apparatus are transferred into an existing structural object.
  • the multiple equipment are installed and assembled.
  • multiple building materials such as pipes, etc.
  • the multiple building materials are linked to each other while being fixed by welding, etc.
  • the input part 10 performs the input of information to the processor 40 .
  • the input part 10 is used by a user of the work support system 100 .
  • the input part 10 is, for example, a keyboard, a touch panel, a microphone (voice input), etc.
  • the imager 20 acquires three-dimensional data of a subject that is imaged.
  • the imager 20 is, for example, a three-dimensional laser scanner.
  • the imager 20 scans a laser over the surface of the subject and determines the three-dimensional coordinates of the irradiation points from the reflected light.
  • the exterior form of the subject is acquired as a collection of many points (point cloud data).
  • the imager 20 is mounted at the work site inside the structural object A.
  • the imager 20 automatically acquires the point cloud data of the internal structure of the structural object A and the point cloud data of the first to nth transfer objects B 1 to B n .
  • Multiple imagers 20 may be mounted inside the structural object A.
  • the multiple imagers 20 reduce dead angles by being mounted at mutually-different positions.
  • the point cloud data that represents the exterior forms of the objects is acquired more accurately.
  • the processor 40 compares the multiple point cloud data and overlays the parts that match each other. Thereby, one set of point cloud data is generated.
  • the storage part 30 stores the various data used in the work support system 100 .
  • the storage part 30 is, for example, a hard disk drive (HDD) built into a PC, a file server, etc.
  • HDD hard disk drive
  • the storage part 30 includes a structural object database 31 , a transfer object database 32 , a design database 33 , an imaging result database 34 , and a schedule database 35 .
  • the structural object database 31 stores the three-dimensional data of the internal structure of the structural object A where the work is performed.
  • the three-dimensional data includes the area inside the structural object A where the first to nth transfer objects B 1 to B n are fixed, and the transfer path to the area of the first to nth transfer objects B 1 to B n , inside the structural object A.
  • the structural object database 31 may further store three-dimensional data of a location where the transfer of the first to nth transfer objects B 1 to B n is performed outside the structural object A.
  • such three-dimensional data is obtained by using the imager 20 to image the structural object A prior to the work and by acquiring the point cloud data.
  • such three-dimensional CAD data may be stored in the structural object database 31 .
  • the transfer object database 32 stores the three-dimensional data (e.g., the three-dimensional CAD data) of each of the first to nth transfer objects B 1 to B n .
  • the three-dimensional data may be acquired by using the imager 20 to image each of the transfer objects prior to the work.
  • the design database 33 stores first three-dimensional data.
  • the three-dimensional data of the structural object A that is stored in the structural object database 31 and the three-dimensional data of each of the transfer objects that is stored in the transfer object database 32 are overlaid in three-dimensional coordinates.
  • the imaging result database 34 stores second three-dimensional data in the state in which the first to nth transfer objects B 1 to B n are actually arranged inside the structural object A.
  • the second three-dimensional data is obtained by using the imager 20 to image the first to nth transfer objects B 1 to B n after the first to nth transfer objects B 1 to B n are arranged and fixed inside the structural object A.
  • the imaging result database 34 may store other multiple three-dimensional data.
  • the other multiple three-dimensional data respectively illustrates states in which the first to nth transfer objects B 1 to B n are fixed.
  • the imaging result database 34 may store three-dimensional data illustrating a state after the first transfer object B 1 is fixed before the second transfer object B 2 is transferred, three-dimensional data illustrating a state after the first transfer object B 1 and the second transfer object B 2 are fixed before the third transfer object B 3 is transferred, etc.
  • the schedule database 35 stores the schedule of the work.
  • the work schedule is generated by a generally-used scheduler.
  • the work schedule includes information such as the delivery time of the work, the transfer sequence and the fixation sequence of the transfer objects, the transfer date and time of each of the transfer objects, the resources (the personnel, the machines, etc.) used in the transfer, the operation time period of each resource, etc.
  • the processor 40 is, for example, a CPU (a central processing unit) and memory included in a PC.
  • a program for causing the processor 40 to execute the various processing is stored in the memory.
  • the processor 40 executes the processing while referring to the data stored in the storage part 30 . The specific processing that is executed by the processor 40 is described below.
  • the displayer 50 is, for example, a monitor, a touch panel, etc.
  • the displayer 50 displays each database stored in the storage part 30 so that the user of the work support system 100 can confirm and edit each database.
  • the displayer 50 displays the results derived by the processor 40 so that the user can confirm the results.
  • the terminal 60 is, for example, a mobile device such as a smartphone, a tablet, etc.
  • the worker that works inside the structural object A carries the terminal 60 .
  • the processor 40 can transmit the derived results to the terminal 60 .
  • the terminal 60 functions as a receiver that receives the information transmitted from the processor 40 .
  • the worker can refer to each database of the storage part 30 by using the terminal 60 .
  • the components described above are connected to each other by a wireless network, various cables such as USB cables, LAN cables, etc., so that the necessary information can be mutually sent and received.
  • FIG. 2 is a flowchart illustrating an example of the work and the operations of the work support system 100 according to the embodiment.
  • FIGS. 3A to 3D are schematic views illustrating an example of the data stored in the storage part 30 .
  • FIG. 3A illustrates the three-dimensional data of the structural object A stored in the structural object database 31 .
  • FIG. 3B illustrates the three-dimensional data of the first to nth transfer objects B 1 to B n connected to each other. The three-dimensional data illustrated in FIG. 3B is stored in the transfer object database 32 .
  • FIG. 3C illustrates the first three-dimensional data stored in the design database 33 .
  • FIG. 3D illustrates the second three-dimensional data stored in the imaging result database 34 .
  • the three-dimensional data that is stored in the storage part 30 is illustrated schematically in two dimensions in FIG. 3A to FIG. 3D .
  • the flow illustrated in FIG. 2 mainly includes step S 4 in which the transfer and fixation of the transfer objects are performed, steps S 1 to S 3 that are performed prior to the work of step S 4 , and steps S 5 to S 7 that are performed by the work support system 100 after the work of steps S 1 to S 4 .
  • Point cloud data is generated by imaging the interior of the structural object A by using the imager 20 . Thereby, the three-dimensional data of the internal structure of the structural object A is acquired. In the case where three-dimensional CAD data or the like of the structural object A already exists, the three-dimensional CAD data may be utilized; and step S 1 may be omitted.
  • step S 1 the three-dimensional data of the internal structure of the structural object A is prepared as illustrated in FIG. 3A .
  • the three-dimensional data of the first to nth transfer objects B 1 to Bn is acquired.
  • three-dimensional CAD data or the like of the first to nth transfer objects B 1 to Bn exists, such data may be utilized; and step S 2 may be omitted.
  • point cloud data that reflects the exterior forms of these transfer objects is generated by using a laser scanner to image each of the first to nth transfer objects B 1 to B n . Thereby, the three-dimensional data of the first to nth transfer objects B 1 to B n is acquired.
  • step S 2 the three-dimensional data of the first to nth transfer objects B 1 to B n is prepared as illustrated in FIG. 3B .
  • the three-dimensional data of the first to nth transfer objects B 1 to B n prepared in step S 2 is overlaid in three-dimensional coordinates on the three-dimensional data of the structural object A prepared in step S 1 .
  • the three-dimensional data of the first to nth transfer objects B 1 to B n is overlaid at a first position of the structural object A of the three-dimensional data.
  • the first position corresponds to a second position where each of the transfer objects are fixed inside the structural object A in the actual work.
  • the first three-dimensional data is generated by step S 3 .
  • the three-dimensional data of the first to nth transfer objects B 1 to B n is arranged at the first position inside the structural object A of the three-dimensional data as illustrated in FIG. 3C .
  • the user may use the first three-dimensional data to investigate and generate the transfer paths when transferring each of the transfer objects to the fixation locations inside the structural object A. For example, the user performs the investigation and the generation of the transfer paths while moving each of the first to nth transfer objects B 1 to B n inside the structural object A in the first three-dimensional data.
  • the transfer paths can be investigated while checking for the existence of interference between the structural object A and each of the transfer objects when transferring.
  • the work support system 100 may be configured so that the processor 40 executes the following operations.
  • the processor 40 automatically derives the transfer path of each of the transfer objects based on information such as the fixation locations and the coordinates of the first to nth transfer objects B 1 to B n the coordinates of the transfer entrance of the structural object A, etc.
  • the processor 40 may refer to the schedule database 35 .
  • the processor 40 may derive the transfer path of each of the transfer objects while considering the transfer sequence of the first to nth transfer objects B 1 to B n .
  • the processor 40 may cause the displayer 50 to display a video image of each of the first to nth transfer objects B 1 to B n , moving along the transfer paths set in the three-dimensional data of the structural object A. Using the first three-dimensional data, the processor 40 may cause the displayer 50 to display a state in which the first to nth transfer objects B 1 to B n are arranged at the first position inside the structural object A.
  • the states when transferring and after transferring can be ascertained in specific detail prior to actually transferring the first to nth transfer objects B 1 to B n . It is possible to perform the work more smoothly. Such information may be presented at the delivery locations of the transfer objects prior to the work.
  • the appearance of the work and the state after the delivery can be shared with the customer in more specific detail.
  • the point cloud data of the internal structure of the structural object A may include information relating to the detailed state of the floor surface inside the structural object A.
  • the worker can confirm the detailed state of the floor surface inside the structural object A from the three-dimensional data stored in the structural object database 31 .
  • the worker may perform the following work in the first three-dimensional data.
  • the worker may modify the fixation position of the first to nth transfer objects B 1 to B n to avoid parts of the floor surface where the unevenness is large.
  • the worker may adjust the height of members interposed between the floor surface and each of the transfer objects to reduce the tilt with respect to the horizontal direction of each of the transfer objects.
  • the first to nth transfer objects B 1 to B n are actually transferred into the structural object A and fixed. At this time, the first to nth transfer objects B 1 to B n are fixed at the predetermined first position inside the structural object A. In other words, the first to nth transfer objects B 1 to B n are fixed so that the position after the fixation of the first to nth transfer objects B 1 to B n matches the position where the first to nth transfer objects B 1 to B n are arranged in the first three-dimensional data.
  • the first to nth transfer objects B 1 to B n that are fixed are imaged by the imager 20 .
  • step S 5 the three-dimensional data (the second three-dimensional data) of the first to nth transfer objects B 1 to B n actually fixed inside the structural object A is acquired as illustrated in FIG. 3D .
  • the second three-dimensional data is stored in the storage part 30 .
  • step S 5 the three-dimensional data of the first to nth transfer objects B 1 to B n and the entire interior of the structural object A may be acquired as in FIG. 3C .
  • the processor 40 compares the first three-dimensional data obtained in step S 3 and the second three-dimensional data obtained in step S 5 . Thereby, the displacement of the first to nth transfer objects B 1 to B n of the second three-dimensional data with respect to the first to nth transfer objects B 1 to B n of the first three-dimensional data is detected.
  • the processor 40 overlays the first three-dimensional data and the second three-dimensional data in the same three-dimensional coordinate system. Thereby, how much the position of the first to nth transfer objects B 1 to B n is displaced between the first three-dimensional data and the second three-dimensional data is detected. In other words, the processor 40 detects, for the first to nth transfer objects B 1 to B n , the degree of the difference of the position when actually fixed with respect to the position designed prior to the transfer.
  • the processor 40 causes the displayer 50 to display the detection results.
  • the processor 40 transmits the detection results toward the terminal 60 .
  • the terminal 60 displays the received results on a screen.
  • the worker confirms the detection results displayed by the terminal 60 . Thereby, the worker can easily confirm whether or not the position of the fixed first to nth transfer objects B 1 to B n matches the position designed beforehand.
  • steps S 5 to S 7 described above being performed by the work support system 100 according to the embodiment it can be detected automatically whether or not the position of the first to nth transfer objects B 1 to B n after the fixation matches the position designed beforehand.
  • the work support system 100 it is unnecessary for the worker to measure and confirm the position of the first to nth transfer objects B 1 to B n after the fixation; and the work efficiency can be increased.
  • the work support system 100 By using the work support system 100 according to the embodiment, detection results that do not depend on the surveying skill of the worker are obtained. Therefore, the existence and degree of the displacement can be verified with higher precision. For example, by detecting the displacement with high precision, the worker can refix each of the transfer objects based on the detection results. Therefore, the displacement of the final fixation position of each of the transfer objects with respect to the design position can be small; and the quality of the work can be improved.
  • FIG. 4 is a flowchart illustrating an example of the specific processing of steps S 6 and S 7 illustrated in FIG. 2 .
  • FIG. 5 is a schematic view illustrating an example when the first three-dimensional data and the second three-dimensional data are overlaid.
  • FIGS. 6A to 6C are schematic views illustrating the appearance when the three-dimensional data of FIG. 5 is projected onto each plane.
  • steps S 61 to S 64 are performed in step S 6 ; and step S 71 or S 72 is performed in step S 7 .
  • step S 6 the first three-dimensional data and the second three-dimensional data are overlaid in the same three-dimensional coordinate system in step S 61 .
  • FIG. 5 illustrates the schematic appearance when overlaid.
  • the transfer object B that is included in the first three-dimensional data is illustrated by the broken lines; and the transfer object B that is included in the second three-dimensional data is illustrated by the solid lines.
  • At least the second three-dimensional data is point cloud data; but FIG. 5 schematically illustrates the outer edges of the point cloud data using straight lines.
  • the transfer object B of the first three-dimensional data and the transfer object B of the second three-dimensional data after the overlaying are projected in some direction.
  • planes the X-Y plane, the Y-Z plane, and the X-Z plane
  • the X-Z plane perpendicular to the directions in the three-dimensional coordinates
  • FIGS. 6A to 6C The appearance when the three-dimensional data illustrated in FIG. 5 is projected onto each plane is illustrated in FIGS. 6A to 6C .
  • FIG. 6A illustrates the projection of the X-Y plane.
  • FIG. 6B illustrates the projection of the Y-Z plane.
  • FIG. 6C illustrates the projection of the X-Z plane.
  • the transfer object B of the first three-dimensional data is illustrated by the broken lines; and the transfer object B of the second three-dimensional data is illustrated by the solid lines.
  • the part where the density of the dots is high illustrates the part where the transfer objects B overlap (match) between the first three-dimensional data and the second three-dimensional data.
  • the parts where the density of the dots is low illustrate the parts where the transfer objects B do not overlap (do not match) between the first three-dimensional data and the second three-dimensional data.
  • the processor 40 calculates the mismatch amount of the three-dimensional data of the transfer objects B between the first three-dimensional data and the second three-dimensional data for each of the projection planes as illustrated in FIG. 6A to FIG. 6C .
  • the first three-dimensional data includes, for example, the three-dimensional CAD data of the transfer object B.
  • the proportion of the three-dimensional CAD data and the point cloud data overlapping in the projection plane is calculated.
  • the region separated from any point included in the point cloud data by a prescribed distance or more in the three-dimensional CAD data is determined to be mismatched.
  • points that do not overlap the three-dimensional CAD data are determined to be mismatched.
  • the irradiation angle pitch of the laser beam irradiated from the imager 20 has a spacing of 3 millimeters at 10 meters when the point cloud data is generated by the imager 20 .
  • the prescribed distance is set to 10 millimeters.
  • the processor 40 detects a first mismatch amount in the X-direction (the first direction), a second mismatch amount in the Y-direction (the second direction), and a third mismatch amount in the Z-direction (the third direction) for the transfer object B. For example, the processor 40 calculates the number of points determined to be mismatched as the first mismatch amount, the second mismatch amount, and the third mismatch amount respectively for the multiple projection planes. Or, the processor 40 may calculate the surface area of the three-dimensional CAD data determined to be mismatched as the first mismatch amount, the second mismatch amount, and the third mismatch amount respectively for the multiple projection planes.
  • the processor 40 may calculate the proportion of the number of points determined to be mismatched with respect to the number of points included in the point cloud data of the transfer object B as the first mismatch amount, the second mismatch amount, and the third mismatch amount respectively for the multiple projection planes.
  • the processor 40 may calculate the proportion of the surface area determined to be mismatched with respect to the surface area of the three-dimensional CAD data as the first mismatch amount, the second mismatch amount, and the third mismatch amount respectively for the multiple projection planes.
  • the first three-dimensional data and the second three-dimensional data may include information of the colors of each of the transfer objects B. In such a case, the match or the mismatch may be determined further based on the information of the color of each point.
  • the processor 40 may extract characteristic parts from each of the transfer objects B of the first three-dimensional data and the second three-dimensional data for each projection plane. In such a case, the processor 40 may calculate the distances between the characteristic parts as the first mismatch amount, the second mismatch amount, and the third mismatch amount respectively for the multiple projection planes.
  • the processor 40 calculates the mismatch amount for each mutually-orthogonal direction (each projection plane). Thereby, it can be detected how much and in which direction the fixed transfer object B is displaced with respect to the pre-designed position.
  • the mismatch amount also can be calculated for any projection plane by projecting the overlaid first three-dimensional data and second three-dimensional data in any direction.
  • the proportion or the number of the points or the surface area determined to be mismatched is calculated as the mismatch amount.
  • the proportion or the number of the points or the surface area determined to match may be calculated as the mismatch amount. Even in such a case, it can be considered that substantially the mismatch amount is calculated.
  • the processor 40 may execute the following method.
  • the processor 40 moves and rotates the transfer object B of one of the first three-dimensional data or the second three-dimensional data for each projection plane. At this time, the processor 40 moves the transfer object B so that the mismatch amount of the transfer object B between the first three-dimensional data and the second three-dimensional data is a minimum.
  • the processor 40 stores, in the storage part 30 , the result of the calculated movement distance and rotation angle in each direction.
  • the movement direction, the movement distance, and the rotation angle of the transfer object B necessary to minimize the mismatch amount for each projection plane are calculated.
  • This method is effective for work in which the multiple transfer objects are sequentially transferred and fixed. In such work, if the fixation position of one transfer object is rotated with respect to the predetermined position, the amount of the displacement (the shift of the position) for the other transfer objects fixed to the one transfer object increases according to the number of transfer objects to be fixed. Accordingly, in the case where the work support system 100 is used in such work, it is desirable for the processor 40 to calculate the rotation angle for minimizing the mismatch amount.
  • a preset threshold for the mismatch amount detected in step S 63 is stored in the storage part 30 .
  • the processor 40 compares the mismatch amount to the threshold for each projection plane. For example, the mismatch amount that can be tolerated in each direction is set as the threshold. In the case where the mismatch amount is less than the threshold for each projection plane, the flow proceeds to step 571 .
  • step S 72 the flow proceeds to step S 72 .
  • the processor 40 causes the displayer 50 to display or notifies the terminal 60 that the mismatch amount in each direction is less than the threshold and within the tolerance range for the fixed first to nth transfer objects B 1 to B n . At this time, the mismatch amount in each direction may be displayed by the displayer 50 and notified to the terminal 60 .
  • the processor 40 causes the displayer 50 to display or notifies the terminal 60 that there is a direction in which the mismatch amount is not less than the threshold for the fixed first to nth transfer objects B 1 to B n . Also, the processor 40 causes the displayer 50 to display or notifies the terminal 60 of the direction in which the mismatch amount is not less than the threshold and the mismatch amount of the direction. Further, the mismatch amount may be displayed by the displayer 50 or notified to the terminal 60 for the direction in which the mismatch amount is less than the threshold.
  • the processor 40 may convert and display the mismatch amount detected in each direction into numerical values using units such as millimeters, centimeters, etc. Thereby, the user of the work support system 100 can easily ascertain the mismatch amount intuitively.
  • steps S 71 and S 72 there may be no display of a notification for the directions in which the mismatch amount is less than the threshold.
  • the processor 40 may not display or notify the results in step S 71 and may display only the direction in which the mismatch amount is not less than the threshold and the mismatch amount of the direction in step S 72 .
  • the work support system 100 is applicable also when the transfer and fixation of one transfer object is performed.
  • the method illustrated in the flowchart of FIG. 2 and FIG. 4 is applicable even in the case where only the first transfer object B 1 is transferred and fixed inside the structural object A.
  • the case is described in the example described above where the first to nth transfer objects B 1 to B n are imaged and the displacement of the first to nth transfer objects B 1 to B n is detected after the first to nth transfer objects B 1 to B n are fixed.
  • the displacement of these transfer objects may be detected after the first to nth transfer objects B 1 to B n are arranged inside the structural object A prior to the fixation. The displacement can be corrected more easily by detecting the displacement prior to fixing each of the transfer objects.
  • the work support system 100 may execute steps S 5 to S 7 of FIG. 2 each time one transfer object is transferred and arranged inside the structural object A.
  • the work support system 100 according to the embodiment executes steps S 5 to S 7 after the first transfer object B 1 is transferred and arranged prior to performing the fixation of the second transfer object B 2 and the connection with the first transfer object B 1 .
  • the displacement of the first transfer object B 1 can be detected and corrected prior to the fixation and the connection of the second transfer object B 2 . Therefore, the time necessary for the work can be shorter than in the case of correcting the displacement after all of the transfer objects are fixed and connected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Multimedia (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Civil Engineering (AREA)
  • Architecture (AREA)
  • Game Theory and Decision Science (AREA)
  • Structural Engineering (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Processing Or Creating Images (AREA)
  • General Factory Administration (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

According to one embodiment, a work support system includes an imager, a storage part, and a processor. The imager is configured to image a first transfer object after the first transfer object is transferred into a structural object and arranged inside the structural object. The storage part stores first three-dimensional data. The first three-dimensional data includes three-dimensional data of the structural object and three-dimensional data of the first transfer object. The three-dimensional data of the first transfer object is overlaid at a first position inside the structural object of the three-dimensional data. The processor detects a displacement of the first transfer object of second three-dimensional data with respect to the first transfer object of the first three-dimensional data. The second three-dimensional data is obtained by imaging the first transfer object arranged inside the structural object.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2016-218249, filed on Nov. 8, 2016; the entire contents of which are incorporated herein by reference.
  • FIELD
  • Embodiments described herein relate generally to a work support system and a work method.
  • BACKGROUND
  • In JP-A 2014-178794 (Kokai), a system is discussed in which a laser scanner or the like is used to acquire three-dimensional data of a structural object, and a transfer path is generated for transferring building materials and/or equipment into the interior of the structural object. According to the technology discussed in JP-A 2014-178794 (Kokai), the efficiency of the work for the transferring can be increased.
  • However, in JP-A 2014-178794 (Kokai), the fixation work of the transferred building materials and/or equipment is not described; and there is still room for improvement for the work efficiency of such fixation work.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram illustrating a configuration of a work support system according to an embodiment;
  • FIG. 2 is a flowchart illustrating an example of work and operations of the work support system according to the embodiment;
  • FIGS. 3A to 3D are schematic views illustrating an example of data stored in a storage part;
  • FIG. 4 is a flowchart illustrating an example of specific processing of steps S6 and S7 illustrated in FIG. 2;
  • FIG. 5 is a schematic view illustrating an example when first three-dimensional data and second three-dimensional data are overlaid; and
  • FIGS. 6A to 6C are schematic views illustrating the appearance when the three-dimensional data of FIG. 5 is projected onto each plane.
  • DETAILED DESCRIPTION
  • According to one embodiment, a work support system includes an imager, a storage part, and a processor. The imager is configured to image a first transfer object after the first transfer object is transferred into a structural object and arranged inside the structural object. The storage part stores first three-dimensional data. The first three-dimensional data includes three-dimensional data of the structural object and three-dimensional data of the first transfer object. The three-dimensional data of the first transfer object is overlaid at a first position inside the structural object of the three-dimensional data. The processor detects a displacement of the first transfer object of second three-dimensional data with respect to the first transfer object of the first three-dimensional data. The second three-dimensional data is obtained by imaging the first transfer object arranged inside the structural object.
  • Embodiments of the invention will now be described with reference to the drawings.
  • In the drawings and the specification of the application, components similar to those described thereinabove are marked with like reference numerals, and a detailed description is omitted as appropriate.
  • FIG. 1 is a block diagram illustrating the configuration of the work support system 100 according to the embodiment.
  • As illustrated in FIG. 1, the work support system 100 includes an input part 10, an imager 20, a storage part 30, a processor 40, a displayer 50, and a terminal 60. For example, the work support system 100 according to the embodiment is used to support work. The work includes, for example, the transfer of multiple transfer objects B into a structural object A, the connection of the multiple transfer objects B, and the fixation of the multiple transfer objects B. The multiple transfer objects B include first to nth transfer objects B1 to Bn.
  • For example, first work and second work described below are examples of such work.
  • In the first work, multiple equipment for configuring a paper sheet sorting apparatus are transferred into an existing structural object. The multiple equipment are installed and assembled.
  • In the second work, multiple building materials such as pipes, etc., are transferred into the structural object at the construction site of the plant. The multiple building materials are linked to each other while being fixed by welding, etc.
  • The input part 10 performs the input of information to the processor 40. For example, the input part 10 is used by a user of the work support system 100. The input part 10 is, for example, a keyboard, a touch panel, a microphone (voice input), etc.
  • The imager 20 acquires three-dimensional data of a subject that is imaged. The imager 20 is, for example, a three-dimensional laser scanner. The imager 20 scans a laser over the surface of the subject and determines the three-dimensional coordinates of the irradiation points from the reflected light. The exterior form of the subject is acquired as a collection of many points (point cloud data). For example, the imager 20 is mounted at the work site inside the structural object A. The imager 20 automatically acquires the point cloud data of the internal structure of the structural object A and the point cloud data of the first to nth transfer objects B1 to Bn.
  • Multiple imagers 20 may be mounted inside the structural object A. The multiple imagers 20 reduce dead angles by being mounted at mutually-different positions. The point cloud data that represents the exterior forms of the objects is acquired more accurately. In the case where the multiple imagers 20 are mounted inside the structural object A, the processor 40 compares the multiple point cloud data and overlays the parts that match each other. Thereby, one set of point cloud data is generated. The storage part 30 stores the various data used in the work support system 100. The storage part 30 is, for example, a hard disk drive (HDD) built into a PC, a file server, etc.
  • The storage part 30 includes a structural object database 31, a transfer object database 32, a design database 33, an imaging result database 34, and a schedule database 35.
  • The structural object database 31 stores the three-dimensional data of the internal structure of the structural object A where the work is performed. The three-dimensional data includes the area inside the structural object A where the first to nth transfer objects B1 to Bn are fixed, and the transfer path to the area of the first to nth transfer objects B1 to Bn, inside the structural object A. The structural object database 31 may further store three-dimensional data of a location where the transfer of the first to nth transfer objects B1 to Bn is performed outside the structural object A. For example, such three-dimensional data is obtained by using the imager 20 to image the structural object A prior to the work and by acquiring the point cloud data. Or, in the case where three-dimensional CAD data of the structural object A and the location where the transfer is performed exists, such three-dimensional CAD data may be stored in the structural object database 31.
  • The transfer object database 32 stores the three-dimensional data (e.g., the three-dimensional CAD data) of each of the first to nth transfer objects B1 to Bn. In the case where there is no three-dimensional CAD data of these transfer objects, the three-dimensional data may be acquired by using the imager 20 to image each of the transfer objects prior to the work.
  • The design database 33 stores first three-dimensional data. In the first three-dimensional data, the three-dimensional data of the structural object A that is stored in the structural object database 31 and the three-dimensional data of each of the transfer objects that is stored in the transfer object database 32 are overlaid in three-dimensional coordinates.
  • The imaging result database 34 stores second three-dimensional data in the state in which the first to nth transfer objects B1 to Bn are actually arranged inside the structural object A. The second three-dimensional data is obtained by using the imager 20 to image the first to nth transfer objects B1 to Bn after the first to nth transfer objects B1 to Bn are arranged and fixed inside the structural object A.
  • The imaging result database 34 may store other multiple three-dimensional data. The other multiple three-dimensional data respectively illustrates states in which the first to nth transfer objects B1 to Bn are fixed. For example, the imaging result database 34 may store three-dimensional data illustrating a state after the first transfer object B1 is fixed before the second transfer object B2 is transferred, three-dimensional data illustrating a state after the first transfer object B1 and the second transfer object B2 are fixed before the third transfer object B3 is transferred, etc.
  • The schedule database 35 stores the schedule of the work. For example, the work schedule is generated by a generally-used scheduler. The work schedule includes information such as the delivery time of the work, the transfer sequence and the fixation sequence of the transfer objects, the transfer date and time of each of the transfer objects, the resources (the personnel, the machines, etc.) used in the transfer, the operation time period of each resource, etc. The processor 40 is, for example, a CPU (a central processing unit) and memory included in a PC. A program for causing the processor 40 to execute the various processing is stored in the memory. The processor 40 executes the processing while referring to the data stored in the storage part 30. The specific processing that is executed by the processor 40 is described below.
  • The displayer 50 is, for example, a monitor, a touch panel, etc. The displayer 50 displays each database stored in the storage part 30 so that the user of the work support system 100 can confirm and edit each database. The displayer 50 displays the results derived by the processor 40 so that the user can confirm the results.
  • The terminal 60 is, for example, a mobile device such as a smartphone, a tablet, etc. The worker that works inside the structural object A carries the terminal 60. The processor 40 can transmit the derived results to the terminal 60. In other words, the terminal 60 functions as a receiver that receives the information transmitted from the processor 40. The worker can refer to each database of the storage part 30 by using the terminal 60.
  • The components described above are connected to each other by a wireless network, various cables such as USB cables, LAN cables, etc., so that the necessary information can be mutually sent and received.
  • An example of the work and operations of the work support system 100 according to the embodiment will now be described with reference to FIG. 2 and FIGS. 3A to 3D.
  • FIG. 2 is a flowchart illustrating an example of the work and the operations of the work support system 100 according to the embodiment.
  • FIGS. 3A to 3D are schematic views illustrating an example of the data stored in the storage part 30.
  • FIG. 3A illustrates the three-dimensional data of the structural object A stored in the structural object database 31. FIG. 3B illustrates the three-dimensional data of the first to nth transfer objects B1 to Bn connected to each other. The three-dimensional data illustrated in FIG. 3B is stored in the transfer object database 32. FIG. 3C illustrates the first three-dimensional data stored in the design database 33. FIG. 3D illustrates the second three-dimensional data stored in the imaging result database 34.
  • The three-dimensional data that is stored in the storage part 30 is illustrated schematically in two dimensions in FIG. 3A to FIG. 3D.
  • The flow illustrated in FIG. 2 mainly includes step S4 in which the transfer and fixation of the transfer objects are performed, steps S1 to S3 that are performed prior to the work of step S4, and steps S5 to S7 that are performed by the work support system 100 after the work of steps S1 to S4.
  • Step S1
  • Point cloud data is generated by imaging the interior of the structural object A by using the imager 20. Thereby, the three-dimensional data of the internal structure of the structural object A is acquired. In the case where three-dimensional CAD data or the like of the structural object A already exists, the three-dimensional CAD data may be utilized; and step S1 may be omitted.
  • By step S1, the three-dimensional data of the internal structure of the structural object A is prepared as illustrated in FIG. 3A.
  • Step S2
  • The three-dimensional data of the first to nth transfer objects B1 to Bn is acquired. In the case where three-dimensional CAD data or the like of the first to nth transfer objects B1 to Bn exists, such data may be utilized; and step S2 may be omitted. In the case where there is no three-dimensional CAD data or the like, for example, point cloud data that reflects the exterior forms of these transfer objects is generated by using a laser scanner to image each of the first to nth transfer objects B1 to Bn. Thereby, the three-dimensional data of the first to nth transfer objects B1 to Bn is acquired.
  • By step S2, the three-dimensional data of the first to nth transfer objects B1 to Bn is prepared as illustrated in FIG. 3B.
  • Step S3
  • The three-dimensional data of the first to nth transfer objects B1 to Bn prepared in step S2 is overlaid in three-dimensional coordinates on the three-dimensional data of the structural object A prepared in step S1. The three-dimensional data of the first to nth transfer objects B1 to Bn is overlaid at a first position of the structural object A of the three-dimensional data. The first position corresponds to a second position where each of the transfer objects are fixed inside the structural object A in the actual work.
  • The first three-dimensional data is generated by step S3. In the first three-dimensional data, the three-dimensional data of the first to nth transfer objects B1 to Bn is arranged at the first position inside the structural object A of the three-dimensional data as illustrated in FIG. 3C.
  • After generating the first three-dimensional data, the user may use the first three-dimensional data to investigate and generate the transfer paths when transferring each of the transfer objects to the fixation locations inside the structural object A. For example, the user performs the investigation and the generation of the transfer paths while moving each of the first to nth transfer objects B1 to Bn inside the structural object A in the first three-dimensional data. By using the first three-dimensional data, the transfer paths can be investigated while checking for the existence of interference between the structural object A and each of the transfer objects when transferring.
  • Or, the work support system 100 may be configured so that the processor 40 executes the following operations. For example, the processor 40 automatically derives the transfer path of each of the transfer objects based on information such as the fixation locations and the coordinates of the first to nth transfer objects B1 to Bn the coordinates of the transfer entrance of the structural object A, etc. At this time, the processor 40 may refer to the schedule database 35. The processor 40 may derive the transfer path of each of the transfer objects while considering the transfer sequence of the first to nth transfer objects B1 to Bn.
  • The processor 40 may cause the displayer 50 to display a video image of each of the first to nth transfer objects B1 to Bn, moving along the transfer paths set in the three-dimensional data of the structural object A. Using the first three-dimensional data, the processor 40 may cause the displayer 50 to display a state in which the first to nth transfer objects B1 to Bn are arranged at the first position inside the structural object A.
  • By causing the displayer 50 to display such information, the states when transferring and after transferring can be ascertained in specific detail prior to actually transferring the first to nth transfer objects B1 to Bn. It is possible to perform the work more smoothly. Such information may be presented at the delivery locations of the transfer objects prior to the work.
  • The appearance of the work and the state after the delivery can be shared with the customer in more specific detail.
  • In the case where the point cloud data of the internal structure of the structural object A is acquired in step S1, the point cloud data may include information relating to the detailed state of the floor surface inside the structural object A. In such a case, the worker can confirm the detailed state of the floor surface inside the structural object A from the three-dimensional data stored in the structural object database 31. At this time, the worker may perform the following work in the first three-dimensional data. The worker may modify the fixation position of the first to nth transfer objects B1 to Bn to avoid parts of the floor surface where the unevenness is large. The worker may adjust the height of members interposed between the floor surface and each of the transfer objects to reduce the tilt with respect to the horizontal direction of each of the transfer objects.
  • Step S4
  • The first to nth transfer objects B1 to Bn are actually transferred into the structural object A and fixed. At this time, the first to nth transfer objects B1 to Bn are fixed at the predetermined first position inside the structural object A. In other words, the first to nth transfer objects B1 to Bn are fixed so that the position after the fixation of the first to nth transfer objects B1 to Bn matches the position where the first to nth transfer objects B1 to Bn are arranged in the first three-dimensional data.
  • Step S5
  • The first to nth transfer objects B1 to Bn that are fixed are imaged by the imager 20.
  • By step S5, the three-dimensional data (the second three-dimensional data) of the first to nth transfer objects B1 to Bn actually fixed inside the structural object A is acquired as illustrated in FIG. 3D. The second three-dimensional data is stored in the storage part 30.
  • In the example illustrated in FIG. 3D, only the three-dimensional data of the first to nth transfer objects B1 to Bn and the periphery of the first to nth transfer objects B1 to Bn is acquired. In step S5, the three-dimensional data of the first to nth transfer objects B1 to Bn and the entire interior of the structural object A may be acquired as in FIG. 3C.
  • Step S6
  • The processor 40 compares the first three-dimensional data obtained in step S3 and the second three-dimensional data obtained in step S5. Thereby, the displacement of the first to nth transfer objects B1 to Bn of the second three-dimensional data with respect to the first to nth transfer objects B1 to Bn of the first three-dimensional data is detected.
  • More specifically, the processor 40 overlays the first three-dimensional data and the second three-dimensional data in the same three-dimensional coordinate system. Thereby, how much the position of the first to nth transfer objects B1 to Bn is displaced between the first three-dimensional data and the second three-dimensional data is detected. In other words, the processor 40 detects, for the first to nth transfer objects B1 to Bn, the degree of the difference of the position when actually fixed with respect to the position designed prior to the transfer.
  • Step S7
  • The processor 40 causes the displayer 50 to display the detection results. The processor 40 transmits the detection results toward the terminal 60. The terminal 60 displays the received results on a screen. The worker confirms the detection results displayed by the terminal 60. Thereby, the worker can easily confirm whether or not the position of the fixed first to nth transfer objects B1 to Bn matches the position designed beforehand.
  • By steps S5 to S7 described above being performed by the work support system 100 according to the embodiment, it can be detected automatically whether or not the position of the first to nth transfer objects B1 to Bn after the fixation matches the position designed beforehand.
  • Therefore, according to the work support system 100 according to the embodiment, it is unnecessary for the worker to measure and confirm the position of the first to nth transfer objects B1 to Bn after the fixation; and the work efficiency can be increased.
  • By using the work support system 100 according to the embodiment, detection results that do not depend on the surveying skill of the worker are obtained. Therefore, the existence and degree of the displacement can be verified with higher precision. For example, by detecting the displacement with high precision, the worker can refix each of the transfer objects based on the detection results. Therefore, the displacement of the final fixation position of each of the transfer objects with respect to the design position can be small; and the quality of the work can be improved.
  • One more specific example of the detection of the displacement and the display of the detection results of steps S6 and S7 will now be described with reference to FIG. 4 to FIGS. 6A to 6C.
  • FIG. 4 is a flowchart illustrating an example of the specific processing of steps S6 and S7 illustrated in FIG. 2.
  • FIG. 5 is a schematic view illustrating an example when the first three-dimensional data and the second three-dimensional data are overlaid.
  • FIGS. 6A to 6C are schematic views illustrating the appearance when the three-dimensional data of FIG. 5 is projected onto each plane.
  • In the example illustrated in FIG. 4, steps S61 to S64 are performed in step S6; and step S71 or S72 is performed in step S7.
  • Step S61
  • Similarly to step S6 described above, the first three-dimensional data and the second three-dimensional data are overlaid in the same three-dimensional coordinate system in step S61.
  • FIG. 5 illustrates the schematic appearance when overlaid. In FIG. 5, the transfer object B that is included in the first three-dimensional data is illustrated by the broken lines; and the transfer object B that is included in the second three-dimensional data is illustrated by the solid lines. At least the second three-dimensional data is point cloud data; but FIG. 5 schematically illustrates the outer edges of the point cloud data using straight lines.
  • Step S62
  • The transfer object B of the first three-dimensional data and the transfer object B of the second three-dimensional data after the overlaying are projected in some direction. Here, the case of being projected onto planes (the X-Y plane, the Y-Z plane, and the X-Z plane) perpendicular to the directions in the three-dimensional coordinates (the X-direction, the Y-direction, and the Z-direction) is described.
  • The appearance when the three-dimensional data illustrated in FIG. 5 is projected onto each plane is illustrated in FIGS. 6A to 6C.
  • FIG. 6A illustrates the projection of the X-Y plane. FIG. 6B illustrates the projection of the Y-Z plane. FIG. 6C illustrates the projection of the X-Z plane.
  • In FIG. 6A to FIG. 6C, the transfer object B of the first three-dimensional data is illustrated by the broken lines; and the transfer object B of the second three-dimensional data is illustrated by the solid lines. The part where the density of the dots is high illustrates the part where the transfer objects B overlap (match) between the first three-dimensional data and the second three-dimensional data. The parts where the density of the dots is low illustrate the parts where the transfer objects B do not overlap (do not match) between the first three-dimensional data and the second three-dimensional data.
  • Step S63
  • The processor 40 calculates the mismatch amount of the three-dimensional data of the transfer objects B between the first three-dimensional data and the second three-dimensional data for each of the projection planes as illustrated in FIG. 6A to FIG. 6C.
  • The first three-dimensional data includes, for example, the three-dimensional CAD data of the transfer object B. In such a case, the proportion of the three-dimensional CAD data and the point cloud data overlapping in the projection plane is calculated. At this time, the region separated from any point included in the point cloud data by a prescribed distance or more in the three-dimensional CAD data is determined to be mismatched. Or, points that do not overlap the three-dimensional CAD data are determined to be mismatched.
  • As an example, the irradiation angle pitch of the laser beam irradiated from the imager 20 has a spacing of 3 millimeters at 10 meters when the point cloud data is generated by the imager 20. In such a case, the prescribed distance is set to 10 millimeters.
  • The processor 40 detects a first mismatch amount in the X-direction (the first direction), a second mismatch amount in the Y-direction (the second direction), and a third mismatch amount in the Z-direction (the third direction) for the transfer object B. For example, the processor 40 calculates the number of points determined to be mismatched as the first mismatch amount, the second mismatch amount, and the third mismatch amount respectively for the multiple projection planes. Or, the processor 40 may calculate the surface area of the three-dimensional CAD data determined to be mismatched as the first mismatch amount, the second mismatch amount, and the third mismatch amount respectively for the multiple projection planes.
  • Or, the processor 40 may calculate the proportion of the number of points determined to be mismatched with respect to the number of points included in the point cloud data of the transfer object B as the first mismatch amount, the second mismatch amount, and the third mismatch amount respectively for the multiple projection planes. Or, the processor 40 may calculate the proportion of the surface area determined to be mismatched with respect to the surface area of the three-dimensional CAD data as the first mismatch amount, the second mismatch amount, and the third mismatch amount respectively for the multiple projection planes. The first three-dimensional data and the second three-dimensional data may include information of the colors of each of the transfer objects B. In such a case, the match or the mismatch may be determined further based on the information of the color of each point.
  • Or, the processor 40 may extract characteristic parts from each of the transfer objects B of the first three-dimensional data and the second three-dimensional data for each projection plane. In such a case, the processor 40 may calculate the distances between the characteristic parts as the first mismatch amount, the second mismatch amount, and the third mismatch amount respectively for the multiple projection planes.
  • Thus, the processor 40 calculates the mismatch amount for each mutually-orthogonal direction (each projection plane). Thereby, it can be detected how much and in which direction the fixed transfer object B is displaced with respect to the pre-designed position.
  • Or, the mismatch amount also can be calculated for any projection plane by projecting the overlaid first three-dimensional data and second three-dimensional data in any direction.
  • In the example described above, the proportion or the number of the points or the surface area determined to be mismatched is calculated as the mismatch amount. The proportion or the number of the points or the surface area determined to match may be calculated as the mismatch amount. Even in such a case, it can be considered that substantially the mismatch amount is calculated.
  • After calculating the mismatch amount described above in step S63, the processor 40 may execute the following method.
  • The processor 40 moves and rotates the transfer object B of one of the first three-dimensional data or the second three-dimensional data for each projection plane. At this time, the processor 40 moves the transfer object B so that the mismatch amount of the transfer object B between the first three-dimensional data and the second three-dimensional data is a minimum. The processor 40 stores, in the storage part 30, the result of the calculated movement distance and rotation angle in each direction.
  • According to this method, the movement direction, the movement distance, and the rotation angle of the transfer object B necessary to minimize the mismatch amount for each projection plane are calculated. This method is effective for work in which the multiple transfer objects are sequentially transferred and fixed. In such work, if the fixation position of one transfer object is rotated with respect to the predetermined position, the amount of the displacement (the shift of the position) for the other transfer objects fixed to the one transfer object increases according to the number of transfer objects to be fixed. Accordingly, in the case where the work support system 100 is used in such work, it is desirable for the processor 40 to calculate the rotation angle for minimizing the mismatch amount.
  • Step S64
  • A preset threshold for the mismatch amount detected in step S63 is stored in the storage part 30. The processor 40 compares the mismatch amount to the threshold for each projection plane. For example, the mismatch amount that can be tolerated in each direction is set as the threshold. In the case where the mismatch amount is less than the threshold for each projection plane, the flow proceeds to step 571.
  • In the case where a mismatch amount that is not less than the threshold exists for any of the projection planes, the flow proceeds to step S72.
  • Step S71
  • The processor 40 causes the displayer 50 to display or notifies the terminal 60 that the mismatch amount in each direction is less than the threshold and within the tolerance range for the fixed first to nth transfer objects B1 to Bn. At this time, the mismatch amount in each direction may be displayed by the displayer 50 and notified to the terminal 60.
  • Step S72
  • The processor 40 causes the displayer 50 to display or notifies the terminal 60 that there is a direction in which the mismatch amount is not less than the threshold for the fixed first to nth transfer objects B1 to Bn. Also, the processor 40 causes the displayer 50 to display or notifies the terminal 60 of the direction in which the mismatch amount is not less than the threshold and the mismatch amount of the direction. Further, the mismatch amount may be displayed by the displayer 50 or notified to the terminal 60 for the direction in which the mismatch amount is less than the threshold.
  • In steps S71 and S72, the processor 40 may convert and display the mismatch amount detected in each direction into numerical values using units such as millimeters, centimeters, etc. Thereby, the user of the work support system 100 can easily ascertain the mismatch amount intuitively.
  • In steps S71 and S72, there may be no display of a notification for the directions in which the mismatch amount is less than the threshold. In other words, the processor 40 may not display or notify the results in step S71 and may display only the direction in which the mismatch amount is not less than the threshold and the mismatch amount of the direction in step S72.
  • Here, the case is described where the multiple transfer objects up to the first to nth transfer objects B1 to Bn are transferred. The work support system 100 according to the embodiment is applicable also when the transfer and fixation of one transfer object is performed. In other words, the method illustrated in the flowchart of FIG. 2 and FIG. 4 is applicable even in the case where only the first transfer object B1 is transferred and fixed inside the structural object A.
  • The case is described in the example described above where the first to nth transfer objects B1 to Bn are imaged and the displacement of the first to nth transfer objects B1 to Bn is detected after the first to nth transfer objects B1 to Bn are fixed. The displacement of these transfer objects may be detected after the first to nth transfer objects B1 to Bn are arranged inside the structural object A prior to the fixation. The displacement can be corrected more easily by detecting the displacement prior to fixing each of the transfer objects.
  • The work support system 100 according to the embodiment may execute steps S5 to S7 of FIG. 2 each time one transfer object is transferred and arranged inside the structural object A. For example, the work support system 100 according to the embodiment executes steps S5 to S7 after the first transfer object B1 is transferred and arranged prior to performing the fixation of the second transfer object B2 and the connection with the first transfer object B1. Thereby, the displacement of the first transfer object B1 can be detected and corrected prior to the fixation and the connection of the second transfer object B2. Therefore, the time necessary for the work can be shorter than in the case of correcting the displacement after all of the transfer objects are fixed and connected.
  • While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the invention.
  • Moreover, above-mentioned embodiments can be combined mutually and can be carried out.

Claims (12)

What is claimed is:
1. A work support system, comprising:
an imager configured to image a first transfer object after the first transfer object is transferred into a structural object and arranged inside the structural object;
a storage part storing first three-dimensional data, the first three-dimensional data including three-dimensional data of the structural object and three-dimensional data of the first transfer object, the three-dimensional data of the first transfer object being overlaid at a first position inside the structural object of the three-dimensional data; and
a processor detecting a displacement of the first transfer object of second three-dimensional data with respect to the first transfer object of the first three-dimensional data, the second three-dimensional data being obtained by imaging the first transfer object arranged inside the structural object.
2. The system according to claim 1, wherein the processor detects the displacement by calculating a mismatch amount of the first transfer object of the second three-dimensional data with respect to the first transfer object of the first three-dimensional data.
3. The system according to claim 2, further comprising a receiver,
the mismatch amount including a first mismatch amount in a first direction, a second mismatch amount in a second direction, and a third mismatch amount in a third direction, the second direction being orthogonal to the first direction, the third direction being orthogonal to the first direction and the second direction,
the processor detecting the displacement for each of the first direction, the second direction, and the third direction by calculating the first mismatch amount, the second mismatch amount, and the third mismatch amount,
the processor transmitting the displacements to the receiver.
4. The system according to claim 3, wherein
the processor compares each of the first mismatch amount, the second mismatch amount, and the third mismatch amount to a threshold stored in the storage part, and
in the case where one or more of the first mismatch amount, the second mismatch amount, or the third mismatch amount is greater than the threshold, the processor transmits, to the receiver, the one or more of the first mismatch amount, the second mismatch amount, or the third mismatch amount and one or more of the first direction, the second direction, or the third direction corresponding to the one or more of the first mismatch amount, the second mismatch amount, or the third mismatch amount.
5. A work support system used in work, the work including a transfer of a first transfer object into a structural object, a transfer of a second transfer object into the structural object, and a connection between the first transfer object and the second transfer object, the system comprising:
an imager configured to image the first transfer object after the first transfer object is transferred into the structural object and arranged inside the structural object prior to the second transfer object being connected to the first transfer object;
a storage part storing first three-dimensional data, the first three-dimensional data including three-dimensional data of the structural object and three-dimensional data of the first transfer object, the three-dimensional data of the first transfer object being overlaid at a first position inside the structural object of the three-dimensional data; and
a processor detecting a displacement of the first transfer object of second three-dimensional data with respect to the first transfer object of the first three-dimensional data, the second three-dimensional data being obtained by imaging the first transfer object arranged inside the structural object.
6. The system according to claim 5, wherein the processor detects the displacement by calculating a mismatch amount of the first transfer object of the second three-dimensional data with respect to the first transfer object of the first three-dimensional data.
7. The system according to claim 6, further comprising a receiver,
the mismatch amount including a first mismatch amount in a first direction, a second mismatch amount in a second direction, and a third mismatch amount in a third direction, the second direction being orthogonal to the first direction, the third direction being orthogonal to the first direction and the second direction,
the processor detecting the displacement for each of the first direction, the second direction, and the third direction by calculating the first mismatch amount, the second mismatch amount, and the third mismatch amount,
the processor transmitting the displacements to the receiver.
8. The system according to claim 7, wherein
the processor compares each of the first mismatch amount, the second mismatch amount, and the third mismatch amount to a threshold stored in the storage part, and
in the case where one or more of the first mismatch amount, the second mismatch amount, or the third mismatch amount is greater than the threshold, the processor transmits, to the receiver, the one or more of the first mismatch amount, the second mismatch amount, or the third mismatch amount and one or more of the first direction, the second direction, or the third direction corresponding to the one or more of the first mismatch amount, the second mismatch amount, or the third mismatch amount.
9. A work method, comprising:
imaging a first transfer object after the first transfer object is transferred into a structural object and arranged inside the structural object;
acquiring first three-dimensional data, the first three-dimensional data including three-dimensional data of the structural object and three-dimensional data of the first transfer object, the three-dimensional data of the first transfer object being overlaid at a prescribed position inside the structural object of the three-dimensional data; and
using the first three-dimensional data and second three-dimensional data to detect a displacement of the first transfer object of the second three-dimensional data with respect to the first transfer object of the first three-dimensional data, the second three-dimensional data being obtained in the imaging.
10. The method according to claim 9, wherein the displacement is detected by calculating a mismatch amount of the first transfer object of the second three-dimensional data with respect to the first transfer object of the first three-dimensional data.
11. The method according to claim 9, wherein
the first transfer object is connected to the second transfer object transferred into the structural object, and
the imaging of the first transfer object and the detecting of the displacement are performed after the first transfer object is arranged inside the structural object before the first transfer object is connected to the second transfer object.
12. The method according to claim 9, wherein a transfer path of the second transfer object into the structural object is generated using the second three-dimensional data.
US16/814,445 2016-11-08 2020-03-10 Work support system and work method Abandoned US20200211222A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/814,445 US20200211222A1 (en) 2016-11-08 2020-03-10 Work support system and work method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016218249A JP6833460B2 (en) 2016-11-08 2016-11-08 Work support system, work method, and processing equipment
JP2016-218249 2016-11-08
US15/802,638 US10650551B2 (en) 2016-11-08 2017-11-03 Work support system and work method
US16/814,445 US20200211222A1 (en) 2016-11-08 2020-03-10 Work support system and work method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/802,638 Division US10650551B2 (en) 2016-11-08 2017-11-03 Work support system and work method

Publications (1)

Publication Number Publication Date
US20200211222A1 true US20200211222A1 (en) 2020-07-02

Family

ID=62063951

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/802,638 Active 2038-04-28 US10650551B2 (en) 2016-11-08 2017-11-03 Work support system and work method
US16/814,445 Abandoned US20200211222A1 (en) 2016-11-08 2020-03-10 Work support system and work method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/802,638 Active 2038-04-28 US10650551B2 (en) 2016-11-08 2017-11-03 Work support system and work method

Country Status (3)

Country Link
US (2) US10650551B2 (en)
JP (1) JP6833460B2 (en)
CN (1) CN108062431B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210115489A1 (en) 2018-04-13 2021-04-22 Kikkoman Corporation Novel mediator

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2910512A1 (en) * 2014-02-21 2015-08-26 Siemens Aktiengesellschaft Method for calibrating laser scanners to a container transportation crane
US20160078682A1 (en) * 2013-04-24 2016-03-17 Kawasaki Jukogyo Kabushiki Kaisha Component mounting work support system and component mounting method

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5951075B2 (en) 1980-03-31 1984-12-12 富士通株式会社 semiconductor storage device
EP1076237A3 (en) * 1999-08-10 2002-01-02 FUJI MACHINE Mfg. Co., Ltd. Method and apparatus for obtaining three-dimensional data
JP2002007485A (en) * 2000-06-20 2002-01-11 Hitachi Ltd Design aid system for nuclear power plant structure
JP3910382B2 (en) * 2001-07-11 2007-04-25 株式会社日立製作所 Image matching device
US7367809B2 (en) * 2004-03-26 2008-05-06 Atsushi Takahashi Three-dimensional digital entity mesoscope system equipped with three-dimensional visual instruction functions
JP2005331383A (en) * 2004-05-20 2005-12-02 Toshiba Corp Method and device for evaluating three-dimensional coordinate position
US7742634B2 (en) * 2005-03-15 2010-06-22 Omron Corporation Image processing method, three-dimensional position measuring method and image processing apparatus
JP4185074B2 (en) 2005-06-29 2008-11-19 関西工事測量株式会社 Disaster prevention system and method for identifying the location of victims
JP4093273B2 (en) * 2006-03-13 2008-06-04 オムロン株式会社 Feature point detection apparatus, feature point detection method, and feature point detection program
CN101249548B (en) * 2008-02-26 2010-06-02 廊坊智通机器人***有限公司 Robot sand core lock core, core setting method and system
EP2521058A1 (en) * 2011-05-06 2012-11-07 Dassault Systèmes Determining a geometrical CAD operation
JP2014178794A (en) * 2013-03-14 2014-09-25 Hitachi Ltd Carrying-in route planning system
CN104802174B (en) * 2013-10-10 2016-09-07 精工爱普生株式会社 Robot control system, robot, program and robot control method
JP6463093B2 (en) * 2014-11-25 2019-01-30 日立Geニュークリア・エナジー株式会社 Construction planning support device
JP5951075B1 (en) 2015-06-02 2016-07-13 三菱電機株式会社 Installation adjustment height calculation device, installation adjustment height calculation program, and installation method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160078682A1 (en) * 2013-04-24 2016-03-17 Kawasaki Jukogyo Kabushiki Kaisha Component mounting work support system and component mounting method
EP2910512A1 (en) * 2014-02-21 2015-08-26 Siemens Aktiengesellschaft Method for calibrating laser scanners to a container transportation crane

Also Published As

Publication number Publication date
JP2018077628A (en) 2018-05-17
US20180130231A1 (en) 2018-05-10
JP6833460B2 (en) 2021-02-24
CN108062431A (en) 2018-05-22
US10650551B2 (en) 2020-05-12
CN108062431B (en) 2022-02-18

Similar Documents

Publication Publication Date Title
US10891512B2 (en) Apparatus and method for spatially referencing images
Ellenberg et al. Use of unmanned aerial vehicle for quantitative infrastructure evaluation
JP5465128B2 (en) Point cloud position data processing device, point cloud position data processing system, point cloud position data processing method, and point cloud position data processing program
JP6184237B2 (en) Three-dimensional data processing apparatus, processing method thereof, and processing program thereof
US9470511B2 (en) Point-to-point measurements using a handheld device
US9989483B2 (en) Systems and methods for performing backscatter three dimensional imaging from one side of a structure
JP5388921B2 (en) Three-dimensional distance measuring apparatus and method
US20180225839A1 (en) Information acquisition apparatus
US11288877B2 (en) Method for matching a virtual scene of a remote scene with a real scene for augmented reality and mixed reality
CN104660944A (en) Image projection apparatus and image projection method
JP2019153274A (en) Position calculation device, position calculation program, position calculation method, and content addition system
JP2010287074A (en) Camera calibration device, camera calibration method, camera calibration program and recording medium recording the program
US20150301690A1 (en) Input-operation detection device, image display apparatus, projector apparatus and projector system
US20200211222A1 (en) Work support system and work method
EP3312641B1 (en) Method, apparatus and system for scanning and imaging
JP2019008473A (en) Composite reality simulation device and composite reality simulation program
JP2007010419A (en) Three-dimensional shape of object verifying system
US11415695B2 (en) Distance measuring system with layout generation functionality
JP6960319B2 (en) How to adjust the position of the constituent members of the structure
CN102421367A (en) Medical image display device and medical image display method
JP6341108B2 (en) Imaging parameter determination device, portable terminal device, imaging parameter determination system, imaging parameter determination method, and imaging parameter determination program
US20240112406A1 (en) Bar arrangement inspection result display system
JP2011117833A (en) Survey method and survey system
US11915356B2 (en) Semi-automatic 3D scene optimization with user-provided constraints
JP7392826B2 (en) Data processing device, data processing system, and data processing method

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION