WO2021117656A1 - Marking support system, marking support device, information processing device, marking support method, and program - Google Patents

Marking support system, marking support device, information processing device, marking support method, and program Download PDF

Info

Publication number
WO2021117656A1
WO2021117656A1 PCT/JP2020/045394 JP2020045394W WO2021117656A1 WO 2021117656 A1 WO2021117656 A1 WO 2021117656A1 JP 2020045394 W JP2020045394 W JP 2020045394W WO 2021117656 A1 WO2021117656 A1 WO 2021117656A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
projection
support device
marking
information
Prior art date
Application number
PCT/JP2020/045394
Other languages
French (fr)
Japanese (ja)
Inventor
祐樹 栗栖
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2021563936A priority Critical patent/JP7179203B2/en
Publication of WO2021117656A1 publication Critical patent/WO2021117656A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/02Means for marking measuring points

Definitions

  • This disclosure relates to a marking support system, a marking support device, an information processing device, a marking support method, and a program.
  • inking out is a method of measuring the distance from the reference point and marking it, pulling a thread with ink, powder chalk, etc. between the reference point and the mark, and flipping it. Used.
  • Patent Document 1 discloses a laser marking device that outputs and displays a laser line on a side wall surface, a ceiling surface, a floor surface, or the like of a structure.
  • Patent Document 2 discloses an marking support device that projects an image generated from a design drawing, a three-dimensional model, or the like onto a target surface to be marking with a projector.
  • the present disclosure has been made in view of the above problems, and an object of the present disclosure is to improve convenience when projecting a drawing on a target surface to be marked and supporting the marking work.
  • the marking support system of the present disclosure includes an information processing device that converts drawing data including marking information into a projected image and a marking support device that projects the projected image onto a target surface.
  • the marking support device includes a projection unit, a sensor unit, a movement drive unit, a processor unit, a detection information transmission unit, a movement route information reception unit, and a projection image information reception unit.
  • the projection unit projects the projected image in the directivity direction.
  • the sensor unit detects the position and size of surrounding objects and the acceleration and angular velocity of the marking support device.
  • the movement drive unit moves the marking support device.
  • the processor unit controls the marking support device.
  • the detection information transmission unit transmits detection information indicating the position and size of a peripheral object detected by the sensor unit and the acceleration and angular velocity of the marking support device to the information processing device.
  • the movement route information receiving unit receives the movement route information indicating the movement route from the current position to the projection position from the information processing device.
  • the projected image information receiving unit receives the projected image information indicating the projected image from the information processing device.
  • the information processing device includes a projection condition information acquisition unit, a detection information reception unit, a current position calculation unit, a movement route calculation unit, a movement route information transmission unit, a projection image generation unit, and a projection image information transmission unit. Be prepared.
  • the projection condition information acquisition unit acquires projection condition information indicating the projection condition including the projection position of the marking support device.
  • the detection information receiving unit receives the detection information from the marking support device.
  • the current position calculation unit calculates the current position and orientation of the marking support device based on the detection information.
  • the movement path calculation unit calculates the movement path from the current position of the marking support device to the projection position based on the information indicating the current position of the marking support device and the projection condition information calculated by the current position calculation unit.
  • the movement route information transmission unit transmits the movement route information indicating the movement route calculated by the movement route calculation unit to the marking support device.
  • the projection image generation unit performs a drawing conversion process for converting drawing data into a projection image based on the information indicating the current position and orientation of the marking support device calculated by the current position calculation unit.
  • the projection image information transmission unit transmits the projection image information indicating the projection image generated by the projection image generation unit to the marking support device.
  • the processor unit controls the movement drive unit based on the movement path information to move the marking support device to the projection position, and controls the projection unit to project the projected image indicated by the projected image information on the target surface. ..
  • a movable marking support device moves to a projection position, and a projected image obtained by converting drawing data including marking information based on the orientation of the marking support device is projected onto a target surface. , Convenience is improved when the drawing is projected on the target surface to be marked and the marking work is supported.
  • FIG. 1 A block diagram showing a functional configuration example of the marking support device according to the embodiment.
  • Block diagram showing a functional configuration example of the information processing apparatus according to the embodiment The figure which shows an example of the input screen displayed on the user terminal which concerns on embodiment.
  • a flowchart showing an example of the operation of the projection image generation process according to the embodiment A flowchart showing an example of the operation of the marking support process according to the embodiment.
  • the figure which shows an example of the hardware configuration of the information processing apparatus which concerns on embodiment A block diagram showing a functional configuration example of the marking support device according to the embodiment.
  • the marking support system according to the present embodiment will be described in detail with reference to the drawings.
  • the marking information includes, for example, the installation position of equipment, the drilling position of anchor bolts, the sticking position of a sheet, and the like.
  • the area of the floor surface corresponding to the original drawing data of the projected image is referred to as a target surface.
  • the marking support system 100 includes a marking support device 1 that projects a projected image onto a target surface, an information processing device 2 that converts drawing data including marking information into a projected image, and a user. It includes a user terminal 3 to be used.
  • the drawing data is, for example, a CAD drawing showing the positions of the walls, pillars, etc. of the building where the marking work is executed.
  • the marking support device 1 is, for example, an autonomously flying unmanned aerial vehicle (UAV: Unmanned Aerial Vehicle) represented by a drone.
  • the information processing device 2 is, for example, a computer.
  • the user terminal 3 is, for example, a tablet terminal.
  • the marking support device 1 and the user terminal 3 are each wirelessly connected to the information processing device 2.
  • the user terminal 3 has a display unit 31 that displays an input screen, an input unit 32 that receives input from the user, a control unit 33 that controls the display unit 31 and the input unit 32, and a communication unit that communicates with the information processing device 2.
  • the display unit 31 is, for example, a liquid crystal screen.
  • the input unit 32 is, for example, a touch panel.
  • the control unit 33 is a housing in which, for example, a CPU (Central Processing Unit), a memory, and the like are incorporated. When the user inputs an input screen display instruction to the input unit 32, the control unit 33 generates an input screen and displays it on the display unit 31.
  • a CPU Central Processing Unit
  • the control unit 33 When the user inputs the projection condition of the marking support device 1 on the input screen, the control unit 33 generates projection condition information indicating the input projection condition and transmits it to the information processing device 2 via the communication unit 34.
  • the projection conditions are the projection position, the projection direction, and the like of the marking support device 1.
  • the marking support device 1 includes a projection unit 11 that projects a projected image in a directional direction, a sensor unit 12 that detects the position and size of surrounding objects, and the acceleration and angular velocity of the marking support device 1, and the projection unit 11. It includes a photographing unit 13 that captures the projected projected image, and a driving unit 14 that moves the marking support device 1 to change the directing direction of the projection unit 11 and the photographing direction of the photographing unit 13. Further, the marking support device 1 includes a marking unit 15 for ejecting ink, a power source 16 for supplying electric power to the marking support device 1, a processor unit 17 for controlling the marking support device 1, and an information processing device 2. And a communication unit 18 that communicates with the user terminal 3.
  • the projection unit 11 is, for example, a laser projector.
  • the projection unit 11 projects the projected image onto the target surface according to the instruction from the processor unit 17.
  • the sensor unit 12 is, for example, an LRF (Laser RangeFinder), an inertial sensor, and a gyro sensor that obtain distance information by emitting and reflecting laser light.
  • the sensor unit 12 detects the position and size of surrounding objects and the acceleration and angular velocity of the marking support device 1.
  • the photographing unit 13 is, for example, a digital camera. The photographing unit 13 photographs the projected image projected by the projection unit 11 in accordance with the instruction from the processor unit 17.
  • the drive unit 14 is, for example, a rotor having a propeller, and a biaxial rotor that rotates the projection unit 11 and the imaging unit 13 in the horizontal and vertical directions.
  • the drive unit 14 moves the marking support device 1 according to the instruction from the processor unit 17, and changes the directivity direction of the projection unit 11 and the imaging direction of the imaging unit 13.
  • the marking unit 15 is, for example, an inkjet nozzle.
  • the marking unit 15 ejects ink according to an instruction from the processor unit 17.
  • the power source 16 is, for example, a lithium ion battery.
  • the power source 16 supplies power to the projection unit 11, the sensor unit 12, the drive unit 14, the photographing unit 13, the marking unit 15, the processor unit 17, and the communication unit 18.
  • the communication unit 18 is, for example, a wireless communication device connected to a wireless LAN (Local Area Network).
  • the communication unit 18 establishes communication with the information processing device 2, and shows movement path information indicating the movement path to the projection position, rotation amount information indicating the rotation amounts of the projection unit 11 and the imaging unit 13, and a projected image.
  • the projected image information is received from the information processing device 2.
  • the communication unit 18 is an example of a movement path information receiving unit, a rotation amount information receiving unit, and a projected image information receiving unit.
  • the processor unit 17 is, for example, a microcomputer.
  • the processor unit 17 transmits detection information indicating the position and size of a peripheral object detected by the sensor unit 12 and the acceleration and angular velocity of the marking support device 1 to the information processing device 2 via the communication unit 18.
  • the processor unit 17 may periodically transmit the detection information to the information processing device 2, or may transmit the detection information in response to a request from the information processing device 2.
  • the processor unit 17 controls the drive unit 14 based on the movement route information received by the communication unit 18, and moves the marking support device 1 to the projection position.
  • the processor unit 17 controls the drive unit 14 based on the rotation amount information received by the communication unit 18, and changes the directivity direction of the projection unit 11 and the imaging direction of the imaging unit 13.
  • the processor unit 17 projects the projected image indicated by the projected image information received by the communication unit 18 onto the projection unit 11.
  • the processor unit 17 controls the photographing unit 13 to photograph the projected image projected by the projection unit 11.
  • the processor unit 17 transmits the photographed image information indicating the photographed image of the photographing unit 13 to the information processing device 2 via the communication unit 18.
  • the communication unit 18 is an example of a detection information transmission unit and a captured image information transmission unit.
  • the information processing device 2 includes an arithmetic storage unit 21 that processes information received from the marking support device 1 and the user terminal 3, and a communication unit 22 that communicates with the marking support device 1 and the user terminal 3.
  • the communication unit 22 is, for example, a wireless communication device connected to a wireless LAN.
  • the communication unit 22 receives the projection condition information indicating the projection condition input by the user from the user terminal 3.
  • the communication unit 22 receives the detection information from the marking support device 1.
  • the communication unit 22 is an example of a projection condition information acquisition unit and a detection information reception unit.
  • the arithmetic storage unit 21 is a housing in which, for example, a CPU and a memory are incorporated.
  • the arithmetic storage unit 21 stores drawing data acquired in advance.
  • the information processing device 2 may receive drawing data from, for example, an external device or system, or the user may input drawing data into the information processing device 2.
  • the arithmetic storage unit 21 Based on the projection condition information and the detection information received by the communication unit 22, the arithmetic storage unit 21 describes the movement path from the current position of the marking support device 1 to the projection position, the direction of the projection unit 11, and the photographing unit 13. The amount of rotation of the projection unit 11 and the imaging unit 13 that make the imaging direction coincide with the projection direction is calculated.
  • the arithmetic storage unit 21 sends the communication unit 22 the movement path information indicating the movement path from the current position of the marking support device 1 to the projection position and the rotation amount information indicating the rotation amounts of the projection unit 11 and the imaging unit 13. It is transmitted to the marking support device 1 via.
  • the arithmetic storage unit 21 converts the drawing data into a projected image based on the detection information received by the communication unit 22.
  • the arithmetic storage unit 21 transmits the projected image information indicating the projected image obtained by converting the drawing data to the marking support device 1 via the communication unit 22.
  • the communication unit 22 is an example of a movement route information transmission unit, a rotation amount information transmission unit, and a projected image information transmission unit.
  • the sensor unit 12 of the marking support device 1 determines the acceleration and angular speed of the marking support device 1 and the external sensor unit 121 that detects the position and size of an object such as a wall, floor, ceiling, or pillar existing in the sensing range. It includes an internal sensor unit 122 for detecting.
  • the external sensor unit 121 is, for example, an LRF.
  • the sensing range is a range from the external sensor unit 121 to a certain distance, which is determined by the performance of the external sensor unit 121.
  • the internal sensor unit 122 is, for example, an inertial sensor and a gyro sensor.
  • the drive unit 14 includes a moving drive unit 141 that moves the marking support device 1 by flight, and a directional rotating unit 142 that changes the directivity direction of the projection unit 11 and the imaging direction of the imaging unit 13.
  • the mobile drive unit 141 is, for example, a rotor having a propeller.
  • the directional rotating unit 142 is, for example, a biaxial rotor that rotates in the horizontal direction and the vertical direction to rotate the projection unit 11 and the photographing unit 13.
  • the processor unit 17 includes a projection control unit 171 that controls the projection unit 11, a detection information collection unit 172 that acquires detection information of the external world sensor unit 121 and the internal world sensor unit 122, a movement drive unit 141, and a direction rotation unit 142.
  • the projection control unit 171 controls the projection unit 11 to project the projection image indicated by the projection image information received by the communication unit 18 onto the target surface.
  • the marking operator performs marking on the target surface based on marking information such as the installation position of equipment, the drilling position of anchor bolts, and the sticking position of the sheet included in the projected projected image.
  • the projection control unit 171 ends the projection of the projected image on the projection unit 11.
  • the projection time may be set by the user.
  • the user may input a projection end instruction to the user terminal 3.
  • the user terminal 3 transmits the end instruction information indicating the input end instruction to the information processing device 2 via the communication unit 34.
  • the information processing device 2 transmits the end instruction information to the marking support device 1, and when the marking support device 1 receives the end instruction information, the projection control unit 171 of the processor unit 17 projects the projected image onto the projection unit 11. To end.
  • the detection information collecting unit 172 transmits the detection information acquired from the external world sensor unit 121 and the internal world sensor unit 122 to the information processing device 2 via the communication unit 18.
  • the drive control unit 173 controls the movement drive unit 141 to move the marking support device 1 to the projection position by the movement path indicated by the movement route information received by the communication unit 18.
  • the drive control unit 173 controls the directional rotation unit 142 to rotate the projection unit 11 and the imaging unit 13 by the rotation amount of the projection unit 11 and the imaging unit 13 indicated by the rotation amount information received by the communication unit 18. .. Since the marking support device 1 has a function of changing the pointing direction of the projection unit 11 and the shooting direction of the shooting unit 13, for example, even when it is difficult to project and shoot a projected image from directly above, the projected image is projected. And shooting is possible.
  • the shooting control unit 174 controls the shooting unit 13 to shoot a projected image projected on the target surface.
  • the photographing control unit 174 transmits the photographed image information indicating the photographed image of the photographing unit 13 to the information processing device 2 via the communication unit 18.
  • the marking control unit 175 controls the marking unit 15 to eject ink. While the drive control unit 173 controls the movement drive unit 141 to move the marking support device 1, the marking control unit 175 controls the marking unit 15 to eject ink, thereby producing lines, shapes, etc. on the target surface. Can be filled in.
  • the functions of the other components are as described above.
  • the calculation storage unit 21 of the information processing device 2 includes a current position calculation unit 211 that calculates the current position and orientation of the marking support device 1, a movement path from the current position of the marking support device 1 to the projection position, and projection.
  • the arithmetic storage unit 21 performs image conversion processing on the projected image generation unit 214 that converts the drawing data into the projected image and the image conversion process of the captured image, calculates the difference from the original drawing data of the projected image, and the projected image is normal. It includes a captured image processing unit 215 for determining whether or not there is an image, and a storage unit 216 for storing various information.
  • the communication unit 22 stores the projection condition information indicating the projection condition input by the user received from the user terminal 3 in the storage unit 216.
  • the communication unit 22 stores the detection information of the external world sensor unit 121 and the internal world sensor unit 122 received from the marking support device 1 and the photographed image information indicating the photographed image of the photographing unit 13 in the storage unit 216.
  • the current position calculation unit 211 calculates the current position and orientation of the marking support device 1 based on the detection information stored in the storage unit 216.
  • the current position of the marking support device 1 is the coordinates of the current position on the drawing and the height from the floor surface indicated by the drawing data.
  • the relative position calculation unit 213 calculates the distance in the direction of the projection unit 11 to the target surface of the marking support device 1 and the angle of the direction in the direction with respect to the target surface based on the detection information stored in the storage unit 216.
  • the storage unit 216 stores drawing data acquired in advance.
  • the projection image generation unit 214 includes the orientation of the marking support device 1 calculated by the current position calculation unit 211 and the distance in the direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213.
  • a drawing conversion process for converting the drawing data stored in the storage unit 216 into a projected image is executed based on the angle in the direction of orientation with respect to the target surface.
  • the projection image generation unit 214 stores the parameters set when the drawing conversion process is executed in the storage unit 216.
  • the captured image processing unit 215 extracts the projected image from the captured image indicated by the captured image information.
  • the captured image processing unit 215 executes an image conversion process that performs the reverse processing of the drawing conversion process on the projected image extracted from the captured image by using the parameters stored in the storage unit 216.
  • the captured image processing unit 215 calculates the difference between the captured image that has undergone image conversion processing and the original drawing data of the projected image.
  • the captured image processing unit 215 outputs error information indicating that the projected image is abnormal.
  • the output of the error information may be a screen display, a voice output, or may be transmitted to the user terminal 3 via the communication unit 22.
  • the control unit 33 of the user terminal 3 causes the display unit 31 to display the error information received from the information processing device 2 via the communication unit 34.
  • FIG. 4A is a diagram showing an example of an input screen displayed on the user terminal 3.
  • FIG. 4B is an example of a projected image projected by the marking support device 1.
  • the user specifies the two-dimensional coordinates 42 in the horizontal plane (XY plane in the drawing) of the projection position of the marking support device 1 on the drawing data 41 displayed on the input screen. Further, the user specifies the flight height 43, which is the coordinates in the vertical direction (Z direction in the drawing) of the projection position of the marking support device 1, on the drawing data displayed on the input screen. That is, the user specifies the three-dimensional coordinates (X, Y, Z) of the projection position of the marking support device 1 on the drawing data displayed on the input screen. The user may stagnate for N minutes at the coordinates (X1, Y1, Z1), then move to the coordinates (X2, Y2, Z2) and stagnate for M minutes, and so on. And may be specified.
  • the user specifies the area 44 as the projection range on the drawing data 41 displayed on the input screen.
  • the user designates an area 44 whose diagonal line is a line dragged on the drawing data 41 displayed on the input screen as a projection range.
  • the user may input the value of the maximum distance L1 from the projection unit 11 shown in FIG. 4B to the projected image 47, and specify the area 44 on the drawing data 41 defined by the maximum distance L1 as the projection range. ..
  • the projection range is specified on the drawing data 41 because, depending on the performance of the projection unit 11 and the brightness of the site, if the projection is over a wide range, the projected image becomes difficult to see as the distance from the projection unit 11 increases. This is because the work becomes difficult.
  • the user When the size of the drawing data 41 is sufficiently small with respect to the performance of the projection unit 11 and the local brightness, or when the drawing data 41 is the drawing data divided in advance, the user does not specify the projection range and draws the drawing.
  • the entire range of the data 41 may be the projection range. Further, when the user specifies a plurality of projection positions and a stagnation time, the user specifies a projection range at each projection position.
  • the control unit 33 calculates the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the designated projection range. For example, when the projection range is a quadrangle as shown in FIG. 4A, the intersection of the diagonal lines is set as the center point.
  • the control unit 33 has three-dimensional coordinates (X, Y, Z) of the projection position of the marking support device 1 specified by the user, a projection range, and three-dimensional coordinates (X C , Y C) of the center point 45 of the projection range. , 0) and the projection condition information indicating the above are generated and transmitted to the information processing apparatus 2 via the communication unit 34.
  • the input screen may include a marking instruction button for instructing marking by the marking support device 1. When the user taps the marking instruction button, the control unit 33 generates marking instruction information indicating the marking instruction to the marking support device 1 and transmits the marking instruction information to the information processing device 2 via the communication unit 34.
  • the projection path calculation unit 212 of the calculation storage unit 21 of the information processing device 2 is a drawing of the current position of the marking support device 1 calculated by the current position calculation unit 211 when the communication unit 22 receives the projection condition information from the user terminal 3. It is determined whether or not the above coordinates and the height from the floor surface and the three-dimensional coordinates (X, Y, Z) of the projection position match, that is, whether or not the marking support device 1 has arrived at the projection position. .. When the projection path calculation unit 212 determines that the coordinates of the current position of the marking support device 1 on the drawing and the height from the floor surface do not match the three-dimensional coordinates (X, Y, Z) of the projection position, the current position is present.
  • the marking support device 1 projects from the current position based on the coordinates on the drawing of the current position of the marking support device 1 calculated by the position calculation unit 211 and the information indicating the height from the floor surface, and the drawing data.
  • the movement path until the movement to the three-dimensional coordinates (X, Y, Z) of the projection position indicated by the condition information is calculated.
  • the movement path from the coordinates of the current position of the marking support device 1 on the drawing to the two-dimensional coordinates (X, Y) of the projection position is, for example, the shortest path on the drawing.
  • the projection path calculation unit 212 is an example of a movement path calculation unit.
  • the projection route calculation unit 212 transmits the movement route information indicating the calculated movement route to the marking support device 1 via the communication unit 22.
  • the movement path information indicates, for example, a movement path from the coordinates on the drawing of the current position of the marking support device 1 to the two-dimensional coordinates (X, Y) of the projection position, and the flight height Z at the projection position.
  • the drive control unit 173 of the processor unit 17 of the marking support device 1 receives the movement route information from the information processing device 2 via the communication unit 18, it controls the movement drive unit 141 to control the movement route indicated by the movement route information. Then, it is moved to the three-dimensional coordinates (X, Y, Z) of the projection position.
  • the drive control unit 173 moves the marking support device 1 along the movement path up to the two-dimensional coordinates (X, Y) of the projection position indicated by the movement path information, and raises or lowers the marking support device 1 to the flight height Z.
  • the projection path calculation unit 212 of the information processing device 2 determines that the coordinates of the current position of the marking support device 1 on the drawing, the height from the floor surface, and the three-dimensional coordinates (X, Y, Z) of the projection position match. If so, the coordinates on the drawing of the current position of the marking support device 1 calculated by the current position calculation unit 211 and the height from the floor surface, that is, the three-dimensional coordinates (X, Y, Z) of the projection position, and marking Projection is performed based on the orientation ⁇ of the support device 1, the distance L2 in the direction direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213, and the angle ⁇ in the direction direction with respect to the target surface.
  • the three-dimensional coordinates (X D , Y D , 0) of the extension line in the direction of the portion 11 and the intersection point 48 of the floor surface are calculated.
  • the three-dimensional coordinates of the intersection point 48 between the extension line of the projection direction of the projection unit 11 and the floor surface are referred to as the directional coordinates.
  • the projection path calculation unit 212 matches the calculated directional coordinates (X D , Y D , 0) with the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the projection range indicated by the projection condition information. Whether or not to do so, that is, whether or not the pointing direction and the projection direction match.
  • the projection path calculation unit 212 has to match the directional coordinates (X D , Y D , 0) with the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the projection range indicated by the projection condition information.
  • the current position (X, Y, Z) and orientation ⁇ of the marking support device 1 calculated by the current position calculation unit 211 and the target surface of the marking support device 1 calculated by the relative position calculation unit 213.
  • the projection path calculation unit 212 transmits the calculated rotation amount information indicating the horizontal rotation amount and the vertical rotation amount of the projection unit 11 and the photographing unit 13 to the marking support device 1 via the communication unit 22.
  • the projection path calculation unit 212 is an example of a rotation amount calculation unit.
  • the drive control unit 173 of the processor unit 17 of the marking support device 1 When the drive control unit 173 of the processor unit 17 of the marking support device 1 receives the rotation amount information from the information processing device 2 via the communication unit 18, it controls the direction rotation unit 142, and the rotation amount information indicates.
  • the projection unit 11 and the imaging unit 13 are rotated by the amount of rotation in the horizontal direction and the amount of rotation in the vertical direction of the projection unit 11 and the imaging unit 13.
  • the projection image generation unit 214 of the information processing apparatus 2 determines that the directional coordinates (X D , Y D , 0) and the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the projection range match. If so, the orientation ⁇ of the marking support device 1 calculated by the current position calculation unit 211, the distance L2 in the direction direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213, and the target.
  • a drawing conversion process for converting the drawing data stored in the storage unit 216 into a projected image is executed based on the angle ⁇ in the directing direction with respect to the surface.
  • the drawing conversion process executed by the projection image generation unit 214 will be described with reference to FIG.
  • the drawing conversion process includes a cropping process, a scaling process, a projective conversion process, and a rotation conversion process shown in FIG.
  • the projection image generation unit 214 first executes a cropping process for cropping the image data in the projection range indicated by the projection condition information.
  • the projection image generation unit 214 outputs the cut out image data to the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the projection range from the current position (X, Y, Z) of the marking support device 1.
  • the dimensions match when projected according to the distance L2 from the current position (X, Y, Z) of the marking support device 1 to the directional coordinates (X D , Y D, 0).
  • the projected image generation unit 214 executes a projective transformation process that performs a projective transformation on the enlarged or reduced image data according to the angle ⁇ in the pointing direction with respect to the target surface.
  • the projected image generation unit 214 executes a rotation conversion process of rotating the projected image data according to the orientation ⁇ of the marking support device 1.
  • the drawing conversion process does not have to include the cropping process.
  • the projection image generation unit 214 stores the parameters set when each process included in the drawing conversion process is performed in the storage unit 216.
  • the projection image generation unit 214 transmits the projection image information indicating the generated projection image to the marking support device 1 via the communication unit 22.
  • the projection control unit 171 of the processor unit 17 of the marking support device 1 controls the projection unit 11 to project the projection image indicated by the projection image information received by the communication unit 18 onto the target surface.
  • the shooting control unit 174 controls the shooting unit 13 to shoot a projected image projected on the target surface.
  • the photographing control unit 174 transmits the photographed image information indicating the photographed image photographed by the photographing unit 13 to the information processing device 2 via the communication unit 18.
  • the captured image processing unit 215 of the information processing device 2 receives the captured image information from the marking support device 1 via the communication unit 22, the captured image processing unit 215 extracts the projected image from the captured image indicated by the captured image information.
  • the communication unit 22 is an example of a captured image information receiving unit.
  • the captured image processing unit 215 executes an image conversion process that performs the reverse processing of the drawing conversion process on the projected image extracted from the captured image by using the parameters stored in the storage unit 216.
  • the image conversion process includes a reverse rotation conversion process, a reverse projective conversion process, and a reverse enlargement / reduction process.
  • the captured image processing unit 215 first executes a reverse rotation conversion process of rotating the projected image extracted from the captured image in the reverse direction using the parameters set in the rotation conversion process.
  • the captured image processing unit 215 executes a back-projection conversion process that performs a reverse projection conversion on the projected image that has been subjected to the reverse rotation conversion process, using the parameters set in the projective conversion process.
  • the captured image processing unit 215 executes a reverse enlargement / reduction process of conversely reducing or enlarging the projected image on which the back-projection conversion process has been performed, using the parameters set in the enlargement / reduction process.
  • the captured image processing unit 215 cuts out the drawing data stored in the storage unit 216 using the parameters set in the cropping process, and generates the original drawing data of the projected image.
  • the captured image processing unit 215 calculates the difference between the captured image subjected to the image conversion process and the original drawing data of the projected image. Differences between the captured image subjected to the image conversion process and the original drawing data of the projected image are, for example, dimensional differences, image distortion, and projection fading. As a result, for example, when the distance from the projection unit 11 to the target surface is long, the illuminance of the projected image is reduced, or the target surface is significantly tilted or uneven, so that the projected image is deformed. It is possible to detect dimensional differences, distortions, etc. due to the fact that the image is projected larger than it actually is.
  • the captured image processing unit 215 determines whether or not the difference between the captured image executed the image conversion process and the original drawing data of the projected image exceeds the threshold value. When the difference between the captured image on which the image conversion process is executed and the original drawing data of the projected image exceeds the threshold value, the captured image processing unit 215 outputs error information indicating that the projected image is abnormal. As a result, the user can know the abnormality of the projected image and can prevent the marking failure.
  • the projection path calculation unit 212 of the calculation storage unit 21 of the information processing device 2 calculates the current position of the marking support device 1 by the current position calculation unit 211. Based on the information indicating the direction and the drawing data, the marking support device 1 calculates the movement route for writing the marking information on the target surface and the ink ejection timing.
  • the projection path calculation unit 212 transmits the marking path information indicating the calculated movement path and the ink ejection timing to the marking support device 1 via the communication unit 22.
  • the communication unit 22 is an example of a marking instruction information acquisition unit and a marking route information transmitting unit.
  • the projection path calculation unit 212 is an example of the marking path calculation unit.
  • the drive control unit 173 controls the moving drive unit 141, and the marking route information indicates.
  • the marking support device 1 is moved along the movement route.
  • the communication unit 18 is an example of a marking route information receiving unit.
  • the flight height of the marking support device 1 is set so that the distance from the marking portion 15 to the floor surface is within the range of the ink.
  • the drive control unit 173 may control not only the flight height of the marking support device 1 but also the attitude of the marking support device 1.
  • the marking control unit 175 controls the marking unit 15 to eject ink at the ink ejection timing indicated by the marking path information.
  • the marking support device 1 writes the marking information such as the installation position of the equipment indicated by the marking information, the drilling position of the anchor bolt, and the sticking position of the sheet on the target surface. Thereby, for example, even if the marking operator cannot mark the target surface, the marking can be performed on the target surface. Alternatively, the lines, shapes, etc. of the projected image can be transcribed on the target surface.
  • the projection condition setting process shown in FIG. 6 starts when the power is turned on to the user terminal 3.
  • the control unit 33 of the user terminal 3 determines whether or not the display instruction of the projection condition input screen has been input to the input unit 32 (step S11).
  • the control unit 33 repeats step S11 and waits for the input of the display instruction of the projection condition input screen.
  • step S11 When the display instruction of the projection condition input screen is input (step S11; YES), the control unit 33 displays the input screen that displays the drawing data acquired from the information processing device 2 and accepts the input of the projection condition. It is displayed on 31 (step S12).
  • the control unit 33 determines whether or not the projection position of the marking support device 1 has been input to the input screen (step S13).
  • step S13 When the control unit 33 determines that the projection position of the marking support device 1 has not been input to the input screen (step S13; NO), the control unit 33 repeats step S13 and waits for the input of the projection position of the marking support device 1. ..
  • the user specifies the two-dimensional coordinates 42 in the horizontal plane (XY plane in the drawing) of the projection position of the marking support device 1 on the drawing data 41 displayed on the input screen. Further, the user specifies the flight height 43, which is the coordinates in the vertical direction (Z direction in the drawing) of the projection position of the marking support device 1, on the drawing data displayed on the input screen.
  • the control unit 33 of the user terminal 3 marks out on the input screen. It is determined that the projection position of the support device 1 has been input.
  • Step S14 when the control unit 33 determines that the projection position of the marking support device 1 has been input to the input screen (step S13; YES), whether or not the projection range of the marking support device 1 has been input to the input screen.
  • Step S14 When the control unit 33 determines that the projection range of the marking support device 1 has not been input to the input screen (step S14; NO), the control unit 33 repeats step S14 and waits for the input of the projection range of the marking support device 1. .
  • the control unit 33 of the user terminal 3 displays the projection range of the marking support device 1 on the input screen. Is determined to have been input.
  • control unit 33 determines that the projection range of the marking support device 1 has been input to the input screen (step S14; YES)
  • the control unit 33 calculates the center point of the input projection range (step S15).
  • the control unit 33 calculates the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the projection range.
  • control unit 33 generates projection condition information indicating the input projection position of the marking support device 1, the projection range, and the center point of the projection range, and processes information via the communication unit 34. It is transmitted to the device 2 (step S16). At this time, the control unit 33 may close the input screen.
  • the control unit 33 uses the three-dimensional coordinates (X, Y, Z) of the projection position of the marking support device 1 specified by the user, the projection range, and the three-dimensional coordinates of the center point 45 of the projection range.
  • (X C , Y C , 0) and projection condition information indicating (X C, Y C, 0) are generated and transmitted to the information processing device 2 via the communication unit 34.
  • step S17 when the power of the user terminal 3 is not turned off (step S17; NO), the process returns to step S11 and repeats steps S11 to S17.
  • step S17; YES the process ends. If the entire range of the drawing data is the projection range, step S14 may be omitted. If the input screen is not closed after the projection condition information is transmitted in step S16 shown in FIG. 6, the process may return to step S13 and repeat steps S13 to S16. In this case, when the user inputs an instruction to close the input screen, the control unit 33 closes the input screen.
  • the movement rotation setting process shown in FIG. 7 starts when the communication unit 22 of the information processing device 2 receives the projection condition information from the user terminal 3.
  • the projection path calculation unit 212 of the calculation storage unit 21 determines whether or not the current position of the marking support device 1 calculated by the current position calculation unit 211 and the projection position indicated by the projection condition information match (step S21). ..
  • the projection path calculation unit 212 determines whether or not the coordinates of the current position of the marking support device 1 on the drawing, the height from the floor surface, and the three-dimensional coordinates of the projection position match.
  • the projection path calculation unit 212 determines that the current position of the marking support device 1 and the projection position do not match (step S21; NO).
  • the projection path calculation unit 212 determines the current position of the marking support device 1 calculated by the current position calculation unit 211. Based on the information shown and the drawing data, the movement path from the current position to the projection position of the marking support device 1 is calculated (step S22).
  • the projection path calculation unit 212 transmits the movement route information indicating the calculated movement route to the marking support device 1 via the communication unit 22 (step S23), and the process returns to step S21.
  • the projection path calculation unit 212 includes information indicating the coordinates on the drawing of the current position of the marking support device 1 calculated by the current position calculation unit 211, the height from the floor surface, and the orientation. Based on the drawing data, the movement path from the current position to the movement of the marking support device 1 to the three-dimensional coordinates (X, Y, Z) of the projection position indicated by the projection condition information is calculated.
  • the movement path information indicates, for example, a movement path from the current position of the marking support device 1 to the two-dimensional coordinates (X, Y) of the projection position, and a flight height Z at the projection position.
  • the projection path calculation unit 212 determines that the current position of the marking support device 1 and the projection position match (step S21; YES)
  • the marking support device 1 calculated by the current position calculation unit 211
  • the current position that is, the projection position, the orientation of the marking support device 1, the distance of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213, and the angle of the pointing direction with respect to the target surface.
  • the directional coordinates are calculated (step S24).
  • the projection path calculation unit 212 has the coordinates on the drawing of the current position of the marking support device 1 calculated by the current position calculation unit 211, that is, the three-dimensional coordinates (X, Y, Z) of the projection position. , Based on the orientation ⁇ of the marking support device 1, the distance L2 in the pointing direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213, and the angle ⁇ in the pointing direction with respect to the target surface. Then, the directional coordinates (X D , Y D , 0) are calculated.
  • the projection path calculation unit 212 determines whether or not the directional coordinates and the center point of the projection range indicated by the projection condition information match (step S25).
  • the projection path calculation unit 212 determines that the directional coordinates and the center point of the projection range do not match (step S25; NO)
  • the current position and orientation of the marking support device 1 calculated by the current position calculation unit 211 The directing direction coordinates and the projection condition information indicate based on the distance in the directing direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213 and the angle of the directing direction with respect to the target surface.
  • the horizontal rotation amount and the horizontal rotation amount of the projection unit 11 and the photographing unit 13 that coincide with the center point of the projection range are calculated (step S26).
  • the projection path calculation unit 212 transmits the calculated rotation amount information indicating the horizontal rotation amount and the vertical rotation amount of the projection unit 11 and the photographing unit 13 to the marking support device 1 via the communication unit 22 ( Step S27), the process returns to step S25.
  • the projection path calculation unit 212 has three-dimensional coordinates (X C , Y C , 0 ) of the pointing direction coordinates (X D , Y D , 0) and the center point 45 of the projection range indicated by the projection condition information. ) Does not match, the projection path calculation unit 212 determines that the coordinates (X, Y, Z) and direction ⁇ of the current position of the marking support device 1 calculated by the current position calculation unit 211 on the drawing are Directional coordinates (X D , Y D) based on the distance L2 in the direction direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213 and the angle ⁇ in the direction direction with respect to the target surface.
  • the projection path calculation unit 212 transmits the calculated rotation amount information indicating the horizontal rotation amount and the vertical rotation amount of the projection unit 11 and the photographing unit 13 to the marking support device 1 via the communication unit 22.
  • step S25 when the projection path calculation unit 212 determines that the coordinates in the directivity direction and the center point of the projection range match (step S25; YES), the process ends.
  • the projection image generation process shown in FIG. 8 starts when the movement rotation setting process is completed.
  • the projection image generation unit 214 of the arithmetic storage unit 21 of the information processing device 2 performs drawing conversion processing including cropping processing, enlargement / reduction processing, projection conversion processing, and rotation conversion processing shown in FIG.
  • the projection image generation unit 214 first executes a cropping process for cropping the image data in the projection range indicated by the projection condition information (step S31).
  • the projected image generation unit 214 measures the cut out image data when it is projected according to the distance to the center point of the projection range, that is, the distance to the pointing direction coordinates calculated by the relative position calculation unit 213.
  • the enlargement / reduction process for enlarging or reducing the size to match the size is executed (step S32).
  • the projected image generation unit 214 executes a projective conversion process of performing a projective conversion on the enlarged or reduced image data according to the angle of the pointing direction with respect to the target surface calculated by the relative position calculation unit 213 (step S33).
  • the projected image generation unit 214 executes a rotation conversion process of rotating the projected image data according to the orientation of the marking support device 1 calculated by the current position calculation unit 211 (step S34).
  • the projection image generation unit 214 stores the parameters set when each process is performed in steps S31 to S34 in the storage unit 216.
  • the projection image generation unit 214 transmits the projection image information indicating the converted projection image to the marking support device 1 via the communication unit 22 (step S35), and ends the process.
  • the information processing apparatus 2 When a plurality of projection positions, projection ranges, and stagnation times are input in steps S13 and S14 of the projection condition setting process shown in FIG. 6, the information processing apparatus 2 performs after the stagnation time at each projection position elapses. , The movement rotation setting process shown in FIG. 7 and the projection image generation process shown in FIG. 8 are executed.
  • the marking support process shown in FIG. 9 starts when the power is turned on to the marking support device 1.
  • the drive control unit 173 of the processor unit 17 of the marking support device 1 determines whether or not the communication unit 18 has received the movement route information from the information processing device 2 (step S41).
  • step S41 is repeated to wait for the reception of the movement route information.
  • the drive control unit 173 controls the movement drive unit 141 to support marking with the movement route indicated by the movement route information.
  • the device 1 is moved to the projection position (step S42).
  • the drive control unit 173 moves the marking support device 1 along the movement path up to the two-dimensional coordinates (X, Y) of the projection position indicated by the movement path information, and rises to the flight height Z. Or lower it.
  • the drive control unit 173 determines whether or not the communication unit 18 has received the rotation amount information from the information processing device 2 (step S43).
  • step S43 is repeated to wait for the reception of the rotation amount information.
  • the drive control unit 173 controls the direction rotation unit 142 and the projection unit 11 indicated by the rotation amount information.
  • the projection unit 11 and the imaging unit 13 are rotated by the horizontal rotation amount and the vertical rotation amount of the photographing unit 13 (step S44).
  • the projection control unit 171 of the processor unit 17 determines whether or not the communication unit 18 has received the projected image information from the information processing device 2 (step S45). When the communication unit 18 has not received the projected image information from the information processing device 2 (step S45; NO), the step S45 is repeated to wait for the reception of the projected image information. When the projection image information is received from the information processing device 2 via the communication unit 18 (step S45; YES), the projection control unit 171 controls the projection unit 11 to receive the projection image information received by the communication unit 18. The projected image to be shown is projected onto the target surface (step S46).
  • the shooting control unit 174 of the processor unit 17 controls the shooting unit 13 to shoot the projected image projected on the target surface (step S47).
  • the photographing control unit 174 transmits the photographed image information indicating the photographed image photographed by the photographing unit 13 to the information processing device 2 via the communication unit 18 (step S48).
  • the projection control unit 171 of the processor unit 17 controls the projection unit 11 to continue projecting the projected image on the target surface.
  • the predetermined projection time elapses (step S49; YES)
  • the process ends. In step S49, it may be determined whether or not the end instruction information has been received.
  • step S49 when the projection control unit 171 of the processor unit 17 has not received the end instruction information (step S49; NO), the projection control unit 171 controls the projection unit 11 to continue projecting the projected image on the target surface.
  • step S49; YES When the end instruction information is received (step S49; YES), the process ends.
  • the projected image determination process shown in FIG. 10 starts when the communication unit 22 of the information processing device 2 receives the captured image information from the user terminal 3.
  • the captured image processing unit 215 extracts a projected image from the captured image indicated by the captured image information (step S51).
  • the captured image processing unit 215 executes an image conversion process that performs the reverse processing of the drawing conversion process on the projected image extracted from the captured image by using the parameters stored in the storage unit 216.
  • the captured image processing unit 215 first executes a reverse rotation conversion process of rotating the projected image extracted from the captured image in the reverse direction using the parameters set in the rotation conversion process (step S52). Next, the captured image processing unit 215 executes a back-projection conversion process that performs a reverse-direction projective conversion on the projected image that has been subjected to the reverse-rotation conversion process using the parameters set in the projective conversion process (step S53). ). Next, the captured image processing unit 215 executes a reverse enlargement / reduction process of conversely reducing or enlarging the projected image on which the back-projection conversion process has been performed, using the parameters set in the enlargement / reduction process (step S54).
  • the captured image processing unit 215 cuts out the drawing data stored in the storage unit 216 using the parameters set in the cropping process, and generates the original drawing data of the projected image (step S55).
  • the captured image processing unit 215 calculates the difference between the captured image subjected to the image conversion process and the original drawing data of the projected image (step S56). Differences between the captured image subjected to the image conversion process and the original drawing data of the projected image are, for example, dimensional differences, image distortion, and projection fading.
  • the captured image processing unit 215 determines whether or not the difference between the captured image executed the image conversion process and the original drawing data of the projected image exceeds the threshold value (step S57). When the difference between the captured image on which the image conversion process is executed and the original drawing data of the projected image does not exceed the threshold value (step S57; NO), the process ends. When the difference between the captured image subjected to the image conversion process and the original drawing data of the projected image exceeds the threshold value (step S57; YES), the captured image processing unit 215 provides error information indicating that the projected image is abnormal. Output (step S58) and end the process. If the entire range of the drawing data is the projection range, step S51 may be omitted.
  • the movable marking support device 1 moves to the projection position and projects the projected image obtained by converting the drawing data including the marking information based on the angle of the direction onto the target surface to support the marking work. Convenience when doing is improved.
  • the information processing device 2 includes a temporary storage unit 101, a storage unit 102, a calculation unit 103, an input unit 104, a transmission / reception unit 105, and a display unit 106.
  • the temporary storage unit 101, the storage unit 102, the input unit 104, the transmission / reception unit 105, and the display unit 106 are all connected to the calculation unit 103 via the BUS.
  • the calculation unit 103 is, for example, a CPU. According to the control program stored in the storage unit 102, the calculation unit 103 includes the current position calculation unit 211 of the calculation storage unit 21, the projection path calculation unit 212, the relative position calculation unit 213, the projection image generation unit 214, and the captured image processing unit. Each process of 215 is executed.
  • the temporary storage unit 101 is, for example, a RAM (Random-Access Memory).
  • the temporary storage unit 101 loads the control program stored in the storage unit 102 and is used as a work area of the calculation unit 103.
  • the storage unit 102 is a non-volatile memory such as a flash memory, a hard disk, a DVD-RAM (Digital Versatile Disc-Random Access Memory), or a DVD-RW (Digital Versatile Disc-ReWritable).
  • the storage unit 102 stores in advance a program for causing the calculation unit 103 to perform the processing of the information processing device 2. Further, the storage unit 102 supplies the data stored in this program to the calculation unit 103 according to the instruction of the calculation unit 103, and stores the data supplied from the calculation unit 103.
  • the storage unit 216 of the arithmetic storage unit 21 is configured in the storage unit 102.
  • the input unit 104 is an interface device that connects an input device such as a keyboard and a pointing device and an input device such as a keyboard and a pointing device to the BUS. For example, when the user directly inputs the drawing data to the information processing device 2, the input drawing data is supplied to the calculation unit 103 via the input unit 104.
  • the transmission / reception unit 105 is a network termination device or wireless communication device connected to the network, and a serial interface or LAN (Local Area Network) interface connected to them.
  • the transmission / reception unit 105 functions as a communication unit 22.
  • the display unit 106 is a display device such as a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display). For example, when the user directly inputs the drawing data to the information processing device 2, the display unit 106 displays the drawing data input screen.
  • CTR Cathode Ray Tube
  • LCD Liquid Crystal Display
  • the processing of the current position calculation unit 211, the projection path calculation unit 212, the relative position calculation unit 213, the projection image generation unit 214, the captured image processing unit 215, and the storage unit 216 of the calculation storage unit 21 of the information processing device 2 shown in FIG. The control program executes by processing using the temporary storage unit 101, the calculation unit 103, the storage unit 102, the input unit 104, the transmission / reception unit 105, the display unit 106, and the like as resources.
  • the central part that performs processing of the information processing device 2 is a normal computer regardless of the dedicated system. It can be realized using a system.
  • a computer program for executing the above operation can be a computer-readable recording medium such as a flexible disk, a CD-ROM (Compact Disc-Read Only Memory), or a DVD-ROM (Digital Versatile Disc-Read Only Memory).
  • the information processing device 2 that executes the above-mentioned processing may be configured by storing and distributing the computer program in the computer and installing the computer program in the computer. Further, the information processing device 2 may be configured by storing the computer program in a storage device of a server device on a communication network such as the Internet and downloading it by a normal computer system.
  • the function of the information processing device 2 is realized by sharing the OS (Operating System) and the application program, or by coordinating the OS and the application program, only the application program part is stored in the recording medium and the storage device. You may.
  • the computer program may be posted on a bulletin board system (BBS, Bulletin Board System) on the communication network, and the computer program may be provided via the communication network. Then, by starting this computer program and executing it in the same manner as other application programs under the control of the OS, the above processing may be executed.
  • BSS bulletin board System
  • the marking support device 1 includes a marking unit 15.
  • the marking support device 1 includes a marking unit 15. It does not have to be.
  • the processor unit 17 does not have to include the marking control unit 175.
  • the driving unit 14 of the marking support device 1 includes a direction rotating unit 142, but projects of a projected image in the directivity direction of the fixed projection unit 11 and the imaging direction of the imaging unit 13 and When shooting is always possible, the drive unit 14 does not have to include the directional rotating unit 142. In this case, the projection condition information does not have to include the projection direction. Further, when the directing direction of the fixed projection unit 11 and the photographing direction of the photographing unit 13 are vertical, the calculation storage unit 21 of the information processing apparatus 2 does not have to include the relative position calculation unit 213.
  • the projection image generation unit 214 executes a drawing conversion process for converting the drawing data stored in the storage unit 216 into a projection image based on the current position and orientation of the marking support device 1 calculated by the current position calculation unit 211. At this time, the drawing conversion process does not have to include the projection conversion process.
  • the information processing device 2 includes a captured image processing unit 215.
  • the distance from the projection unit 11 to the target surface becomes longer as the illuminance of the projected image decreases, and the target surface is significantly tilted.
  • the information processing apparatus 2 may not include the captured image processing unit 215 when it is not necessary to determine whether or not the projected image is normal because there is no unevenness or unevenness.
  • the drive unit 14 includes a movement drive unit 141 that moves the marking support device 1 by flight, but the movement of the marking support device 1 is not limited to movement by flight.
  • the marking support device 1 may be a self-propelled device capable of moving in a plane, and may include an elevating mechanism capable of changing the heights of the projection unit 11 and the photographing unit 13.
  • the marking support system 100 includes the user terminal 3, but is not limited to this.
  • the user may input the projection condition information into the information processing device 2, or the information processing device 2 may acquire the projection condition information from another device or system.
  • the marking support device 1 projects the projected image on the floor surface of the building
  • the present invention is not limited to this, and the surface on which the marking support device 1 projects the projected image is formed by the user. Any surface may be used for marking work.
  • the marking support device 1 may project a projected image onto another surface such as a ceiling surface or a wall surface of a building.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Projection Apparatus (AREA)

Abstract

A communications unit (18) in a marking support device (1) provided in a marking support system (100) sends detection information to an information processing device (2), said detection information indicating the position, size, acceleration, and angular velocity of an object detected in the vicinity by a sensor unit (12). A communications unit (22) in the information processing device (2) obtains projection conditions information, indicating projection conditions including projection position, and receives detection information. A calculation storage unit (21) calculates a travel path from the present position of the marking support device (1) to the projection position, on the basis of the detection information, and converts drawing data including marking information into a projection image. The communications unit (22) sends, to the marking support device (1), projection image information that indicates the projection image and travel path information indicating the travel path. A processor unit (17) controls a drive unit (14) on the basis of the travel path information, moves the marking support device 1 to the projection position, controls the projection unit (11), and projects the projection image indicated by the projection image information, on to a target surface.

Description

墨出し支援システム、墨出し支援装置、情報処理装置、墨出し支援方法およびプログラムMarking support system, marking support device, information processing device, marking support method and program
 本開示は、墨出し支援システム、墨出し支援装置、情報処理装置、墨出し支援方法およびプログラムに関する。 This disclosure relates to a marking support system, a marking support device, an information processing device, a marking support method, and a program.
 建築工事、工場の立ち上げなどにおいて、建屋の床面、壁面および天井面を加工したり、これらの面に設備を設置したりする際、目印となる線、形などを記す作業を行う。この作業を「墨出し」という。一般的に「墨出し」は、線を描くとき、基準点からの距離を実測して印を付け、基準点と印との間に墨、粉チョークなどを付帯した糸を引っ張り、弾く手法が用いられる。 In building work, factory start-up, etc., when processing the floor, wall surface and ceiling surface of a building, or installing equipment on these surfaces, work is done to mark lines, shapes, etc. that serve as landmarks. This work is called "inking out". In general, when drawing a line, "inking out" is a method of measuring the distance from the reference point and marking it, pulling a thread with ink, powder chalk, etc. between the reference point and the mark, and flipping it. Used.
 これに代わる方法として、特許文献1には、構造物の側壁面、天井面、床面などにレーザーラインを出力して表示するレーザー墨出し器が開示されている。特許文献2には、墨出しを行う対象面へ、設計図、3次元モデルなどから生成された画像をプロジェクタで投影する墨出し支援装置が開示されている。 As an alternative method, Patent Document 1 discloses a laser marking device that outputs and displays a laser line on a side wall surface, a ceiling surface, a floor surface, or the like of a structure. Patent Document 2 discloses an marking support device that projects an image generated from a design drawing, a three-dimensional model, or the like onto a target surface to be marking with a projector.
特開2006-215019号公報Japanese Unexamined Patent Publication No. 2006-215019 特開2017-181342号公報Japanese Unexamined Patent Publication No. 2017-181342
 しかし、特許文献1または2に記載の方法を用いて、レーザーまたはプロジェクタによって墨出しを行う対象面へ図面を投影して墨出し作業を支援する場合、投影部からの水平距離が遠くなるほど、元の図面と投影された図面との間に線幅の広がりによる差異が生じたり、照度が低下したりする。これにより、投影された図面の精度が低下したり、作業者が投影された図面を見にくくなったりすることがある。そのため、図面の一定領域毎に投影画像を作成し、各領域の投影拠点に投影装置を移動させて操作する手間が発生し、不便である。 However, when the drawing is projected onto the target surface to be marked by a laser or a projector to support the marking work by using the method described in Patent Document 1 or 2, the farther the horizontal distance from the projection portion is, the more the original There is a difference between the drawing and the projected drawing due to the spread of the line width, and the illuminance decreases. This may reduce the accuracy of the projected drawing or make it difficult for the operator to see the projected drawing. Therefore, it is inconvenient to create a projected image for each fixed area of the drawing and move the projection device to the projection base of each area to operate it.
 本開示は、上記のような問題点に鑑みてなされたものであり、墨出しを行う対象面へ図面を投影して墨出し作業を支援する際の利便性を向上させることを目的とする。 The present disclosure has been made in view of the above problems, and an object of the present disclosure is to improve convenience when projecting a drawing on a target surface to be marked and supporting the marking work.
 上記目的を達成するため、本開示の墨出し支援システムは、墨出し情報を含む図面データを投影画像に変換する情報処理装置および投影画像を対象面に投影する墨出し支援装置を備える。墨出し支援装置は、投影部と、センサ部と、移動駆動部と、プロセッサ部と、検知情報送信部と、移動経路情報受信部と、投影画像情報受信部とを備える。投影部は、投影画像を指向方向に投影する。センサ部は、周辺の物体の位置および大きさと墨出し支援装置の加速度および角速度とを検知する。移動駆動部は、墨出し支援装置を移動させる。プロセッサ部は、墨出し支援装置を制御する。検知情報送信部は、センサ部が検知した周辺の物体の位置および大きさと墨出し支援装置の加速度および角速度とを示す検知情報を情報処理装置に送信する。移動経路情報受信部は、情報処理装置から、現在位置から投影位置までの移動経路を示す移動経路情報を受信する。投影画像情報受信部は、情報処理装置から、投影画像を示す投影画像情報を受信する。情報処理装置は、投影条件情報取得部と、検知情報受信部と、現在位置演算部と、移動経路演算部と、移動経路情報送信部と、投影画像生成部と、投影画像情報送信部とを備える。投影条件情報取得部は、墨出し支援装置の投影位置を含む投影条件を示す投影条件情報を取得する。検知情報受信部は、墨出し支援装置から検知情報を受信する。現在位置演算部は、検知情報に基づいて、墨出し支援装置の現在位置および向きを算出する。移動経路演算部は、現在位置演算部が算出した墨出し支援装置の現在位置を示す情報および投影条件情報に基づいて、墨出し支援装置の現在位置から投影位置までの移動経路を算出する。移動経路情報送信部は、移動経路演算部が算出した移動経路を示す移動経路情報を墨出し支援装置に送信する。投影画像生成部は、現在位置演算部が算出した墨出し支援装置の現在位置および向きを示す情報に基づいて、図面データを投影画像に変換する図面変換処理を行う。投影画像情報送信部は、投影画像生成部が生成した投影画像を示す投影画像情報を墨出し支援装置に送信する。プロセッサ部は、移動経路情報に基づいて、移動駆動部を制御して、墨出し支援装置を投影位置に移動させ、投影部を制御して、投影画像情報が示す投影画像を対象面に投影させる。 In order to achieve the above object, the marking support system of the present disclosure includes an information processing device that converts drawing data including marking information into a projected image and a marking support device that projects the projected image onto a target surface. The marking support device includes a projection unit, a sensor unit, a movement drive unit, a processor unit, a detection information transmission unit, a movement route information reception unit, and a projection image information reception unit. The projection unit projects the projected image in the directivity direction. The sensor unit detects the position and size of surrounding objects and the acceleration and angular velocity of the marking support device. The movement drive unit moves the marking support device. The processor unit controls the marking support device. The detection information transmission unit transmits detection information indicating the position and size of a peripheral object detected by the sensor unit and the acceleration and angular velocity of the marking support device to the information processing device. The movement route information receiving unit receives the movement route information indicating the movement route from the current position to the projection position from the information processing device. The projected image information receiving unit receives the projected image information indicating the projected image from the information processing device. The information processing device includes a projection condition information acquisition unit, a detection information reception unit, a current position calculation unit, a movement route calculation unit, a movement route information transmission unit, a projection image generation unit, and a projection image information transmission unit. Be prepared. The projection condition information acquisition unit acquires projection condition information indicating the projection condition including the projection position of the marking support device. The detection information receiving unit receives the detection information from the marking support device. The current position calculation unit calculates the current position and orientation of the marking support device based on the detection information. The movement path calculation unit calculates the movement path from the current position of the marking support device to the projection position based on the information indicating the current position of the marking support device and the projection condition information calculated by the current position calculation unit. The movement route information transmission unit transmits the movement route information indicating the movement route calculated by the movement route calculation unit to the marking support device. The projection image generation unit performs a drawing conversion process for converting drawing data into a projection image based on the information indicating the current position and orientation of the marking support device calculated by the current position calculation unit. The projection image information transmission unit transmits the projection image information indicating the projection image generated by the projection image generation unit to the marking support device. The processor unit controls the movement drive unit based on the movement path information to move the marking support device to the projection position, and controls the projection unit to project the projected image indicated by the projected image information on the target surface. ..
 本開示によれば、移動可能な墨出し支援装置が投影位置に移動して、墨出し支援装置の向きに基づいて墨出し情報を含む図面データを変換した投影画像を対象面に投影することで、墨出しを行う対象面へ図面を投影して墨出し作業を支援する際の利便性が向上する。 According to the present disclosure, a movable marking support device moves to a projection position, and a projected image obtained by converting drawing data including marking information based on the orientation of the marking support device is projected onto a target surface. , Convenience is improved when the drawing is projected on the target surface to be marked and the marking work is supported.
実施の形態に係る墨出し支援システムの構成例を示す図The figure which shows the configuration example of the marking support system which concerns on embodiment 実施の形態に係る墨出し支援装置の機能構成例を示すブロック図A block diagram showing a functional configuration example of the marking support device according to the embodiment. 実施の形態に係る情報処理装置の機能構成例を示すブロック図Block diagram showing a functional configuration example of the information processing apparatus according to the embodiment 実施の形態に係るユーザ端末に表示された入力画面の一例を示す図The figure which shows an example of the input screen displayed on the user terminal which concerns on embodiment 墨出し支援装置が投影する投影画像の一例を示す図The figure which shows an example of the projection image projected by the marking support device. 実施の形態に係る図面データを投影画像に変換する図面変換処理を説明する図The figure explaining the drawing conversion process which converts the drawing data which concerns on embodiment into a projection image. 実施の形態に係る投影条件設定処理の動作の一例を示すフローチャートA flowchart showing an example of the operation of the projection condition setting process according to the embodiment. 実施の形態に係る移動回動設定処理の動作の一例を示すフローチャートA flowchart showing an example of the operation of the movement rotation setting process according to the embodiment. 実施の形態に係る投影画像生成処理の動作の一例を示すフローチャートA flowchart showing an example of the operation of the projection image generation process according to the embodiment. 実施の形態に係る墨出し支援処理の動作の一例を示すフローチャートA flowchart showing an example of the operation of the marking support process according to the embodiment. 実施の形態に係る投影画像判定処理の動作の一例を示すフローチャートA flowchart showing an example of the operation of the projection image determination process according to the embodiment. 実施の形態に係る情報処理装置のハードウェア構成の一例を示す図The figure which shows an example of the hardware configuration of the information processing apparatus which concerns on embodiment
 以下、図面を参照して、本実施の形態に係る墨出し支援システムについて詳細に説明する。なお、図中、同一または相当する部分には、同じ符号を付す。本実施の形態では、墨出し情報を含む図面データを変換した投影画像を建屋の床面に投影して墨出し作業を支援する例について説明する。墨出し情報は、例えば、設備の設置位置、アンカーボルトの穴あけ位置、シートの貼付位置などである。以下、投影画像の元の図面データに対応する床面の領域を対象面という。 Hereinafter, the marking support system according to the present embodiment will be described in detail with reference to the drawings. In the figure, the same or corresponding parts are designated by the same reference numerals. In the present embodiment, an example of supporting the marking work by projecting a projected image obtained by converting drawing data including marking information onto the floor surface of a building will be described. The marking information includes, for example, the installation position of equipment, the drilling position of anchor bolts, the sticking position of a sheet, and the like. Hereinafter, the area of the floor surface corresponding to the original drawing data of the projected image is referred to as a target surface.
 図1に示すように、墨出し支援システム100は、投影画像を対象面に投影する墨出し支援装置1と、墨出し情報を含む図面データを投影画像に変換する情報処理装置2と、ユーザが使用するユーザ端末3とを備える。図面データは、例えば墨出し作業を実行する建屋の壁、柱などの位置を記したCAD図面である。墨出し支援装置1は、例えばドローンに代表される自律飛行する無人航空機(UAV:Unmanned Aerial Vehicle)である。情報処理装置2は、例えばコンピュータである。ユーザ端末3は、例えばタブレット端末である。墨出し支援装置1およびユーザ端末3はそれぞれ、情報処理装置2に無線で接続している。 As shown in FIG. 1, the marking support system 100 includes a marking support device 1 that projects a projected image onto a target surface, an information processing device 2 that converts drawing data including marking information into a projected image, and a user. It includes a user terminal 3 to be used. The drawing data is, for example, a CAD drawing showing the positions of the walls, pillars, etc. of the building where the marking work is executed. The marking support device 1 is, for example, an autonomously flying unmanned aerial vehicle (UAV: Unmanned Aerial Vehicle) represented by a drone. The information processing device 2 is, for example, a computer. The user terminal 3 is, for example, a tablet terminal. The marking support device 1 and the user terminal 3 are each wirelessly connected to the information processing device 2.
 ユーザ端末3は、入力画面を表示する表示部31と、ユーザからの入力を受け付ける入力部32と、表示部31および入力部32を制御する制御部33と、情報処理装置2と通信する通信部34とを備える。表示部31は、例えば液晶画面である。入力部32は、例えば、タッチパネルである。制御部33は、例えばCPU(Central Processing Unit)、メモリなどが組み込まれた筐体である。ユーザが入力画面の表示指示を入力部32に入力すると、制御部33は、入力画面を生成し、表示部31に表示させる。ユーザが入力画面に墨出し支援装置1の投影条件を入力すると、制御部33は、入力された投影条件を示す投影条件情報を生成し、通信部34を介して情報処理装置2に送信する。投影条件は、墨出し支援装置1の投影位置、投影方向などである。 The user terminal 3 has a display unit 31 that displays an input screen, an input unit 32 that receives input from the user, a control unit 33 that controls the display unit 31 and the input unit 32, and a communication unit that communicates with the information processing device 2. With 34. The display unit 31 is, for example, a liquid crystal screen. The input unit 32 is, for example, a touch panel. The control unit 33 is a housing in which, for example, a CPU (Central Processing Unit), a memory, and the like are incorporated. When the user inputs an input screen display instruction to the input unit 32, the control unit 33 generates an input screen and displays it on the display unit 31. When the user inputs the projection condition of the marking support device 1 on the input screen, the control unit 33 generates projection condition information indicating the input projection condition and transmits it to the information processing device 2 via the communication unit 34. The projection conditions are the projection position, the projection direction, and the like of the marking support device 1.
 墨出し支援装置1は、投影画像を指向方向に投影する投影部11と、周辺の物体の位置および大きさと墨出し支援装置1の加速度および角速度とを検知するセンサ部12と、投影部11が投影した投影画像を撮影する撮影部13と、墨出し支援装置1を移動させ、投影部11の指向方向および撮影部13の撮影方向を変化させる駆動部14とを備える。また、墨出し支援装置1は、インクを射出するマーキング部15と、墨出し支援装置1に電力を供給する電力源16と、墨出し支援装置1を制御するプロセッサ部17と、情報処理装置2およびユーザ端末3と通信する通信部18とを備える。 The marking support device 1 includes a projection unit 11 that projects a projected image in a directional direction, a sensor unit 12 that detects the position and size of surrounding objects, and the acceleration and angular velocity of the marking support device 1, and the projection unit 11. It includes a photographing unit 13 that captures the projected projected image, and a driving unit 14 that moves the marking support device 1 to change the directing direction of the projection unit 11 and the photographing direction of the photographing unit 13. Further, the marking support device 1 includes a marking unit 15 for ejecting ink, a power source 16 for supplying electric power to the marking support device 1, a processor unit 17 for controlling the marking support device 1, and an information processing device 2. And a communication unit 18 that communicates with the user terminal 3.
 投影部11は、例えば、レーザープロジェクタである。投影部11は、プロセッサ部17からの指示に従って、投影画像を対象面に投影する。センサ部12は、例えば、レーザー光の出射および反射により距離情報を得るLRF(Laser Range Finder)、慣性センサおよびジャイロセンサである。センサ部12は、周辺の物体の位置および大きさと墨出し支援装置1の加速度および角速度とを検知する。撮影部13は、例えば、デジタルカメラである。撮影部13は、プロセッサ部17からの指示に従って、投影部11によって投影された投影画像を撮影する。 The projection unit 11 is, for example, a laser projector. The projection unit 11 projects the projected image onto the target surface according to the instruction from the processor unit 17. The sensor unit 12 is, for example, an LRF (Laser RangeFinder), an inertial sensor, and a gyro sensor that obtain distance information by emitting and reflecting laser light. The sensor unit 12 detects the position and size of surrounding objects and the acceleration and angular velocity of the marking support device 1. The photographing unit 13 is, for example, a digital camera. The photographing unit 13 photographs the projected image projected by the projection unit 11 in accordance with the instruction from the processor unit 17.
 駆動部14は、例えばプロペラを有するロータ、ならびに、投影部11および撮影部13を水平方向および垂直方向に回動させる2軸ロータである。駆動部14は、プロセッサ部17からの指示に従って、墨出し支援装置1を移動させ、投影部11の指向方向および撮影部13の撮影方向を変化させる。マーキング部15は、例えば、インクジェットノズルである。マーキング部15は、プロセッサ部17からの指示に従って、インクを射出する。 The drive unit 14 is, for example, a rotor having a propeller, and a biaxial rotor that rotates the projection unit 11 and the imaging unit 13 in the horizontal and vertical directions. The drive unit 14 moves the marking support device 1 according to the instruction from the processor unit 17, and changes the directivity direction of the projection unit 11 and the imaging direction of the imaging unit 13. The marking unit 15 is, for example, an inkjet nozzle. The marking unit 15 ejects ink according to an instruction from the processor unit 17.
 電力源16は、例えば、リチウムイオンバッテリーである。電力源16は、投影部11、センサ部12、駆動部14、撮影部13、マーキング部15、プロセッサ部17および通信部18に電力を供給する。通信部18は、例えば、無線LAN(Local Area Network)に接続する無線通信装置である。通信部18は、情報処理装置2との通信を確立し、投影位置までの移動経路を示す移動経路情報、投影部11および撮影部13の回動量を示す回動量情報、ならびに、投影画像を示す投影画像情報を情報処理装置2から受信する。通信部18は、移動経路情報受信部、回動量情報受信部および投影画像情報受信部の例である。 The power source 16 is, for example, a lithium ion battery. The power source 16 supplies power to the projection unit 11, the sensor unit 12, the drive unit 14, the photographing unit 13, the marking unit 15, the processor unit 17, and the communication unit 18. The communication unit 18 is, for example, a wireless communication device connected to a wireless LAN (Local Area Network). The communication unit 18 establishes communication with the information processing device 2, and shows movement path information indicating the movement path to the projection position, rotation amount information indicating the rotation amounts of the projection unit 11 and the imaging unit 13, and a projected image. The projected image information is received from the information processing device 2. The communication unit 18 is an example of a movement path information receiving unit, a rotation amount information receiving unit, and a projected image information receiving unit.
 プロセッサ部17は、例えばマイクロコンピュータである。プロセッサ部17は、センサ部12が検知した周辺の物体の位置および大きさと墨出し支援装置1の加速度および角速度とを示す検知情報を、通信部18を介して情報処理装置2に送信する。プロセッサ部17は、検知情報を定期的に情報処理装置2に送信してもよいし、情報処理装置2からの要求に応答して送信してもよい。プロセッサ部17は、通信部18が受信した移動経路情報に基づいて駆動部14を制御し、墨出し支援装置1を投影位置まで移動させる。プロセッサ部17は、通信部18が受信した回動量情報に基づいて駆動部14を制御し、投影部11の指向方向および撮影部13の撮影方向を変化させる。プロセッサ部17は、通信部18が受信した投影画像情報が示す投影画像を投影部11に投影させる。プロセッサ部17は、撮影部13を制御し、投影部11が投影した投影画像を撮影させる。プロセッサ部17は、撮影部13の撮影画像を示す撮影画像情報を、通信部18を介して情報処理装置2に送信する。通信部18は、検知情報送信部および撮影画像情報送信部の例である。 The processor unit 17 is, for example, a microcomputer. The processor unit 17 transmits detection information indicating the position and size of a peripheral object detected by the sensor unit 12 and the acceleration and angular velocity of the marking support device 1 to the information processing device 2 via the communication unit 18. The processor unit 17 may periodically transmit the detection information to the information processing device 2, or may transmit the detection information in response to a request from the information processing device 2. The processor unit 17 controls the drive unit 14 based on the movement route information received by the communication unit 18, and moves the marking support device 1 to the projection position. The processor unit 17 controls the drive unit 14 based on the rotation amount information received by the communication unit 18, and changes the directivity direction of the projection unit 11 and the imaging direction of the imaging unit 13. The processor unit 17 projects the projected image indicated by the projected image information received by the communication unit 18 onto the projection unit 11. The processor unit 17 controls the photographing unit 13 to photograph the projected image projected by the projection unit 11. The processor unit 17 transmits the photographed image information indicating the photographed image of the photographing unit 13 to the information processing device 2 via the communication unit 18. The communication unit 18 is an example of a detection information transmission unit and a captured image information transmission unit.
 情報処理装置2は、墨出し支援装置1およびユーザ端末3から受信した情報を処理する演算記憶部21と、墨出し支援装置1およびユーザ端末3と通信する通信部22とを備える。通信部22は、例えば、無線LANに接続する無線通信装置である。通信部22は、ユーザ端末3からユーザによって入力された投影条件を示す投影条件情報を受信する。通信部22は、墨出し支援装置1から検知情報を受信する。通信部22は、投影条件情報取得部および検知情報受信部の例である。 The information processing device 2 includes an arithmetic storage unit 21 that processes information received from the marking support device 1 and the user terminal 3, and a communication unit 22 that communicates with the marking support device 1 and the user terminal 3. The communication unit 22 is, for example, a wireless communication device connected to a wireless LAN. The communication unit 22 receives the projection condition information indicating the projection condition input by the user from the user terminal 3. The communication unit 22 receives the detection information from the marking support device 1. The communication unit 22 is an example of a projection condition information acquisition unit and a detection information reception unit.
 演算記憶部21は、例えばCPU、メモリなどが組み込まれた筐体である。演算記憶部21は、予め取得した図面データを記憶している。情報処理装置2は、例えば外部の装置またはシステムから図面データを受信してもよいし、ユーザが情報処理装置2に図面データを入力してもよい。演算記憶部21は、通信部22が受信した投影条件情報および検知情報に基づいて、墨出し支援装置1の現在位置から投影位置までの移動経路と、投影部11の指向方向および撮影部13の撮影方向を投影方向と一致させる投影部11および撮影部13の回動量とを算出する。演算記憶部21は、墨出し支援装置1の現在位置から投影位置までの移動経路を示す移動経路情報、および、投影部11および撮影部13の回動量を示す回動量情報を、通信部22を介して墨出し支援装置1に送信する。演算記憶部21は、通信部22が受信した検知情報に基づいて、図面データを投影画像に変換する。演算記憶部21は、図面データを変換した投影画像を示す投影画像情報を、通信部22を介して、墨出し支援装置1に送信する。通信部22は、移動経路情報送信部、回動量情報送信部および投影画像情報送信部の例である。 The arithmetic storage unit 21 is a housing in which, for example, a CPU and a memory are incorporated. The arithmetic storage unit 21 stores drawing data acquired in advance. The information processing device 2 may receive drawing data from, for example, an external device or system, or the user may input drawing data into the information processing device 2. Based on the projection condition information and the detection information received by the communication unit 22, the arithmetic storage unit 21 describes the movement path from the current position of the marking support device 1 to the projection position, the direction of the projection unit 11, and the photographing unit 13. The amount of rotation of the projection unit 11 and the imaging unit 13 that make the imaging direction coincide with the projection direction is calculated. The arithmetic storage unit 21 sends the communication unit 22 the movement path information indicating the movement path from the current position of the marking support device 1 to the projection position and the rotation amount information indicating the rotation amounts of the projection unit 11 and the imaging unit 13. It is transmitted to the marking support device 1 via. The arithmetic storage unit 21 converts the drawing data into a projected image based on the detection information received by the communication unit 22. The arithmetic storage unit 21 transmits the projected image information indicating the projected image obtained by converting the drawing data to the marking support device 1 via the communication unit 22. The communication unit 22 is an example of a movement route information transmission unit, a rotation amount information transmission unit, and a projected image information transmission unit.
 ここで、図2を用いて、墨出し支援装置1の機能構成について説明する。墨出し支援装置1のセンサ部12は、センシング範囲に存在する壁、床、天井、柱などの物体の位置および大きさを検知する外界センサ部121と、墨出し支援装置1の加速度および角速度を検知する内界センサ部122と、を備える。外界センサ部121は、例えばLRFである。センシング範囲は、外界センサ部121の性能によって決まる外界センサ部121から一定の距離までの範囲である。内界センサ部122は、例えば慣性センサおよびジャイロセンサである。 Here, the functional configuration of the marking support device 1 will be described with reference to FIG. The sensor unit 12 of the marking support device 1 determines the acceleration and angular speed of the marking support device 1 and the external sensor unit 121 that detects the position and size of an object such as a wall, floor, ceiling, or pillar existing in the sensing range. It includes an internal sensor unit 122 for detecting. The external sensor unit 121 is, for example, an LRF. The sensing range is a range from the external sensor unit 121 to a certain distance, which is determined by the performance of the external sensor unit 121. The internal sensor unit 122 is, for example, an inertial sensor and a gyro sensor.
 駆動部14は、墨出し支援装置1を飛行によって移動させる移動駆動部141と、投影部11の指向方向および撮影部13の撮影方向を変化させる方向回動部142と、を備える。移動駆動部141は、例えばプロペラを有するロータである。方向回動部142は、例えば水平方向および垂直方向に回転して投影部11および撮影部13を回動させる2軸ロータである。 The drive unit 14 includes a moving drive unit 141 that moves the marking support device 1 by flight, and a directional rotating unit 142 that changes the directivity direction of the projection unit 11 and the imaging direction of the imaging unit 13. The mobile drive unit 141 is, for example, a rotor having a propeller. The directional rotating unit 142 is, for example, a biaxial rotor that rotates in the horizontal direction and the vertical direction to rotate the projection unit 11 and the photographing unit 13.
 プロセッサ部17は、投影部11を制御する投影制御部171と、外界センサ部121および内界センサ部122の検知情報を取得する検知情報収集部172と、移動駆動部141および方向回動部142を制御する駆動制御部173と、撮影部13を制御する撮影制御部174と、マーキング部15を制御するマーキング制御部175とを備える。 The processor unit 17 includes a projection control unit 171 that controls the projection unit 11, a detection information collection unit 172 that acquires detection information of the external world sensor unit 121 and the internal world sensor unit 122, a movement drive unit 141, and a direction rotation unit 142. A drive control unit 173 for controlling the image, a photographing control unit 174 for controlling the photographing unit 13, and a marking control unit 175 for controlling the marking unit 15.
 投影制御部171は、投影部11を制御して、通信部18が受信した投影画像情報が示す投影画像を、対象面に投影させる。墨出し作業者は、投影された投影画像に含まれる設備の設置位置、アンカーボルトの穴あけ位置、シートの貼付位置などの墨出し情報に基づいて、対象面に墨出しを行う。予め決められた投影時間が経過すると、投影制御部171は、投影部11に投影画像の投影を終了させる。なお、投影時間はユーザが設定してもよい。あるいは、墨出し作業者の墨出しが完了すると、ユーザがユーザ端末3に投影の終了指示を入力してもよい。この場合、ユーザ端末3は入力された終了指示を示す終了指示情報を、通信部34を介して情報処理装置2に送信する。情報処理装置2は、終了指示情報を墨出し支援装置1に送信し、墨出し支援装置1が終了指示情報を受信すると、プロセッサ部17の投影制御部171は、投影部11に投影画像の投影を終了させる。 The projection control unit 171 controls the projection unit 11 to project the projection image indicated by the projection image information received by the communication unit 18 onto the target surface. The marking operator performs marking on the target surface based on marking information such as the installation position of equipment, the drilling position of anchor bolts, and the sticking position of the sheet included in the projected projected image. When the predetermined projection time elapses, the projection control unit 171 ends the projection of the projected image on the projection unit 11. The projection time may be set by the user. Alternatively, when the marking by the marking worker is completed, the user may input a projection end instruction to the user terminal 3. In this case, the user terminal 3 transmits the end instruction information indicating the input end instruction to the information processing device 2 via the communication unit 34. The information processing device 2 transmits the end instruction information to the marking support device 1, and when the marking support device 1 receives the end instruction information, the projection control unit 171 of the processor unit 17 projects the projected image onto the projection unit 11. To end.
 検知情報収集部172は、外界センサ部121および内界センサ部122から取得した検知情報を、通信部18を介して情報処理装置2に送信する。駆動制御部173は、移動駆動部141を制御して、通信部18が受信した移動経路情報が示す移動経路で、投影位置に墨出し支援装置1を移動させる。駆動制御部173は、方向回動部142を制御して、通信部18が受信した回動量情報が示す投影部11および撮影部13の回動量で、投影部11および撮影部13を回動させる。墨出し支援装置1は、投影部11の指向方向および撮影部13の撮影方向を変化させる機能を備えるので、例えば真上からの投影画像の投影および撮影が困難である場合も、投影画像の投影および撮影が可能である。 The detection information collecting unit 172 transmits the detection information acquired from the external world sensor unit 121 and the internal world sensor unit 122 to the information processing device 2 via the communication unit 18. The drive control unit 173 controls the movement drive unit 141 to move the marking support device 1 to the projection position by the movement path indicated by the movement route information received by the communication unit 18. The drive control unit 173 controls the directional rotation unit 142 to rotate the projection unit 11 and the imaging unit 13 by the rotation amount of the projection unit 11 and the imaging unit 13 indicated by the rotation amount information received by the communication unit 18. .. Since the marking support device 1 has a function of changing the pointing direction of the projection unit 11 and the shooting direction of the shooting unit 13, for example, even when it is difficult to project and shoot a projected image from directly above, the projected image is projected. And shooting is possible.
 撮影制御部174は、撮影部13を制御して、対象面に投影された投影画像を撮影させる。撮影制御部174は、撮影部13の撮影画像を示す撮影画像情報を、通信部18を介して情報処理装置2に送信する。マーキング制御部175は、マーキング部15を制御してインクを射出させる。駆動制御部173が移動駆動部141を制御して墨出し支援装置1を移動させながら、マーキング制御部175がマーキング部15を制御してインクを射出させることで、対象面に線、形などを記入することができる。その他の構成要素の機能は前述のとおりである。 The shooting control unit 174 controls the shooting unit 13 to shoot a projected image projected on the target surface. The photographing control unit 174 transmits the photographed image information indicating the photographed image of the photographing unit 13 to the information processing device 2 via the communication unit 18. The marking control unit 175 controls the marking unit 15 to eject ink. While the drive control unit 173 controls the movement drive unit 141 to move the marking support device 1, the marking control unit 175 controls the marking unit 15 to eject ink, thereby producing lines, shapes, etc. on the target surface. Can be filled in. The functions of the other components are as described above.
 続いて、図3を用いて、情報処理装置2の機能構成について説明する。情報処理装置2の演算記憶部21は、墨出し支援装置1の現在位置および向きを演算する現在位置演算部211と、墨出し支援装置1の現在位置から投影位置までの移動経路、ならびに、投影部11の指向方向および撮影部13の撮影方向と投影方向とを一致させる回動量を算出する投影経路演算部212と、墨出し支援装置1の対象面までの投影部11の指向方向の距離および対象面に対する指向方向の角度を演算する相対位置演算部213とを備える。また、演算記憶部21は、図面データを投影画像に変換する投影画像生成部214と、撮影画像を画像変換処理し、投影画像の元の図面データとの差異を算出して投影画像が正常であるか否かを判定する撮影画像処理部215と、各種情報を記憶する記憶部216とを備える。 Subsequently, the functional configuration of the information processing device 2 will be described with reference to FIG. The calculation storage unit 21 of the information processing device 2 includes a current position calculation unit 211 that calculates the current position and orientation of the marking support device 1, a movement path from the current position of the marking support device 1 to the projection position, and projection. The distance in the direction of the projection unit 11 to the target surface of the marking support device 1 and the projection path calculation unit 212 that calculates the amount of rotation that matches the direction of the direction of the unit 11 and the direction of the imaging unit 13 with the projection direction. It is provided with a relative position calculation unit 213 that calculates an angle in a directing direction with respect to a target surface. Further, the arithmetic storage unit 21 performs image conversion processing on the projected image generation unit 214 that converts the drawing data into the projected image and the image conversion process of the captured image, calculates the difference from the original drawing data of the projected image, and the projected image is normal. It includes a captured image processing unit 215 for determining whether or not there is an image, and a storage unit 216 for storing various information.
 通信部22は、ユーザ端末3から受信したユーザによって入力された投影条件を示す投影条件情報を記憶部216に記憶させる。通信部22は、墨出し支援装置1から受信した外界センサ部121および内界センサ部122の検知情報および撮影部13の撮影画像を示す撮影画像情報を記憶部216に記憶させる。現在位置演算部211は、記憶部216が記憶する検知情報に基づいて、墨出し支援装置1の現在位置および向きを算出する。墨出し支援装置1の現在位置とは、図面データが示す図面上の現在位置の座標および床面からの高さである。相対位置演算部213は、記憶部216が記憶する検知情報に基づいて、墨出し支援装置1の対象面までの投影部11の指向方向の距離および対象面に対する指向方向の角度を演算する。 The communication unit 22 stores the projection condition information indicating the projection condition input by the user received from the user terminal 3 in the storage unit 216. The communication unit 22 stores the detection information of the external world sensor unit 121 and the internal world sensor unit 122 received from the marking support device 1 and the photographed image information indicating the photographed image of the photographing unit 13 in the storage unit 216. The current position calculation unit 211 calculates the current position and orientation of the marking support device 1 based on the detection information stored in the storage unit 216. The current position of the marking support device 1 is the coordinates of the current position on the drawing and the height from the floor surface indicated by the drawing data. The relative position calculation unit 213 calculates the distance in the direction of the projection unit 11 to the target surface of the marking support device 1 and the angle of the direction in the direction with respect to the target surface based on the detection information stored in the storage unit 216.
 記憶部216は、予め取得した図面データを記憶している。投影画像生成部214は、現在位置演算部211が算出した墨出し支援装置1の向きと相対位置演算部213が算出した墨出し支援装置1の対象面までの投影部11の指向方向の距離および対象面に対する指向方向の角度とに基づいて、記憶部216が記憶する図面データを投影画像に変換する図面変換処理を実行する。投影画像生成部214は、図面変換処理を実行した際に設定したパラメータを記憶部216に記憶させる。 The storage unit 216 stores drawing data acquired in advance. The projection image generation unit 214 includes the orientation of the marking support device 1 calculated by the current position calculation unit 211 and the distance in the direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213. A drawing conversion process for converting the drawing data stored in the storage unit 216 into a projected image is executed based on the angle in the direction of orientation with respect to the target surface. The projection image generation unit 214 stores the parameters set when the drawing conversion process is executed in the storage unit 216.
 撮影画像処理部215は、撮影画像情報が示す撮影画像から投影画像を抽出する。撮影画像処理部215は、記憶部216が記憶するパラメータを用いて、撮影画像から抽出した投影画像に図面変換処理とは逆の処理を行う画像変換処理を実行する。撮影画像処理部215は、画像変換処理を行った撮影画像と投影画像の元の図面データとの差異を算出する。撮影画像処理部215は、画像変換処理を行った撮影画像と投影画像の元の図面データとの差異が閾値を超える場合には、投影画像が異常であることを示すエラー情報を出力する。エラー情報の出力は、画面表示でもよいし、音声出力でもよいし、通信部22を介してユーザ端末3に送信してもよい。撮影画像処理部215がエラー情報をユーザ端末3に送信した場合、ユーザ端末3の制御部33は、通信部34を介して情報処理装置2から受信したエラー情報を表示部31に表示させる。 The captured image processing unit 215 extracts the projected image from the captured image indicated by the captured image information. The captured image processing unit 215 executes an image conversion process that performs the reverse processing of the drawing conversion process on the projected image extracted from the captured image by using the parameters stored in the storage unit 216. The captured image processing unit 215 calculates the difference between the captured image that has undergone image conversion processing and the original drawing data of the projected image. When the difference between the captured image subjected to the image conversion processing and the original drawing data of the projected image exceeds the threshold value, the captured image processing unit 215 outputs error information indicating that the projected image is abnormal. The output of the error information may be a screen display, a voice output, or may be transmitted to the user terminal 3 via the communication unit 22. When the captured image processing unit 215 transmits the error information to the user terminal 3, the control unit 33 of the user terminal 3 causes the display unit 31 to display the error information received from the information processing device 2 via the communication unit 34.
 続いて、図4Aおよび図4Bを用いて、墨出し支援装置1、情報処理装置2およびユーザ端末3が実行する処理について説明する。図4Aは、ユーザ端末3に表示された入力画面の一例を示す図である。図4Bは、墨出し支援装置1が投影する投影画像の一例である。ユーザが投影条件の入力画面を表示させる指示をユーザ端末3の入力部32に入力すると、制御部33は、情報処理装置2から図面データを取得する。制御部33は、図面データを表示して投影条件の入力を受け付ける入力画面を、表示部31に表示させる。 Subsequently, the processes executed by the marking support device 1, the information processing device 2, and the user terminal 3 will be described with reference to FIGS. 4A and 4B. FIG. 4A is a diagram showing an example of an input screen displayed on the user terminal 3. FIG. 4B is an example of a projected image projected by the marking support device 1. When the user inputs an instruction to display the input screen of the projection condition to the input unit 32 of the user terminal 3, the control unit 33 acquires drawing data from the information processing device 2. The control unit 33 causes the display unit 31 to display an input screen that displays drawing data and accepts input of projection conditions.
 図4Aに示すように、ユーザは、入力画面に表示された図面データ41上で、墨出し支援装置1の投影位置の水平面(図中のXY平面)における2次元座標42を指定する。また、ユーザは、入力画面に表示された図面データ上で、墨出し支援装置1の投影位置の垂直方向(図中のZ方向)の座標である飛行高さ43を指定する。つまり、ユーザは、入力画面に表示された図面データ上で、墨出し支援装置1の投影位置の3次元座標(X,Y,Z)を指定する。なお、ユーザは、例えば、座標(X1,Y1,Z1)でN分停滞し、その後、座標(X2,Y2,Z2)に移動し、M分停滞するといったように、複数の投影位置と停滞時間とを指定してもよい。 As shown in FIG. 4A, the user specifies the two-dimensional coordinates 42 in the horizontal plane (XY plane in the drawing) of the projection position of the marking support device 1 on the drawing data 41 displayed on the input screen. Further, the user specifies the flight height 43, which is the coordinates in the vertical direction (Z direction in the drawing) of the projection position of the marking support device 1, on the drawing data displayed on the input screen. That is, the user specifies the three-dimensional coordinates (X, Y, Z) of the projection position of the marking support device 1 on the drawing data displayed on the input screen. The user may stagnate for N minutes at the coordinates (X1, Y1, Z1), then move to the coordinates (X2, Y2, Z2) and stagnate for M minutes, and so on. And may be specified.
 次に、ユーザは、入力画面に表示された図面データ41上で領域44を投影範囲として指定する。例えば、ユーザは入力画面に表示された図面データ41上でドラッグした線を対角線とする領域44を投影範囲として指定する。あるいは、ユーザは図4Bに示す投影部11から投影画像47までの最大距離L1の値を入力して、最大距離L1で規定される図面データ41上の領域44を投影範囲として指定してもよい。図面データ41上で投影範囲を指定するのは、投影部11の性能および現地の明るさによって、広い範囲に投影し過ぎた場合、投影部11からの距離が離れると投影画像が見えにくくなり、作業が困難となるといった影響が出るためである。投影部11の性能および現地の明るさに対して図面データ41のサイズが十分に小さい場合、あるいは、図面データ41が予め分割された図面データである場合、ユーザは投影範囲を指定せず、図面データ41の全範囲を投影範囲としてもよい。また、ユーザは、複数の投影位置と停滞時間とを指定する場合には、各投影位置での投影範囲を指定する。 Next, the user specifies the area 44 as the projection range on the drawing data 41 displayed on the input screen. For example, the user designates an area 44 whose diagonal line is a line dragged on the drawing data 41 displayed on the input screen as a projection range. Alternatively, the user may input the value of the maximum distance L1 from the projection unit 11 shown in FIG. 4B to the projected image 47, and specify the area 44 on the drawing data 41 defined by the maximum distance L1 as the projection range. .. The projection range is specified on the drawing data 41 because, depending on the performance of the projection unit 11 and the brightness of the site, if the projection is over a wide range, the projected image becomes difficult to see as the distance from the projection unit 11 increases. This is because the work becomes difficult. When the size of the drawing data 41 is sufficiently small with respect to the performance of the projection unit 11 and the local brightness, or when the drawing data 41 is the drawing data divided in advance, the user does not specify the projection range and draws the drawing. The entire range of the data 41 may be the projection range. Further, when the user specifies a plurality of projection positions and a stagnation time, the user specifies a projection range at each projection position.
 制御部33は、指定された投影範囲の中心点45の3次元座標(X,Y,0)を算出する。例えば、図4Aに示すように投影範囲が四角形の場合、その対角線の交点を中心点とする。制御部33は、ユーザが指定した墨出し支援装置1の投影位置の3次元座標(X,Y,Z)と、投影範囲と、投影範囲の中心点45の3次元座標(X,Y,0)と、を示す投影条件情報を生成し、通信部34を介して情報処理装置2に送信する。なお、入力画面に墨出し支援装置1による墨出しを指示する墨出し指示ボタンを含んでもよい。ユーザが墨出し指示ボタンをタップした場合、制御部33は、墨出し支援装置1に対する墨出し指示を示す墨出し指示情報を生成し、通信部34を介して情報処理装置2に送信する。 The control unit 33 calculates the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the designated projection range. For example, when the projection range is a quadrangle as shown in FIG. 4A, the intersection of the diagonal lines is set as the center point. The control unit 33 has three-dimensional coordinates (X, Y, Z) of the projection position of the marking support device 1 specified by the user, a projection range, and three-dimensional coordinates (X C , Y C) of the center point 45 of the projection range. , 0) and the projection condition information indicating the above are generated and transmitted to the information processing apparatus 2 via the communication unit 34. The input screen may include a marking instruction button for instructing marking by the marking support device 1. When the user taps the marking instruction button, the control unit 33 generates marking instruction information indicating the marking instruction to the marking support device 1 and transmits the marking instruction information to the information processing device 2 via the communication unit 34.
 情報処理装置2の演算記憶部21の投影経路演算部212は、通信部22がユーザ端末3から投影条件情報を受信すると、現在位置演算部211が算出した墨出し支援装置1の現在位置の図面上の座標および床面からの高さと投影位置の3次元座標(X,Y,Z)とが一致するか否か、つまり、墨出し支援装置1が投影位置に到着したか否かを判定する。投影経路演算部212は、墨出し支援装置1の現在位置の図面上の座標および床面からの高さと投影位置の3次元座標(X,Y,Z)とが一致しないと判定した場合、現在位置演算部211が算出した墨出し支援装置1の現在位置の図面上の座標および床面からの高さを示す情報と、図面データとに基づいて、墨出し支援装置1が現在位置から、投影条件情報が示す投影位置の3次元座標(X,Y,Z)に移動するまでの移動経路を算出する。墨出し支援装置1の現在位置の図面上の座標から投影位置の2次元座標(X,Y)までの移動経路は、例えば、図面上の最短経路とする。投影経路演算部212は、移動経路演算部の例である。 The projection path calculation unit 212 of the calculation storage unit 21 of the information processing device 2 is a drawing of the current position of the marking support device 1 calculated by the current position calculation unit 211 when the communication unit 22 receives the projection condition information from the user terminal 3. It is determined whether or not the above coordinates and the height from the floor surface and the three-dimensional coordinates (X, Y, Z) of the projection position match, that is, whether or not the marking support device 1 has arrived at the projection position. .. When the projection path calculation unit 212 determines that the coordinates of the current position of the marking support device 1 on the drawing and the height from the floor surface do not match the three-dimensional coordinates (X, Y, Z) of the projection position, the current position is present. The marking support device 1 projects from the current position based on the coordinates on the drawing of the current position of the marking support device 1 calculated by the position calculation unit 211 and the information indicating the height from the floor surface, and the drawing data. The movement path until the movement to the three-dimensional coordinates (X, Y, Z) of the projection position indicated by the condition information is calculated. The movement path from the coordinates of the current position of the marking support device 1 on the drawing to the two-dimensional coordinates (X, Y) of the projection position is, for example, the shortest path on the drawing. The projection path calculation unit 212 is an example of a movement path calculation unit.
 投影経路演算部212は、算出した移動経路を示す移動経路情報を、通信部22を介して墨出し支援装置1に送信する。移動経路情報は、例えば、墨出し支援装置1の現在位置の図面上の座標から投影位置の2次元座標(X,Y)までの移動経路と、投影位置における飛行高さZとを示す。 The projection route calculation unit 212 transmits the movement route information indicating the calculated movement route to the marking support device 1 via the communication unit 22. The movement path information indicates, for example, a movement path from the coordinates on the drawing of the current position of the marking support device 1 to the two-dimensional coordinates (X, Y) of the projection position, and the flight height Z at the projection position.
 墨出し支援装置1のプロセッサ部17の駆動制御部173は、通信部18を介して情報処理装置2から移動経路情報を受信すると、移動駆動部141を制御して、移動経路情報が示す移動経路で、投影位置の3次元座標(X,Y,Z)に移動させる。駆動制御部173は、例えば、移動経路情報が示す投影位置の2次元座標(X,Y)までの移動経路で墨出し支援装置1を移動させ、飛行高さZまで上昇または下降させる。 When the drive control unit 173 of the processor unit 17 of the marking support device 1 receives the movement route information from the information processing device 2 via the communication unit 18, it controls the movement drive unit 141 to control the movement route indicated by the movement route information. Then, it is moved to the three-dimensional coordinates (X, Y, Z) of the projection position. The drive control unit 173 moves the marking support device 1 along the movement path up to the two-dimensional coordinates (X, Y) of the projection position indicated by the movement path information, and raises or lowers the marking support device 1 to the flight height Z.
 情報処理装置2の投影経路演算部212は、墨出し支援装置1の現在位置の図面上の座標および床面からの高さと投影位置の3次元座標(X,Y,Z)とが一致すると判定した場合、現在位置演算部211が算出した墨出し支援装置1の現在位置の図面上の座標および床面からの高さ、つまり投影位置の3次元座標(X,Y,Z)と、墨出し支援装置1の向きθと、相対位置演算部213が算出した墨出し支援装置1の対象面までの投影部11の指向方向の距離L2および対象面に対する指向方向の角度φとに基づいて、投影部11の指向方向の延長線と床面の交点48の3次元座標(X,Y,0)を算出する。以下、投影部11の指向方向の延長線と床面の交点48の3次元座標を指向方向座標という。投影経路演算部212は、算出した指向方向座標(X,Y,0)と、投影条件情報が示す投影範囲の中心点45の3次元座標(X,Y,0)とが一致するか否か、つまり、指向方向と投影方向とが一致するか否かを判定する。 The projection path calculation unit 212 of the information processing device 2 determines that the coordinates of the current position of the marking support device 1 on the drawing, the height from the floor surface, and the three-dimensional coordinates (X, Y, Z) of the projection position match. If so, the coordinates on the drawing of the current position of the marking support device 1 calculated by the current position calculation unit 211 and the height from the floor surface, that is, the three-dimensional coordinates (X, Y, Z) of the projection position, and marking Projection is performed based on the orientation θ of the support device 1, the distance L2 in the direction direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213, and the angle φ in the direction direction with respect to the target surface. The three-dimensional coordinates (X D , Y D , 0) of the extension line in the direction of the portion 11 and the intersection point 48 of the floor surface are calculated. Hereinafter, the three-dimensional coordinates of the intersection point 48 between the extension line of the projection direction of the projection unit 11 and the floor surface are referred to as the directional coordinates. The projection path calculation unit 212 matches the calculated directional coordinates (X D , Y D , 0) with the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the projection range indicated by the projection condition information. Whether or not to do so, that is, whether or not the pointing direction and the projection direction match.
 投影経路演算部212は、指向方向座標(X,Y,0)と、投影条件情報が示す投影範囲の中心点45の3次元座標(X,Y,0)とが一致しないと判定した場合、現在位置演算部211が算出した墨出し支援装置1の現在位置(X,Y,Z)および向きθと、相対位置演算部213が算出した墨出し支援装置1の対象面までの投影部11の指向方向の距離L2および対象面に対する指向方向の角度φとに基づいて、指向方向座標(X,Y,0)と、投影条件情報が示す投影範囲の中心点45の3次元座標(X,Y,0)とを一致させる投影部11および撮影部13の水平方向の回転量および水平方向の回転量を算出する。投影経路演算部212は、算出した投影部11および撮影部13の水平方向の回転量および垂直方向の回転量を示す回動量情報を、通信部22を介して墨出し支援装置1に送信する。投影経路演算部212は、回動量演算部の例である。 The projection path calculation unit 212 has to match the directional coordinates (X D , Y D , 0) with the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the projection range indicated by the projection condition information. When determined, the current position (X, Y, Z) and orientation θ of the marking support device 1 calculated by the current position calculation unit 211 and the target surface of the marking support device 1 calculated by the relative position calculation unit 213. Based on the directional distance L2 of the projection unit 11 and the directional angle φ with respect to the target surface, the directional coordinates (X D , Y D , 0) and the center point 45 of the projection range indicated by the projection condition information 3 The horizontal rotation amount and the horizontal rotation amount of the projection unit 11 and the photographing unit 13 that match the dimensional coordinates (X C , Y C, 0) are calculated. The projection path calculation unit 212 transmits the calculated rotation amount information indicating the horizontal rotation amount and the vertical rotation amount of the projection unit 11 and the photographing unit 13 to the marking support device 1 via the communication unit 22. The projection path calculation unit 212 is an example of a rotation amount calculation unit.
 墨出し支援装置1のプロセッサ部17の駆動制御部173は、通信部18を介して、情報処理装置2から回動量情報を受信すると、方向回動部142を制御して、回動量情報が示す投影部11および撮影部13の水平方向の回転量および垂直方向の回転量で、投影部11および撮影部13を回動させる。 When the drive control unit 173 of the processor unit 17 of the marking support device 1 receives the rotation amount information from the information processing device 2 via the communication unit 18, it controls the direction rotation unit 142, and the rotation amount information indicates. The projection unit 11 and the imaging unit 13 are rotated by the amount of rotation in the horizontal direction and the amount of rotation in the vertical direction of the projection unit 11 and the imaging unit 13.
 情報処理装置2の投影画像生成部214は、指向方向座標(X,Y,0)と、投影範囲の中心点45の3次元座標(X,Y,0)とが一致すると判定した場合、現在位置演算部211が算出した墨出し支援装置1の向きθと、相対位置演算部213が算出した墨出し支援装置1の対象面までの投影部11の指向方向の距離L2および対象面に対する指向方向の角度φとに基づいて、記憶部216が記憶する図面データを投影画像に変換する図面変換処理を実行する。 The projection image generation unit 214 of the information processing apparatus 2 determines that the directional coordinates (X D , Y D , 0) and the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the projection range match. If so, the orientation θ of the marking support device 1 calculated by the current position calculation unit 211, the distance L2 in the direction direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213, and the target. A drawing conversion process for converting the drawing data stored in the storage unit 216 into a projected image is executed based on the angle φ in the directing direction with respect to the surface.
 ここで、投影画像生成部214が実行する図面変換処理について図5を用いて説明する。図面変換処理は、図5に示す切り抜き処理、拡大縮小処理、射影変換処理および回転変換処理を含む。投影画像生成部214は、まず、投影条件情報が示す投影範囲の画像データを切り抜く切り抜き処理を実行する。次に、投影画像生成部214は、切り抜いた画像データを、墨出し支援装置1の現在位置(X,Y,Z)から投影範囲の中心点45の3次元座標(X,Y,0)までの距離、つまり墨出し支援装置1の現在位置(X,Y,Z)から指向方向座標(X,Y,0)までの距離L2に応じて、投影された際に寸法が合う大きさに拡大または縮小する拡大縮小処理を実行する。次に、投影画像生成部214は、拡大または縮小した画像データに対し、対象面に対する指向方向の角度φに応じて射影変換を行う射影変換処理を実行する。次に、投影画像生成部214は、射影変換した画像データを、墨出し支援装置1の向きθに応じて回転させる回転変換処理を実行する。なお、図面データの全範囲を投影範囲とする場合は、図面変換処理に切り抜き処理を含まなくてよい。 Here, the drawing conversion process executed by the projection image generation unit 214 will be described with reference to FIG. The drawing conversion process includes a cropping process, a scaling process, a projective conversion process, and a rotation conversion process shown in FIG. The projection image generation unit 214 first executes a cropping process for cropping the image data in the projection range indicated by the projection condition information. Next, the projection image generation unit 214 outputs the cut out image data to the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the projection range from the current position (X, Y, Z) of the marking support device 1. ), That is, the dimensions match when projected according to the distance L2 from the current position (X, Y, Z) of the marking support device 1 to the directional coordinates (X D , Y D, 0). Executes scaling processing to scale up or down to size. Next, the projected image generation unit 214 executes a projective transformation process that performs a projective transformation on the enlarged or reduced image data according to the angle φ in the pointing direction with respect to the target surface. Next, the projected image generation unit 214 executes a rotation conversion process of rotating the projected image data according to the orientation θ of the marking support device 1. When the entire range of the drawing data is the projection range, the drawing conversion process does not have to include the cropping process.
 投影画像生成部214は、図面変換処理に含まれる各処理を行った際に設定したパラメータを記憶部216に記憶させる。投影画像生成部214は、生成した投影画像を示す投影画像情報を、通信部22を介して墨出し支援装置1に送信する。墨出し支援装置1のプロセッサ部17の投影制御部171は、投影部11を制御して、通信部18が受信した投影画像情報が示す投影画像を対象面に投影させる。撮影制御部174は、撮影部13を制御して、対象面に投影された投影画像を撮影させる。撮影制御部174は、撮影部13が撮影した撮影画像を示す撮影画像情報を、通信部18を介して情報処理装置2に送信する。 The projection image generation unit 214 stores the parameters set when each process included in the drawing conversion process is performed in the storage unit 216. The projection image generation unit 214 transmits the projection image information indicating the generated projection image to the marking support device 1 via the communication unit 22. The projection control unit 171 of the processor unit 17 of the marking support device 1 controls the projection unit 11 to project the projection image indicated by the projection image information received by the communication unit 18 onto the target surface. The shooting control unit 174 controls the shooting unit 13 to shoot a projected image projected on the target surface. The photographing control unit 174 transmits the photographed image information indicating the photographed image photographed by the photographing unit 13 to the information processing device 2 via the communication unit 18.
 情報処理装置2の撮影画像処理部215は、通信部22を介して墨出し支援装置1から撮影画像情報を受信すると、撮影画像情報が示す撮影画像から投影画像を抽出する。通信部22は、撮影画像情報受信部の例である。撮影画像処理部215は、記憶部216が記憶するパラメータを用いて、撮影画像から抽出した投影画像に図面変換処理とは逆の処理を行う画像変換処理を実行する。画像変換処理は、逆回転変換処理、逆射影変換処理および逆拡大縮小処理を含む。撮影画像処理部215は、まず、回転変換処理で設定したパラメータを用いて撮影画像から抽出した投影画像を逆向きに回転させる逆回転変換処理を実行する。次に、撮影画像処理部215は、射影変換処理で設定したパラメータを用いて、逆回転変換処理を実行した投影画像に対し、逆向きの射影変換を行う逆射影変換処理を実行する。次に、撮影画像処理部215は、拡大縮小処理で設定したパラメータを用いて、逆射影変換処理を実行した投影画像を逆に縮小または拡大する逆拡大縮小処理を実行する。 When the captured image processing unit 215 of the information processing device 2 receives the captured image information from the marking support device 1 via the communication unit 22, the captured image processing unit 215 extracts the projected image from the captured image indicated by the captured image information. The communication unit 22 is an example of a captured image information receiving unit. The captured image processing unit 215 executes an image conversion process that performs the reverse processing of the drawing conversion process on the projected image extracted from the captured image by using the parameters stored in the storage unit 216. The image conversion process includes a reverse rotation conversion process, a reverse projective conversion process, and a reverse enlargement / reduction process. The captured image processing unit 215 first executes a reverse rotation conversion process of rotating the projected image extracted from the captured image in the reverse direction using the parameters set in the rotation conversion process. Next, the captured image processing unit 215 executes a back-projection conversion process that performs a reverse projection conversion on the projected image that has been subjected to the reverse rotation conversion process, using the parameters set in the projective conversion process. Next, the captured image processing unit 215 executes a reverse enlargement / reduction process of conversely reducing or enlarging the projected image on which the back-projection conversion process has been performed, using the parameters set in the enlargement / reduction process.
 撮影画像処理部215は、切り抜き処理で設定したパラメータを用いて、記憶部216が記憶する図面データを切り抜いて、投影画像の元の図面データを生成する。撮影画像処理部215は、画像変換処理を実行した撮影画像と投影画像の元の図面データとの差異を算出する。画像変換処理を実行した撮影画像と投影画像の元の図面データとの差異は、例えば、寸法差異、画像の歪み、投影の薄れなどである。これにより、例えば投影部11から対象面までの距離が遠いことで、投影画像の照度が低下してしまったり、例えば対象面が著しく傾いていたり、凹凸があったりすることで、投影画像が変形してまったり、実際よりも大きく投影されてしまったりすることによる寸法差異、歪みなどを検出することができる。 The captured image processing unit 215 cuts out the drawing data stored in the storage unit 216 using the parameters set in the cropping process, and generates the original drawing data of the projected image. The captured image processing unit 215 calculates the difference between the captured image subjected to the image conversion process and the original drawing data of the projected image. Differences between the captured image subjected to the image conversion process and the original drawing data of the projected image are, for example, dimensional differences, image distortion, and projection fading. As a result, for example, when the distance from the projection unit 11 to the target surface is long, the illuminance of the projected image is reduced, or the target surface is significantly tilted or uneven, so that the projected image is deformed. It is possible to detect dimensional differences, distortions, etc. due to the fact that the image is projected larger than it actually is.
 撮影画像処理部215は、画像変換処理を実行した撮影画像と投影画像の元の図面データとの差異が閾値を超えるか否かを判定する。画像変換処理を実行した撮影画像と投影画像の元の図面データとの差異が閾値を超える場合、撮影画像処理部215は、投影画像が異常であることを示すエラー情報を出力する。これにより、ユーザは投影画像の異常を知ることができ、墨出しの失敗を防ぐことができる。 The captured image processing unit 215 determines whether or not the difference between the captured image executed the image conversion process and the original drawing data of the projected image exceeds the threshold value. When the difference between the captured image on which the image conversion process is executed and the original drawing data of the projected image exceeds the threshold value, the captured image processing unit 215 outputs error information indicating that the projected image is abnormal. As a result, the user can know the abnormality of the projected image and can prevent the marking failure.
 ここで、情報処理装置2が墨出し指示情報を受信した場合の処理について説明する。情報処理装置2の演算記憶部21の投影経路演算部212は、通信部22が設計条件情報に墨出し指示情報を受信した場合、現在位置演算部211が算出した墨出し支援装置1の現在位置および向きを示す情報と、図面データとに基づいて、墨出し支援装置1が対象面に墨出し情報を記入する移動経路と、インクの射出タイミングとを算出する。投影経路演算部212は、算出した移動経路とインクの射出タイミングとを示す墨出し経路情報を、通信部22を介して墨出し支援装置1に送信する。通信部22は、墨出し指示情報取得部および墨出し経路情報送信部の例である。投影経路演算部212は、墨出し経路演算部の例である。 Here, the processing when the information processing device 2 receives the marking instruction information will be described. When the communication unit 22 receives the marking instruction information in the design condition information, the projection path calculation unit 212 of the calculation storage unit 21 of the information processing device 2 calculates the current position of the marking support device 1 by the current position calculation unit 211. Based on the information indicating the direction and the drawing data, the marking support device 1 calculates the movement route for writing the marking information on the target surface and the ink ejection timing. The projection path calculation unit 212 transmits the marking path information indicating the calculated movement path and the ink ejection timing to the marking support device 1 via the communication unit 22. The communication unit 22 is an example of a marking instruction information acquisition unit and a marking route information transmitting unit. The projection path calculation unit 212 is an example of the marking path calculation unit.
 墨出し支援装置1のプロセッサ部17は、通信部18を介して情報処理装置2から墨出し経路情報を受信すると、駆動制御部173が移動駆動部141を制御して、墨出し経路情報が示す移動経路で、墨出し支援装置1を移動させる。通信部18は、墨出し経路情報受信部の例である。このとき、墨出し支援装置1の飛行高さは、マーキング部15から床面までの距離がインクの射程に入る飛行高さとする。駆動制御部173は、墨出し支援装置1の飛行高さだけでなく墨出し支援装置1の姿勢を制御してもよい。マーキング制御部175は、マーキング部15を制御して、墨出し経路情報が示すインクの射出タイミングでインクを射出させる。この動作によって、墨出し支援装置1は、墨出し情報が示す設備の設置位置、アンカーボルトの穴あけ位置、シートの貼付位置などの墨出し情報を対象面に記入する。これにより、例えば、墨出し作業者が対象面に墨出しができない場合でも、対象面に墨出しを行うことができる。あるいは、投影画像の線、形などを対象面に転記することができる。 When the processor unit 17 of the marking support device 1 receives the marking route information from the information processing device 2 via the communication unit 18, the drive control unit 173 controls the moving drive unit 141, and the marking route information indicates. The marking support device 1 is moved along the movement route. The communication unit 18 is an example of a marking route information receiving unit. At this time, the flight height of the marking support device 1 is set so that the distance from the marking portion 15 to the floor surface is within the range of the ink. The drive control unit 173 may control not only the flight height of the marking support device 1 but also the attitude of the marking support device 1. The marking control unit 175 controls the marking unit 15 to eject ink at the ink ejection timing indicated by the marking path information. By this operation, the marking support device 1 writes the marking information such as the installation position of the equipment indicated by the marking information, the drilling position of the anchor bolt, and the sticking position of the sheet on the target surface. Thereby, for example, even if the marking operator cannot mark the target surface, the marking can be performed on the target surface. Alternatively, the lines, shapes, etc. of the projected image can be transcribed on the target surface.
 ここで、ユーザ端末3が実行する投影条件設定処理のフローについて、図4Aおよび図6を用いて説明する。図6に示す投影条件設定処理は、ユーザ端末3に電源が投入されると開始する。ユーザ端末3の制御部33は、入力部32に投影条件の入力画面の表示指示が入力されたか否かを判定する(ステップS11)。投影条件の入力画面の表示指示が入力されていない場合(ステップS11;NO)、制御部33は、ステップS11を繰り返して、投影条件の入力画面の表示指示の入力を待機する。 Here, the flow of the projection condition setting process executed by the user terminal 3 will be described with reference to FIGS. 4A and 6. The projection condition setting process shown in FIG. 6 starts when the power is turned on to the user terminal 3. The control unit 33 of the user terminal 3 determines whether or not the display instruction of the projection condition input screen has been input to the input unit 32 (step S11). When the display instruction of the projection condition input screen is not input (step S11; NO), the control unit 33 repeats step S11 and waits for the input of the display instruction of the projection condition input screen.
 投影条件の入力画面の表示指示が入力された場合(ステップS11;YES)、制御部33は、情報処理装置2から取得した図面データを表示して投影条件の入力を受け付ける入力画面を、表示部31に表示させる(ステップS12)。制御部33は、入力画面に墨出し支援装置1の投影位置が入力されたか否かを判定する(ステップS13)。制御部33は、入力画面に墨出し支援装置1の投影位置が入力されていないと判定すると(ステップS13;NO)、ステップS13を繰り返して、墨出し支援装置1の投影位置の入力を待機する。 When the display instruction of the projection condition input screen is input (step S11; YES), the control unit 33 displays the input screen that displays the drawing data acquired from the information processing device 2 and accepts the input of the projection condition. It is displayed on 31 (step S12). The control unit 33 determines whether or not the projection position of the marking support device 1 has been input to the input screen (step S13). When the control unit 33 determines that the projection position of the marking support device 1 has not been input to the input screen (step S13; NO), the control unit 33 repeats step S13 and waits for the input of the projection position of the marking support device 1. ..
 図4Aに示すように、ユーザは入力画面に表示された図面データ41上で、墨出し支援装置1の投影位置の水平面(図中のXY平面)における2次元座標42を指定する。また、ユーザは、入力画面に表示された図面データ上で、墨出し支援装置1の投影位置の垂直方向(図中のZ方向)の座標である飛行高さ43を指定する。ユーザが入力画面に表示された図面データ上で、墨出し支援装置1の投影位置の3次元座標(X,Y,Z)を指定すると、ユーザ端末3の制御部33は、入力画面に墨出し支援装置1の投影位置が入力されたと判定する。 As shown in FIG. 4A, the user specifies the two-dimensional coordinates 42 in the horizontal plane (XY plane in the drawing) of the projection position of the marking support device 1 on the drawing data 41 displayed on the input screen. Further, the user specifies the flight height 43, which is the coordinates in the vertical direction (Z direction in the drawing) of the projection position of the marking support device 1, on the drawing data displayed on the input screen. When the user specifies the three-dimensional coordinates (X, Y, Z) of the projection position of the marking support device 1 on the drawing data displayed on the input screen, the control unit 33 of the user terminal 3 marks out on the input screen. It is determined that the projection position of the support device 1 has been input.
 図6に戻り、制御部33は、入力画面に墨出し支援装置1の投影位置が入力されたと判定すると(ステップS13;YES)、入力画面に墨出し支援装置1の投影範囲が入力されたか否かを判定する(ステップS14)。制御部33は、入力画面に墨出し支援装置1の投影範囲が入力されていないと判定すると(ステップS14;NO)、ステップS14を繰り返して、墨出し支援装置1の投影範囲の入力を待機する。 Returning to FIG. 6, when the control unit 33 determines that the projection position of the marking support device 1 has been input to the input screen (step S13; YES), whether or not the projection range of the marking support device 1 has been input to the input screen. (Step S14). When the control unit 33 determines that the projection range of the marking support device 1 has not been input to the input screen (step S14; NO), the control unit 33 repeats step S14 and waits for the input of the projection range of the marking support device 1. ..
 図4Aに示すように、ユーザは、入力画面に表示された図面データ41上で領域44を投影範囲として指定すると、ユーザ端末3の制御部33は、入力画面に墨出し支援装置1の投影範囲が入力されたと判定する。 As shown in FIG. 4A, when the user specifies the area 44 as the projection range on the drawing data 41 displayed on the input screen, the control unit 33 of the user terminal 3 displays the projection range of the marking support device 1 on the input screen. Is determined to have been input.
 図6に戻り、制御部33は、入力画面に墨出し支援装置1の投影範囲が入力されたと判定すると(ステップS14;YES)、入力された投影範囲の中心点を算出する(ステップS15)。 Returning to FIG. 6, when the control unit 33 determines that the projection range of the marking support device 1 has been input to the input screen (step S14; YES), the control unit 33 calculates the center point of the input projection range (step S15).
 図4Aに示すように投影範囲が四角形の場合、その対角線の交点を中心点とする。制御部33は、投影範囲の中心点45の3次元座標(X,Y,0)を算出する。 When the projection range is a quadrangle as shown in FIG. 4A, the intersection of the diagonal lines is set as the center point. The control unit 33 calculates the three-dimensional coordinates (X C , Y C , 0) of the center point 45 of the projection range.
 図6に戻り、制御部33は、入力された墨出し支援装置1の投影位置と、投影範囲と、投影範囲の中心点とを示す投影条件情報を生成し、通信部34を介して情報処理装置2に送信する(ステップS16)。このとき、制御部33は、入力画面を閉じてもよい。 Returning to FIG. 6, the control unit 33 generates projection condition information indicating the input projection position of the marking support device 1, the projection range, and the center point of the projection range, and processes information via the communication unit 34. It is transmitted to the device 2 (step S16). At this time, the control unit 33 may close the input screen.
 図4Aの例では、制御部33は、ユーザが指定した墨出し支援装置1の投影位置の3次元座標(X,Y,Z)と、投影範囲と、投影範囲の中心点45の3次元座標(X,Y,0)と、を示す投影条件情報を生成し、通信部34を介して情報処理装置2に送信する。 In the example of FIG. 4A, the control unit 33 uses the three-dimensional coordinates (X, Y, Z) of the projection position of the marking support device 1 specified by the user, the projection range, and the three-dimensional coordinates of the center point 45 of the projection range. (X C , Y C , 0) and projection condition information indicating (X C, Y C, 0) are generated and transmitted to the information processing device 2 via the communication unit 34.
 図6に戻り、ユーザ端末3の電源がOFFになっていない場合(ステップS17;NO)、処理はステップS11に戻り、ステップS11~ステップS17を繰り返す。ユーザ端末3の電源がOFFになった場合(ステップS17;YES)、処理を終了する。なお、図面データの全範囲を投影範囲とする場合は、ステップS14を省略してもよい。また、図6に示すステップS16で投影条件情報を送信後に、入力画面を閉じない場合は、ステップS13に戻り、ステップS13~ステップS16を繰り返してもよい。この場合は、ユーザが入力画面を閉じる指示を入力したときに、制御部33は、入力画面を閉じる。 Returning to FIG. 6, when the power of the user terminal 3 is not turned off (step S17; NO), the process returns to step S11 and repeats steps S11 to S17. When the power of the user terminal 3 is turned off (step S17; YES), the process ends. If the entire range of the drawing data is the projection range, step S14 may be omitted. If the input screen is not closed after the projection condition information is transmitted in step S16 shown in FIG. 6, the process may return to step S13 and repeat steps S13 to S16. In this case, when the user inputs an instruction to close the input screen, the control unit 33 closes the input screen.
 続いて、情報処理装置2が実行する移動回動設定処理のフローについて、図4Bおよび図7を用いて説明する。図7に示す移動回動設定処理は、情報処理装置2の通信部22がユーザ端末3から投影条件情報を受信すると開始する。演算記憶部21の投影経路演算部212は、現在位置演算部211が算出した墨出し支援装置1の現在位置と投影条件情報が示す投影位置とが一致するか否かを判定する(ステップS21)。ステップS21で投影経路演算部212は、墨出し支援装置1の現在位置の図面上の座標および床面からの高さと投影位置の3次元座標とが一致するか否かを判定する。 Subsequently, the flow of the movement rotation setting process executed by the information processing apparatus 2 will be described with reference to FIGS. 4B and 7. The movement rotation setting process shown in FIG. 7 starts when the communication unit 22 of the information processing device 2 receives the projection condition information from the user terminal 3. The projection path calculation unit 212 of the calculation storage unit 21 determines whether or not the current position of the marking support device 1 calculated by the current position calculation unit 211 and the projection position indicated by the projection condition information match (step S21). .. In step S21, the projection path calculation unit 212 determines whether or not the coordinates of the current position of the marking support device 1 on the drawing, the height from the floor surface, and the three-dimensional coordinates of the projection position match.
 投影経路演算部212は、墨出し支援装置1の現在位置と投影位置とが一致しないと判定した場合(ステップS21;NO)、現在位置演算部211が算出した墨出し支援装置1の現在位置を示す情報と、図面データとに基づいて、墨出し支援装置1が現在位置から投影位置に移動するまでの移動経路を算出する(ステップS22)。投影経路演算部212は、算出した移動経路を示す移動経路情報を、通信部22を介して墨出し支援装置1に送信し(ステップS23)、処理はステップS21に戻る。 When the projection path calculation unit 212 determines that the current position of the marking support device 1 and the projection position do not match (step S21; NO), the projection path calculation unit 212 determines the current position of the marking support device 1 calculated by the current position calculation unit 211. Based on the information shown and the drawing data, the movement path from the current position to the projection position of the marking support device 1 is calculated (step S22). The projection path calculation unit 212 transmits the movement route information indicating the calculated movement route to the marking support device 1 via the communication unit 22 (step S23), and the process returns to step S21.
 図4Bの例では、投影経路演算部212は、現在位置演算部211が算出した墨出し支援装置1の現在位置の図面上の座標、床面からの高さ、および、向きを示す情報と、図面データとに基づいて、墨出し支援装置1が現在位置から、投影条件情報が示す投影位置の3次元座標(X,Y,Z)に移動するまでの移動経路を算出する。移動経路情報は、例えば、墨出し支援装置1の現在位置から投影位置の2次元座標(X,Y)までの移動経路と、投影位置における飛行高さZとを示す。 In the example of FIG. 4B, the projection path calculation unit 212 includes information indicating the coordinates on the drawing of the current position of the marking support device 1 calculated by the current position calculation unit 211, the height from the floor surface, and the orientation. Based on the drawing data, the movement path from the current position to the movement of the marking support device 1 to the three-dimensional coordinates (X, Y, Z) of the projection position indicated by the projection condition information is calculated. The movement path information indicates, for example, a movement path from the current position of the marking support device 1 to the two-dimensional coordinates (X, Y) of the projection position, and a flight height Z at the projection position.
 図7に戻り、投影経路演算部212は、墨出し支援装置1の現在位置と投影位置とが一致すると判定すると(ステップS21;YES)、現在位置演算部211が算出した墨出し支援装置1の現在位置つまり投影位置と、墨出し支援装置1の向きと、相対位置演算部213が算出した墨出し支援装置1の対象面までの投影部11の指向方向の距離および対象面に対する指向方向の角度とに基づいて、指向方向座標を算出する(ステップS24)。 Returning to FIG. 7, when the projection path calculation unit 212 determines that the current position of the marking support device 1 and the projection position match (step S21; YES), the marking support device 1 calculated by the current position calculation unit 211 The current position, that is, the projection position, the orientation of the marking support device 1, the distance of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213, and the angle of the pointing direction with respect to the target surface. Based on the above, the directional coordinates are calculated (step S24).
 図4Bの例では、投影経路演算部212は、現在位置演算部211が算出した墨出し支援装置1の現在位置の図面上の座標、つまり投影位置の3次元座標(X,Y,Z)と、墨出し支援装置1の向きθと、相対位置演算部213が算出した墨出し支援装置1の対象面までの投影部11の指向方向の距離L2および対象面に対する指向方向の角度φとに基づいて、指向方向座標(X,Y,0)を算出する。 In the example of FIG. 4B, the projection path calculation unit 212 has the coordinates on the drawing of the current position of the marking support device 1 calculated by the current position calculation unit 211, that is, the three-dimensional coordinates (X, Y, Z) of the projection position. , Based on the orientation θ of the marking support device 1, the distance L2 in the pointing direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213, and the angle φ in the pointing direction with respect to the target surface. Then, the directional coordinates (X D , Y D , 0) are calculated.
 図7に戻り、投影経路演算部212は、指向方向座標と投影条件情報が示す投影範囲の中心点とが一致するか否かを判定する(ステップS25)。投影経路演算部212は、指向方向座標と投影範囲の中心点とが一致しないと判定した場合(ステップS25;NO)、現在位置演算部211が算出した墨出し支援装置1の現在位置および向きと、相対位置演算部213が算出した墨出し支援装置1の対象面までの投影部11の指向方向の距離および対象面に対する指向方向の角度とに基づいて、指向方向座標と、投影条件情報が示す投影範囲の中心点とを一致させる投影部11および撮影部13の水平方向の回転量および水平方向の回転量を算出する(ステップS26)。投影経路演算部212は、算出した投影部11および撮影部13の水平方向の回転量および垂直方向の回転量を示す回動量情報を、通信部22を介して墨出し支援装置1に送信し(ステップS27)、処理はステップS25に戻る。 Returning to FIG. 7, the projection path calculation unit 212 determines whether or not the directional coordinates and the center point of the projection range indicated by the projection condition information match (step S25). When the projection path calculation unit 212 determines that the directional coordinates and the center point of the projection range do not match (step S25; NO), the current position and orientation of the marking support device 1 calculated by the current position calculation unit 211 The directing direction coordinates and the projection condition information indicate based on the distance in the directing direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213 and the angle of the directing direction with respect to the target surface. The horizontal rotation amount and the horizontal rotation amount of the projection unit 11 and the photographing unit 13 that coincide with the center point of the projection range are calculated (step S26). The projection path calculation unit 212 transmits the calculated rotation amount information indicating the horizontal rotation amount and the vertical rotation amount of the projection unit 11 and the photographing unit 13 to the marking support device 1 via the communication unit 22 ( Step S27), the process returns to step S25.
 図4Bの例では、投影経路演算部212は、指向方向座標(X,Y,0)と、投影条件情報が示す投影範囲の中心点45の3次元座標(X,Y,0)とが一致しないと判定した場合、投影経路演算部212は、現在位置演算部211が算出した墨出し支援装置1の現在位置の図面上の座標(X,Y,Z)および向きθと、相対位置演算部213が算出した墨出し支援装置1の対象面までの投影部11の指向方向の距離L2および対象面に対する指向方向の角度φとに基づいて、指向方向座標(X,Y,0)と、投影条件情報が示す投影範囲の中心点45の3次元座標(X,Y,0)とを一致させる投影部11および撮影部13の水平方向の回転量および水平方向の回転量を算出する。投影経路演算部212は、算出した投影部11および撮影部13の水平方向の回転量および垂直方向の回転量を示す回動量情報を、通信部22を介して墨出し支援装置1に送信する。 In the example of FIG. 4B, the projection path calculation unit 212 has three-dimensional coordinates (X C , Y C , 0 ) of the pointing direction coordinates (X D , Y D , 0) and the center point 45 of the projection range indicated by the projection condition information. ) Does not match, the projection path calculation unit 212 determines that the coordinates (X, Y, Z) and direction θ of the current position of the marking support device 1 calculated by the current position calculation unit 211 on the drawing are Directional coordinates (X D , Y D) based on the distance L2 in the direction direction of the projection unit 11 to the target surface of the marking support device 1 calculated by the relative position calculation unit 213 and the angle φ in the direction direction with respect to the target surface. , 0) and three-dimensional coordinates (X C of the center point 45 of the projection range indicated by the projection condition information, Y C, 0) and the projection portion 11 and the imaging unit 13 to match the horizontal rotation amount and the horizontal direction Calculate the amount of rotation. The projection path calculation unit 212 transmits the calculated rotation amount information indicating the horizontal rotation amount and the vertical rotation amount of the projection unit 11 and the photographing unit 13 to the marking support device 1 via the communication unit 22.
 図7に戻り、投影経路演算部212は、指向方向座標と投影範囲の中心点とが一致すると判定すると(ステップS25;YES)、処理を終了する。 Returning to FIG. 7, when the projection path calculation unit 212 determines that the coordinates in the directivity direction and the center point of the projection range match (step S25; YES), the process ends.
 続いて、情報処理装置2が実行する投影画像生成処理のフローについて、図5および図8を用いて説明する。図8に示す投影画像生成処理は、移動回動設定処理が完了すると開始する。情報処理装置2の演算記憶部21の投影画像生成部214は、図5に示す切り抜き処理、拡大縮小処理、射影変換処理および回転変換処理を含む図面変換処理を行う。 Subsequently, the flow of the projection image generation process executed by the information processing apparatus 2 will be described with reference to FIGS. 5 and 8. The projection image generation process shown in FIG. 8 starts when the movement rotation setting process is completed. The projection image generation unit 214 of the arithmetic storage unit 21 of the information processing device 2 performs drawing conversion processing including cropping processing, enlargement / reduction processing, projection conversion processing, and rotation conversion processing shown in FIG.
 投影画像生成部214は、まず、投影条件情報が示す投影範囲の画像データを切り抜く切り抜き処理を実行する(ステップS31)。次に、投影画像生成部214は、切り抜いた画像データを、投影範囲の中心点までの距離、つまり相対位置演算部213が算出した指向方向座標までの距離に応じて、投影された際に寸法が合う大きさに拡大または縮小する拡大縮小処理を実行する(ステップS32)。次に、投影画像生成部214は、拡大または縮小した画像データに対し、相対位置演算部213が算出した対象面に対する指向方向の角度に応じて射影変換を行う射影変換処理を実行する(ステップS33)。次に、投影画像生成部214は、射影変換した画像データを、現在位置演算部211が算出した墨出し支援装置1の向きに応じて回転させる回転変換処理を実行する(ステップS34)。 The projection image generation unit 214 first executes a cropping process for cropping the image data in the projection range indicated by the projection condition information (step S31). Next, the projected image generation unit 214 measures the cut out image data when it is projected according to the distance to the center point of the projection range, that is, the distance to the pointing direction coordinates calculated by the relative position calculation unit 213. The enlargement / reduction process for enlarging or reducing the size to match the size is executed (step S32). Next, the projected image generation unit 214 executes a projective conversion process of performing a projective conversion on the enlarged or reduced image data according to the angle of the pointing direction with respect to the target surface calculated by the relative position calculation unit 213 (step S33). ). Next, the projected image generation unit 214 executes a rotation conversion process of rotating the projected image data according to the orientation of the marking support device 1 calculated by the current position calculation unit 211 (step S34).
 投影画像生成部214は、ステップS31~ステップS34で各処理を行った際に設定したパラメータを記憶部216に記憶させる。投影画像生成部214は、変換した投影画像を示す投影画像情報を、通信部22を介して墨出し支援装置1に送信し(ステップS35)、処理を終了する。 The projection image generation unit 214 stores the parameters set when each process is performed in steps S31 to S34 in the storage unit 216. The projection image generation unit 214 transmits the projection image information indicating the converted projection image to the marking support device 1 via the communication unit 22 (step S35), and ends the process.
 図6に示す投影条件設定処理のステップS13およびステップS14で複数の投影位置および投影範囲と停滞時間とが入力された場合には、情報処理装置2は、各投影位置での停滞時間の経過後に、図7に示す移動回動設定処理および図8に示す投影画像生成処理を実行する。 When a plurality of projection positions, projection ranges, and stagnation times are input in steps S13 and S14 of the projection condition setting process shown in FIG. 6, the information processing apparatus 2 performs after the stagnation time at each projection position elapses. , The movement rotation setting process shown in FIG. 7 and the projection image generation process shown in FIG. 8 are executed.
 続いて、墨出し支援装置1が実行する墨出し支援処理のフローについて、図9を用いて説明する。図9に示す墨出し支援処理は、墨出し支援装置1に電源が投入されると開始する。墨出し支援装置1のプロセッサ部17の駆動制御部173は、通信部18が情報処理装置2から移動経路情報を受信したか否かを判定する(ステップS41)。通信部18が情報処理装置2から移動経路情報を受信していない場合(ステップS41;NO)、ステップS41を繰り返して移動経路情報の受信を待機する。通信部18が情報処理装置2から移動経路情報を受信した場合(ステップS41;YES)、駆動制御部173は、移動駆動部141を制御して、移動経路情報が示す移動経路で、墨出し支援装置1を投影位置に移動させる(ステップS42)。 Subsequently, the flow of the marking support process executed by the marking support device 1 will be described with reference to FIG. The marking support process shown in FIG. 9 starts when the power is turned on to the marking support device 1. The drive control unit 173 of the processor unit 17 of the marking support device 1 determines whether or not the communication unit 18 has received the movement route information from the information processing device 2 (step S41). When the communication unit 18 has not received the movement route information from the information processing device 2 (step S41; NO), step S41 is repeated to wait for the reception of the movement route information. When the communication unit 18 receives the movement route information from the information processing device 2 (step S41; YES), the drive control unit 173 controls the movement drive unit 141 to support marking with the movement route indicated by the movement route information. The device 1 is moved to the projection position (step S42).
 図4Bの例では、駆動制御部173は、例えば、移動経路情報が示す投影位置の2次元座標(X,Y)までの移動経路で墨出し支援装置1を移動させ、飛行高さZまで上昇または下降させる。 In the example of FIG. 4B, the drive control unit 173 moves the marking support device 1 along the movement path up to the two-dimensional coordinates (X, Y) of the projection position indicated by the movement path information, and rises to the flight height Z. Or lower it.
 図9に戻り、駆動制御部173は、通信部18が情報処理装置2から回動量情報を受信したか否かを判定する(ステップS43)。通信部18が情報処理装置2から回動量情報を受信していない場合(ステップS43;NO)、ステップS43を繰り返して回動量情報の受信を待機する。通信部18を介して、情報処理装置2から回動量情報を受信した場合(ステップS43;YES)、駆動制御部173は、方向回動部142を制御して、回動量情報が示す投影部11および撮影部13の水平方向の回転量および垂直方向の回転量で、投影部11および撮影部13を回動させる(ステップS44)。 Returning to FIG. 9, the drive control unit 173 determines whether or not the communication unit 18 has received the rotation amount information from the information processing device 2 (step S43). When the communication unit 18 has not received the rotation amount information from the information processing device 2 (step S43; NO), step S43 is repeated to wait for the reception of the rotation amount information. When the rotation amount information is received from the information processing device 2 via the communication unit 18 (step S43; YES), the drive control unit 173 controls the direction rotation unit 142 and the projection unit 11 indicated by the rotation amount information. The projection unit 11 and the imaging unit 13 are rotated by the horizontal rotation amount and the vertical rotation amount of the photographing unit 13 (step S44).
 プロセッサ部17の投影制御部171は、通信部18が情報処理装置2から投影画像情報を受信したか否かを判定する(ステップS45)。通信部18が情報処理装置2から投影画像情報を受信していない場合(ステップS45;NO)、ステップS45を繰り返して投影画像情報の受信を待機する。通信部18を介して、情報処理装置2から投影画像情報を受信した場合(ステップS45;YES)、投影制御部171は、投影部11を制御して、通信部18が受信した投影画像情報が示す投影画像を対象面に投影させる(ステップS46)。 The projection control unit 171 of the processor unit 17 determines whether or not the communication unit 18 has received the projected image information from the information processing device 2 (step S45). When the communication unit 18 has not received the projected image information from the information processing device 2 (step S45; NO), the step S45 is repeated to wait for the reception of the projected image information. When the projection image information is received from the information processing device 2 via the communication unit 18 (step S45; YES), the projection control unit 171 controls the projection unit 11 to receive the projection image information received by the communication unit 18. The projected image to be shown is projected onto the target surface (step S46).
 プロセッサ部17の撮影制御部174は、撮影部13を制御して、対象面に投影された投影画像を撮影させる(ステップS47)。撮影制御部174は、撮影部13が撮影した撮影画像を示す撮影画像情報を、通信部18を介して情報処理装置2に送信する(ステップS48)。プロセッサ部17の投影制御部171は、予め決められた投影時間が経過していない場合(ステップS49;NO)、投影部11を制御して、投影画像を対象面に投影し続ける。予め決められた投影時間が経過すると(ステップS49;YES)、処理を終了する。なお、ステップS49で、終了指示情報を受信したか否かを判定してもよい。この場合、プロセッサ部17の投影制御部171は、終了指示情報を受信していない場合(ステップS49;NO)、投影部11を制御して、投影画像を対象面に投影し続ける。終了指示情報を受信すると(ステップS49;YES)、処理を終了する。 The shooting control unit 174 of the processor unit 17 controls the shooting unit 13 to shoot the projected image projected on the target surface (step S47). The photographing control unit 174 transmits the photographed image information indicating the photographed image photographed by the photographing unit 13 to the information processing device 2 via the communication unit 18 (step S48). When the predetermined projection time has not elapsed (step S49; NO), the projection control unit 171 of the processor unit 17 controls the projection unit 11 to continue projecting the projected image on the target surface. When the predetermined projection time elapses (step S49; YES), the process ends. In step S49, it may be determined whether or not the end instruction information has been received. In this case, when the projection control unit 171 of the processor unit 17 has not received the end instruction information (step S49; NO), the projection control unit 171 controls the projection unit 11 to continue projecting the projected image on the target surface. When the end instruction information is received (step S49; YES), the process ends.
 続いて、情報処理装置2が実行する投影画像判定処理のフローについて、図10を用いて説明する。図10に示す投影画像判定処理は、情報処理装置2の通信部22がユーザ端末3から撮影画像情報を受信すると開始する。撮影画像処理部215は、撮影画像情報が示す撮影画像から投影画像を抽出する(ステップS51)。撮影画像処理部215は、記憶部216が記憶するパラメータを用いて、撮影画像から抽出した投影画像に、図面変換処理とは逆の処理を行う画像変換処理を実行する。 Subsequently, the flow of the projection image determination process executed by the information processing apparatus 2 will be described with reference to FIG. The projected image determination process shown in FIG. 10 starts when the communication unit 22 of the information processing device 2 receives the captured image information from the user terminal 3. The captured image processing unit 215 extracts a projected image from the captured image indicated by the captured image information (step S51). The captured image processing unit 215 executes an image conversion process that performs the reverse processing of the drawing conversion process on the projected image extracted from the captured image by using the parameters stored in the storage unit 216.
 撮影画像処理部215は、まず、回転変換処理で設定したパラメータを用いて撮影画像から抽出した投影画像を逆向きに回転させる逆回転変換処理を実行する(ステップS52)。次に、撮影画像処理部215は、射影変換処理で設定したパラメータを用いて、逆回転変換処理を実行した投影画像に対し、逆向きの射影変換を行う逆射影変換処理を実行する(ステップS53)。次に、撮影画像処理部215は、拡大縮小処理で設定したパラメータを用いて、逆射影変換処理を実行した投影画像を逆に縮小または拡大する逆拡大縮小処理を実行する(ステップS54)。 The captured image processing unit 215 first executes a reverse rotation conversion process of rotating the projected image extracted from the captured image in the reverse direction using the parameters set in the rotation conversion process (step S52). Next, the captured image processing unit 215 executes a back-projection conversion process that performs a reverse-direction projective conversion on the projected image that has been subjected to the reverse-rotation conversion process using the parameters set in the projective conversion process (step S53). ). Next, the captured image processing unit 215 executes a reverse enlargement / reduction process of conversely reducing or enlarging the projected image on which the back-projection conversion process has been performed, using the parameters set in the enlargement / reduction process (step S54).
 撮影画像処理部215は、切り抜き処理で設定したパラメータを用いて、記憶部216が記憶する図面データを切り抜いて、投影画像の元の図面データを生成する(ステップS55)。撮影画像処理部215は、画像変換処理を実行した撮影画像と投影画像の元の図面データとの差異を算出する(ステップS56)。画像変換処理を実行した撮影画像と投影画像の元の図面データとの差異は、例えば、寸法差異、画像の歪み、投影の薄れなどである。 The captured image processing unit 215 cuts out the drawing data stored in the storage unit 216 using the parameters set in the cropping process, and generates the original drawing data of the projected image (step S55). The captured image processing unit 215 calculates the difference between the captured image subjected to the image conversion process and the original drawing data of the projected image (step S56). Differences between the captured image subjected to the image conversion process and the original drawing data of the projected image are, for example, dimensional differences, image distortion, and projection fading.
 撮影画像処理部215は、画像変換処理を実行した撮影画像と投影画像の元の図面データとの差異が閾値を超えるか否かを判定する(ステップS57)。画像変換処理を実行した撮影画像と投影画像の元の図面データとの差異が閾値を超えない場合(ステップS57;NO)、処理を終了する。画像変換処理を実行した撮影画像と投影画像の元の図面データとの差異が閾値を超える場合(ステップS57;YES)、撮影画像処理部215は、投影画像が異常であることを示すエラー情報を出力し(ステップS58)、処理を終了する。なお、図面データの全範囲を投影範囲とする場合は、ステップS51を省略してもよい。 The captured image processing unit 215 determines whether or not the difference between the captured image executed the image conversion process and the original drawing data of the projected image exceeds the threshold value (step S57). When the difference between the captured image on which the image conversion process is executed and the original drawing data of the projected image does not exceed the threshold value (step S57; NO), the process ends. When the difference between the captured image subjected to the image conversion process and the original drawing data of the projected image exceeds the threshold value (step S57; YES), the captured image processing unit 215 provides error information indicating that the projected image is abnormal. Output (step S58) and end the process. If the entire range of the drawing data is the projection range, step S51 may be omitted.
 以上説明したとおり、実施の形態に係る墨出し支援システム100によれば、墨出し支援装置1の向きと墨出し支援装置1の対象面までの投影部11の指向方向の距離および対象面に対する指向方向の角度とに基づいて、墨出し情報を含む図面データを変換した投影画像を、移動可能な墨出し支援装置1が投影位置に移動して対象面に投影することで、墨出し作業を支援する際の利便性が向上する。 As described above, according to the marking support system 100 according to the embodiment, the direction of the marking support device 1 and the distance in the direction of the projection unit 11 to the target surface of the marking support device 1 and the orientation with respect to the target surface. The movable marking support device 1 moves to the projection position and projects the projected image obtained by converting the drawing data including the marking information based on the angle of the direction onto the target surface to support the marking work. Convenience when doing is improved.
 情報処理装置2のハードウェア構成について図11を用いて説明する。図11に示すように、情報処理装置2は、一時記憶部101、記憶部102、計算部103、入力部104、送受信部105および表示部106を備える。一時記憶部101、記憶部102、入力部104、送受信部105および表示部106はいずれもBUSを介して計算部103に接続されている。 The hardware configuration of the information processing device 2 will be described with reference to FIG. As shown in FIG. 11, the information processing device 2 includes a temporary storage unit 101, a storage unit 102, a calculation unit 103, an input unit 104, a transmission / reception unit 105, and a display unit 106. The temporary storage unit 101, the storage unit 102, the input unit 104, the transmission / reception unit 105, and the display unit 106 are all connected to the calculation unit 103 via the BUS.
 計算部103は、例えばCPUである。計算部103は、記憶部102に記憶されている制御プログラムに従って、演算記憶部21の現在位置演算部211、投影経路演算部212、相対位置演算部213、投影画像生成部214および撮影画像処理部215の各処理を実行する。 The calculation unit 103 is, for example, a CPU. According to the control program stored in the storage unit 102, the calculation unit 103 includes the current position calculation unit 211 of the calculation storage unit 21, the projection path calculation unit 212, the relative position calculation unit 213, the projection image generation unit 214, and the captured image processing unit. Each process of 215 is executed.
 一時記憶部101は、例えばRAM(Random-Access Memory)である。一時記憶部101は、記憶部102に記憶されている制御プログラムをロードし、計算部103の作業領域として用いられる。 The temporary storage unit 101 is, for example, a RAM (Random-Access Memory). The temporary storage unit 101 loads the control program stored in the storage unit 102 and is used as a work area of the calculation unit 103.
 記憶部102は、フラッシュメモリ、ハードディスク、DVD-RAM(Digital Versatile Disc - Random Access Memory)、DVD-RW(Digital Versatile Disc - ReWritable)などの不揮発性メモリである。記憶部102は、情報処理装置2の処理を計算部103に行わせるためのプログラムを予め記憶する。また、記憶部102は、計算部103の指示に従って、このプログラムが記憶するデータを計算部103に供給し、計算部103から供給されたデータを記憶する。演算記憶部21の記憶部216は、記憶部102に構成される。 The storage unit 102 is a non-volatile memory such as a flash memory, a hard disk, a DVD-RAM (Digital Versatile Disc-Random Access Memory), or a DVD-RW (Digital Versatile Disc-ReWritable). The storage unit 102 stores in advance a program for causing the calculation unit 103 to perform the processing of the information processing device 2. Further, the storage unit 102 supplies the data stored in this program to the calculation unit 103 according to the instruction of the calculation unit 103, and stores the data supplied from the calculation unit 103. The storage unit 216 of the arithmetic storage unit 21 is configured in the storage unit 102.
 入力部104は、キーボード、ポインティングデバイスなどの入力装置と、キーボード、ポインティングデバイスなどの入力装置をBUSに接続するインタフェース装置である。例えば、ユーザが情報処理装置2に図面データを直接入力する場合、入力部104を介して、入力された図面データが計算部103に供給される。 The input unit 104 is an interface device that connects an input device such as a keyboard and a pointing device and an input device such as a keyboard and a pointing device to the BUS. For example, when the user directly inputs the drawing data to the information processing device 2, the input drawing data is supplied to the calculation unit 103 via the input unit 104.
 送受信部105は、ネットワークに接続する網終端装置または無線通信装置、およびそれらと接続するシリアルインタフェースまたはLAN(Local Area Network)インタフェースである。送受信部105は、通信部22として機能する。 The transmission / reception unit 105 is a network termination device or wireless communication device connected to the network, and a serial interface or LAN (Local Area Network) interface connected to them. The transmission / reception unit 105 functions as a communication unit 22.
 表示部106は、CRT(Cathode Ray Tube)、LCD(Liquid Crystal Display)などの表示装置である。例えば、ユーザが情報処理装置2に図面データを直接入力する場合、表示部106は、図面データの入力画面を表示する。 The display unit 106 is a display device such as a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display). For example, when the user directly inputs the drawing data to the information processing device 2, the display unit 106 displays the drawing data input screen.
 図3に示す情報処理装置2の演算記憶部21の現在位置演算部211、投影経路演算部212、相対位置演算部213、投影画像生成部214、撮影画像処理部215および記憶部216の処理は、制御プログラムが、一時記憶部101、計算部103、記憶部102、入力部104、送受信部105および表示部106などを資源として用いて処理することによって実行する。 The processing of the current position calculation unit 211, the projection path calculation unit 212, the relative position calculation unit 213, the projection image generation unit 214, the captured image processing unit 215, and the storage unit 216 of the calculation storage unit 21 of the information processing device 2 shown in FIG. , The control program executes by processing using the temporary storage unit 101, the calculation unit 103, the storage unit 102, the input unit 104, the transmission / reception unit 105, the display unit 106, and the like as resources.
 その他、前記のハードウェア構成およびフローチャートは一例であり、任意に変更および修正が可能である。 In addition, the above hardware configuration and flowchart are examples, and can be changed and modified arbitrarily.
 計算部103、一時記憶部101、記憶部102、入力部104、送受信部105、表示部106などの情報処理装置2の処理を行う中心となる部分は、専用のシステムによらず、通常のコンピュータシステムを用いて実現可能である。例えば、前記の動作を実行するためのコンピュータプログラムを、フレキシブルディスク、CD-ROM(Compact Disc - Read Only Memory)、DVD-ROM(Digital Versatile Disc - Read Only Memory)などのコンピュータが読み取り可能な記録媒体に格納して配布し、当該コンピュータプログラムをコンピュータにインストールすることにより、前記の処理を実行する情報処理装置2を構成してもよい。また、インターネットなどの通信ネットワーク上のサーバ装置が有する記憶装置に当該コンピュータプログラムを格納しておき、通常のコンピュータシステムがダウンロードすることで情報処理装置2を構成してもよい。 The central part that performs processing of the information processing device 2 such as the calculation unit 103, the temporary storage unit 101, the storage unit 102, the input unit 104, the transmission / reception unit 105, and the display unit 106 is a normal computer regardless of the dedicated system. It can be realized using a system. For example, a computer program for executing the above operation can be a computer-readable recording medium such as a flexible disk, a CD-ROM (Compact Disc-Read Only Memory), or a DVD-ROM (Digital Versatile Disc-Read Only Memory). The information processing device 2 that executes the above-mentioned processing may be configured by storing and distributing the computer program in the computer and installing the computer program in the computer. Further, the information processing device 2 may be configured by storing the computer program in a storage device of a server device on a communication network such as the Internet and downloading it by a normal computer system.
 また、情報処理装置2の機能を、OS(Operating System)とアプリケーションプログラムの分担、またはOSとアプリケーションプログラムとの協働により実現する場合などには、アプリケーションプログラム部分のみを記録媒体、記憶装置に格納してもよい。 Further, when the function of the information processing device 2 is realized by sharing the OS (Operating System) and the application program, or by coordinating the OS and the application program, only the application program part is stored in the recording medium and the storage device. You may.
 また、搬送波にコンピュータプログラムを重畳し、通信ネットワークを介して提供することも可能である。例えば、通信ネットワーク上の掲示板(BBS, Bulletin Board System)に前記コンピュータプログラムを掲示し、通信ネットワークを介して前記コンピュータプログラムを提供してもよい。そして、このコンピュータプログラムを起動し、OSの制御下で、他のアプリケーションプログラムと同様に実行することにより、前記の処理を実行できる構成にしてもよい。 It is also possible to superimpose a computer program on a carrier wave and provide it via a communication network. For example, the computer program may be posted on a bulletin board system (BBS, Bulletin Board System) on the communication network, and the computer program may be provided via the communication network. Then, by starting this computer program and executing it in the same manner as other application programs under the control of the OS, the above processing may be executed.
 上述した実施の形態では、墨出し支援装置1がマーキング部15を備えるが、例えば、墨出し支援装置1に墨出しをさせる必要がない場合には、墨出し支援装置1はマーキング部15を備えなくてもよい。この場合、プロセッサ部17は、マーキング制御部175を備えなくてよい。 In the above-described embodiment, the marking support device 1 includes a marking unit 15. However, for example, when it is not necessary for the marking support device 1 to perform marking, the marking support device 1 includes a marking unit 15. It does not have to be. In this case, the processor unit 17 does not have to include the marking control unit 175.
 上述した実施の形態では、墨出し支援装置1の駆動部14は、方向回動部142を備えるが、固定された投影部11の指向方向および撮影部13の撮影方向での投影画像の投影および撮影が常に可能である場合には、駆動部14は、方向回動部142を備えなくてもよい。この場合、投影条件情報に投影方向を含まなくてもよい。また、固定された投影部11の指向方向および撮影部13の撮影方向が鉛直方向である場合には、情報処理装置2の演算記憶部21は、相対位置演算部213を備えなくてもよい。投影画像生成部214は、現在位置演算部211が算出した墨出し支援装置1の現在位置および向きに基づいて、記憶部216が記憶する図面データを投影画像に変換する図面変換処理を実行する。このとき、図面変換処理に射影変換処理を含まなくてよい。 In the above-described embodiment, the driving unit 14 of the marking support device 1 includes a direction rotating unit 142, but projects of a projected image in the directivity direction of the fixed projection unit 11 and the imaging direction of the imaging unit 13 and When shooting is always possible, the drive unit 14 does not have to include the directional rotating unit 142. In this case, the projection condition information does not have to include the projection direction. Further, when the directing direction of the fixed projection unit 11 and the photographing direction of the photographing unit 13 are vertical, the calculation storage unit 21 of the information processing apparatus 2 does not have to include the relative position calculation unit 213. The projection image generation unit 214 executes a drawing conversion process for converting the drawing data stored in the storage unit 216 into a projection image based on the current position and orientation of the marking support device 1 calculated by the current position calculation unit 211. At this time, the drawing conversion process does not have to include the projection conversion process.
 上述した実施の形態では、情報処理装置2は、撮影画像処理部215を備えるが、例えば、投影部11から対象面までの距離が投影画像の照度が低下するほど遠くなること、対象面が著しく傾いていたり凹凸があったりすることがなく、投影画像が正常であるか否かを判定する必要がない場合には、情報処理装置2は、撮影画像処理部215を備えなくてもよい。 In the above-described embodiment, the information processing device 2 includes a captured image processing unit 215. For example, the distance from the projection unit 11 to the target surface becomes longer as the illuminance of the projected image decreases, and the target surface is significantly tilted. The information processing apparatus 2 may not include the captured image processing unit 215 when it is not necessary to determine whether or not the projected image is normal because there is no unevenness or unevenness.
 上述した実施の形態では、駆動部14は、墨出し支援装置1を飛行によって移動させる移動駆動部141を備えるが、墨出し支援装置1の移動は飛行による移動に限らない。例えば、墨出し支援装置1は平面移動が可能な自走装置であって、投影部11および撮影部13の高さを変更できる昇降機構を備えるものであってもよい。 In the above-described embodiment, the drive unit 14 includes a movement drive unit 141 that moves the marking support device 1 by flight, but the movement of the marking support device 1 is not limited to movement by flight. For example, the marking support device 1 may be a self-propelled device capable of moving in a plane, and may include an elevating mechanism capable of changing the heights of the projection unit 11 and the photographing unit 13.
 上述した実施の形態では、墨出し支援システム100は、ユーザ端末3を備えるが、これに限らない。例えば、ユーザが情報処理装置2に投影条件情報を入力してもよいし、情報処理装置2は、他の装置またはシステムから投影条件情報を取得してもよい。 In the above-described embodiment, the marking support system 100 includes the user terminal 3, but is not limited to this. For example, the user may input the projection condition information into the information processing device 2, or the information processing device 2 may acquire the projection condition information from another device or system.
 上述した実施の形態では、墨出し支援装置1が投影画像を建屋の床面に投影する例について説明したが、これに限らず、墨出し支援装置1が投影画像を投影する面は、ユーザが墨出し作業をする面であればよい。例えば、墨出し支援装置1は、投影画像を建屋の天井面、壁面など他の面に投影してもよい。 In the above-described embodiment, the example in which the marking support device 1 projects the projected image on the floor surface of the building has been described, but the present invention is not limited to this, and the surface on which the marking support device 1 projects the projected image is formed by the user. Any surface may be used for marking work. For example, the marking support device 1 may project a projected image onto another surface such as a ceiling surface or a wall surface of a building.
 なお、本開示は、本開示の広義の精神と範囲を逸脱することなく、様々な実施の形態及び変形が可能とされるものである。また、上述した実施の形態は、この開示を説明するためのものであり、本開示の範囲を限定するものではない。即ち、本開示の範囲は、実施の形態ではなく、請求の範囲によって示される。そして、請求の範囲内及びそれと同等の開示の意義の範囲内で施される様々な変形が、この開示の範囲内とみなされる。 It should be noted that the present disclosure enables various embodiments and modifications without departing from the broad spirit and scope of the present disclosure. Moreover, the above-described embodiment is for explaining this disclosure, and does not limit the scope of the present disclosure. That is, the scope of the present disclosure is indicated not by the embodiment but by the claims. And various modifications made within the scope of the claims and within the equivalent meaning of disclosure are considered to be within the scope of this disclosure.
 本出願は、2019年12月9日に出願された、日本国特許出願特願2019-221975号に基づく。本明細書中に日本国特許出願特願2019-221975号の明細書、特許請求の範囲、図面全体を参照として取り込むものとする。 This application is based on Japanese Patent Application No. 2019-221975, which was filed on December 9, 2019. The specification, claims, and drawings of Japanese Patent Application No. 2019-221975 are incorporated herein by reference.
 1 墨出し支援装置、2 情報処理装置、3 ユーザ端末、11 投影部、12 センサ部、13 撮影部、14 駆動部、15 マーキング部、16 電力源、17 プロセッサ部、18 通信部、21 演算記憶部、22 通信部、31 表示部、32 入力部、33 制御部、34 通信部、41 図面データ、42 2次元座標、43 飛行高さ、44 領域、45 中心点、47 投影画像、48 交点、100 墨出し支援システム、101 一時記憶部、102 記憶部、103 計算部、104 入力部、105 送受信部、106 表示部、121 外界センサ、123 内界センサ、141 移動駆動部、142 方向回動部、171 投影制御部、172 検知情報収集部、173 駆動制御部、174 撮影制御部、175 マーキング制御部、211 現在位置演算部、212 投影経路演算部、214 投影画像生成部、215 撮影画像処理部、216 記憶部。 1 marking support device, 2 information processing device, 3 user terminal, 11 projection unit, 12 sensor unit, 13 shooting unit, 14 drive unit, 15 marking unit, 16 power source, 17 processor unit, 18 communication unit, 21 arithmetic storage Unit, 22 communication unit, 31 display unit, 32 input unit, 33 control unit, 34 communication unit, 41 drawing data, 42 two-dimensional coordinates, 43 flight height, 44 area, 45 center point, 47 projected image, 48 intersection, 100 marking support system, 101 temporary storage unit, 102 storage unit, 103 calculation unit, 104 input unit, 105 transmission / reception unit, 106 display unit, 121 external world sensor, 123 internal world sensor, 141 mobile drive unit, 142 directional rotation unit , 171 projection control unit, 172 detection information collection unit, 173 drive control unit, 174 shooting control unit, 175 marking control unit, 211 current position calculation unit, 212 projection path calculation unit, 214 projection image generation unit, 215 shooting image processing unit 216 Storage unit.

Claims (9)

  1.  墨出し情報を含む図面データを投影画像に変換する情報処理装置および前記投影画像を対象面に投影する墨出し支援装置を備える墨出し支援システムであって、
     前記墨出し支援装置は、
     前記投影画像を指向方向に投影する投影部と、
     周辺の物体の位置および大きさと前記墨出し支援装置の加速度および角速度とを検知するセンサ部と、
     前記墨出し支援装置を移動させる移動駆動部と、
     前記墨出し支援装置を制御するプロセッサ部と、
     前記センサ部が検知した周辺の物体の位置および大きさと前記墨出し支援装置の加速度および角速度とを示す検知情報を前記情報処理装置に送信する検知情報送信部と、
     前記情報処理装置から、現在位置から投影位置までの移動経路を示す移動経路情報を受信する移動経路情報受信部と、
     前記情報処理装置から、前記投影画像を示す投影画像情報を受信する投影画像情報受信部と、
     を備え、
     前記情報処理装置は、
     前記墨出し支援装置の前記投影位置を含む投影条件を示す投影条件情報を取得する投影条件情報取得部と、
     前記墨出し支援装置から、前記検知情報を受信する検知情報受信部と、
     前記検知情報に基づいて、前記墨出し支援装置の現在位置および向きを算出する現在位置演算部と、
     前記現在位置演算部が算出した前記墨出し支援装置の現在位置を示す情報および前記投影条件情報に基づいて、前記墨出し支援装置の現在位置から前記投影位置までの移動経路を算出する移動経路演算部と、
     前記移動経路演算部が算出した移動経路を示す前記移動経路情報を前記墨出し支援装置に送信する移動経路情報送信部と、
     前記現在位置演算部が算出した前記墨出し支援装置の現在位置および向きを示す情報に基づいて、前記図面データを前記投影画像に変換する図面変換処理を行う投影画像生成部と、
     前記投影画像生成部が生成した前記投影画像を示す前記投影画像情報を前記墨出し支援装置に送信する投影画像情報送信部と、
     を備え、
     前記プロセッサ部は、前記移動経路情報に基づいて、前記移動駆動部を制御して、前記墨出し支援装置を前記投影位置に移動させ、前記投影部を制御して、前記投影画像情報が示す前記投影画像を前記対象面に投影させる、
     墨出し支援システム。
    An marking support system including an information processing device that converts drawing data including marking information into a projected image and a marking support device that projects the projected image onto a target surface.
    The marking support device is
    A projection unit that projects the projected image in the directivity direction,
    A sensor unit that detects the position and size of surrounding objects and the acceleration and angular velocity of the marking support device.
    A moving drive unit that moves the marking support device,
    The processor unit that controls the marking support device and
    A detection information transmitting unit that transmits detection information indicating the position and size of a peripheral object detected by the sensor unit and the acceleration and angular velocity of the marking support device to the information processing device.
    A movement route information receiving unit that receives movement route information indicating a movement route from the current position to the projection position from the information processing device.
    A projection image information receiving unit that receives projection image information indicating the projection image from the information processing device, and
    With
    The information processing device
    A projection condition information acquisition unit that acquires projection condition information indicating a projection condition including the projection position of the marking support device, and a projection condition information acquisition unit.
    A detection information receiving unit that receives the detection information from the marking support device, and
    Based on the detection information, the current position calculation unit that calculates the current position and orientation of the marking support device, and
    A movement route calculation that calculates a movement path from the current position of the marking support device to the projection position based on the information indicating the current position of the marking support device and the projection condition information calculated by the current position calculation unit. Department and
    A movement route information transmission unit that transmits the movement route information indicating the movement route calculated by the movement route calculation unit to the marking support device, and a movement route information transmission unit.
    A projection image generation unit that performs a drawing conversion process for converting the drawing data into the projection image based on the information indicating the current position and orientation of the marking support device calculated by the current position calculation unit.
    A projection image information transmission unit that transmits the projection image information indicating the projection image generated by the projection image generation unit to the marking support device, and a projection image information transmission unit.
    With
    The processor unit controls the movement drive unit based on the movement path information to move the marking support device to the projection position, and controls the projection unit to indicate the projected image information. Projecting the projected image onto the target surface,
    Sumitomo support system.
  2.  前記投影条件は、投影方向をさらに含み、
     前記情報処理装置は、
     前記墨出し支援装置の前記対象面までの前記投影部の指向方向の距離および前記対象面に対する指向方向の角度を演算する相対位置演算部と、
     前記現在位置演算部が算出した前記墨出し支援装置の現在位置および向きを示す情報と、前記相対位置演算部が算出した前記墨出し支援装置の前記対象面までの前記投影部の指向方向の距離および前記対象面に対する指向方向の角度を示す情報とに基づいて、前記投影部の指向方向と投影方向とを一致させる回動量を算出し、算出した回動量を示す回動量情報を生成する回動量演算部と、
     前記回動量情報を前記墨出し支援装置に送信する回動量情報送信部と、
     をさらに備え、
     前記投影画像生成部は、前記現在位置演算部が算出した前記墨出し支援装置の現在位置および向きを示す情報、および、前記相対位置演算部が算出した前記墨出し支援装置の前記対象面までの前記投影部の指向方向の距離および前記対象面に対する指向方向の角度を示す情報に基づいて、前記図面データを前記投影画像に変換する図面変換処理を行い、
     前記墨出し支援装置は、
     前記情報処理装置から、前記回動量情報を受信する回動量情報受信部と、
     前記投影部の指向方向を変化させる方向回動部と、
     をさらに備え、
     前記プロセッサ部は、前記移動駆動部を制御して、前記移動経路情報が示す移動経路で前記墨出し支援装置を投影位置に移動させ、かつ、前記回動量情報に基づいて、前記方向回動部を制御して、前記投影部の指向方向と投影方向とを一致させ、前記投影部を制御して、前記投影画像情報が示す前記投影画像を前記対象面に投影させる、
     請求項1に記載の墨出し支援システム。
    The projection condition further includes the projection direction.
    The information processing device
    A relative position calculation unit that calculates the distance in the directing direction of the projection unit to the target surface of the marking support device and the angle of the directing direction with respect to the target surface.
    The distance between the information indicating the current position and orientation of the marking support device calculated by the current position calculation unit and the direction of the projection unit to the target surface of the marking support device calculated by the relative position calculation unit. And, based on the information indicating the angle of the pointing direction with respect to the target surface, the rotation amount for matching the pointing direction and the projection direction of the projection unit is calculated, and the rotation amount information indicating the calculated rotation amount is generated. The calculation unit and
    A rotation amount information transmitting unit that transmits the rotation amount information to the marking support device, and
    With more
    The projected image generation unit reaches the information indicating the current position and orientation of the marking support device calculated by the current position calculation unit, and the target surface of the marking support device calculated by the relative position calculation unit. Based on the information indicating the distance in the direction direction of the projection unit and the angle in the direction direction with respect to the target surface, a drawing conversion process for converting the drawing data into the projected image is performed.
    The marking support device is
    A rotation amount information receiving unit that receives the rotation amount information from the information processing device, and
    A directional rotating part that changes the directivity direction of the projection part, and
    With more
    The processor unit controls the movement drive unit to move the marking support device to the projection position by the movement path indicated by the movement path information, and based on the rotation amount information, the direction rotation unit. To match the pointing direction and the projection direction of the projection unit, and control the projection unit to project the projection image indicated by the projection image information onto the target surface.
    The marking support system according to claim 1.
  3.  前記墨出し支援装置は、
     前記投影部が投影した前記投影画像を撮影する撮影部と、
     前記撮影部の撮影画像を示す撮影画像情報を前記情報処理装置に送信する撮影画像情報送信部と、
     をさらに備え、
     前記情報処理装置は、
     前記墨出し支援装置から、前記撮影画像情報を受信する撮影画像情報受信部と、
     前記撮影画像情報が示す前記撮影画像に画像変換処理を実行し、前記画像変換処理を実行した前記撮影画像と前記投影画像の元の図面データとの差異を算出し、差異が閾値を超えた場合に、前記投影画像が異常であることを示すエラー情報を出力する撮影画像処理部と、
     をさらに備える、
     請求項1または2に記載の墨出し支援システム。
    The marking support device is
    A shooting unit that captures the projected image projected by the projection unit, and
    A captured image information transmitting unit that transmits captured image information indicating a captured image of the photographing unit to the information processing apparatus, and a captured image information transmitting unit.
    With more
    The information processing device
    A photographed image information receiving unit that receives the photographed image information from the marking support device, and
    When an image conversion process is executed on the captured image indicated by the captured image information, a difference between the captured image subjected to the image conversion process and the original drawing data of the projected image is calculated, and the difference exceeds the threshold value. In addition, a captured image processing unit that outputs error information indicating that the projected image is abnormal, and
    Further prepare,
    The marking support system according to claim 1 or 2.
  4.  前記情報処理装置は、
     前記墨出し支援装置に対する墨出し指示を示す墨出し指示情報を取得する墨出し指示情報取得部と、
     前記墨出し指示情報取得部が墨出し指示情報を取得すると、前記現在位置演算部が算出した前記墨出し支援装置の現在位置を示す情報および前記図面データに基づいて、前記墨出し支援装置が前記対象面に前記墨出し情報を記入する移動経路と、インクの射出タイミングとを算出する墨出し経路演算部と、
     前記墨出し経路演算部が算出した移動経路と、インクの射出タイミングとを示す墨出し経路情報を前記墨出し支援装置に送信する墨出し経路情報送信部と、
     をさらに備え、
     前記墨出し支援装置は、
     インクを射出するマーキング部と、
     前記情報処理装置から、前記墨出し経路情報を受信する墨出し経路情報受信部と、
     をさらに備え、
     前記プロセッサ部は、前記移動駆動部を制御して、前記墨出し経路情報が示す移動経路で前記墨出し支援装置を移動させ、かつ、前記マーキング部を制御して、前記墨出し経路情報が示すインクの射出タイミングでインクを射出させる、
     請求項1から3のいずれか1項に記載の墨出し支援システム。
    The information processing device
    An marking instruction information acquisition unit that acquires marking instruction information indicating a marking instruction to the marking support device, and a marking instruction information acquisition unit.
    When the marking instruction information acquisition unit acquires the marking instruction information, the marking support device performs the marking support device based on the information indicating the current position of the marking support device calculated by the current position calculation unit and the drawing data. A movement path for writing the marking information on the target surface, a marking path calculation unit for calculating the ink ejection timing, and a marking path calculation unit.
    The marking route information transmitting unit that transmits the marking route information indicating the movement path calculated by the marking route calculation unit and the ink ejection timing to the marking support device, and the marking route information transmitting unit.
    With more
    The marking support device is
    Marking part that ejects ink and
    An marking route information receiving unit that receives the marking route information from the information processing device,
    With more
    The processor unit controls the movement drive unit to move the marking support device along the movement path indicated by the marking route information, and controls the marking unit to indicate the marking route information. Ink is ejected at the timing of ink ejection,
    The marking support system according to any one of claims 1 to 3.
  5.  前記墨出し支援装置は、自律飛行によって移動する無人航空機である、
     請求項1から4のいずれか1項に記載の墨出し支援システム。
    The marking support device is an unmanned aerial vehicle that moves by autonomous flight.
    The marking support system according to any one of claims 1 to 4.
  6.  墨出し情報を含む図面データを投影画像に変換する情報処理装置と通信し、前記投影画像を対象面に投影する墨出し支援装置であって、
     前記投影画像を指向方向に投影する投影部と、
     周辺の物体の位置および大きさと前記墨出し支援装置の加速度および角速度とを検知するセンサ部と、
     前記墨出し支援装置を移動させる移動駆動部と、
     前記墨出し支援装置を制御するプロセッサ部と、
     前記センサ部が検知した周辺の物体の位置および大きさと前記墨出し支援装置の加速度および角速度とを示す検知情報を前記情報処理装置に送信する検知情報送信部と、
     前記情報処理装置から、現在位置から投影位置までの移動経路を示す移動経路情報を受信する移動経路情報受信部と、
     前記情報処理装置から、前記投影画像を示す投影画像情報を受信する投影画像情報受信部と、
     を備え、
     前記プロセッサ部は、前記移動経路情報に基づいて、前記移動駆動部を制御して、前記墨出し支援装置を投影位置に移動させ、前記投影部を制御して、前記投影画像情報が示す前記投影画像を前記対象面に投影させる、
     墨出し支援装置。
    An marking support device that communicates with an information processing device that converts drawing data including marking information into a projected image and projects the projected image onto a target surface.
    A projection unit that projects the projected image in the directivity direction,
    A sensor unit that detects the position and size of surrounding objects and the acceleration and angular velocity of the marking support device.
    A moving drive unit that moves the marking support device,
    The processor unit that controls the marking support device and
    A detection information transmitting unit that transmits detection information indicating the position and size of a peripheral object detected by the sensor unit and the acceleration and angular velocity of the marking support device to the information processing device.
    A movement route information receiving unit that receives movement route information indicating a movement route from the current position to the projection position from the information processing device.
    A projection image information receiving unit that receives projection image information indicating the projection image from the information processing device, and
    With
    The processor unit controls the movement drive unit based on the movement path information to move the marking support device to a projection position, controls the projection unit, and controls the projection unit to indicate the projection image information. Project the image onto the target surface,
    Marking support device.
  7.  投影画像を対象面に投影する移動可能な墨出し支援装置と通信し、墨出し情報を含む図面データを前記投影画像に変換する情報処理装置であって、
     前記墨出し支援装置の投影位置を含む投影条件を示す投影条件情報を取得する投影条件情報取得部と、
     前記墨出し支援装置が検知した前記墨出し支援装置の周辺の物体の位置および大きさと前記墨出し支援装置の加速度および角速度とを示す検知情報を受信する検知情報受信部と、
     前記検知情報に基づいて、前記墨出し支援装置の現在位置および向きを算出する現在位置演算部と、
     前記現在位置演算部が算出した前記墨出し支援装置の現在位置を示す情報および前記投影条件情報に基づいて、前記墨出し支援装置の現在位置から前記投影位置までの移動経路を算出する移動経路演算部と、
     前記移動経路演算部が算出した移動経路を示す前記移動経路情報を前記墨出し支援装置に送信する移動経路情報送信部と、
     前記現在位置演算部が算出した前記墨出し支援装置の現在位置および向きを示す情報に基づいて、前記図面データを前記投影画像に変換する図面変換処理を行う投影画像生成部と、
     前記投影画像生成部が生成した前記投影画像を示す前記投影画像情報を前記墨出し支援装置に送信する投影画像情報送信部と、
     を備える、
     情報処理装置。
    An information processing device that communicates with a movable marking support device that projects a projected image onto a target surface and converts drawing data including marking information into the projected image.
    A projection condition information acquisition unit that acquires projection condition information indicating a projection condition including a projection position of the marking support device, and a projection condition information acquisition unit.
    A detection information receiving unit that receives detection information indicating the position and size of an object around the marking support device detected by the marking support device and the acceleration and angular velocity of the marking support device.
    Based on the detection information, the current position calculation unit that calculates the current position and orientation of the marking support device, and
    A movement route calculation that calculates a movement path from the current position of the marking support device to the projection position based on the information indicating the current position of the marking support device and the projection condition information calculated by the current position calculation unit. Department and
    A movement route information transmission unit that transmits the movement route information indicating the movement route calculated by the movement route calculation unit to the marking support device, and a movement route information transmission unit.
    A projection image generation unit that performs a drawing conversion process for converting the drawing data into the projection image based on the information indicating the current position and orientation of the marking support device calculated by the current position calculation unit.
    A projection image information transmission unit that transmits the projection image information indicating the projection image generated by the projection image generation unit to the marking support device, and a projection image information transmission unit.
    To prepare
    Information processing device.
  8.  墨出し情報を含む図面データを投影画像に変換する情報処理装置および前記投影画像を対象面に投影する墨出し支援装置が実行する墨出し支援方法であって、
     前記墨出し支援装置が実行する、
     周辺の物体の位置および大きさと前記墨出し支援装置の加速度および角速度とを検知する検知ステップと、
     前記情報処理装置が実行する、
     前記検知ステップで検知した周辺の物体の位置および大きさと前記墨出し支援装置の加速度および角速度とを示す検知情報に基づいて、前記墨出し支援装置の現在位置および向きを算出する現在位置演算ステップと、
     前記現在位置演算ステップで算出した前記墨出し支援装置の現在位置を示す情報および前記墨出し支援装置の投影位置を含む投影条件を示す投影条件情報に基づいて、前記墨出し支援装置の現在位置から前記投影位置までの移動経路を算出する移動経路演算ステップと、
     前記現在位置演算ステップで算出した前記墨出し支援装置の現在位置および向きを示す情報に基づいて、前記図面データを前記投影画像に変換する図面変換処理を行う投影画像生成ステップと、
     前記墨出し支援装置が実行する、
     前記移動経路演算ステップで算出した移動経路で、前記墨出し支援装置を前記投影位置に移動させる移動ステップと、
     前記投影画像生成ステップで生成した前記投影画像を前記対象面に投影する投影ステップと、
     を備える墨出し支援方法。
    It is an marking support method executed by an information processing device that converts drawing data including marking information into a projected image and a marking support device that projects the projected image onto a target surface.
    The marking support device executes,
    A detection step that detects the position and size of surrounding objects and the acceleration and angular velocity of the marking support device.
    The information processing device executes
    With the current position calculation step of calculating the current position and orientation of the marking support device based on the detection information indicating the position and size of the surrounding object detected in the detection step and the acceleration and angular velocity of the marking support device. ,
    From the current position of the marking support device, based on the information indicating the current position of the marking support device calculated in the current position calculation step and the projection condition information indicating the projection condition including the projection position of the marking support device. A movement path calculation step for calculating the movement path to the projection position, and
    A projection image generation step of performing a drawing conversion process for converting the drawing data into the projection image based on the information indicating the current position and orientation of the marking support device calculated in the current position calculation step.
    The marking support device executes,
    A movement step of moving the marking support device to the projection position by the movement path calculated in the movement path calculation step, and a movement step of moving the marking support device to the projection position.
    A projection step of projecting the projected image generated in the projected image generation step onto the target surface,
    A method of supporting sumi-inking.
  9.  投影画像を対象面に投影する移動可能な墨出し支援装置と通信するコンピュータを、
     前記墨出し支援装置が検知した前記墨出し支援装置の周辺の物体の位置および大きさと前記墨出し支援装置の加速度および角速度とを示す検知情報に基づいて、前記墨出し支援装置の現在位置および向きを算出する現在位置演算部、
     前記現在位置演算部が算出した前記墨出し支援装置の現在位置を示す情報および前記墨出し支援装置の投影位置を含む投影条件を示す投影条件情報に基づいて、前記墨出し支援装置の現在位置から前記投影位置までの移動経路を算出する移動経路演算部、および、
     前記現在位置演算部が算出した前記墨出し支援装置の現在位置および向きを示す情報に基づいて、墨出し情報を含む図面データを前記投影画像に変換する図面変換処理を行う投影画像生成部、
     として機能させるプログラム。
    A computer that communicates with a movable marking support device that projects a projected image onto the target surface.
    The current position and orientation of the marking support device based on the detection information indicating the position and size of an object around the marking support device detected by the marking support device and the acceleration and angular velocity of the marking support device. Current position calculation unit, which calculates
    From the current position of the marking support device, based on the information indicating the current position of the marking support device calculated by the current position calculation unit and the projection condition information indicating the projection condition including the projection position of the marking support device. A movement path calculation unit that calculates a movement path to the projection position, and
    A projection image generation unit that performs a drawing conversion process for converting drawing data including marking information into the projection image based on the information indicating the current position and orientation of the marking support device calculated by the current position calculation unit.
    A program that functions as.
PCT/JP2020/045394 2019-12-09 2020-12-07 Marking support system, marking support device, information processing device, marking support method, and program WO2021117656A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021563936A JP7179203B2 (en) 2019-12-09 2020-12-07 Marking support system, information processing device, marking support method and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-221975 2019-12-09
JP2019221975 2019-12-09

Publications (1)

Publication Number Publication Date
WO2021117656A1 true WO2021117656A1 (en) 2021-06-17

Family

ID=76330350

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/045394 WO2021117656A1 (en) 2019-12-09 2020-12-07 Marking support system, marking support device, information processing device, marking support method, and program

Country Status (2)

Country Link
JP (1) JP7179203B2 (en)
WO (1) WO2021117656A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016220137A (en) * 2015-05-25 2016-12-22 みこらった株式会社 Movable projection system and movable projection method
JP2017181342A (en) * 2016-03-31 2017-10-05 前田建設工業株式会社 Setting-out support device and setting-out support method
CN207082019U (en) * 2017-07-14 2018-03-09 中国一冶集团有限公司 A kind of unmanned plane projects actinobacillus device
JP2018119902A (en) * 2017-01-27 2018-08-02 株式会社竹中土木 Method of automatic marking to ceiling surface and unmanned flying object for automatic marking

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016220137A (en) * 2015-05-25 2016-12-22 みこらった株式会社 Movable projection system and movable projection method
JP2017181342A (en) * 2016-03-31 2017-10-05 前田建設工業株式会社 Setting-out support device and setting-out support method
JP2018119902A (en) * 2017-01-27 2018-08-02 株式会社竹中土木 Method of automatic marking to ceiling surface and unmanned flying object for automatic marking
CN207082019U (en) * 2017-07-14 2018-03-09 中国一冶集团有限公司 A kind of unmanned plane projects actinobacillus device

Also Published As

Publication number Publication date
JP7179203B2 (en) 2022-11-28
JPWO2021117656A1 (en) 2021-06-17

Similar Documents

Publication Publication Date Title
US11760484B2 (en) Detecting optical discrepancies in captured images
WO2021016897A1 (en) Aerial survey method, photographing control method, aerial vehicle, terminal, system, and storage medium
JP4672190B2 (en) Video navigation device
JP6672857B2 (en) Unmanned flying device control system, unmanned flying device control method, and unmanned flying device
CN105676572A (en) Projection correction method and device for projector equipped on mobile robot
US11906305B2 (en) Movable marking system, controlling method for movable marking apparatus, and computer readable recording medium
US10186027B1 (en) Layout projection
US20200097026A1 (en) Method, device, and system for adjusting attitude of a device and computer-readable storage medium
CN110815205A (en) Calibration method, system and device of mobile robot
JP2014063411A (en) Remote control system, control method, and program
WO2020062281A1 (en) Cradle head control method, cradle head, movable platform and readable storage medium
US20200217665A1 (en) Mobile platform, image capture path generation method, program, and recording medium
JP6707933B2 (en) Unmanned flight device control system, unmanned flight device control method, and unmanned flight device
JP2017174159A (en) Pilotless flight device control system, pilotless flight device controlling method, and image projector
US10997747B2 (en) Target positioning with bundle adjustment
WO2021117656A1 (en) Marking support system, marking support device, information processing device, marking support method, and program
TWI726536B (en) Image capturing method and image capturing apparatus
JP7149569B2 (en) Building measurement method
WO2021079516A1 (en) Flight route creation method for flying body and management server
WO2021056411A1 (en) Air route adjustment method, ground end device, unmanned aerial vehicle, system, and storage medium
WO2020143004A1 (en) Information processing method and related device thereof
JP6770826B2 (en) Automatic collimation method and automatic collimation device for measuring the placement position of structures
KR102318841B1 (en) Movable Marking System, Controlling Method For Movable Marking Apparatus and Computer Readable Recording Medium
JP2021169958A (en) Illumination communication system and method for controlling illumination communication system
KR102434523B1 (en) Movable Marking System, Controlling Method For Movable Marking Apparatus and Computer Readable Recording Medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20898149

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2021563936

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20898149

Country of ref document: EP

Kind code of ref document: A1