WO2021014699A1 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
WO2021014699A1
WO2021014699A1 PCT/JP2020/016308 JP2020016308W WO2021014699A1 WO 2021014699 A1 WO2021014699 A1 WO 2021014699A1 JP 2020016308 W JP2020016308 W JP 2020016308W WO 2021014699 A1 WO2021014699 A1 WO 2021014699A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
image
image data
moving body
time
Prior art date
Application number
PCT/JP2020/016308
Other languages
French (fr)
Japanese (ja)
Inventor
香緒莉 新畑
鷹見 忠雄
石井 孝治
寛 河上
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Priority to JP2021534541A priority Critical patent/JP7208402B2/en
Publication of WO2021014699A1 publication Critical patent/WO2021014699A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Definitions

  • the present invention relates to a technique for imaging a subject by a moving body.
  • a mechanism is being considered in which an unmanned aerial vehicle called a drone is used to image a subject such as a building and inspect the subject.
  • this mechanism in order to grasp the state of the subject based on the captured image, it is necessary to capture a high-quality image in which the subject is accurately focused.
  • Patent Document 1 when inspecting a power transmission facility with a camera mounted on a drone, it is determined from the blurred shape of the distance image whether the subject is on the short side or the long side from the focus position. Is disclosed.
  • the subject when an image of a subject captured by an imaging device mounted on a flying object or other moving object does not meet the quality standard, the subject can be imaged again at the same imaging position and imaging direction as at the time of imaging.
  • the purpose is to provide a unique mechanism.
  • the present invention measures the image data captured by the imaging device from the first moving body including the timing device and the imaging device, and the timing at which the images are captured by the timing device.
  • An acquisition unit that acquires time data, a detection unit that detects an imaging time corresponding to image data that does not satisfy the image quality standard among the acquired image data, and a detection unit that detects the time data based on the time data. It is characterized by including a specific unit that specifies an imaging position and an imaging direction of the first moving body by the imaging device at an imaging time, and an output unit that outputs information about the specified imaging position and the imaging direction.
  • the output unit outputs information on the image pickup position and the image pickup direction to a second moving body including an image pickup device, and instructs the image pickup position and the image pickup direction to take an image.
  • the captured image data may be acquired from the second moving body in response to the instruction.
  • the specific unit is based on the scheduled imaging time, the planned imaging position, and the scheduled imaging direction of the imaging device included in the first moving body, and the timing data corresponding to the image data that does not satisfy the quality standard of the captured image.
  • the imaging position and imaging direction of the first moving body by the imaging device may be specified.
  • the specific unit is the position of the first moving body measured by the first moving body and the position of the first moving body at the time indicated by the time data corresponding to the image data that does not satisfy the image quality standard.
  • the imaging position and imaging direction of the first moving body by the imaging device may be specified based on the orientation.
  • the specific unit moves the first movement at a position obtained by analyzing a marker attached to a subject included in the image data and a time indicated by the time data corresponding to the image data that does not satisfy the image quality standard.
  • the image pickup position and image pickup direction of the first moving body by the image pickup device may be specified based on the position measured by the body.
  • the output unit may output information on the specified imaging position and imaging direction for the image data whose priority is equal to or higher than the threshold value among the image data that does not satisfy the image quality standard.
  • the subject when the image of the subject captured by the image pickup device mounted on the moving body does not satisfy the quality standard, the subject can be re-imaged at the same imaging position and imaging direction as at the time of imaging. ..
  • FIG. 1 is a diagram showing an example of the configuration of the flight system 1.
  • the flight system 1 is a system for remotely inspecting, monitoring, or observing a subject.
  • the subject referred to here may be, for example, an artificial object such as a building or a natural object such as a plant or terrain, or may be a disaster situation or a victim at the time of a disaster.
  • a building is assumed as a subject.
  • the flight system 1 includes, for example, a plurality of flying objects 10a and 10b called drones, a server device 20, and a communication network 2 that connects these in a communicable manner.
  • the server device 20 functions as an information processing device that controls the flying objects 10a and 10b.
  • the communication network 2 is, for example, a wireless communication network such as LTE (Long Term Evolution).
  • the flight system 1 includes at least two or more flying objects, a flying object 10a (an example of a first moving body in the present invention) and a flying object 10b (an example of a second moving body in the present invention).
  • the flying object 10 may be a flying object that flies in response to an operation of the control device by an operator (so-called manual flight), which is not shown, or autonomously flies under the control of the server device 20 (so-called automatic flight). It may be an air vehicle that performs maneuvering flight), or it may be an air vehicle that combines these manual flight and automatic flight. In this embodiment, an example of an air vehicle that performs autopilot flight will be described.
  • the server device 20 controls the flying object 10 to fly to a close distance of the building to be inspected, and the flying object 10 images the building.
  • the air vehicle 10 transmits image data of the building to the server device 20 via the communication network 2, and the server device 20 does not have any defects or failures in the building by image analysis using the image data, visual inspection by the observer, or the like. It is checked whether or not.
  • the flying object 10a and the flying object 10b perform a formation flight with a certain separation distance, and the flying object 10a first takes an image while flying around the building according to the flight schedule.
  • the flight object 10b immediately takes an image of the part of the building that does not meet the quality standard.
  • the imaging by the flying object 10b is referred to as "reimaging”.
  • FIG. 2 is a diagram showing a hardware configuration of the flying object 10.
  • the flying object 10 physically includes a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a flight device 1007, a sensor 1008, a positioning device 1009, a bus connecting them, and the like. It is configured as a computer device. Each of these devices operates on power supplied by a battery (not shown). In the following description, the word "device” can be read as a circuit, device, unit, or the like.
  • the hardware configuration of the aircraft body 10 may be configured to include one or more of the devices shown in the figure, or may be configured not to include some of the devices.
  • the processor 1001 For each function in the aircraft 10, the processor 1001 performs calculations by loading predetermined software (programs) on hardware such as the processor 1001 and the memory 1002, and controls communication by the communication device 1004, or the memory 1002. And by controlling at least one of reading and writing of data in the storage 1003.
  • predetermined software programs
  • the processor 1001 operates, for example, an operating system to control the entire computer.
  • the processor 1001 may be configured by a central processing unit (CPU: Central Processing Unit) including an interface with a peripheral device, a control device, an arithmetic unit, a register, and the like. Further, for example, a baseband signal processing unit, a call processing unit, and the like may be realized by the processor 1001.
  • CPU Central Processing Unit
  • a baseband signal processing unit, a call processing unit, and the like may be realized by the processor 1001.
  • the processor 1001 reads a program (program code), a software module, data, etc. from at least one of the storage 1003 and the communication device 1004 into the memory 1002, and executes various processes according to these.
  • a program program that causes a computer to execute at least a part of the operations described later is used.
  • the functional block of the aircraft body 10 may be realized by a control program stored in the memory 1002 and operating in the processor 1001.
  • Various processes may be executed by one processor 1001, but may be executed simultaneously or sequentially by two or more processors 1001.
  • Processor 1001 may be implemented by one or more chips.
  • the program may be transmitted from the communication network 2 to the aircraft 10 via a telecommunication line.
  • the memory 1002 is a computer-readable recording medium, and is composed of at least one such as a ROM (ReadOnlyMemory), an EPROM (ErasableProgrammableROM), an EEPROM (ElectricallyErasableProgrammableROM), and a RAM (RandomAccessMemory). May be done.
  • the memory 1002 may be referred to as a register, a cache, a main memory (main storage device), or the like.
  • the memory 1002 can store a program (program code), a software module, or the like that can be executed to carry out the method according to the present embodiment.
  • the storage 1003 is a computer-readable recording medium, and is, for example, an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, an optical magnetic disk (for example, a compact disk, a digital versatile disk, or a Blu-ray). It may consist of at least one (registered trademark) disk), smart card, flash memory (eg, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, and the like.
  • the storage 1003 may be referred to as an auxiliary storage device.
  • the storage 1003 stores, for example, identification information (referred to as flying object identification information) of the flying object 10. This aircraft identification information is used by the server device 20 to identify and control the aircraft 10.
  • the communication device 1004 is hardware (transmission / reception device) for communicating between computers via the communication network 2, and is also referred to as, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 1004 includes, for example, a high frequency switch, a duplexer, a filter, a frequency synthesizer, and the like in order to realize at least one of frequency division duplex (FDD: Frequency Division Duplex) and time division duplex (TDD: Time Division Duplex). It may be composed of.
  • FDD Frequency Division Duplex
  • TDD Time Division Duplex
  • the transmission / reception unit may be physically or logically separated from each other by the transmission control unit and the reception unit.
  • the input device 1005 is an input device (for example, a key, a microphone, a switch, a button, etc.) that receives an input from the outside, and particularly includes an image pickup device.
  • the output device 1006 is an output device (for example, a display, a speaker, an LED lamp, etc.) that outputs to the outside.
  • the flight device 1007 is a mechanism for flying the flying object 10 in the air, and includes, for example, a propeller, a motor for driving the propeller, and a drive mechanism.
  • the sensor 1008 includes a group of sensors for measuring the position or attitude of the flying object 10, such as a gyro sensor, an acceleration sensor, a pressure (altitude) sensor, and a magnetic (orientation) sensor, as well as a time measuring device for measuring the time.
  • This timekeeping device is particularly used to measure the time when an image is taken by the image pickup device.
  • this timekeeping device supplies the processor 1001 with a time signal that serves as a reference for the time when the image is taken by the image pickup device.
  • the processor 1001 outputs time data (time stamp) indicating the time when the image pickup is performed by the image pickup device based on the time signal supplied from the time measuring device. Add to image data.
  • the senor 1008 is provided in the present embodiment as a timekeeping device for measuring the time when the image is taken by the image pickup device, it is not always necessary, for example, the processor 1001, the communication device 1004, or the positioning device 1009. It suffices if the device in the flying object 10 is provided.
  • the positioning device 1009 measures the three-dimensional position of the flying object 10.
  • the positioning device 1009 is a GPS receiver, and measures the position of the flying object 10 based on GPS signals received from a plurality of satellites.
  • Each device such as the processor 1001 and the memory 1002 is connected by a bus for communicating information.
  • the bus may be configured by using a single bus, or may be configured by using a different bus for each device.
  • the aircraft body 10 includes hardware such as a microprocessor, a digital signal processor (DSP), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), and an FPGA (Field Programmable Gate Array).
  • the hardware may realize a part or all of each functional block.
  • processor 1001 may be implemented using at least one of these hardware.
  • FIG. 3 is a diagram showing a hardware configuration of the server device 20.
  • the server device 20 is physically configured as a computer device including a processor 2001, a memory 2002, a storage 2003, a communication device 2004, an input device 2005, an output device 2006, a bus connecting them, and the like.
  • the processor 2001 performs an operation by loading predetermined software (program) on hardware such as the processor 2001 and the memory 2002, and controls the communication by the communication device 2004, or the memory 2002. And by controlling at least one of reading and writing of data in the storage 2003.
  • the processor 2001, the memory 2002, the storage 2003, the communication device 2004, and the bus connecting them are the processor 1001, the memory 1002, the storage 1003, the communication device 1004, and the bus connecting them as described for the aircraft 10. Since the same is true, the description thereof will be omitted.
  • the flight schedule of the flying objects 10a and 10b is stored in the storage 2003.
  • the flight schedule of the aircraft 10a includes the planned flight position and the scheduled flight time of the aircraft 10a on the planned flight route determined in advance based on the position of the building, and the planned image position, the planned imaging direction, and the imaging by the aircraft 10a. Includes information indicating the scheduled time.
  • the planned imaging direction is a direction in which the imaging device of the flying object 10a performs imaging at the planned imaging position.
  • the flight schedule of the flight body 10b includes the flight schedule position and the flight schedule time of the flight body 10b on the flight schedule route determined in advance based on the position of the building.
  • the position, direction, and timing of reimaging by the flying object 10b are determined when it is determined that the imaging by the flying object 10a does not meet the quality standard. Therefore, it is not included in the flight schedule stored in advance in the storage 2003.
  • the input device 2005 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, a joystick, a ball controller, etc.) that accepts an input from the outside.
  • the output device 2006 is an output device (for example, a display, a speaker, an LED lamp, etc.) that outputs to the outside.
  • the input device 2005 and the output device 2006 may have an integrated configuration (for example, a touch panel).
  • FIG. 4 is a diagram showing an example of the functional configuration of the flight system 1.
  • the acquisition unit 21 acquires various data from the aircraft 10 via the communication network 2.
  • the data acquired by the acquisition unit 21 includes various behavior data such as the position and attitude of the flying object 10, image data captured by the imaging device of the flying object 10, and the time when the image is captured. Includes time data timed by the time device.
  • the detection unit 22 detects the imaging time corresponding to the image data that does not satisfy the image quality standard among the image data acquired by the acquisition unit 21 based on the above time data.
  • An image quality standard is an image recognition algorithm that quantifies the level corresponding to the threshold of an event such as defocus, image blur, backlight, or image haze. Is.
  • the autofocus control of the image pickup apparatus can quantify the state in which the subject is out of focus from the output data of the image pickup apparatus. Further, when the shape or structure of the subject is known, it may be quantified that the shape or structure cannot be detected by image recognition.
  • the state of overexposure or underexposure that occurs when the brightness of the subject deviates from the dynamic range of the exposure control of the image pickup device can be quantified from the output data of the image pickup device.
  • the detection unit 22 determines that the image data does not satisfy the image quality standard. Then, since the image data and the timing data have a one-to-one correspondence, the detection unit 22 refers to the timing data corresponding to the image data determined not to satisfy the image quality standard, and determines the imaging timing. To detect.
  • the identification unit 23 specifies the image pickup position and the image pickup direction of the flying object 10a by the image pickup device at the image pickup time detected by the detection unit 22. Since the flying object 10a flies and captures images according to the flight schedule described above, the imaging position and imaging direction in which the actual imaging is performed are uniquely determined by the scheduled imaging time. Therefore, the specifying unit 23 may specify the imaging position and the imaging direction corresponding to the same scheduled imaging time as the imaging timing detected by the detection unit 22 in the flight schedule of the flying object 10a. That is, the specific unit 23 is detected by the detection unit 22 based on the scheduled imaging time, the planned imaging position, and the planned imaging direction of the flying object 10a by the imaging device, and the timing data corresponding to the image data that does not satisfy the quality standard.
  • the imaging position and imaging direction of the flying object 10a by the imaging device at the imaging time are specified.
  • the time measurement by the time measuring device of the flying object 10a is accurate with the time axis in the flight schedule stored by the server device 20.
  • the timekeeping device of the flying object 10a needs to be synchronized with the timekeeping device of the node in the communication network 2 such as the server device 20.
  • the output unit 24 outputs information regarding the imaging position and imaging direction specified by the specific unit 23.
  • the output unit 24 outputs information on the imaging position and the imaging direction to the flying object 10b, and re-outputs the imaging position and the imaging direction. Instruct imaging.
  • the server device 20 is described as the main body of processing, specifically, the processor 2001 is described by loading predetermined software (program) on hardware such as the processor 2001 and the memory 2002. Means that the process is executed by performing the calculation and controlling the communication by the communication device 2004 and the reading and / or writing of the data in the memory 2002 and the storage 2003. The same applies to the flying object 10.
  • predetermined software program
  • the flying object 10a flies and images according to the flight schedule under the control of the server device 20.
  • This imaging may be an imaging of a moving image or an imaging of a still image.
  • the processor 1001 adds time data indicating the imaging time for each image data corresponding to a so-called frame. Further, in the case of a still image, the processor 1001 adds time data indicating the imaging time for each still image.
  • image data and timing data are transmitted from the communication device 1004 of the aircraft 10a to the server device 20 via the communication network 2.
  • the acquisition unit 21 of the server device 20 acquires various data including these image data and timing data from the aircraft 10a via the communication network 2 (step S11).
  • the detection unit 22 detects the image data that does not satisfy the image quality standard among the image data acquired from the flying object 10a, and further corresponds to the image data at the imaging time corresponding to the image data. Detection is performed based on the timing data (step S12).
  • the specifying unit 23 specifies the imaging position and imaging direction of the flying object 10a at the detected imaging time based on the flight schedule of the flying object 10a (step S13). That is, the specific unit 23 specifies the same scheduled imaging time as the above-mentioned imaging time in the flight schedule, and sets the planned imaging position and the planned imaging direction corresponding to the scheduled imaging time to the imaging position of the flying object 10a at the above-mentioned imaging time and Specify as the imaging direction.
  • the output unit 24 outputs information on the specified imaging position and imaging direction to the flying object 10b via the communication network 2, and instructs the flying object 10b to perform re-imaging in the imaging position and the imaging direction (step S14).
  • the flying object 10b flies to the imaging position, and images the subject in the imaging direction at the imaging position.
  • the image data obtained by this re-imaging is transmitted from the communication device 1004 of the flying object 10b to the server device 20 via the communication network 2.
  • the acquisition unit 21 of the server device 20 acquires various data including these image data and timing data from the flight object 10b via the communication network 2, it determines whether or not the image quality standard is satisfied. If the quality standard is not met, the output unit 24 instructs the flying object 10b to re-image at the same imaging position and the imaging direction. Such reimaging instructions for the flying object 10b are repeated until the quality criteria are met.
  • step S15 The above processing is repeated until the end of the imaging process for the subject (step S15; YES).
  • the subject when the image of the subject captured by the image pickup device mounted on the flying object 10 does not meet the quality standard, the subject is re-imaged at the same imaging position and imaging direction as at the time of imaging. It becomes possible.
  • the identification unit 23 specifies the image pickup position and the image pickup direction of the flying object 10a corresponding to the image pickup time detected by the detection unit 22 based on the flight schedule.
  • the specifying unit 23 may specify the imaging position and the imaging direction of the flying object 10a based on the position and orientation of the flying object 10a acquired from the flying object 10a by the acquiring unit 21. That is, when the flying object 10a transmits the image data and the timing data to the server device 20, the position of the flying object 10a positioned by the positioning device 1009 when the image is taken, and the flight detected by the sensor 1008. Information indicating the orientation (orientation) of the body 10a is transmitted.
  • the position of the flying object 10a is the position of the imaging device, and there is a certain correlation between the orientation of the flying object 10a and the imaging direction. Therefore, the specific unit 23 of the server device 20 is based on the above-mentioned position and orientation measured by the flying object 10a at the time indicated by the time data corresponding to the image data that does not satisfy the image quality standard, and the image pickup position and the image pickup position by the image pickup device. Specify the imaging direction.
  • Modification 2 For example, when the subject is provided with a sign such as a marker made of a predetermined figure, this sign may be used to specify the imaging position and imaging direction for reimaging. Specifically, the specific unit 23 is at a time indicated by the position of the sign obtained by analyzing the sign attached to the subject included in the image data and the time data corresponding to the image data that does not satisfy the quality standard. Based on the position measured by the positioning device 1009 of the flying object 10a, the imaging position and the imaging direction by the flying object 10a are specified.
  • the imaging position by the flying object 10a at the imaging time detected by the detection unit 22 is the position measured by the positioning device 1009 of the flying object 10a at the time indicated by the time data corresponding to the image data that does not satisfy the quality standard. is there. Further, the imaging direction by the flying object 10a at the imaging time detected by the detection unit 22 is determined from the position measured by the positioning device 1009 of the flying object 10a at the time indicated by the time data corresponding to the image data that does not satisfy the quality standard. It is a direction toward the position of the sign obtained by analyzing the sign attached to the subject included in the image data.
  • Each image captured by the flying object 10a may be given some priority based on the importance or the like. This priority is stored in advance in association with the scheduled imaging time, the scheduled imaging position, and the scheduled imaging direction in the flight schedule.
  • the output unit 24 outputs information on the specified imaging position and imaging direction for the image data whose priority is equal to or higher than the threshold among the image data that does not satisfy the image quality standard, and the priority is the threshold. For the image data less than, it is possible not to output the information regarding the specified imaging position and imaging direction.
  • the output unit 24 outputs information on the imaging position and the imaging direction to the flying object 10b, and instructs the flying object 10b to re-image at the imaging position and the imaging direction. It was.
  • the output unit 24 only needs to output information regarding the image pickup position and the image pickup direction specified by the specific unit 23.
  • the operator of the flying object 10a operates according to the output imaging position and imaging direction, so that the flying object 10b re-imaging at the imaging position and imaging direction.
  • the flying object 10b may supplementarily image the subject or may be waiting for an instruction for reimaging. Further, in the flying objects 10a and 10b, one of them performs the first imaging and the other performs the reimaging, but the roles of each other may be switched. Further, in the above embodiment, the moving body is a flying body, but in the present invention, the moving body for imaging is not limited to the flying device, and may be a device that moves by some means.
  • the information processing device may be mounted on a server device as in the above embodiment, or at least a part of the functions thereof is mounted on a flying object or a control device for manipulating the flying object. It may be mounted on other devices.
  • each functional block may be realized by using one device that is physically or logically connected, or directly or indirectly (for example, by using two or more physically or logically separated devices). , Wired, wireless, etc.) and may be realized using these plurality of devices.
  • the functional block may be realized by combining the software with the one device or the plurality of devices.
  • Functions include judgment, decision, judgment, calculation, calculation, processing, derivation, investigation, search, confirmation, reception, transmission, output, access, solution, selection, selection, establishment, comparison, assumption, expectation, and assumption.
  • broadcasting notifying, communicating, forwarding, configuring, reconfiguring, allocating, mapping, assigning, etc., but only these.
  • a functional block that makes transmission function is called a transmission control unit (transmitting unit) or a transmitter (transmitter).
  • the method of realizing each of them is not particularly limited.
  • the server device or the like in the embodiment of the present disclosure may function as a computer that performs the processing of the present disclosure.
  • Each aspect / embodiment described in the present disclosure includes LTE (Long Term Evolution), LTE-A (LTE-Advanced), SUPER 3G, IMT-Advanced, 4G (4th generation mobile communication system), and 5G (5th generation mobile communication).
  • system FRA (FutureRadioAccess), NR (newRadio), W-CDMA (registered trademark), GSM (registered trademark), CDMA2000, UMB (UltraMobileBroadband), IEEE 802.11 (Wi-Fi (registered trademark)) )), LTE 802.16 (WiMAX®), IEEE 802.20, UWB (Ultra-WideBand), Bluetooth®, and other systems that utilize suitable systems and have been extended based on these. It may be applied to at least one of the next generation systems. Further, a plurality of systems may be applied in combination (for example, a combination of at least one of LTE and LTE-A and 5G).
  • the input / output information and the like may be stored in a specific location (for example, memory) or may be managed using a management table. Input / output information and the like can be overwritten, updated, or added. The output information and the like may be deleted. The input information or the like may be transmitted to another device.
  • the determination may be made by a value represented by 1 bit (0 or 1), by a boolean value (Boolean: true or false), or by comparing numerical values (for example, a predetermined value). It may be done by comparison with the value).
  • Software is an instruction, instruction set, code, code segment, program code, program, subprogram, software module, whether called software, firmware, middleware, microcode, hardware description language, or another name.
  • Applications, software applications, software packages, routines, subroutines, objects, executables, execution threads, procedures, features, etc. should be broadly interpreted to mean.
  • software, instructions, information and the like may be transmitted and received via a transmission medium.
  • a transmission medium For example, a website that uses at least one of wired technology (coaxial cable, fiber optic cable, twist pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.) When transmitted from a server, or other remote source, at least one of these wired and wireless technologies is included within the definition of transmission medium.
  • the information, signals, etc. described in the present disclosure may be represented using any of a variety of different techniques.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description are voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may be represented by a combination of.
  • the terms described in the present disclosure and the terms necessary for understanding the present disclosure may be replaced with terms having the same or similar meanings.
  • information, parameters, etc. described in the present disclosure may be expressed using absolute values, relative values from predetermined values, or using other corresponding information. It may be represented.
  • references to elements using designations such as “first”, “second” as used in this disclosure does not generally limit the quantity or order of those elements. These designations can be used in the present disclosure as a convenient way to distinguish between two or more elements. Thus, references to the first and second elements do not mean that only two elements can be adopted, or that the first element must somehow precede the second element.
  • each of the above devices may be replaced with a "means”, a “circuit”, a “device”, or the like.
  • the term "A and B are different” may mean “A and B are different from each other”.
  • the term may mean that "A and B are different from C”.
  • Terms such as “separate” and “combined” may be interpreted in the same way as “different”.
  • Flight system 2: Communication network
  • 10 Aircraft
  • 1001 Processor
  • 1002 Memory
  • 1003 Storage
  • 1004 Communication device
  • 1005 Input device
  • 1006 Output device
  • 1007 Flight device
  • 1008 Sensor
  • 1009 Positioning device
  • 20 Server device
  • 21 Acquisition unit
  • 22 Detection unit
  • 23 Specific unit
  • 24 Output unit
  • 2002 Memory
  • 2004 Communication device
  • 2005: Input device 2006: Output device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A server device 20 controls a flight vehicle 10 to cause the flight vehicle 10 to fly extremely close to a building to be inspected and capture an image of the building. The flight vehicle 10 transmits image data of the building to the server device 20, and in the server device 20, using the image data, an inspection as to whether there is no deficiency in the building is made by image analysis, an observer's visual inspection, or the like. Regarding image capturing of the building, while a flight vehicle 10a and a flight vehicle 10b are flying in formation with some separation distance therebetween, first the flight vehicle 10a captures an image while flying along the building in accordance with a flight schedule, and if, in image data thereof, there is image data that does not meet a quality criterion, the flight vehicle 10b captures an image of a portion of the building that does not meet the quality criterion.

Description

情報処理装置Information processing device
 本発明は、移動体によって被写体を撮像するための技術に関する。 The present invention relates to a technique for imaging a subject by a moving body.
 ドローンと呼ばれる無人飛行体を用いて、例えば建造物等の被写体を撮像してその被写体の点検を行うような仕組みが考えられている。この仕組みにおいて撮像画像に基づいて被写体の状態を把握するためには、その被写体に対して正確に焦点を合わせた高品質の画像を撮像する必要がある。例えば特許文献1には、ドローンに搭載されたカメラにより送電設備の点検を行う場合に、被写体がピント位置から距離が短い側にあるか長い側にあるかを距離画像のボケ形状により判断することが開示されている。 A mechanism is being considered in which an unmanned aerial vehicle called a drone is used to image a subject such as a building and inspect the subject. In this mechanism, in order to grasp the state of the subject based on the captured image, it is necessary to capture a high-quality image in which the subject is accurately focused. For example, in Patent Document 1, when inspecting a power transmission facility with a camera mounted on a drone, it is determined from the blurred shape of the distance image whether the subject is on the short side or the long side from the focus position. Is disclosed.
特開2019-9919号公報Japanese Unexamined Patent Publication No. 2019-9919
 本発明は、飛行体その他の移動体に搭載された撮像装置によって撮像された被写体の画像が品質基準を満たさない場合に、その撮像時と同じ撮像位置及び撮像方向において被写体を再度撮像し得るような仕組みを提供することを目的とする。 According to the present invention, when an image of a subject captured by an imaging device mounted on a flying object or other moving object does not meet the quality standard, the subject can be imaged again at the same imaging position and imaging direction as at the time of imaging. The purpose is to provide a unique mechanism.
 上記課題を解決するため、本発明は、計時装置及び撮像装置を備える第1の移動体から、前記撮像装置によって撮像された画像データと、画像が撮像された時期として前記計時装置によって計時された時期データとを取得する取得部と、取得された前記画像データのうち画像の品質基準を満たさない画像データに対応する撮像時期を、前記時期データに基づいて検出する検出部と、検出された前記撮像時期における、前記第1の移動体の前記撮像装置による撮像位置及び撮像方向を特定する特定部と、特定された前記撮像位置及び前記撮像方向に関する情報を出力する出力部とを備えることを特徴とする情報処理装置を提供する。 In order to solve the above problems, the present invention measures the image data captured by the imaging device from the first moving body including the timing device and the imaging device, and the timing at which the images are captured by the timing device. An acquisition unit that acquires time data, a detection unit that detects an imaging time corresponding to image data that does not satisfy the image quality standard among the acquired image data, and a detection unit that detects the time data based on the time data. It is characterized by including a specific unit that specifies an imaging position and an imaging direction of the first moving body by the imaging device at an imaging time, and an output unit that outputs information about the specified imaging position and the imaging direction. To provide an information processing device.
 前記出力部は、前記撮像位置及び前記撮像方向に関する情報を、撮像装置を備える第2の移動体に対して出力して、当該撮像位置及び当該撮像方向での撮像を指示し、前記取得部は、前記指示に応じて前記第2の移動体から、撮像された画像データを取得するようにしてもよい。 The output unit outputs information on the image pickup position and the image pickup direction to a second moving body including an image pickup device, and instructs the image pickup position and the image pickup direction to take an image. , The captured image data may be acquired from the second moving body in response to the instruction.
 前記特定部は、前記第1の移動体が備える前記撮像装置による撮像予定時期、撮像予定位置及び撮像予定方向と、撮像画像の品質基準を満たさない前記画像データに対応する時期データとに基づいて、前記第1の移動体の前記撮像装置による撮像位置及び撮像方向を特定するようにしてもよい。 The specific unit is based on the scheduled imaging time, the planned imaging position, and the scheduled imaging direction of the imaging device included in the first moving body, and the timing data corresponding to the image data that does not satisfy the quality standard of the captured image. , The imaging position and imaging direction of the first moving body by the imaging device may be specified.
 前記特定部は、画像の品質基準を満たさない画像データに対応する前記時期データが示す時期において前記第1の移動体によって測定された当該第1の移動体の位置及び当該第1の移動体の向きに基づいて、前記第1の移動体の前記撮像装置による撮像位置及び撮像方向を特定するようにしてもよい。 The specific unit is the position of the first moving body measured by the first moving body and the position of the first moving body at the time indicated by the time data corresponding to the image data that does not satisfy the image quality standard. The imaging position and imaging direction of the first moving body by the imaging device may be specified based on the orientation.
 前記特定部は、前記画像データに含まれる被写体に付された標識を解析して得た位置と、画像の品質基準を満たさない画像データに対応する前記時期データが示す時期において前記第1の移動体によって測定された位置とに基づいて、前記第1の移動体の前記撮像装置による撮像位置及び撮像方向を特定するようにしてもよい。 The specific unit moves the first movement at a position obtained by analyzing a marker attached to a subject included in the image data and a time indicated by the time data corresponding to the image data that does not satisfy the image quality standard. The image pickup position and image pickup direction of the first moving body by the image pickup device may be specified based on the position measured by the body.
前記出力部は、画像の品質基準を満たさない画像データのうち、優先度が閾値以上の画像データについて、特定された前記撮像位置及び前記撮像方向に関する情報を出力するようにしてもよい。 The output unit may output information on the specified imaging position and imaging direction for the image data whose priority is equal to or higher than the threshold value among the image data that does not satisfy the image quality standard.
 本発明によれば、移動体に搭載された撮像装置によって撮像された被写体の画像が品質基準を満たさない場合に、その撮像時と同じ撮像位置及び撮像方向において被写体を再度撮像し得るようになる。 According to the present invention, when the image of the subject captured by the image pickup device mounted on the moving body does not satisfy the quality standard, the subject can be re-imaged at the same imaging position and imaging direction as at the time of imaging. ..
本発明を実施するための実施形態に係る飛行システム1の構成の一例を示す図である。It is a figure which shows an example of the structure of the flight system 1 which concerns on embodiment for carrying out this invention. 実施形態に係る飛行体10のハードウェア構成を示す図である。It is a figure which shows the hardware composition of the flying object 10 which concerns on embodiment. 実施形態に係るサーバ装置20のハードウェア構成を示す図である。It is a figure which shows the hardware configuration of the server apparatus 20 which concerns on embodiment. 実施形態に係る飛行システム1の機能構成の一例を示す図である。It is a figure which shows an example of the functional structure of the flight system 1 which concerns on embodiment. 実施形態に係るサーバ装置20の動作の一例を示すフローチャートである。It is a flowchart which shows an example of the operation of the server apparatus 20 which concerns on embodiment.
[実施形態]
 図1は、飛行システム1の構成の一例を示す図である。飛行システム1は、被写体の点検、監視又は観察等を遠隔で行うためのシステムである。ここでいう被写体は、例えば建造物等の人工物や植物や地形等の自然物であってもよいし、災害発生時における災害状況又は被災者であってもよい。本実施形態では被写体として、建造物を想定している。
[Embodiment]
FIG. 1 is a diagram showing an example of the configuration of the flight system 1. The flight system 1 is a system for remotely inspecting, monitoring, or observing a subject. The subject referred to here may be, for example, an artificial object such as a building or a natural object such as a plant or terrain, or may be a disaster situation or a victim at the time of a disaster. In this embodiment, a building is assumed as a subject.
 図1に示すように、飛行システム1は、例えばドローンと呼ばれる複数の飛行体10a,10bと、サーバ装置20と、これらを通信可能に接続する通信網2とを備える。サーバ装置20は、飛行体10a,10bを制御する情報処理装置として機能する。通信網2は、例えばLTE(Long Term Evolution)等の無線通信網である。 As shown in FIG. 1, the flight system 1 includes, for example, a plurality of flying objects 10a and 10b called drones, a server device 20, and a communication network 2 that connects these in a communicable manner. The server device 20 functions as an information processing device that controls the flying objects 10a and 10b. The communication network 2 is, for example, a wireless communication network such as LTE (Long Term Evolution).
 飛行システム1は、飛行体10a(本発明における第1の移動体の一例)及び飛行体10b(本発明における第2の移動体の一例)という少なくとも2以上の飛行体を備える。以降の説明において、これら複数の飛行体を区別しないときは、飛行体10と総称する。飛行体10は、図示せぬ操縦者による操縦装置の操作に応じて飛行(いわゆる手動操縦飛行)する飛行体であってもよいし、サーバ装置20による制御の下で自律的に飛行(いわゆる自動操縦飛行)する飛行体であってもよいし、これらの手動操縦飛行及び自動操縦飛行を併用する飛行体であってもよい。本実施形態では、自動操縦飛行を行う飛行体の例で説明する。 The flight system 1 includes at least two or more flying objects, a flying object 10a (an example of a first moving body in the present invention) and a flying object 10b (an example of a second moving body in the present invention). In the following description, when these plurality of flying objects are not distinguished, they are collectively referred to as the flying object 10. The flying object 10 may be a flying object that flies in response to an operation of the control device by an operator (so-called manual flight), which is not shown, or autonomously flies under the control of the server device 20 (so-called automatic flight). It may be an air vehicle that performs maneuvering flight), or it may be an air vehicle that combines these manual flight and automatic flight. In this embodiment, an example of an air vehicle that performs autopilot flight will be described.
 サーバ装置20は、飛行体10を制御して、点検対象である建造物の至近距離まで飛行させ、その飛行体10によって建造物を撮像させる。飛行体10は通信網2経由で建造物の画像データをサーバ装置20に送信し、サーバ装置20において、その画像データを用いて画像解析や監視者の目視等により建造物に不備や故障がないかどうかが点検される。建造物の撮像においては、飛行体10a及び飛行体10bが或る離隔距離を隔てた編隊飛行を行いつつ、まず飛行体10aが飛行予定に従い建造物の周囲を飛行しながら撮像を行い、その画像データにおいて所定の品質基準を満たさない画像データがある場合は、直ちに、その品質基準を満たさない建造物の箇所に対して飛行体10bによる撮像が行われるようになっている。以下において、飛行体10bによる撮像を「再撮像」という。 The server device 20 controls the flying object 10 to fly to a close distance of the building to be inspected, and the flying object 10 images the building. The air vehicle 10 transmits image data of the building to the server device 20 via the communication network 2, and the server device 20 does not have any defects or failures in the building by image analysis using the image data, visual inspection by the observer, or the like. It is checked whether or not. In the imaging of a building, the flying object 10a and the flying object 10b perform a formation flight with a certain separation distance, and the flying object 10a first takes an image while flying around the building according to the flight schedule. When there is image data that does not meet the predetermined quality standard in the data, the flight object 10b immediately takes an image of the part of the building that does not meet the quality standard. Hereinafter, the imaging by the flying object 10b is referred to as "reimaging".
 図2は、飛行体10のハードウェア構成を示す図である。飛行体10は、物理的には、プロセッサ1001、メモリ1002、ストレージ1003、通信装置1004、入力装置1005、出力装置1006、飛行装置1007、センサ1008、測位装置1009及びこれらを接続するバスなどを含むコンピュータ装置として構成されている。これらの各装置は図示せぬ電池から供給される電力によって動作する。なお、以下の説明では、「装置」という文言は、回路、デバイス、ユニットなどに読み替えることができる。飛行体10のハードウェア構成は、図に示した各装置を1つ又は複数含むように構成されてもよいし、一部の装置を含まずに構成されてもよい。 FIG. 2 is a diagram showing a hardware configuration of the flying object 10. The flying object 10 physically includes a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a flight device 1007, a sensor 1008, a positioning device 1009, a bus connecting them, and the like. It is configured as a computer device. Each of these devices operates on power supplied by a battery (not shown). In the following description, the word "device" can be read as a circuit, device, unit, or the like. The hardware configuration of the aircraft body 10 may be configured to include one or more of the devices shown in the figure, or may be configured not to include some of the devices.
 飛行体10における各機能は、プロセッサ1001、メモリ1002などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることによって、プロセッサ1001が演算を行い、通信装置1004による通信を制御したり、メモリ1002及びストレージ1003におけるデータの読み出し及び書き込みの少なくとも一方を制御したりすることによって実現される。 For each function in the aircraft 10, the processor 1001 performs calculations by loading predetermined software (programs) on hardware such as the processor 1001 and the memory 1002, and controls communication by the communication device 1004, or the memory 1002. And by controlling at least one of reading and writing of data in the storage 1003.
 プロセッサ1001は、例えば、オペレーティングシステムを動作させてコンピュータ全体を制御する。プロセッサ1001は、周辺装置とのインターフェース、制御装置、演算装置、レジスタなどを含む中央処理装置(CPU:Central Processing Unit)によって構成されてもよい。また、例えばベースバンド信号処理部や呼処理部などがプロセッサ1001によって実現されてもよい。 The processor 1001 operates, for example, an operating system to control the entire computer. The processor 1001 may be configured by a central processing unit (CPU: Central Processing Unit) including an interface with a peripheral device, a control device, an arithmetic unit, a register, and the like. Further, for example, a baseband signal processing unit, a call processing unit, and the like may be realized by the processor 1001.
 プロセッサ1001は、プログラム(プログラムコード)、ソフトウェアモジュール、データなどを、ストレージ1003及び通信装置1004の少なくとも一方からメモリ1002に読み出し、これらに従って各種の処理を実行する。プログラムとしては、後述する動作の少なくとも一部をコンピュータに実行させるプログラムが用いられる。飛行体10の機能ブロックは、メモリ1002に格納され、プロセッサ1001において動作する制御プログラムによって実現されてもよい。各種の処理は、1つのプロセッサ1001によって実行されてもよいが、2以上のプロセッサ1001により同時又は逐次に実行されてもよい。プロセッサ1001は、1以上のチップによって実装されてもよい。なお、プログラムは、電気通信回線を介して通信網2から飛行体10に送信されてもよい。 The processor 1001 reads a program (program code), a software module, data, etc. from at least one of the storage 1003 and the communication device 1004 into the memory 1002, and executes various processes according to these. As the program, a program that causes a computer to execute at least a part of the operations described later is used. The functional block of the aircraft body 10 may be realized by a control program stored in the memory 1002 and operating in the processor 1001. Various processes may be executed by one processor 1001, but may be executed simultaneously or sequentially by two or more processors 1001. Processor 1001 may be implemented by one or more chips. The program may be transmitted from the communication network 2 to the aircraft 10 via a telecommunication line.
 メモリ1002は、コンピュータ読み取り可能な記録媒体であり、例えば、ROM(Read Only Memory)、EPROM(Erasable Programmable ROM)、EEPROM(Electrically Erasable Programmable ROM)、RAM(Random Access Memory)などの少なくとも1つによって構成されてもよい。メモリ1002は、レジスタ、キャッシュ、メインメモリ(主記憶装置)などと呼ばれてもよい。メモリ1002は、本実施形態に係る方法を実施するために実行可能なプログラム(プログラムコード)、ソフトウェアモジュールなどを保存することができる。 The memory 1002 is a computer-readable recording medium, and is composed of at least one such as a ROM (ReadOnlyMemory), an EPROM (ErasableProgrammableROM), an EEPROM (ElectricallyErasableProgrammableROM), and a RAM (RandomAccessMemory). May be done. The memory 1002 may be referred to as a register, a cache, a main memory (main storage device), or the like. The memory 1002 can store a program (program code), a software module, or the like that can be executed to carry out the method according to the present embodiment.
 ストレージ1003は、コンピュータ読み取り可能な記録媒体であり、例えば、CD-ROM(Compact Disc ROM)などの光ディスク、ハードディスクドライブ、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリ(例えば、カード、スティック、キードライブ)、フロッピー(登録商標)ディスク、磁気ストリップなどの少なくとも1つによって構成されてもよい。ストレージ1003は、補助記憶装置と呼ばれてもよい。ストレージ1003は、例えば飛行体10の識別情報(飛行体識別情報という)を記憶する。この飛行体識別情報は、サーバ装置20が飛行体10を識別して制御するために用いられる。 The storage 1003 is a computer-readable recording medium, and is, for example, an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, an optical magnetic disk (for example, a compact disk, a digital versatile disk, or a Blu-ray). It may consist of at least one (registered trademark) disk), smart card, flash memory (eg, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, and the like. The storage 1003 may be referred to as an auxiliary storage device. The storage 1003 stores, for example, identification information (referred to as flying object identification information) of the flying object 10. This aircraft identification information is used by the server device 20 to identify and control the aircraft 10.
 通信装置1004は、通信網2を介してコンピュータ間の通信を行うためのハードウェア(送受信デバイス)であり、例えばネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュールなどともいう。通信装置1004は、例えば周波数分割複信(FDD:Frequency Division Duplex)及び時分割複信(TDD:Time Division Duplex)の少なくとも一方を実現するために、高周波スイッチ、デュプレクサ、フィルタ、周波数シンセサイザなどを含んで構成されてもよい。例えば、送受信アンテナ、アンプ部、送受信部、伝送路インターフェースなどは、通信装置1004によって実現されてもよい。送受信部は、送信制御部と受信部とで、物理的に、又は論理的に分離された実装がなされてもよい。 The communication device 1004 is hardware (transmission / reception device) for communicating between computers via the communication network 2, and is also referred to as, for example, a network device, a network controller, a network card, a communication module, or the like. The communication device 1004 includes, for example, a high frequency switch, a duplexer, a filter, a frequency synthesizer, and the like in order to realize at least one of frequency division duplex (FDD: Frequency Division Duplex) and time division duplex (TDD: Time Division Duplex). It may be composed of. For example, the transmission / reception antenna, the amplifier unit, the transmission / reception unit, the transmission line interface, and the like may be realized by the communication device 1004. The transmission / reception unit may be physically or logically separated from each other by the transmission control unit and the reception unit.
 入力装置1005は、外部からの入力を受け付ける入力デバイス(例えば、キー、マイクロフォン、スイッチ、ボタンなど)であり、特に撮像装置を含む。出力装置1006は、外部への出力を実施する出力デバイス(例えば、ディスプレイ、スピーカー、LEDランプなど)である。 The input device 1005 is an input device (for example, a key, a microphone, a switch, a button, etc.) that receives an input from the outside, and particularly includes an image pickup device. The output device 1006 is an output device (for example, a display, a speaker, an LED lamp, etc.) that outputs to the outside.
 飛行装置1007は、飛行体10を空中で飛行させるための機構であり、例えばプロペラや、そのプロペラを駆動するためのモータ及び駆動機構を含む。 The flight device 1007 is a mechanism for flying the flying object 10 in the air, and includes, for example, a propeller, a motor for driving the propeller, and a drive mechanism.
 センサ1008は、例えばジャイロセンサ、加速度センサ、気圧(高度)センサ、磁気(方位)センサ等の、飛行体10の位置又は姿勢を計測するセンサ群のほか、時刻を計測する計時装置を含む。この計時装置は、特に、撮像装置による撮像がなされた時期を計測するために用いられる。具体的には、この計時装置は、撮像装置による撮像がなされた時期の基準となる時刻信号をプロセッサ1001に供給する。撮像装置によって撮像された画像データがプロセッサ1001に入力されると、プロセッサ1001は、計時装置から供給される時刻信号に基づいて、撮像装置による撮像がなされた時期を示す時期データ(タイムスタンプ)を画像データに付与する。なお、撮像装置による撮像がなされた時期を計測するための計時装置は、本実施形態ではセンサ1008が備えていたが、必ずしもそうである必要はなく、例えばプロセッサ1001、通信装置1004又は測位装置1009等の、飛行体10内の装置が備えていればよい。 The sensor 1008 includes a group of sensors for measuring the position or attitude of the flying object 10, such as a gyro sensor, an acceleration sensor, a pressure (altitude) sensor, and a magnetic (orientation) sensor, as well as a time measuring device for measuring the time. This timekeeping device is particularly used to measure the time when an image is taken by the image pickup device. Specifically, this timekeeping device supplies the processor 1001 with a time signal that serves as a reference for the time when the image is taken by the image pickup device. When the image data captured by the image pickup device is input to the processor 1001, the processor 1001 outputs time data (time stamp) indicating the time when the image pickup is performed by the image pickup device based on the time signal supplied from the time measuring device. Add to image data. Although the sensor 1008 is provided in the present embodiment as a timekeeping device for measuring the time when the image is taken by the image pickup device, it is not always necessary, for example, the processor 1001, the communication device 1004, or the positioning device 1009. It suffices if the device in the flying object 10 is provided.
 測位装置1009は、飛行体10の三次元の位置を測定する。測位装置1009は、GPS受信機であり、複数の衛星から受信したGPS信号に基づいて飛行体10の位置を測定する。 The positioning device 1009 measures the three-dimensional position of the flying object 10. The positioning device 1009 is a GPS receiver, and measures the position of the flying object 10 based on GPS signals received from a plurality of satellites.
 プロセッサ1001、メモリ1002などの各装置は、情報を通信するためのバスによって接続される。バスは、単一のバスを用いて構成されてもよいし、装置間ごとに異なるバスを用いて構成されてもよい。 Each device such as the processor 1001 and the memory 1002 is connected by a bus for communicating information. The bus may be configured by using a single bus, or may be configured by using a different bus for each device.
 飛行体10は、マイクロプロセッサ、デジタル信号プロセッサ(DSP:Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field Programmable Gate Array)な
どのハードウェアを含んで構成されてもよく、当該ハードウェアにより、各機能ブロックの一部又は全てが実現されてもよい。例えば、プロセッサ1001は、これらのハードウェアの少なくとも1つを用いて実装されてもよい。
The aircraft body 10 includes hardware such as a microprocessor, a digital signal processor (DSP), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), and an FPGA (Field Programmable Gate Array). The hardware may realize a part or all of each functional block. For example, processor 1001 may be implemented using at least one of these hardware.
 図3は、サーバ装置20のハードウェア構成を示す図である。サーバ装置20は、物理的には、プロセッサ2001、メモリ2002、ストレージ2003、通信装置2004、入力装置2005、出力装置2006及びこれらを接続するバスなどを含むコンピュータ装置として構成されている。サーバ装置20における各機能は、プロセッサ2001、メモリ2002などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることによって、プロセッサ2001が演算を行い、通信装置2004による通信を制御したり、メモリ2002及びストレージ2003におけるデータの読み出し及び書き込みの少なくとも一方を制御したりすることによって実現される。プロセッサ2001、メモリ2002、ストレージ2003、通信装置2004及びこれらを接続するバスは、飛行体10について説明したプロセッサ1001、メモリ1002、ストレージ1003、通信装置1004及びこれらを接続するバスと、ハードウェアとしては同様であるため、その説明を省略する。 FIG. 3 is a diagram showing a hardware configuration of the server device 20. The server device 20 is physically configured as a computer device including a processor 2001, a memory 2002, a storage 2003, a communication device 2004, an input device 2005, an output device 2006, a bus connecting them, and the like. For each function in the server device 20, the processor 2001 performs an operation by loading predetermined software (program) on hardware such as the processor 2001 and the memory 2002, and controls the communication by the communication device 2004, or the memory 2002. And by controlling at least one of reading and writing of data in the storage 2003. The processor 2001, the memory 2002, the storage 2003, the communication device 2004, and the bus connecting them are the processor 1001, the memory 1002, the storage 1003, the communication device 1004, and the bus connecting them as described for the aircraft 10. Since the same is true, the description thereof will be omitted.
 ストレージ2003には、飛行体10a、10bの飛行予定が記憶されている。飛行体10aの飛行予定は、建造物の位置に基づいて予め策定された飛行予定経路上における飛行体10aの飛行予定位置及び飛行予定時期と、飛行体10aによる撮像予定位置、撮像予定方向及び撮像予定時期を示す情報を含む。ここで、撮像予定方向とは、撮像予定位置において飛行体10aの撮像装置により撮像がなされる方向である。一方、飛行体10bの飛行予定は、建造物の位置に基づいて予め策定された飛行予定経路上における飛行体10bの飛行予定位置及び飛行予定時期を含むものである。飛行体10bによる再撮像の位置、方向及び時期は、飛行体10aによる撮像が品質基準を満たさないと判断されたときに決まるようになっている。このため、ストレージ2003に予め記憶された飛行予定には含まれていない。 The flight schedule of the flying objects 10a and 10b is stored in the storage 2003. The flight schedule of the aircraft 10a includes the planned flight position and the scheduled flight time of the aircraft 10a on the planned flight route determined in advance based on the position of the building, and the planned image position, the planned imaging direction, and the imaging by the aircraft 10a. Includes information indicating the scheduled time. Here, the planned imaging direction is a direction in which the imaging device of the flying object 10a performs imaging at the planned imaging position. On the other hand, the flight schedule of the flight body 10b includes the flight schedule position and the flight schedule time of the flight body 10b on the flight schedule route determined in advance based on the position of the building. The position, direction, and timing of reimaging by the flying object 10b are determined when it is determined that the imaging by the flying object 10a does not meet the quality standard. Therefore, it is not included in the flight schedule stored in advance in the storage 2003.
 入力装置2005は、外部からの入力を受け付ける入力デバイス(例えば、キーボード、マウス、マイクロフォン、スイッチ、ボタン、センサ、ジョイスティック、ボールコントローラなど)である。出力装置2006は、外部への出力を実施する出力デバイス(例えば、ディスプレイ、スピーカー、LEDランプなど)である。なお、入力装置2005及び出力装置2006は、一体となった構成(例えば、タッチパネル)であってもよい。 The input device 2005 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, a joystick, a ball controller, etc.) that accepts an input from the outside. The output device 2006 is an output device (for example, a display, a speaker, an LED lamp, etc.) that outputs to the outside. The input device 2005 and the output device 2006 may have an integrated configuration (for example, a touch panel).
 図4は、飛行システム1の機能構成の一例を示す図である。サーバ装置20において、取得部21は、飛行体10から通信網2経由で各種のデータを取得する。取得部21により取得されるデータは、例えば飛行体10の位置及び姿勢等の各種挙動データや、飛行体10の撮像装置によって撮像された画像データ、及び画像が撮像された時期として飛行体10の計時装置によって計時された時期データを含む。 FIG. 4 is a diagram showing an example of the functional configuration of the flight system 1. In the server device 20, the acquisition unit 21 acquires various data from the aircraft 10 via the communication network 2. The data acquired by the acquisition unit 21 includes various behavior data such as the position and attitude of the flying object 10, image data captured by the imaging device of the flying object 10, and the time when the image is captured. Includes time data timed by the time device.
 検出部22は、取得部21により取得された画像データのうち、画像の品質基準を満たさない画像データに対応する撮像時期を、上記時期データに基づいて検出する。画像の品質基準とは、例えば焦点のずれ、画像のブレ、逆光、画像にもやがかかる等の事象に対して、その事象の閾値に相当するレベルを或る画像認識アルゴリズムで数値化したものである。例えば、焦点のずれに関しては、撮像装置のオートフォーカス制御で被写体に焦点が合わない状態を撮像装置の出力データから数値化することができる。また、被写体の形状乃至構造が既知である場合、その形状乃至構造が画像認識により検出できないことを数値化するようにしてもよい。また、逆光に関しては、被写体の明るさが撮像装置の露光制御のダイナミックレンジを外れた際に発生する白飛び又は黒潰れの状態を撮像装置の出力データから数値化することができる。検出部22は、取得部21により取得された画像データについて上記アルゴリズムに従い数値化した値が閾値に相当するレベルに満たない場合は、画像の品質基準を満たさない画像データであると判断する。そして、画像データと時期データとは1対1に対応しているから、検出部22は、画像の品質基準を満たさないと判断した画像データに対応する時期データを参照して、その撮像時期を検出する。 The detection unit 22 detects the imaging time corresponding to the image data that does not satisfy the image quality standard among the image data acquired by the acquisition unit 21 based on the above time data. An image quality standard is an image recognition algorithm that quantifies the level corresponding to the threshold of an event such as defocus, image blur, backlight, or image haze. Is. For example, with respect to the out-of-focus, the autofocus control of the image pickup apparatus can quantify the state in which the subject is out of focus from the output data of the image pickup apparatus. Further, when the shape or structure of the subject is known, it may be quantified that the shape or structure cannot be detected by image recognition. As for backlight, the state of overexposure or underexposure that occurs when the brightness of the subject deviates from the dynamic range of the exposure control of the image pickup device can be quantified from the output data of the image pickup device. When the value quantified according to the above algorithm for the image data acquired by the acquisition unit 21 does not reach the level corresponding to the threshold value, the detection unit 22 determines that the image data does not satisfy the image quality standard. Then, since the image data and the timing data have a one-to-one correspondence, the detection unit 22 refers to the timing data corresponding to the image data determined not to satisfy the image quality standard, and determines the imaging timing. To detect.
 特定部23は、検出部22により検出された撮像時期における、飛行体10aの撮像装置による撮像位置及び撮像方向を特定する。飛行体10aは前述した飛行予定に従って飛行及び撮像を行うから、実際に撮像を行った撮像位置及び撮像方向は、撮像予定時期によって一意に定まる。よって、特定部23は、飛行体10aの飛行予定において、検出部22により検出された撮像時期と同じ撮像予定時期に対応する撮像位置及び撮像方向を特定すればよい。つまり、特定部23は、飛行体10aの撮像装置による撮像予定時期、撮像予定位置及び撮像予定方向と、品質基準を満たさない画像データに対応する時期データとに基づいて、検出部22により検出された撮像時期における飛行体10aの撮像装置による撮像位置及び撮像方向を特定する。このように飛行予定における撮像予定時期に基づいて実際の撮像位置お由撮像方向を特定するため、飛行体10aの計時装置による計時は、サーバ装置20によって記憶されている飛行予定における時間軸と正確に整合している必要がある。つまり、飛行体10aの計時装置は、サーバ装置20等の通信網2におけるノードの計時装置と同期している必要がある。 The identification unit 23 specifies the image pickup position and the image pickup direction of the flying object 10a by the image pickup device at the image pickup time detected by the detection unit 22. Since the flying object 10a flies and captures images according to the flight schedule described above, the imaging position and imaging direction in which the actual imaging is performed are uniquely determined by the scheduled imaging time. Therefore, the specifying unit 23 may specify the imaging position and the imaging direction corresponding to the same scheduled imaging time as the imaging timing detected by the detection unit 22 in the flight schedule of the flying object 10a. That is, the specific unit 23 is detected by the detection unit 22 based on the scheduled imaging time, the planned imaging position, and the planned imaging direction of the flying object 10a by the imaging device, and the timing data corresponding to the image data that does not satisfy the quality standard. The imaging position and imaging direction of the flying object 10a by the imaging device at the imaging time are specified. In this way, in order to specify the actual imaging position and the imaging direction based on the scheduled imaging time in the flight schedule, the time measurement by the time measuring device of the flying object 10a is accurate with the time axis in the flight schedule stored by the server device 20. Must be consistent with. That is, the timekeeping device of the flying object 10a needs to be synchronized with the timekeeping device of the node in the communication network 2 such as the server device 20.
 出力部24は、特定部23により特定された撮像位置及び撮像方向に関する情報を出力する。本実施形態のように飛行体10が自動操縦飛行の場合は、出力部24は、撮像位置及び撮像方向に関する情報を、飛行体10bに対して出力し、その撮像位置及び当該撮像方向での再撮像を指示する。 The output unit 24 outputs information regarding the imaging position and imaging direction specified by the specific unit 23. When the flying object 10 is in autopilot flight as in the present embodiment, the output unit 24 outputs information on the imaging position and the imaging direction to the flying object 10b, and re-outputs the imaging position and the imaging direction. Instruct imaging.
[動作]
 次に、サーバ装置20の動作について説明する。なお、以下の説明において、サーバ装置20を処理の主体として記載する場合には、具体的にはプロセッサ2001、メモリ2002などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることで、プロセッサ2001が演算を行い、通信装置2004による通信や、メモリ2002及びストレージ2003におけるデータの読み出し及び/又は書き込みを制御することにより、処理が実行されることを意味する。飛行体10についても同様である。
[motion]
Next, the operation of the server device 20 will be described. In the following description, when the server device 20 is described as the main body of processing, specifically, the processor 2001 is described by loading predetermined software (program) on hardware such as the processor 2001 and the memory 2002. Means that the process is executed by performing the calculation and controlling the communication by the communication device 2004 and the reading and / or writing of the data in the memory 2002 and the storage 2003. The same applies to the flying object 10.
 図5において、飛行体10aはサーバ装置20による制御の下で飛行予定に従い、飛行及び撮像を行う。この撮像は、動画の撮像であってもよいし、静止画の撮像であってもよい。動画の場合は、いわゆるフレームに相当する画像データごとに、その撮像時期を示す時期データがプロセッサ1001により付与される。また、静止画の場合は、その静止画ごとにその撮像時期を示す時期データがプロセッサ1001により付与される。これら画像データ及び時期データは、飛行体10aの通信装置1004から通信網2経由でサーバ装置20に送信される。サーバ装置20の取得部21は、飛行体10aから通信網2経由で、これら画像データ及び時期データを含む各種データを取得する(ステップS11)。 In FIG. 5, the flying object 10a flies and images according to the flight schedule under the control of the server device 20. This imaging may be an imaging of a moving image or an imaging of a still image. In the case of moving images, the processor 1001 adds time data indicating the imaging time for each image data corresponding to a so-called frame. Further, in the case of a still image, the processor 1001 adds time data indicating the imaging time for each still image. These image data and timing data are transmitted from the communication device 1004 of the aircraft 10a to the server device 20 via the communication network 2. The acquisition unit 21 of the server device 20 acquires various data including these image data and timing data from the aircraft 10a via the communication network 2 (step S11).
 次に、検出部22は、飛行体10aから取得された画像データのうち、画像の品質基準を満たさない画像データを検出し、さらにその画像データに対応する撮像時期を、その画像データに対応する時期データに基づいて検出する(ステップS12)。 Next, the detection unit 22 detects the image data that does not satisfy the image quality standard among the image data acquired from the flying object 10a, and further corresponds to the image data at the imaging time corresponding to the image data. Detection is performed based on the timing data (step S12).
 次に、特定部23は、検出された撮像時期における飛行体10aの撮像位置及び撮像方向を、飛行体10aの飛行予定に基づいて特定する(ステップS13)。つまり、特定部23は、飛行予定において、上記撮像時期と同じ撮像予定時期を特定し、その撮像予定時期に対応する撮像予定位置及び撮像予定方向を、上記撮像時期における飛行体10aの撮像位置及び撮像方向として特定する。 Next, the specifying unit 23 specifies the imaging position and imaging direction of the flying object 10a at the detected imaging time based on the flight schedule of the flying object 10a (step S13). That is, the specific unit 23 specifies the same scheduled imaging time as the above-mentioned imaging time in the flight schedule, and sets the planned imaging position and the planned imaging direction corresponding to the scheduled imaging time to the imaging position of the flying object 10a at the above-mentioned imaging time and Specify as the imaging direction.
 次に、出力部24は、特定された撮像位置及び撮像方向に関する情報を通信網2経由で飛行体10bに出力し、その撮像位置及び当該撮像方向での再撮像を指示する(ステップS14)。この指示に応じて、飛行体10bは、上記撮像位置へと飛行し、その撮像位置にて上記撮像方向にて被写体に対する撮像を行う。この再撮像による画像データは、飛行体10bの通信装置1004から通信網2経由でサーバ装置20に送信される。サーバ装置20の取得部21は、飛行体10bから通信網2経由で、これら画像データ及び時期データを含む各種データを取得すると、画像の品質基準を満たすか否かを判断する。もし品質基準を満たさない場合は、出力部24は、飛行体10bに対して同じ撮像位置及び当該撮像方向での再度の再撮像を指示する。このような飛行体10bに対する再撮像の指示が、品質基準が満たされるまで繰り返される。 Next, the output unit 24 outputs information on the specified imaging position and imaging direction to the flying object 10b via the communication network 2, and instructs the flying object 10b to perform re-imaging in the imaging position and the imaging direction (step S14). In response to this instruction, the flying object 10b flies to the imaging position, and images the subject in the imaging direction at the imaging position. The image data obtained by this re-imaging is transmitted from the communication device 1004 of the flying object 10b to the server device 20 via the communication network 2. When the acquisition unit 21 of the server device 20 acquires various data including these image data and timing data from the flight object 10b via the communication network 2, it determines whether or not the image quality standard is satisfied. If the quality standard is not met, the output unit 24 instructs the flying object 10b to re-image at the same imaging position and the imaging direction. Such reimaging instructions for the flying object 10b are repeated until the quality criteria are met.
 以上のような処理が、被写体に対する撮像工程の終了まで(ステップS15;YES)、繰り返される。 The above processing is repeated until the end of the imaging process for the subject (step S15; YES).
 以上説明した実施形態によれば、飛行体10に搭載された撮像装置によって撮像された被写体の画像が品質基準を満たさない場合に、その撮像時と同じ撮像位置及び撮像方向において被写体を再度撮像することが可能となる。 According to the embodiment described above, when the image of the subject captured by the image pickup device mounted on the flying object 10 does not meet the quality standard, the subject is re-imaged at the same imaging position and imaging direction as at the time of imaging. It becomes possible.
[変形例]
 本発明は、上述した各実施形態に限定されない。上述した実施形態を以下のように変形してもよい。また、以下の2つ以上の変形例を組み合わせて実施してもよい。
[Modification example]
The present invention is not limited to the above-described embodiments. The above-described embodiment may be modified as follows. Further, the following two or more modified examples may be combined and carried out.
[変形例1]
 上記実施形態においては、特定部23は、検出部22により検出された撮像時期に対応する飛行体10aの撮像位置及び撮像方向を飛行予定に基づいて特定していた。これに代えて、特定部23は、取得部21が飛行体10aから取得したその飛行体10aの位置及び向きに基づいて、飛行体10aの撮像位置及び撮像方向を特定してもよい。つまり、飛行体10aが画像データ及び時期データをサーバ装置20に送信するときに、その撮像がなされたときに測位装置1009により測位された飛行体10aの位置、及び、センサ1008により検出された飛行体10aの向き(方位)を示す情報を送信する。飛行体10aの位置は撮像装置の位置であり、飛行体10aの向きと撮像方向とは一定の相関関係がある。よって、サーバ装置20の特定部23は、画像の品質基準を満たさない画像データに対応する時期データが示す時期において飛行体10aによって測定された上記位置及び向きに基づいて、撮像装置による撮像位置及び撮像方向を特定する。
[Modification 1]
In the above embodiment, the identification unit 23 specifies the image pickup position and the image pickup direction of the flying object 10a corresponding to the image pickup time detected by the detection unit 22 based on the flight schedule. Instead, the specifying unit 23 may specify the imaging position and the imaging direction of the flying object 10a based on the position and orientation of the flying object 10a acquired from the flying object 10a by the acquiring unit 21. That is, when the flying object 10a transmits the image data and the timing data to the server device 20, the position of the flying object 10a positioned by the positioning device 1009 when the image is taken, and the flight detected by the sensor 1008. Information indicating the orientation (orientation) of the body 10a is transmitted. The position of the flying object 10a is the position of the imaging device, and there is a certain correlation between the orientation of the flying object 10a and the imaging direction. Therefore, the specific unit 23 of the server device 20 is based on the above-mentioned position and orientation measured by the flying object 10a at the time indicated by the time data corresponding to the image data that does not satisfy the image quality standard, and the image pickup position and the image pickup position by the image pickup device. Specify the imaging direction.
[変形例2]
 例えば被写体に所定の図形からなるマーカー等の標識が設けられている場合には、この標識を用いて、再撮像の撮像位置及び撮像方向を特定してもよい。具体的には、特定部23は、画像データに含まれる被写体に付された標識を解析して得た、その標識の位置と、品質基準を満たさない画像データに対応する時期データが示す時期において飛行体10aの測位装置1009によって測定された位置とに基づいて、飛行体10aによる撮像位置及び撮像方向を特定する。ここで、検出部22により検出された撮像時期における飛行体10aによる撮像位置は、品質基準を満たさない画像データに対応する時期データが示す時期において飛行体10aの測位装置1009によって測定された位置である。また、検出部22により検出された撮像時期における飛行体10aによる撮像方向は、品質基準を満たさない画像データに対応する時期データが示す時期において飛行体10aの測位装置1009によって測定された位置から、画像データに含まれる被写体に付された標識を解析して得た、その標識の位置へと向かう方向である。
[Modification 2]
For example, when the subject is provided with a sign such as a marker made of a predetermined figure, this sign may be used to specify the imaging position and imaging direction for reimaging. Specifically, the specific unit 23 is at a time indicated by the position of the sign obtained by analyzing the sign attached to the subject included in the image data and the time data corresponding to the image data that does not satisfy the quality standard. Based on the position measured by the positioning device 1009 of the flying object 10a, the imaging position and the imaging direction by the flying object 10a are specified. Here, the imaging position by the flying object 10a at the imaging time detected by the detection unit 22 is the position measured by the positioning device 1009 of the flying object 10a at the time indicated by the time data corresponding to the image data that does not satisfy the quality standard. is there. Further, the imaging direction by the flying object 10a at the imaging time detected by the detection unit 22 is determined from the position measured by the positioning device 1009 of the flying object 10a at the time indicated by the time data corresponding to the image data that does not satisfy the quality standard. It is a direction toward the position of the sign obtained by analyzing the sign attached to the subject included in the image data.
[変形例3]
 飛行体10aにより撮像された各画像に対しては、重要度等に基づいて、なんらかの優先度が付されている場合がある。この優先度は、飛行予定における撮像予定時期、撮像予定位置及び撮像予定方向に対応付けて予め記憶されている。このような場合、出力部24は、画像の品質基準を満たさない画像データのうち、優先度が閾値以上の画像データについて、特定された撮像位置及び撮像方向に関する情報を出力し、優先度が閾値未満の画像データについて、特定された撮像位置及び撮像方向に関する情報を出力しないようにしてもよい。
[Modification 3]
Each image captured by the flying object 10a may be given some priority based on the importance or the like. This priority is stored in advance in association with the scheduled imaging time, the scheduled imaging position, and the scheduled imaging direction in the flight schedule. In such a case, the output unit 24 outputs information on the specified imaging position and imaging direction for the image data whose priority is equal to or higher than the threshold among the image data that does not satisfy the image quality standard, and the priority is the threshold. For the image data less than, it is possible not to output the information regarding the specified imaging position and imaging direction.
[変形例4]
 上記実施形態のように自動操縦飛行の場合においては、出力部24は、撮像位置及び撮像方向に関する情報を飛行体10bに出力して、その撮像位置及び当該撮像方向での再撮像を指示していた。これに対して、例えば手動操縦飛行の場合は、出力部24は、特定部23により特定された撮像位置及び撮像方向に関する情報を出力するだけでよい。この場合、飛行体10aの操縦者が、出力された撮像位置及び撮像方向に従い操縦を行うことで、その撮像位置及び撮像方向で飛行体10bによる再撮像を行わせる。
[Modification example 4]
In the case of autopilot flight as in the above embodiment, the output unit 24 outputs information on the imaging position and the imaging direction to the flying object 10b, and instructs the flying object 10b to re-image at the imaging position and the imaging direction. It was. On the other hand, for example, in the case of manual flight, the output unit 24 only needs to output information regarding the image pickup position and the image pickup direction specified by the specific unit 23. In this case, the operator of the flying object 10a operates according to the output imaging position and imaging direction, so that the flying object 10b re-imaging at the imaging position and imaging direction.
[変形例5]
 飛行体10aが飛行予定に従って被写体の撮像を行っている期間、飛行体10bは、補助的に被写体を撮像していてよいし、再撮像の指示を待機していてもよい。また、飛行体10a,10bは、いずれか一方が最初の撮像を行い、他方が再撮像を行うが、お互いの役割を交代してもよい。また、上記実施形態において、移動体は飛行体であったが、本発明において撮像を行う移動体は飛行する装置に限らず、何らかの手段で移動する装置であればよい。
[Modification 5]
During the period during which the flying object 10a is imaging the subject according to the flight schedule, the flying object 10b may supplementarily image the subject or may be waiting for an instruction for reimaging. Further, in the flying objects 10a and 10b, one of them performs the first imaging and the other performs the reimaging, but the roles of each other may be switched. Further, in the above embodiment, the moving body is a flying body, but in the present invention, the moving body for imaging is not limited to the flying device, and may be a device that moves by some means.
[変形例6]
 本発明に係る情報処理装置は、上記実施形態のようにサーバ装置に実装されていてもよいし、少なくともその一部の機能が飛行体や、その飛行体を操縦する操縦装置に実装されていてもよいし、その他の装置に実装されていてもよい。
[Modification 6]
The information processing device according to the present invention may be mounted on a server device as in the above embodiment, or at least a part of the functions thereof is mounted on a flying object or a control device for manipulating the flying object. It may be mounted on other devices.
[その他の変形例]
 なお、上記実施形態の説明に用いたブロック図は、機能単位のブロックを示している。これらの機能ブロック(構成部)は、ハードウェア及びソフトウェアの少なくとも一方の任意の組み合わせによって実現される。また、各機能ブロックの実現方法は特に限定されない。すなわち、各機能ブロックは、物理的又は論理的に結合した1つの装置を用いて実現されてもよいし、物理的又は論理的に分離した2つ以上の装置を直接的又は間接的に(例えば、有線、無線などを用いて)接続し、これら複数の装置を用いて実現されてもよい。機能ブロックは、上記1つの装置又は上記複数の装置にソフトウェアを組み合わせて実現されてもよい。
[Other variants]
The block diagram used in the description of the above embodiment shows a block of functional units. These functional blocks (components) are realized by any combination of at least one of hardware and software. Further, the method of realizing each functional block is not particularly limited. That is, each functional block may be realized by using one device that is physically or logically connected, or directly or indirectly (for example, by using two or more physically or logically separated devices). , Wired, wireless, etc.) and may be realized using these plurality of devices. The functional block may be realized by combining the software with the one device or the plurality of devices.
 機能には、判断、決定、判定、計算、算出、処理、導出、調査、探索、確認、受信、送信、出力、アクセス、解決、選択、選定、確立、比較、想定、期待、見做し、報知(broadcasting)、通知(notifying)、通信(communicating)、転送(forwarding)、構成(configuring)、再構成(reconfiguring)、割り当て(allocating、mapping)、割り振り(assigning)などがあるが、これらに限られない。たとえば、送信を機能させる機能ブロック(構成部)は、送信制御部(transmitting unit)や送信機(transmitter)と呼称される。いずれも、上述したとおり、実現方法は特に限定されない。 Functions include judgment, decision, judgment, calculation, calculation, processing, derivation, investigation, search, confirmation, reception, transmission, output, access, solution, selection, selection, establishment, comparison, assumption, expectation, and assumption. There are broadcasting, notifying, communicating, forwarding, configuring, reconfiguring, allocating, mapping, assigning, etc., but only these. I can't. For example, a functional block (constituent unit) that makes transmission function is called a transmission control unit (transmitting unit) or a transmitter (transmitter). As described above, the method of realizing each of them is not particularly limited.
 例えば、本開示の一実施の形態におけるサーバ装置などは、本開示の処理を行うコンピュータとして機能してもよい。 For example, the server device or the like in the embodiment of the present disclosure may function as a computer that performs the processing of the present disclosure.
 本開示において説明した各態様/実施形態は、LTE(Long Term Evolution)、LTE-A(LTE-Advanced)、SUPER 3G、IMT-Advanced、4G(4th generation mobile communication system)、5G(5th generation mobile communication system)、FRA(Future Radio Access)、NR(new Radio)、W-CDMA(登録商標)、GSM(登録商標)、CDMA2000、UMB(Ultra Mobile Broadband)、IEEE 802.11(Wi-Fi(登録商標))、IEEE 802.16(WiMAX(登録商標))、IEEE 802.20、UWB(Ultra-WideBand)、Bluetooth(登録商標)、その他の適切なシステムを利用するシステム及びこれらに基づいて拡張された次世代システムの少なくとも一つに適用されてもよい。また、複数のシステムが組み合わされて(例えば、LTE及びLTE-Aの少なくとも一方と5Gとの組み合わせ等)適用されてもよい。 Each aspect / embodiment described in the present disclosure includes LTE (Long Term Evolution), LTE-A (LTE-Advanced), SUPER 3G, IMT-Advanced, 4G (4th generation mobile communication system), and 5G (5th generation mobile communication). system), FRA (FutureRadioAccess), NR (newRadio), W-CDMA (registered trademark), GSM (registered trademark), CDMA2000, UMB (UltraMobileBroadband), IEEE 802.11 (Wi-Fi (registered trademark)) )), LTE 802.16 (WiMAX®), IEEE 802.20, UWB (Ultra-WideBand), Bluetooth®, and other systems that utilize suitable systems and have been extended based on these. It may be applied to at least one of the next generation systems. Further, a plurality of systems may be applied in combination (for example, a combination of at least one of LTE and LTE-A and 5G).
 本開示において説明した各態様/実施形態の処理手順、シーケンス、フローチャートなどは、矛盾の無い限り、順序を入れ替えてもよい。例えば、本開示において説明した方法については、例示的な順序を用いて様々なステップの要素を提示しており、提示した特定の順序に限定されない。 The order of the processing procedures, sequences, flowcharts, etc. of each aspect / embodiment described in the present disclosure may be changed as long as there is no contradiction. For example, the methods described in the present disclosure present elements of various steps using exemplary order, and are not limited to the particular order presented.
 入出力された情報等は特定の場所(例えば、メモリ)に保存されてもよいし、管理テーブルを用いて管理してもよい。入出力される情報等は、上書き、更新、又は追記され得る。出力された情報等は削除されてもよい。入力された情報等は他の装置へ送信されてもよい。 The input / output information and the like may be stored in a specific location (for example, memory) or may be managed using a management table. Input / output information and the like can be overwritten, updated, or added. The output information and the like may be deleted. The input information or the like may be transmitted to another device.
 判定は、1ビットで表される値(0か1か)によって行われてもよいし、真偽値(Boolean:true又はfalse)によって行われてもよいし、数値の比較(例えば、所定の値との比較)によって行われてもよい。 The determination may be made by a value represented by 1 bit (0 or 1), by a boolean value (Boolean: true or false), or by comparing numerical values (for example, a predetermined value). It may be done by comparison with the value).
 以上、本開示について詳細に説明したが、当業者にとっては、本開示が本開示中に説明した実施形態に限定されるものではないということは明らかである。本開示は、請求の範囲の記載により定まる本開示の趣旨及び範囲を逸脱することなく修正及び変更態様として実施することができる。したがって、本開示の記載は、例示説明を目的とするものであり、本開示に対して何ら制限的な意味を有するものではない。 Although the present disclosure has been described in detail above, it is clear to those skilled in the art that the present disclosure is not limited to the embodiments described in the present disclosure. The present disclosure may be implemented as an amendment or modification without departing from the purpose and scope of the present disclosure, which is determined by the description of the scope of claims. Therefore, the description of the present disclosure is for the purpose of exemplary explanation and does not have any limiting meaning to the present disclosure.
 ソフトウェアは、ソフトウェア、ファームウェア、ミドルウェア、マイクロコード、ハードウェア記述言語と呼ばれるか、他の名称で呼ばれるかを問わず、命令、命令セット、コード、コードセグメント、プログラムコード、プログラム、サブプログラム、ソフトウェアモジュール、アプリケーション、ソフトウェアアプリケーション、ソフトウェアパッケージ、ルーチン、サブルーチン、オブジェクト、実行可能ファイル、実行スレッド、手順、機能などを意味するよう広く解釈されるべきである。また、ソフトウェア、命令、情報などは、伝送媒体を介して送受信されてもよい。例えば、ソフトウェアが、有線技術(同軸ケーブル、光ファイバケーブル、ツイストペア、デジタル加入者回線(DSL:Digital Subscriber Line)など)及び無線技術(赤外線、マイクロ波など)の少なくとも一方を使用してウェブサイト、サーバ、又は他のリモートソースから送信される場合、これらの有線技術及び無線技術の少なくとも一方は、伝送媒体の定義内に含まれる。 Software is an instruction, instruction set, code, code segment, program code, program, subprogram, software module, whether called software, firmware, middleware, microcode, hardware description language, or another name. , Applications, software applications, software packages, routines, subroutines, objects, executables, execution threads, procedures, features, etc. should be broadly interpreted to mean. Further, software, instructions, information and the like may be transmitted and received via a transmission medium. For example, a website that uses at least one of wired technology (coaxial cable, fiber optic cable, twist pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.) When transmitted from a server, or other remote source, at least one of these wired and wireless technologies is included within the definition of transmission medium.
 本開示において説明した情報、信号などは、様々な異なる技術のいずれかを使用して表されてもよい。例えば、上記の説明全体に渡って言及され得るデータ、命令、コマンド、情報、信号、ビット、シンボル、チップなどは、電圧、電流、電磁波、磁界若しくは磁性粒子、光場若しくは光子、又はこれらの任意の組み合わせによって表されてもよい。
 なお、本開示において説明した用語及び本開示の理解に必要な用語については、同一の又は類似する意味を有する用語と置き換えてもよい。
The information, signals, etc. described in the present disclosure may be represented using any of a variety of different techniques. For example, data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description are voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may be represented by a combination of.
The terms described in the present disclosure and the terms necessary for understanding the present disclosure may be replaced with terms having the same or similar meanings.
 また、本開示において説明した情報、パラメータなどは、絶対値を用いて表されてもよいし、所定の値からの相対値を用いて表されてもよいし、対応する別の情報を用いて表されてもよい。 Further, the information, parameters, etc. described in the present disclosure may be expressed using absolute values, relative values from predetermined values, or using other corresponding information. It may be represented.
 本開示において使用する「に基づいて」という記載は、別段に明記されていない限り、「のみに基づいて」を意味しない。言い換えれば、「に基づいて」という記載は、「のみに基づいて」と「に少なくとも基づいて」の両方を意味する。 The phrase "based on" as used in this disclosure does not mean "based on" unless otherwise stated. In other words, the statement "based on" means both "based only" and "at least based on".
 本開示において使用する「第1」、「第2」などの呼称を使用した要素へのいかなる参照も、それらの要素の量又は順序を全般的に限定しない。これらの呼称は、2つ以上の要素間を区別する便利な方法として本開示において使用され得る。したがって、第1及び第2の要素への参照は、2つの要素のみが採用され得ること、又は何らかの形で第1の要素が第2の要素に先行しなければならないことを意味しない。 Any reference to elements using designations such as "first", "second" as used in this disclosure does not generally limit the quantity or order of those elements. These designations can be used in the present disclosure as a convenient way to distinguish between two or more elements. Thus, references to the first and second elements do not mean that only two elements can be adopted, or that the first element must somehow precede the second element.
 上記の各装置の構成における「部」を、「手段」、「回路」、「デバイス」等に置き換えてもよい。 The "part" in the configuration of each of the above devices may be replaced with a "means", a "circuit", a "device", or the like.
 本開示において、「含む(include)」、「含んでいる(including)」及びそれらの変形が使用されている場合、これらの用語は、用語「備える(comprising)」と同様に、包括的であることが意図される。さらに、本開示において使用されている用語「又は(or)」は、排他的論理和ではないことが意図される。 When "include", "including" and variations thereof are used in the present disclosure, these terms are as comprehensive as the term "comprising". Is intended. Furthermore, the term "or" used in the present disclosure is intended not to be an exclusive OR.
 本開示において、例えば、英語でのa、an及びtheのように、翻訳により冠詞が追加された場合、本開示は、これらの冠詞の後に続く名詞が複数形であることを含んでもよい。 In the present disclosure, if articles are added by translation, for example, a, an and the in English, the disclosure may include that the nouns following these articles are in the plural.
 本開示において、「AとBが異なる」という用語は、「AとBが互いに異なる」ことを意味してもよい。なお、当該用語は、「AとBがそれぞれCと異なる」ことを意味してもよい。「離れる」、「結合される」などの用語も、「異なる」と同様に解釈されてもよい。 In the present disclosure, the term "A and B are different" may mean "A and B are different from each other". The term may mean that "A and B are different from C". Terms such as "separate" and "combined" may be interpreted in the same way as "different".
1:飛行システム、2:通信網、10:飛行体、1001:プロセッサ、1002:メモリ、1003:ストレージ、1004:通信装置、1005:入力装置、1006:出力装置、1007:飛行装置、1008:センサ、1009:測位装置、20:サーバ装置、21:取得部、22:検出部、23:特定部、24:出力部、2001:プロセッサ、2002:メモリ、2003:ストレージ、2004:通信装置、2005:入力装置、2006:出力装置。 1: Flight system, 2: Communication network, 10: Aircraft, 1001: Processor, 1002: Memory, 1003: Storage, 1004: Communication device, 1005: Input device, 1006: Output device, 1007: Flight device, 1008: Sensor , 1009: Positioning device, 20: Server device, 21: Acquisition unit, 22: Detection unit, 23: Specific unit, 24: Output unit, 2001: Processor, 2002: Memory, 2003: Storage, 2004: Communication device, 2005: Input device, 2006: Output device.

Claims (6)

  1.  計時装置及び撮像装置を備える第1の移動体から、前記撮像装置によって撮像された画像データと、画像が撮像された時期として前記計時装置によって計時された時期データとを取得する取得部と、
     取得された前記画像データのうち画像の品質基準を満たさない画像データに対応する撮像時期を、前記時期データに基づいて検出する検出部と、
     検出された前記撮像時期における、前記第1の移動体の前記撮像装置による撮像位置及び撮像方向を特定する特定部と、
     特定された前記撮像位置及び前記撮像方向に関する情報を出力する出力部と
     を備えることを特徴とする情報処理装置。
    An acquisition unit that acquires image data captured by the imaging device and time data measured by the timing device as the time when the image was captured from a first moving body including a time measuring device and an imaging device.
    A detection unit that detects the imaging timing corresponding to the acquired image data that does not meet the image quality standard based on the timing data.
    A specific unit that specifies the imaging position and imaging direction of the first moving body by the imaging device at the detected imaging period, and
    An information processing device including an output unit that outputs information regarding the specified imaging position and the imaging direction.
  2.  前記出力部は、前記撮像位置及び前記撮像方向に関する情報を、撮像装置を備える第2の移動体に対して出力して、当該撮像位置及び当該撮像方向での撮像を指示し、
     前記取得部は、前記指示に応じて前記第2の移動体から、撮像された画像データを取得する
     ことを特徴とする請求項1に記載の情報処理装置。
    The output unit outputs information about the imaging position and the imaging direction to a second moving body including an imaging device, and instructs the imaging position and the imaging direction to perform imaging.
    The information processing apparatus according to claim 1, wherein the acquisition unit acquires image data captured from the second moving body in response to the instruction.
  3.  前記特定部は、前記第1の移動体が備える前記撮像装置による撮像予定時期、撮像予定位置及び撮像予定方向と、撮像画像の品質基準を満たさない前記画像データに対応する時期データとに基づいて、前記第1の移動体の前記撮像装置による撮像位置及び撮像方向を特定する
     ことを特徴とする請求項1又は2記載の情報処理装置。
    The specific unit is based on the scheduled imaging time, the planned imaging position, and the scheduled imaging direction of the imaging device included in the first moving body, and the timing data corresponding to the image data that does not satisfy the quality standard of the captured image. The information processing apparatus according to claim 1 or 2, wherein the imaging position and imaging direction of the first moving body by the imaging apparatus are specified.
  4.  前記特定部は、画像の品質基準を満たさない画像データに対応する前記時期データが示す時期において前記第1の移動体によって測定された当該第1の移動体の位置及び当該第1の移動体の向きに基づいて、前記第1の移動体の前記撮像装置による撮像位置及び撮像方向を特定する
     ことを特徴とする請求項1又は2記載の情報処理装置。
    The specific unit is the position of the first moving body measured by the first moving body and the position of the first moving body at the time indicated by the time data corresponding to the image data that does not satisfy the image quality standard. The information processing apparatus according to claim 1 or 2, wherein the imaging position and imaging direction of the first moving body by the imaging device are specified based on the orientation.
  5.  前記特定部は、前記画像データに含まれる被写体に付された標識を解析して得た位置と、画像の品質基準を満たさない画像データに対応する前記時期データが示す時期において前記第1の移動体によって測定された位置とに基づいて、前記第1の移動体の前記撮像装置による撮像位置及び撮像方向を特定する
     ことを特徴とする請求項1又は2記載の情報処理装置。
    The specific unit moves the first movement at a position obtained by analyzing a marker attached to a subject included in the image data and a time indicated by the time data corresponding to the image data that does not satisfy the image quality standard. The information processing apparatus according to claim 1 or 2, wherein the imaging position and imaging direction of the first moving body by the imaging apparatus are specified based on the position measured by the body.
  6.  前記出力部は、画像の品質基準を満たさない画像データのうち、優先度が閾値以上の画像データについて、特定された前記撮像位置及び前記撮像方向に関する情報を出力する
     ことを特徴とする請求項1~5のいずれか1項に記載の情報処理装置。
    Claim 1 is characterized in that the output unit outputs information on the specified imaging position and imaging direction for image data having a priority equal to or higher than a threshold among image data that does not satisfy the image quality standard. The information processing apparatus according to any one of 5 to 5.
PCT/JP2020/016308 2019-07-19 2020-04-13 Information processing device WO2021014699A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021534541A JP7208402B2 (en) 2019-07-19 2020-04-13 Information processing equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019133376 2019-07-19
JP2019-133376 2019-07-19

Publications (1)

Publication Number Publication Date
WO2021014699A1 true WO2021014699A1 (en) 2021-01-28

Family

ID=74194085

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/016308 WO2021014699A1 (en) 2019-07-19 2020-04-13 Information processing device

Country Status (2)

Country Link
JP (1) JP7208402B2 (en)
WO (1) WO2021014699A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023089984A1 (en) * 2021-11-19 2023-05-25 富士フイルム株式会社 Moving vehicle, moving vehicle image-capturing system, and moving vehicle image-capturing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005286379A (en) * 2004-03-26 2005-10-13 Fuji Photo Film Co Ltd Photographing support system and photographing support method
WO2019163118A1 (en) * 2018-02-26 2019-08-29 株式会社オプティム Computer system, drone control method, and program
WO2020004029A1 (en) * 2018-06-26 2020-01-02 ソニー株式会社 Control device, method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005286379A (en) * 2004-03-26 2005-10-13 Fuji Photo Film Co Ltd Photographing support system and photographing support method
WO2019163118A1 (en) * 2018-02-26 2019-08-29 株式会社オプティム Computer system, drone control method, and program
WO2020004029A1 (en) * 2018-06-26 2020-01-02 ソニー株式会社 Control device, method, and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023089984A1 (en) * 2021-11-19 2023-05-25 富士フイルム株式会社 Moving vehicle, moving vehicle image-capturing system, and moving vehicle image-capturing method

Also Published As

Publication number Publication date
JP7208402B2 (en) 2023-01-18
JPWO2021014699A1 (en) 2021-01-28

Similar Documents

Publication Publication Date Title
JP7341991B2 (en) monitoring device
CN111728572B (en) Automatic endoscope equipment control system
TW201321059A (en) Camera ball turret having high bandwidth data transmission to external image processor
WO2020230371A1 (en) Control device, program, and control method
WO2021014699A1 (en) Information processing device
JP7079345B2 (en) Information processing equipment
JPWO2019093198A1 (en) Flight control device and flight control system
JP6999353B2 (en) Unmanned aerial vehicle and inspection system
WO2019087891A1 (en) Information processing device and flight control system
JP6945004B2 (en) Information processing device
JP7167341B2 (en) Information processing equipment
JP7285927B2 (en) Control device
WO2020262528A1 (en) Information processing device, and information processing method
WO2020189491A1 (en) Information processing device and information processing method
WO2020262529A1 (en) Information processing device, and information processing method
JP7499663B2 (en) Information processing device
US11794900B2 (en) Information processing apparatus
US20220148443A1 (en) Control device, program, and control method
JP7307593B2 (en) Information processing device and program
JP7060616B2 (en) Information processing equipment
WO2024057746A1 (en) Correction device
US20210360164A1 (en) Image control method and device, and mobile platform
US20210129988A1 (en) Sending apparatus and program
JP2021096699A (en) Flight route learning device, condition correcting device and flight route determining device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20844886

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021534541

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20844886

Country of ref document: EP

Kind code of ref document: A1