KR101648680B1 - Unmanned Aerial Vehicle - Google Patents

Unmanned Aerial Vehicle Download PDF

Info

Publication number
KR101648680B1
KR101648680B1 KR1020150162698A KR20150162698A KR101648680B1 KR 101648680 B1 KR101648680 B1 KR 101648680B1 KR 1020150162698 A KR1020150162698 A KR 1020150162698A KR 20150162698 A KR20150162698 A KR 20150162698A KR 101648680 B1 KR101648680 B1 KR 101648680B1
Authority
KR
South Korea
Prior art keywords
information
living
image data
organism
unmanned airplane
Prior art date
Application number
KR1020150162698A
Other languages
Korean (ko)
Inventor
이상돈
최용선
김민경
Original Assignee
이화여자대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 이화여자대학교 산학협력단 filed Critical 이화여자대학교 산학협력단
Priority to KR1020150162698A priority Critical patent/KR101648680B1/en
Application granted granted Critical
Publication of KR101648680B1 publication Critical patent/KR101648680B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • B64C27/08Helicopters with two or more rotors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • G06K9/00778
    • G06K9/6204
    • B64C2201/127
    • B64C2201/141
    • B64C2201/145

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Toys (AREA)

Abstract

Provided is an unmanned aerial vehicle which analyzes the number and position information of entities related to living things using the image data having the living things and transmits the analyzed results to a control center. The unmanned aerial vehicle comprises: an image acquiring part which acquires first image data having the body temperature information of the living things; a calculating part which extracts the outline of the first image data and calculates the number of the entities of the living things using the outline; and a communication part which receives satellite navigation information related to the unmanned aerial vehicle and transmits the determined position information of the living things and the number of the entities to the control center on the basis of the satellite navigation information.

Description

Unmanned Aerial Vehicle

More specifically, it relates to a drone that tracks life and collects data related to life.

As science and technology develops today, research on individual species, population numbers or survival areas of living organisms is being conducted using various electronic devices. However, most of the rare or endangered species still have difficulties in the process of investigation because they are still alive in a wild environment where human access is difficult.

Conventional Korea Meteorological Administration has published and used research on methods for predicting and detecting movements of birds, bats or insects using weather radars that predict rainfall or precipitation probabilities. However, there is a limitation that observation methods using weather radar are limited to studies related to large-scale habitat or migration, and there is also a problem due to high cost of the weather radar itself.

According to one aspect of the present invention, there is provided a unmanned airplane, which analyzes the number and location of objects related to the living organism using image data including living organisms, and transmits the analyzed information to a control center. Wherein the unmanned airplane comprises an image acquisition unit for acquiring first image data including body temperature information of a living organism, a calculation unit for extracting contours of the first image data and calculating the number of living things using the contour, And a communication unit for receiving the satellite navigation information related to the airplane and transmitting the location information of the living organ determined based on the satellite navigation information and the number of the objects to the control center.

According to an embodiment, the calculation unit may calculate the pixel having the slope of the pixels of the first image data equal to or greater than a predetermined threshold as the outline.

According to another embodiment, the unmanned airplane may further include a controller for moving the UAV so that a representative position of the creature is located at a central point of the first image data. In addition, the calculation unit may calculate the representative position of the creature using the positional information of the pixel corresponding to the contour line. Also, the calculation unit may calculate the movement path of the living body using the satellite navigation information and the representative position, and the communication unit may transmit the movement path to the control center.

According to another embodiment, the unmanned airplane may further include a database storing identification information of the creature. In addition, the calculation unit may calculate at least one of the length and the width of the organism using the contour line, and compare the length and the width with the identification information.

According to another embodiment, the unmanned airplane may further include a database storing identification information of the creature. In addition, the image acquiring unit acquires second image data including the activity information of the organism, and the calculating unit may compare the color of the organism extracted using the second image data with the identification information. In addition, the calculation unit may determine whether or not to fly the organism using the second image data, and calculate at least one of the length of the wing of the organism and the length of the petite calculated on the basis of the contour during flight of the organism, It can be compared with information. More specifically, the calculation unit determines at least one of an individual species candidate and individual species information of the organism according to a result of the comparison, and the communication unit transmits at least one of the individual species candidate and the species species information to the control center Lt; / RTI >

According to another aspect, there is provided a unmanned airplane that communicates with a living observing device installed at a predetermined location to automatically collect and update data associated with living creatures. Wherein the unmanned airplane comprises a control unit for moving the UAV to a living organism observing apparatus installed at a pre-designated position, and a controller for transmitting a command signal for turning on the network connection of the living observing apparatus and receiving data related to living creatures from the living observing apparatus And a communication unit. More specifically, the data may include at least one of thermal infrared image data, visible region image data, temperature information, humidity information, and sound information.

According to an embodiment, the communication unit of the unmanned airplane may receive time tag information related to the living organism observed from the outline of the thermal infrared image data from the living organism observing apparatus.

According to another embodiment, the communication unit may receive residual battery information of the living observing apparatus from the living observing apparatus.

According to another embodiment, the communication unit may transmit a command signal related to the memory initialization of the living observing apparatus to the living observing apparatus.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a diagram illustrating an operation of a UAV that tracks a creature according to an exemplary embodiment of the present invention; FIG.
FIG. 2 is a diagram illustrating an outline of image data extracted by an unmanned aerial vehicle according to an exemplary embodiment of the present invention. Referring to FIG.
FIGS. 3A, 3B, and 3C are diagrams for explaining a configuration in which an unmanned airplane according to an exemplary embodiment controls a moving direction and a moving speed according to movement of living creatures. FIG.
4 is a block diagram of an unmanned aerial vehicle according to an exemplary embodiment of the present invention.
5 is an exemplary view illustrating an operation of a UAV that transmits and receives data to and from a living observing apparatus according to an exemplary embodiment of the present invention.
6 is a flowchart illustrating a method of communicating an unmanned aerial vehicle and a living observation device according to an exemplary embodiment of the present invention.

Specific structural or functional descriptions of embodiments are set forth for illustration purposes only and may be embodied with various changes and modifications. Accordingly, the embodiments are not intended to be limited to the particular forms disclosed, and the scope of the present disclosure includes changes, equivalents, or alternatives included in the technical idea.

The terms first or second, etc. may be used to describe various elements, but such terms should be interpreted solely for the purpose of distinguishing one element from another. For example, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.

It is to be understood that when an element is referred to as being "connected" to another element, it may be directly connected or connected to the other element, although other elements may be present in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, the terms "comprises ", or" having ", and the like, are used to specify one or more of the described features, numbers, steps, operations, elements, But do not preclude the presence or addition of steps, operations, elements, parts, or combinations thereof.

Unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the meaning of the context in the relevant art and, unless explicitly defined herein, are to be interpreted as ideal or overly formal Do not.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a diagram illustrating an operation of a UAV that tracks a creature according to an exemplary embodiment of the present invention; FIG. Referring to FIG. 1, there is shown a UAV 100 that tracks life and collects data related to the life. In the present specification, the UAV 100 represents a flying object that is not controlled by a pilot but is controlled or adjusted by a remote pilot, and may be implemented in various forms such as a drone or a helicopter. In addition, living things in this specification refer to all living beings, and can be applied to various forms such as animals and plants.

Today, rare species or endangered species, which have a relatively high research value compared to other life forms, are found to be inhabited in a natural environment where human access is difficult. More specifically, the rare or endangered species may reside in an uninhabited island area, a deep mountain valley, a cliff, or a swamp.

Conventionally, in order to collect data related to the ecology of these living things and to carry out studies on ecology, most of ecologists have to move directly to the book area or the mountainous area, and it takes a lot of resources and time to observe living things The research was conducted in a way. In addition, there is an observation method using an electronic device such as a configuration using a weather radar for observing a living body or a configuration for installing an observation camera. However, in order to directly track an organism to be measured, There are limitations.

When the UAV 100 according to the present embodiment is used, an effect of freely approaching a cliff, a river or a marsh where people are difficult to approach can be expected. In addition, if the UAV 100 is flying at a higher altitude than the altitude at which the creature is active, it can be expected that an image data can be used to easily track a large number of living things. In addition, the UAV 100 can accurately track living organisms flying long distances such as seasonal migratory birds.

A more detailed description of the specific operation of the UAV 100 that tracks the organism and collects the necessary data from the living observing apparatus will be described with additional figures below.

FIG. 2 is a diagram illustrating an outline of image data extracted by an unmanned aerial vehicle according to an exemplary embodiment of the present invention. Referring to FIG. Referring to FIG. 2, image data 200 including life forms is shown. More specifically, the image data 200 is a thermal image obtained by an unmanned aerial vehicle, and may include body temperature information of a living body. In this specification, the thermal image can detect the radiant energy radiated by the object itself independently of the presence or absence of light in the outer space, and can display the reproduced image using the radiant energy.

The unmanned airplane can set an object corresponding to a predetermined temperature range and determine whether image data 200 including the object is acquired. More specifically, the unmanned airplane can pre-set the temperature range associated with the object using a database associated with the organism. In addition, the unmanned airplane can pre-set the area associated with the creature as a flight area, using a database associated with the creature.

Illustratively, the unmanned airplane can track tree sparrows in urban areas. Sparrows are brown algae that survive throughout Eurasia and have an average body temperature of 43 ° C. Accordingly, the unmanned airplane can use the database information associated with the sparrow described above to set the urban area as the flight area, and track the object having a temperature range of 41 ° C to 45 ° C.

Assume that the unmanned airplane acquires the image data 200 including the object in the temperature range of 41 ° C to 45 ° C as described above. The unmanned airplane may divide a first region corresponding to the temperature range and a second region that deviates from the temperature range within the acquired image data 200. In addition, the unmanned airplane can extract the boundary between the first area and the second area as a contour line.

Illustratively, the unmanned airplane can calculate pixels having the maximum or minimum points of pixel data in the acquired thermal image data as a component of the contour line. More specifically, the unmanned airplane may calculate, as the contour line, an edge line in which the pixel value of the data is suddenly changed in comparison with the neighboring pixel.

In addition, the unmanned airplane can use the extracted contour to calculate information on the number of living things. More specifically, the number of individuals can correspond to a closed curve within the set comprising the contours. The unmanned airplane can calculate the number of living things by calculating the number of closed curves included in the extracted contour line and calculating the length of the long axis or short axis appearing in the closed curve and comparing with the database related to living creatures. In this specification, the long axis may represent the farthest distance between pixels in the closed curve, and the short axis may represent the closest distance between pixels in the closed curve in a direction perpendicular to the long axis.

In one embodiment, the unmanned airplane can compare the pixel data of each pixel in the image data 200 with the pixel data of neighboring pixels using the acquired image data 200. In addition, the unmanned airplane can extract, as an outline, an edge whose difference value between pixel data of one pixel and neighboring pixels is greater than or equal to a predetermined threshold value.

In the case of the embodiment shown in FIG. 2, the unmanned airplane can extract the four objects 210, 220, 230, and 240 corresponding to the contours. In addition, the UAV can compare the length of the major axis of each of the extracted objects 210, 220, 230, 240 with the length information 14 cm of the sparrow stored in the database. More specifically, the long axis in the present embodiment may represent the longest line segment connecting the pixels on the closed curve representing the extracted object 210, 220, 230, 240. In addition, the UAV can determine whether the length of the long axis of each of the extracted objects 210, 220, 230, and 240 is within 14 cm of the length information about the sparrow and within the confidence range.

In one embodiment, the fourth object 240 can be removed from the unmanned airplane using noise information other than sparrows using the length information 14 cm of the sparrow stored in the database. Illustratively, the fourth object may represent a bull where the sparrow sits. According to this embodiment, by removing at least one object out of the confidence range on the basis of the length information corresponding to the sparrow, the unmanned airplane can remove the object other than the sparrow as noise information. Illustratively, the unmanned airplane can remove objects such as hawk or sea gulls as noise information, as in the present embodiment.

In addition, the UAV can judge the first object 210, the second object 220, and the third object 230, which exist in the confidence range, as sparrows in comparison with the length information 14 cm associated with the sparrow. Illustratively, suppose that the unmanned airplane uses the third object 230 to determine that the detected object is a sparrow. The unmanned airplane can calculate the length (full length) of the body of the third object 230 flying. In addition, the unmanned airplane can calculate the length of the wings of the third object 230 flying. More specifically, the length of the wing may indicate a wing span of the third object 230 to the maximum extent of the wing. The unmanned airplane can determine whether the object of the third object 230 is a sparrow using at least one of the length of the body of the third object 230 and the maximum length of the wing. Accordingly, the unmanned airplane can extract three closed curves corresponding to sparrows with three objects 210, 220, and 230 and extract them as contour lines.

According to the present embodiment, the unmanned airplane can extract only the outline of a predetermined temperature range from the thermal image data 200 having the body temperature information of the living body, and remove the non-living body as the noise information in the image data 200. In addition, it is possible to remove, as noise information, other objects having a temperature range similar to life by comparing the information about the living organism stored in the database with the lengths of long axes or short axes of the closed curve in the outline. According to the present embodiment, the user can expect the effect of confirming the existence and existence of the living creatures existing in the area of interest in a more convenient environment.

FIGS. 3A, 3B, and 3C are diagrams for explaining a configuration in which an unmanned airplane according to an exemplary embodiment controls a moving direction and a moving speed according to movement of living creatures. FIG. Referring to FIG. 3A, a unmanned airplane 320 observing a moving goose flock 310 is shown. If the unmanned airplane 320 is flying at the same altitude as the geese flock 310, then the unmanned airplane 320 will maintain the group of geese 310 within an appropriate line of sight (LOS) 310) will have great difficulty flying. Because the goose flock 310 is a natural life form, there is a limitation in controlling or predicting the flight direction or speed of the flight by the machine.

However, the embodiment is drone (320) is observed geese bunch 310 to fly the flight altitude H 2 with a predetermined threshold distance or more difference in flight than the height H 1 of geese bunch 310 shown in Figure 3a Is shown. According to the embodiment shown in FIG. 3A, the UAV 320 can expect the effect of safely observing the goose flock 310 without the possibility of collision with the goose flock 310. In addition, the effect of observing all the geese flock 310 including a plurality of geese in one image data according to the flight altitude H 2 of the UAV 320 can be expected.

Referring to FIG. 3B, the image data 330 obtained by the UAV 320 and the contours extracted from the image data 330 are shown at a reference point T 1 . Illustratively, the image data 330 may include a group of geese 310 to be observed by the UAV 320. The detailed description of the configuration in which the unmanned airplane 320 extracts the contour line from the image data 330 can be applied to the description with reference to FIG. 2, and a detailed description thereof will be omitted.

The UAV 320 may calculate position data corresponding to the contour of the obtained geese flock 310. [ More specifically, the position data may represent two-dimensional position data defined along the X and Y axes. In addition, the UAV 320 can set the focal point of the obtained image data 330 to (0, 0) and calculate the position data of the (X, Y) value corresponding to the pixels of the remaining contours.

The UAV 320 can calculate the representative position 331 of the goose flock 310 corresponding to the reference time T 1 using the position data of each of the plurality of pixels included in the outline. More specifically, the unmanned airplane 320 can calculate the representative position 331 using the following equation (1).

Figure 112015113103234-pat00001

In Equation (1), K is the total number of pixels included in the contour of the geese flock 310, and the representative position 331 can be calculated as an arithmetic mean of the X and Y coordinates corresponding to each pixel. Illustratively, suppose that the representative position 331 corresponding to the reference time point is calculated as the origin (0, 0).

Referring to FIG. 3C, the image data 340 obtained by the UAV 320 at the reference time point T 2 and the contours extracted from the image data 340 are shown. Illustratively, the image data 340 may include a group of geese 310, which are objects to be observed by the UAV 320. As an example, the reference time point T 2 may be a time point when a predetermined time has elapsed compared with the reference time point T 1 .

Illustratively, assume that the unmanned airplane has not moved in place between the reference times T 1 and T 2 . If the goose flock 310 has moved between the reference times T 1 and T 2 , the UAV 320 may obtain an outline having different position data from the image data 340. The difference value of the position data may indicate a moving direction and a moving distance of the corresponding geese flock 310 between the reference times T 1 and T 2 . Similarly, the UAV 320 may calculate two-dimensional position data corresponding to the contour of the second image data 340. In addition, the UAV 320 can calculate the second representative position 341 of the geese flock 310 corresponding to the reference time T 2 using Equation 1 and the second image data 340.

In addition, the UAV 320 calculates a difference value between the first representative position 331 and the second representative position 341, and calculates the difference between the movement direction and the movement speed at which the UAV 320 must move, Can be determined. More specifically, the UAV 320 can fly so that the second representative position 341 is located at the origin (0, 0). In addition, the UAV 320 may calculate traces drawn by a plurality of representative locations over time, and generate the traces as a movement path corresponding to the geese flock 310. The UAV 320 may transmit the satellite information received from the GPS satellite and the generated travel route to the control center to report the location and movement route of the goose flock 310. [

Illustratively, FIGS. 3B and 3C show a configuration for calculating a representative position using two-dimensional position data corresponding to a pixel, but this may also be applied to three-dimensional position data. When the image data acquired by the UAV 320 is three-dimensional image data including a depth value, the UAV 320 calculates a three-dimensional representative position including a z value corresponding to the flight altitude . In addition, the UAV 320 may control its own moving direction and moving speed according to the moving direction and moving distance of the three-dimensional representative position with respect to time.

4 is a block diagram of an unmanned aerial vehicle according to an exemplary embodiment of the present invention. Referring to FIG. 4, there is shown a block diagram of an unmanned airplane 400 that transmits data associated with an observed organism to a control center using acquired image data. The unmanned airplane 400 may include an image acquisition unit 410, a calculation unit 420, a communication unit 430, a database 440, and a control unit 450.

The image acquiring unit 410 may acquire first image data including body temperature information of a living body. In addition, the image acquiring unit 410 may acquire second image data including color information of a living organism. In one embodiment, the image acquiring unit 410 may be implemented as an infrared camera that senses electromagnetic radiation of a wavelength band of 0.9 μm or more and 14 μm or less and generates an image corresponding to the sensed radiant energy. In another embodiment, the image acquisition unit 410 may be implemented as a CCD camera (charge coupled device camera) that converts an image photographed using a charge coupled device into an electrical signal and stores the electrical signal in a memory device in the form of digital data .

The calculation unit 420 may extract a contour line from the first image data and calculate the number of living things using the contour line. More specifically, the calculation unit 420 may extract the pixel having the pixel data difference value with neighboring pixels among the pixels of the first image data equal to or greater than a predetermined threshold value as the outline.

In addition, the calculation unit 420 may calculate at least one of the length and the width of the living organism using the outline. More specifically, the calculation unit 420 may calculate the lengths of major axes and minor axes existing in a closed curve corresponding to each living organism. The long axis may represent a farthest distance of pixels existing in a closed curve corresponding to the living organism. The short axis may represent the closest distance in a direction perpendicular to the long axis of the pixels present in the closed curve.

In one embodiment, the calculation unit 420 may acquire a depth value corresponding to the altitude at which the organism exists in the second image data, and determine whether or not to fly the organism according to the depth value. In addition, the calculation unit 420 may calculate the length of the wings of the living creature and the length of the body during the flight based on the outline of the life of the creature. More specifically, the length of the wings of the creature may represent a wing span that the creature has maximally spread the wing.

In addition, the calculation unit 420 can extract the color information of the living organism from the second image data by using the outline of the living organism extracted using the first image data.

The communication unit 430 can receive the current position of the unmanned airplane as the navigation information. More specifically, the communication unit 430 can receive the current position information from the GPS satellite and transmit it to the control center as position information associated with the creature. In addition, the communication unit 430 may transmit at least one of the number of objects, the length information, the width information, and the color information of the living organism obtained using the first image data and the second image data to the control center. Herein, the control center indicates a place including a control device associated with control or manipulation of the unmanned airplane, and may include a portable computing device such as a user terminal or a laptop, as well as a computing device installed at a fixed point .

For example, the communication unit 430 may be a WLAN (Wireless LAN), a WiFi (Wireless Fidelity) Direct, a DLNA (Digital Living Network Alliance), a Wibro (Wireless broadband), a Wimax (World Interoperability for Microwave Access) (Bluetooth), Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Near Field Communication (NFC) Interface with the control center using at least one of the interfaces.

The database 440 may store the identification information related to the creature, and may provide the identification information to the calculation unit 420. The calculation unit 420 may determine whether the data obtained using the identification information is data corresponding to the creature. Illustratively, the database 440 may store at least one of the living area of the creature, the average body temperature, the color, the length of the body, the width of the body, and the length of the wing of the bird as the identification information.

In addition, the calculation unit 420 determines at least one of the individual species candidates and the species species information of the organism according to the result of the comparison, and the communication unit 430 can transmit at least one of the object species candidate and the species species information to the control center have.

The control unit 450 can control the flight of the UAV 400 using the representative position of the creature. More specifically, the calculation unit 420 may calculate the representative position of the creature using the positional information of the pixel corresponding to the contour of the first image data. Illustratively, the calculation unit 420 may calculate an arithmetic average value of the positional data of the pixels corresponding to the contour with the representative position using Equation (1).

The control unit 450 calculates the moving direction and the moving distance of the representative position with respect to time, and moves the UAV 400 according to the moving direction and the moving distance. Illustratively, the control unit 450 may move the UAV 400 such that the representative position is located at the focal point of the first image data acquired by the image acquisition unit 410.

5 is an exemplary view illustrating an operation of a UAV that transmits and receives data to and from a living observing apparatus according to an exemplary embodiment of the present invention. Referring to FIG. 5, there are shown two life observation devices 521 and 522 installed at predetermined locations, and an unmanned airplane 510 transmitting and receiving data. The Ministry of Environment of the Republic of Korea defines endangered species of wildlife and wildlife as endangered species and endangered species of wildlife and plants in the near future. .

Therefore, in order to investigate the ecology of these endangered species or rare species, scientists have difficulties in collecting relevant data for a limited time in protected areas.

However, the living body observing devices 521 and 522 according to the present embodiment may acquire first image data including body temperature information of a living organism and second image data including color information of a living organism, And a communication interface for transmitting and receiving data.

The unmanned airplane 510 can access the at least one life observation device 521 and 522 within a predetermined communication distance to receive desired data. More specifically, the UAV 510 may access the at least one life observation device 521, 522 and transmit a command signal to cause the network connection of at least one life observation device 521, 522 to be turned on. In addition, when at least one life observation device 521, 522 is turned on, the unmanned airplane 510 can receive data associated with life from at least one life observation device 521, 522. The data may include at least one of thermal infrared image data, visible region image data, temperature information, humidity information, and sound information.

In one embodiment, the first living observing device 521 may transmit image data to the UAV 510, including image data including whether the creature utilizes the moving path and the moving time of the living using the moving path. More specifically, the first living observing apparatus 521 detects the appearance of the living organism using the contour of the image data, and transmits time tag information related to appearance of the living organism to the UAV 510 . In addition, the first living observing device 521 measures the noise of the road over time, and can transmit the moving path of the living creature according to the noise information to the UAV 510.

In another embodiment, the second living observing device 522 may transmit information on the time when living things change, temperature information, and humidity information to the UAV 510. Illustratively, the time information at which the organism changes may indicate the time at which the organism, the plant, is flowering.

In the case of using the UAV 510 and the at least one life observation device 521, 522 according to the present embodiment, the scientist can obtain the effect of acquiring information about a living creature existing in a desired area at a desired location. The scientist directly accesses the life observation devices 521 and 522 to confirm the photographed image data and to receive data transmitted from the unmanned airplane 510 in the laboratory without consuming time and resources such as initializing the memory And the creature observation devices 521 and 522 can be maintained and used by using the unmanned airplane 510.

6 is a flowchart illustrating a method of communicating an unmanned aerial vehicle and a living observation device according to an exemplary embodiment of the present invention. Referring to FIG. 6, a method 600 of communication between the UAV 610 and the life observation device 620 is shown. According to this embodiment, the unmanned airplane 610 can move to the location of the predetermined life observation device 620. More specifically, the unmanned airplane 610 can be steered by a user present in the control center. Alternatively, the unmanned airplane 610 may automatically move to a designated coordinate at a predetermined time according to a predetermined command.

The unmanned airplane 610 may transmit a command signal 631 to turn on the network connection to the living observation device 620 when the unmanned airplane 610 approaches the living observation device 620 within a communication distance.

Accordingly, the life observation apparatus 620 can transmit the response signal 632 corresponding to the received command signal 631 to the unmanned airplane 610. More specifically, the response signal 632 may be an acknowledgment signal for the previously transmitted command signal 631. In addition, the life observation device 620 is a living observation for network connection. (ID) information unique to the device 620 to the unmanned airplane 610 as the response signal 632.

The unmanned airplane 610 receives the response signal 632 and decodes the id information of the living observing apparatus 620 included in the response signal 632 to determine whether the living observing apparatus 620 It can be judged. As a result of the determination, when the data observation device 620 is scheduled to receive data, the UAV 610 transmits a data request signal 633 including an IP address for data upload and network resource allocation information to the life observation device 620).

However, in a case where the unmanned airplane 610 has not received the ack signal from the living observing apparatus 620, the unmanned airplane 610 transmits the command signal 631 for turning on the network to the living observing apparatus 620 Can be retransmitted. In another embodiment, when the UAV 610 fails to receive an ack signal from the life observation device 620, the UAV 610 may report an abnormal state of the life observation device 620 to the control center .

The life observation device 620 receives the data request signal 633 including the network resource allocation information from the UAV 610 and transmits the data 634 to the unmanned airplane 610 using the network resource included in the data request signal 633. [ To the airplane 610 or other external device. As described above, the data 634 may include at least one of thermal infrared image data, visible region image data, temperature information, humidity information, and sound information. In addition, the life observation device 620 can transmit the remaining battery information of the life observation device 620 together with the data to the UAV.

The unmanned airplane 610 may send a command signal 635 to the life observation device 620 to report the completion of the transmission of the data 634 and to turn off the network when the transmission of the data 634 is completed. In addition, the unmanned airplane 610 may send command signals associated with the memory initialization of the life observation device 620 to the life observation device 620.

The embodiments described above may be implemented in hardware components, software components, and / or a combination of hardware components and software components. For example, the devices, methods, and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, such as an array, a programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.

The software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded. The software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave. The software may be distributed over a networked computer system and stored or executed in a distributed manner. The software and data may be stored on one or more computer readable recording media.

The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

Although the embodiments have been described with reference to the drawings, various technical modifications and variations may be applied to those skilled in the art. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.

Claims (12)

delete delete delete delete delete delete An image acquiring unit acquiring first image data including body temperature information of a living body;
A calculation unit for extracting an outline of the first image data and calculating the number of living things using the outline;
A communication unit for receiving satellite navigation information related to an unmanned airplane and transmitting the location information and the number of the living bodies determined based on the satellite navigation information to a control center; And
A database storing identification information of the organism
Lt; / RTI >
Wherein the image acquiring unit acquires second image data including activity information of the organism, the calculating unit compares the color of the organism extracted using the second image data with the identification information, For comparing the at least one of the wing span and the petite length of the creature calculated on the basis of the contour during flight of the creature with the identification information, .
8. The method of claim 7,
Wherein the calculation unit determines at least one of the individual species candidate and the species species information of the organism according to a result of the comparison, and the communication unit transmits the at least one of the individual species candidate and the species species information to the control center .
A control unit for moving the unmanned airplane using a living body observing device installed at a predetermined location; And
A communication unit that transmits a command signal to turn on the network connection of the living organism observation device and receives data related to living creatures from the living organism observation device;
Lt; / RTI >
Wherein the data includes at least one of thermal infrared image data, visible region image data, temperature information, humidity information, and sound information.
10. The method of claim 9,
Wherein the communication unit receives time tag information related to the living organism observed using the outline of the thermal infrared image data from the living organism observing apparatus.
10. The method of claim 9,
Wherein the communication unit receives remaining battery information of the living organism observing apparatus from the living observing apparatus.
10. The method of claim 9,
Wherein the communication unit transmits a command signal related to the memory initialization of the living organism observing apparatus to the living organism observing apparatus.
KR1020150162698A 2015-11-19 2015-11-19 Unmanned Aerial Vehicle KR101648680B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150162698A KR101648680B1 (en) 2015-11-19 2015-11-19 Unmanned Aerial Vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150162698A KR101648680B1 (en) 2015-11-19 2015-11-19 Unmanned Aerial Vehicle

Publications (1)

Publication Number Publication Date
KR101648680B1 true KR101648680B1 (en) 2016-08-16

Family

ID=56854659

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150162698A KR101648680B1 (en) 2015-11-19 2015-11-19 Unmanned Aerial Vehicle

Country Status (1)

Country Link
KR (1) KR101648680B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108089205A (en) * 2017-12-21 2018-05-29 成都大学 A kind of unmanned plane flies to control personnel location system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040083661A (en) * 2003-03-24 2004-10-06 주식회사 에스원 Appratus and method for extracting subject in observing camera
KR20150031530A (en) * 2013-09-16 2015-03-25 주식회사 에스원 Method and apparatus for video surveillance by using surveillance apparatus of unmanned aerial vehicle
KR101536095B1 (en) * 2015-01-14 2015-07-13 농업회사법인 주식회사 에이치알제주 Grassland management system using drone
KR20150098485A (en) * 2014-02-20 2015-08-28 주식회사 콕스 Automatic focuser of thermal imaging camera and method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040083661A (en) * 2003-03-24 2004-10-06 주식회사 에스원 Appratus and method for extracting subject in observing camera
KR20150031530A (en) * 2013-09-16 2015-03-25 주식회사 에스원 Method and apparatus for video surveillance by using surveillance apparatus of unmanned aerial vehicle
KR20150098485A (en) * 2014-02-20 2015-08-28 주식회사 콕스 Automatic focuser of thermal imaging camera and method thereof
KR101536095B1 (en) * 2015-01-14 2015-07-13 농업회사법인 주식회사 에이치알제주 Grassland management system using drone

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108089205A (en) * 2017-12-21 2018-05-29 成都大学 A kind of unmanned plane flies to control personnel location system
CN108089205B (en) * 2017-12-21 2021-02-02 成都大学 Unmanned aerial vehicle flies accuse personnel positioning system

Similar Documents

Publication Publication Date Title
Alam et al. A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs)
Nguyen et al. TrackerBots: Autonomous unmanned aerial vehicle for real‐time localization and tracking of multiple radio‐tagged animals
Cliff et al. Robotic ecology: Tracking small dynamic animals with an autonomous aerial vehicle
US20200026720A1 (en) Construction and update of elevation maps
US8615105B1 (en) Object tracking system
Alanezi et al. Livestock management with unmanned aerial vehicles: A review
US11912407B1 (en) Unmanned vehicle morphing
Kazmi et al. Adaptive surveying and early treatment of crops with a team of autonomous vehicles
WO2010129907A2 (en) Method and system for visual collision detection and estimation
CN109963465A (en) For the system and method via unmanned vehicle identity comprising the harmful organism in the region of crops
CN109996729A (en) For the system and method based on damaged crops detection via unmanned vehicle identity comprising the harmful organism in the region of crops
CN106647805A (en) Unmanned aerial vehicle, and method and device for autonomous flight of unmanned aerial vehicle
CN113950063B (en) Wireless communication network networking method, wireless communication network networking device, computer equipment and storage medium
Rizk et al. Toward AI-assisted UAV for human detection in search and rescue missions
Jensen et al. Tracking tagged fish with swarming unmanned aerial vehicles using fractional order potential fields and Kalman filtering
Buchelt et al. Exploring artificial intelligence for applications of drones in forest ecology and management
Bayram et al. Active localization of VHF collared animals with aerial robots
Deebak et al. Aerial and underwater drone communication: potentials and vulnerabilities
Minakshi et al. High-accuracy detection of malaria mosquito habitats using drone-based multispectral imagery and Artificial Intelligence (AI) algorithms in an agro-village peri-urban pastureland intervention site (Akonyibedo) in Unyama SubCounty, Gulu District, Northern Uganda
Ju et al. Investigation of an autonomous tracking system for localization of radio-tagged flying insects
KR101648680B1 (en) Unmanned Aerial Vehicle
Bhattacharya et al. IDeA: IoT-based autonomous aerial demarcation and path planning for precision agriculture with UAVs
Kumar et al. Safety wing for industry (SWI 2020)–an advanced unmanned aerial vehicle design for safety and security facility management in industries
Pal et al. A Comprehensive Review of AI-enabled Unmanned Aerial Vehicle: Trends, Vision, and Challenges
Teschner et al. Digital twin of drone-based protection of agricultural areas

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190701

Year of fee payment: 4