KR101696088B1 - Method for recognizing object by ultrasound and apparatus therefor - Google Patents

Method for recognizing object by ultrasound and apparatus therefor Download PDF

Info

Publication number
KR101696088B1
KR101696088B1 KR1020150113502A KR20150113502A KR101696088B1 KR 101696088 B1 KR101696088 B1 KR 101696088B1 KR 1020150113502 A KR1020150113502 A KR 1020150113502A KR 20150113502 A KR20150113502 A KR 20150113502A KR 101696088 B1 KR101696088 B1 KR 101696088B1
Authority
KR
South Korea
Prior art keywords
image
array
ultrasound
column
ultrasonic
Prior art date
Application number
KR1020150113502A
Other languages
Korean (ko)
Inventor
유선철
조한길
구정회
조현우
표주현
Original Assignee
포항공과대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 포항공과대학교 산학협력단 filed Critical 포항공과대학교 산학협력단
Priority to KR1020150113502A priority Critical patent/KR101696088B1/en
Application granted granted Critical
Publication of KR101696088B1 publication Critical patent/KR101696088B1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8977Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using special techniques for image reconstruction, e.g. FFT, geometrical transformations, spatial deconvolution, time deconvolution
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/50Systems of measurement, based on relative movement of the target
    • G01S15/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S15/60Velocity or trajectory determination systems; Sense-of-movement determination systems wherein the transmitter and receiver are mounted on the moving object, e.g. for determining ground speed, drift angle, ground track
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52001Auxiliary means for detecting or identifying sonar signals or the like, e.g. sonar jamming signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52046Techniques for image enhancement involving transmitter or receiver

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Image Analysis (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Provided are an ultrasound wave object recognition method and an apparatus thereof. According to an embodiment of the present invention, the ultrasound wave object recognition method comprises the following steps: 1) generating at least one template image in at least one direction with respect to an object to be detected; 2) emitting a plurality of ultrasound wave beams and obtaining an ultrasound wave image with respect to each ultrasound beam by using reflection wave; 3) obtaining a plurality of one-dimensional arrangements with respect to an area where the object exists in the template image and obtaining a plurality of one-dimensional arrangements with respect to the area where the object exists in the ultrasound wave image; 4) calculating conformity of the one-dimensional arrangements of the template image and the one-dimensional arrangements of the ultrasound wave image; and 5) determining whether the object is recognized based on a degree of dispersion. In step 5), the degree of dispersion of index values of the one-dimensional arrangement of the ultrasound wave image with the highest conformity of step 4) is used.

Description

[0001] The present invention relates to a method and apparatus for recognizing an ultrasonic object,

The present invention relates to a method and apparatus for recognizing an ultrasonic object.

In general, underwater object recognition in shallow water is required in a variety of fields such as underwater pipeline exploration, ocean surface ecological survey, engineering, civil engineering, environment, disaster prevention, and so on.

However, in case of deep sea, turbidity of water is generally high, and therefore, there is a limitation because it is difficult to obtain visibility by using an optical camera. At this time, when the ultrasonic probe is used, it is more effective than the optical camera because it can confirm tens of meters or more even in a state where the turbidity of water is high.

At this time, in order to recognize a target object using a detection device such as an underwater robot, a technique for determining whether or not the target object is an object is evaluated by evaluating the degree of agreement between the shape of the object to be recognized and the actually captured ultrasound image .

On the other hand, in order to evaluate the degree of agreement between the shape of an object and the ultrasound image, a method of comparing a template image, which is a sample ultrasound image photographed from various directions, with an ultrasound image photographed during an exploration is mainly used.

At this time, the accuracy of the degree of agreement between the object and the image is determined according to the degree of securing the template image. However, since there is a limitation in securing a template image, a method of artificially generating an ultrasound image of an object viewed from an arbitrary angle has been used in accordance with an ultrasound image simulator. In order to compare the simulation image with the ultrasound image of the object, a method of simplifying the ultrasound image through segmentation has been used.

However, there are difficulties in determining the level of simplification in the case of segmentation.

An embodiment of the present invention is to provide an ultrasonic object recognition method capable of directly comparing a template image of a target object with an ultrasound image of a photographed real object to determine whether the target object is an object.

An embodiment of the present invention is to provide an ultrasound object recognizing apparatus that compares a template image of a target object with a taken ultrasound image to determine whether the target object is an object or not.

The method for recognizing an ultrasound object according to an embodiment of the present invention includes the steps of: 1) generating one or more template images for at least one direction with respect to an object to be searched, 2) transmitting a plurality of ultrasound beams, Acquiring a plurality of one-dimensional arrays for the region in which the object exists in the template image, and obtaining a plurality of one-dimensional arrays for the region in which the object exists in the ultrasound image, 4) calculating a degree of agreement between a plurality of one-dimensional arrays of the template image and a plurality of one-dimensional arrays of the ultrasonic image, and calculating a degree of agreement using a correlation function expressed by the following equation (1) , Obtains averages of the maximum values of the agreement degree, obtains a maximum value among the averages of the maximum value of the agreement degree, 5) determining whether the object is recognized according to the scattering degree of the array made up of the maximum value among the averages of the maximum values of the degree of agreement, and if the scattering degree is less than a predetermined value Determining that the object to be searched exists in the region of the step 2) if the scattering degree is smaller than the threshold value, and if the object to be searched does not exist in the region of the step 2) .
At this time, Equation (1) is as follows.
R j , k (i) =

Figure 112016047264611-pat00022

(Wherein, S j (i): j-th column of the array of ultrasound images, T k (i): the template image k-th column array, R j, k (i) : 1≤i≤r s -r t +1 , r s : length of ultrasound image (number of rows), r t : length of template image (number of rows))
In addition, the 4) comprises the steps of: (1) one of the template image first to c t of the second column array and the x-th to the ultrasound images c t + x-1 by using a second column arranged between the formula 1 c t correlation function obtaining a step a) (where, 1≤x≤c s -c t +1), (2) c t of the correlation function, and calculating the maximum value in each of the average of M x of the maximum value calculation c s -c t b) obtaining +1, (3) -c c s is the maximum value of the t + 1 of M x c to obtain the M lmax) step, (4) l max th column array of ultrasound image To obtain the array consisting of l max + c t -1 column arrays. In this case, the step 5) may further include determining whether the object is recognized according to the scattering degree of the arrangement obtained in the step d).

The ultrasound object searching apparatus according to an embodiment of the present invention may include (1) an image acquiring unit that emits an ultrasound signal, receives an ultrasound reflection signal of the ultrasound signal to acquire an ultrasound image, (2) (3) an image obtained by acquiring an image obtained from the image acquiring unit and the image generating unit, comparing the one-dimensional arrays of the respective images, and determining whether the object is a search object .
(1) an acquisition unit for acquiring a one-dimensional array of the ultrasound image and the template image acquired from the image acquisition unit and the image generation unit; (2) The degree of agreement is calculated by comparing the one-dimensional arrays obtained by the array obtaining unit, using the correlation function expressed by Equation (1), and then averages of the maximum values of the degree of agreement are obtained, A maximum value among the averages of maximum values and a maximum value among averages of the maximum values of the degrees of agreement; (3) If the scattering degree is smaller than the predetermined threshold value, it is determined that the object exists in the search object. If the scattering degree is larger than the predetermined threshold value, And a comparison unit for determining whether or not the object exists and determining whether the object exists or not.
At this time, Equation (1) is as follows.
R j , k (i) =
Figure 112016047264611-pat00023

(Wherein, S j (i): j-th column of the array of ultrasound images, T k (i): the template image k-th column array, R j, k (i) : 1≤i≤r s -r t +1 , r s : length of ultrasound image (number of rows), r t : length of template image (number of rows))
The calculation unit obtains c t correlation functions using Equation (1) between the first to c t th column arrays of the template image and the x th to c t + x-1 th column arrays of the ultrasound image x≤c -c s t + 1), and, c t of the correlation between the average of the maximum value M of x calculated by calculating the maximum value in each function c s t -c obtained +1, and the acquired c s obtaining M lmax is the maximum value of the -c t +1 of M x, and it is possible to obtain an array of from l max th column array of ultrasound images l max + t c to -1-th column array. At this time, the comparison unit may compare the scatter diagram of the arrays of from l max th column array of ultrasound images l max + t c to -1-th column array.

delete

delete

delete

delete

delete

delete

delete

delete

delete

delete

delete

delete

delete

delete

delete

delete

delete

The method of recognizing an ultrasonic object according to an embodiment of the present invention can be utilized as a core technology for an intelligent underwater robot to fully recognize and explore an underwater object.

In addition, the ultrasonic object recognition method according to an embodiment of the present invention can not only judge a target object but also can be verified.

In addition, the ultrasonic object recognition apparatus according to an embodiment of the present invention can be utilized even at night or when the time is dark.

In addition, the ultrasonic object recognition apparatus according to an embodiment of the present invention can quickly determine an object.

1 is a flowchart of an ultrasonic object recognition method according to an embodiment of the present invention.
FIG. 2 is a diagram illustrating a column arrangement of an ultrasound image according to an embodiment of the present invention.
3 is a diagram showing a columnar graph of (a) ultrasound image and (b) a columnar graph of a template image according to an embodiment of the present invention.
4 is a graph illustrating a correlation function according to an embodiment of the present invention.
FIG. 5 is a diagram illustrating a parallel arrangement of a columnar array of template images in a columnar graph of ultrasound images according to an embodiment of the present invention.
FIG. 6 is a diagram illustrating a method of obtaining a maximum value among averages of maximum values of a correlation function according to an exemplary embodiment of the present invention.
7 is a block diagram of an ultrasonic object recognition apparatus according to an embodiment of the present invention.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be readily apparent to those skilled in the art to which the present invention pertains. The present invention may be embodied in many different forms and is not limited to the embodiments described herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and the same or similar components are denoted by the same reference numerals throughout the specification.

As shown in FIG. 1, a method 100 for recognizing an ultrasonic object according to an embodiment of the present invention includes generating (S110) at least one template image for at least one direction with respect to an object to be searched (S110) (S120) of acquiring a plurality of one-dimensional arrays for an area in which an object exists in the template image, and acquiring an object in the ultrasound image (S140) of obtaining a plurality of one-dimensional arrays of a plurality of one-dimensional arrays of template images and a plurality of one-dimensional arrays of ultrasonic images (S140) And determining whether the object is recognized (S150).

The method of recognizing an ultrasonic object according to the present invention is a method of recognizing an object using ultrasonic waves without any limitation. In an embodiment of the present invention, a case of detecting an object using ultrasound . In addition, in an embodiment of the present invention, a submersible robot uses sonar, which is one of underwater search equipment, to recognize an underwater object by using ultrasonic waves.

First, a template image is generated for an object to be searched (step S110).

In order to search the object to be searched, information that can be compared with the object to be searched is required. Accordingly, in an embodiment of the present invention, a template image can be obtained as information that can be compared with an object to be searched. The template image is an ultrasound image that is used as a sample of an object taken in various directions. The template image may be obtained by photographing an object using ultrasonic waves, or indirectly by using a simulator or the like. Since the template image is represented by a bitmap image, which is a two-dimensional array in which a plurality of one-dimensional arrays are arranged side by side, the template image can be represented by a columnar array of a plurality of one-dimensional arrays. Next, a plurality of ultrasound beams are transmitted and an ultrasound image is acquired using reflected waves for each of the ultrasound beams (step S120).

To search for objects, an underwater robot sends ultrasonic waves. As shown in FIG. 2, the ultrasonic waves transmitted at this time may be in the form of a plurality of beams having a small width in the front and a wide fan shape in the up and down direction. The underwater robot acquires an ultrasonic wave reflection signal reflected from an object existing in the traveling direction of the plurality of beams sent out. At this time, each of the acquired ultrasound reflection signals can be expressed in a one-dimensional array. In addition, a two-dimensional bit map can be constructed by arranging the ultrasonic wave reflection signals of the one-dimensional array side by side, and this two-dimensional bit map constitutes one ultrasonic image. Accordingly, the ultrasound image constituted by the two-dimensional bitmap can be represented by a columnar array, which is a plurality of one-dimensional arrays.

Next, a plurality of one-dimensional arrays are acquired for the region where the object exists in the template image acquired in Step S110, and a plurality of one-dimensional arrays are obtained for the region in which the object exists in the ultrasound image acquired in Step S120 (Step S130).

FIGS. 3A and 3B illustrate a 65th row array of ultrasound images and a 26th row array of template images, respectively, according to an embodiment of the present invention. At this time, the arrangement of the ultrasound images is larger than the arrangement of the template images because the ultrasound images include all the search ranges of the underwater robots, but the template images contain only the width of the object in the generated object images.

The 65th row of the ultrasound image is represented by S 65 (i) as shown in FIG. 3A, and the number of rows and columns is r s and c s , respectively, and the range of i may be 1 to r s . The 26th column arrangement of the template image is represented by T 26 (i) as shown in FIG. 3B, the number of rows and columns is r t , c t , and the range of i may be 1 to r t .

Next, the degree of agreement between a plurality of one-dimensional arrays of template images and a plurality of one-dimensional arrays of ultrasonic images is calculated (step S140).

Since the width of the one-dimensional array of the template image is smaller than the width of the one-dimensional array of the ultrasound image, as described above, since the template image is an image of an object using a simulator and the ultrasound image is an image obtained by the underwater robot, By moving the one-dimensional array of the template image in parallel, it is possible to acquire the position with the highest degree of agreement, which corresponds to the one-dimensional array of the ultrasound image.

In order to calculate the degree of agreement between the one-dimensional array of the template image and the one-dimensional array of the ultrasound image, a correlation function expressed by the following equation (1) can be used.

Equation 1

R j, k (i) =

Figure 112015077939106-pat00003

(Wherein, S j (i): the ultrasonic j-th column array, T k (i of the image): The template k-th column array of picture, r s: the length (number of rows) of the ultrasound image, R j, k (i) : 1? I? R s- r t +1)

Here, the 65th column arrangement of the 26th column arrangement and ultrasound images of one embodiment of a template image of the present invention may be represented as T 26 (i) and S 65 (i), respectively, S 65 (i) and T 26 (i) can be expressed as R 65,26 (i).

At this time, the correlation functions R 65 and 26 (i) of the 65th row array of the ultrasound image and the 26th row array of the template image have a distribution as shown in FIG. As shown in FIG. 5, when the T 26 (i) of FIG. 3B is compared with the sliding graph S 65 (i) of FIG. 3A, the point at which the correlation function has the maximum value is the point at which the index value is 157. A well-coincident position is a position that translates the index value by 157 degrees. That is, when the correlation is obtained, the two arrays can obtain the most consistent value and the index value that is translated to obtain the most consistent value. The method of comparing the template image with the ultrasound image is as follows.

Template images are made on-column c t, c have a total of t rows arrangement, since the ultrasonic image is made c s-column has a total of c s-column array. Therefore, it is possible by using the first to t-th column array c and c of the first to t-th column of the array of ultrasound images in the template image to obtain a total c t of the correlation function. Because the acquired c t of the correlation function has a maximum value, respectively, it is possible to obtain an average of M 1 of the maximum of the correlation function.

Also, a total of c t correlation functions are obtained by using the first through c t th column arrays of the template image and the second through c t + 1 th column arrays of the ultrasound image, and the maximum value the average of the M 2 is obtained. In this case, the average of the maximum values of the correlation functions

Figure 112015077939106-pat00004
The calculation is repeated until it is obtained.

That is, a total of c t correlation functions are obtained by using the first through c t th column arrays of the template image and the c s -c t +1 through c s th column arrays of the ultrasound image, The average of each maximum value

Figure 112015077939106-pat00005
, It is possible to obtain the average of the maximum values of each of the correlation functions c s - c t +1.

On the other hand, the larger the average of the maximum values of the obtained correlation functions, the higher the agreement between the column arrays of the corresponding template image and the column arrays of the corresponding ultrasound images. Therefore, when the degree of agreement between the template image and the ultrasound image is the highest, there is a case in which the average value of c s -c t +1 is the largest among the average values.

At this time, the largest average value

Figure 112015077939106-pat00006
.
Figure 112015077939106-pat00007
Is an average value of the maximum values obtained from the c t correlation functions using the 1 st through c t th column arrays of the template image and the l max through l max + c t -1 column arrays of the ultrasound image. That is, there is a possibility that there is the template image of the object from between the l max th column array of ultrasound images l + max c t -1-th column array may be determined most high.

On the other hand, the area obtained in step S140, which is calculated with reference to FIG. 6, is the area having the highest probability of an object appearing in the template image as the object recognition area. However, even if the objects do not coincide, the area having the largest average value can be obtained. In this case, since the objects do not coincide with each other, the average values are low. However, since the maximum value exists among the average values, the region having the maximum value among the average values can be selected, have. Therefore, additional verification is required for the area where it is determined that an object may exist.

At this time, the match degree is defined as an index value having a maximum value in the correlation function between the template image and the ultrasound image. Thus, the template the 1st column array and an ultrasound image of the l max th column defining the index value of a maximum value of the correlation function, the location obtained from the array to I 1, and the second column arranged in the template image and the ultrasound image of the image The index value where the maximum value of the correlation function obtained from the l max +1 column array is located is defined as I 2 . The index value at which the maximum value of the correlation function obtained from the c t -th column array of the template image and the l max + c t -1 column array of the ultrasound image is repeated

Figure 112015077939106-pat00008
, A total of c t index values can be obtained.

Finally, it is determined whether or not the target object is recognized (step S150).

One index array can be constructed using the c t index values obtained in step S140. At this time, when an actual object exists in the ultrasound image, the maximum value occurs at a point where the object exists. Therefore, each of the maximum value generating position indexes constituting the index array becomes closer to each other based on the position of the object, so that the scattering of the indexes becomes lower.

On the other hand, when there is no actual object in the ultrasound image, the maximum value generation position indexes constituting the index array are randomly distributed because there is no reference. Therefore, the scattering degree of the maximum value generating position indices increases.

That is, when the scattering degree of the index array is equal to or larger than the predetermined threshold value, it can be determined that there is no object in the region of the ultrasound image acquired in step S120. If the scattering degree of the index array is smaller than the predetermined threshold value, It can be determined that the object exists in the region of the ultrasound image.

At this time, although the scattering degree is used in the embodiment of the present invention, the scattering degree is not limited to this, but it is also possible to use the dispersion, the standard deviation and the average deviation which can determine the scattering degree.

On the other hand, since the template image is generated by using the simulator, the template image can be acquired from all directions with respect to the object. Accordingly, if the target object is not determined in step S150, the template image in the new direction can be generated and the above-described steps can be repeated.

At this time, when the ultrasound image is compared with template images in all directions, if the ultrasound image can not be discriminated as an object, the object acquired by the ultrasound image may be determined as not an object to be searched. When the ultrasound image is repeatedly compared with template images in various directions, it is determined that the object acquired by the ultrasound image is an object to be searched and the object search can be terminated. At this time, when the object is identified as the object, the direction in which the object is viewed can also be confirmed.

The ultrasonic object recognition method 100 of FIG. 1 may be implemented through the ultrasonic object recognition apparatus 700 shown in FIG. Referring to FIG. 7, an ultrasonic object recognition apparatus 700 according to an embodiment of the present invention includes an image acquisition unit 710, an image generation unit 730, and a determination unit 750.

The image acquiring unit 710 transmits the ultrasound signal, and receives the ultrasound reflection signal for the ultrasound signal to acquire the ultrasound image. The ultrasound image acquired through the image acquisition unit 710 is transmitted to a determination unit 750 described later.

The image generating unit 730 generates a template image of the object to be searched through simulation. At this time, the search altitude of the object searching apparatus using the imaging sonar and the view angle of the image obtaining unit 710 can be used to generate an image through simulation. Meanwhile, the template image generated by the image generation unit 730 is transmitted to the determination unit 750.

The determination unit 750 receives the ultrasound image from the image acquisition unit 710 and acquires the template image generated by the image generation unit 730. The determination unit 750 that receives the ultrasound image and the template image compares the one-dimensional arrays of the respective images to determine whether the object is a search object. The determining unit 750 may further include an arrangement obtaining unit 751, a calculating unit 753, and a comparing unit 755.

The array obtaining unit 751 obtains a template image and an ultrasound image from the image generating unit 730 and the image obtaining unit 710, respectively, and obtains a one-dimensional array of the respective images. At this time, the one-dimensional array to be acquired may preferably be a columnar array.

The calculating unit 753 calculates a result of comparing the one-dimensional arrays acquired by the array obtaining unit 751. [ At this time, the calculating unit 753 may use the following equation (1) to compare the one-dimensional arrays.

Equation 1

R j, k (i) =

Figure 112015077939106-pat00009

(Wherein, S j (i): the ultrasonic j-th column array, T k (i of the image): The template k-th column array of picture, r s: the length (number of rows) of the ultrasound image, R j, k (i) : 1? I? R s- r t +1)

Template images are made on-column c t, c have a total of t rows arrangement, since the ultrasonic image is made c s-column has a total of c s-column array. Therefore, it is possible by using the first to t-th column array c and c of the first to t-th column of the array of ultrasound images in the template image to obtain a total c t of the correlation function. Because the acquired c t of the correlation function has a maximum value, respectively, it is possible to obtain an average of M 1 of the maximum of the correlation function.

Also, a total of c t correlation functions are obtained by using the first through c t th column arrays of the template image and the second through c t + 1 th column arrays of the ultrasound image, and the maximum value the average of the M 2 is obtained. In this case, the average of the maximum values of the correlation functions

Figure 112015077939106-pat00010
The calculation is repeated until it is obtained.

That is, a total of c t correlation functions are obtained by using the first through c t th column arrays of the template image and the c s -c t +1 through c s th column arrays of the ultrasound image, The average of each maximum value

Figure 112015077939106-pat00011
, It is possible to obtain the average of the maximum values of each of the correlation functions c s - c t +1.

The calculation unit 753 is acquired -c c s t +1 of correlation functions, each correlation function to obtain the maximum value of the average of the respective values of the maximum and make up the the samples acquired output array And the index values thereof.

The comparator 755 determines whether an object exists by using the index values calculated by the calculator 753.

The index values acquired by the calculator 753 may be configured as one index array. At this time, when an actual object exists in the ultrasound image, the maximum value occurs at a point where the object exists. Therefore, the maximum value generating position index is approximated based on the position of the object, so that the scattering degree is lowered.

On the other hand, when no actual object exists in the ultrasound image, the maximum value generating position index is randomly distributed because there is no reference. Therefore, the scattering degree of the maximum value generating position index increases.

That is, when the scattering degree of the index array is equal to or larger than the predetermined threshold value, it can be determined that there is no object in the area where the object acquired by the comparator 755 exists. If the scattering degree of the index array is smaller than the predetermined threshold value , It can be determined that the object exists in an area where the object acquired by the comparison unit 755 may exist.

At this time, although the scattering degree is used in the embodiment of the present invention, the scattering degree is not limited to this, but it is also possible to use the dispersion, the standard deviation and the average deviation which can determine the scattering degree.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

700: Ultrasonic object recognition device
710: Image acquisition unit 730: Image generation unit
750: Judgment section 751: Arrangement acquisition section
753: Calculator 755:

Claims (12)

A method for recognizing an ultrasonic object,
1) generating at least one template image for at least one direction with respect to an object to be searched;
2) transmitting a plurality of ultrasound beams and acquiring ultrasound images using reflected waves for the respective ultrasound beams;
3) acquiring a plurality of one-dimensional arrays for the region where the object exists in the template image, and acquiring a plurality of one-dimensional arrays for the region in which the object exists in the ultrasound image;
4) A degree of agreement between a plurality of one-dimensional arrays of a template image and a plurality of one-dimensional arrays of an ultrasonic image is calculated, and a matching degree is calculated using a correlation function expressed by Equation 1 below. Obtaining a maximum value of the averages of the maximum values of the agreement degree, and obtaining an array of maximum values of the averages of the maximum value of the agreement degree; And
5) determining whether or not the target object is recognized according to the scattering degree of the array made up of the maximum value among the averages of the maximum values of the matching degrees; if the scattering degree is smaller than the predetermined threshold value, Determining that the object to be searched does not exist in the region of the step 2) when the scattering degree is larger than a predetermined threshold value;
And an ultrasonic probe for detecting the ultrasonic object.
Equation 1
R j , k (i) =
Figure 112016047264611-pat00024

(Wherein, S j (i): j-th column of the array of ultrasound images, T k (i): the template image k-th column array, R j, k (i) : 1≤i≤r s -r t +1 , r s : length of ultrasound image (number of rows), r t : length of template image (number of rows))
The method according to claim 1,
The step (4)
(A) (step 1) of obtaining c t correlation functions using Equation (1) between the first through c t th column arrays of the template image and the x th through c t + x-1 th column arrays of the ultrasound image, x≤c s -c t +1);
b) calculating a maximum value in each of the c t correlation functions, and obtaining M x , which is an average of the calculated maximum values, c s -c t +1;
c) obtaining a maximum value M lmax of c s- c t +1 M x ; And
And d) obtaining an array consisting of the l max column array to the l max + c t -1 column array of the ultrasound image,
The step (5)
Further comprising the step of determining whether the object is recognized according to the scattering degree of the array obtained in the step d).
delete delete delete In the ultrasonic object searching apparatus,
An image acquiring unit that emits an ultrasonic signal, receives an ultrasonic reflection signal for the ultrasonic signal, and acquires an ultrasonic image;
An image generation unit for generating a template image which is a simulation image of an object to be searched through simulation; And
And a determination unit for determining whether the object is a search object by comparing the one-dimensional arrays of the respective images after acquiring the images obtained from the image acquisition unit and the image generation unit,
The judging unit judges,
Determining whether an object is a search object after acquiring an area in which an object exists,
An array acquisition unit for acquiring a one-dimensional array of the ultrasound image and the template image acquired from the image acquisition unit and the image generation unit;
The degree of agreement is calculated by comparing the one-dimensional arrays obtained by the array obtaining unit, using the correlation function expressed by Equation (1), and then averages of the maximum values of the degree of agreement are obtained, Obtaining a maximum value among the averages of the maximum values, and then calculating an array of the maximum values among the averages of the maximum values of the matching degrees; And
When the scattering degree is smaller than the predetermined threshold value, it is determined that the object exists, and when the scattering degree is larger than the predetermined threshold value, A comparing unit for determining whether or not an object is present and determining whether the object is matched;
And an ultrasonic sensor for detecting the ultrasonic object.
Equation 1
R j , k (i) =
Figure 112016047264611-pat00025

(Wherein, S j (i): j-th column of the array of ultrasound images, T k (i): the template image k-th column array, R j, k (i) : 1≤i≤r s -r t +1 , r s : length of ultrasound image (number of rows), r t : length of template image (number of rows))
The method according to claim 6,
The calculating unit calculates,
Obtain c t correlation functions using Equation 1 between the 1 st through c t th column arrays of the template image and the x th through c t + x-1 th column arrays of the ultrasound image (where 1? X? C s -c t +1)
c t acquisition of the correlation between the average of M x of the maximum value calculated by calculating a maximum value from the function, each s -c c t +1 dog, and
Obtains the maximum value M lmax of the obtained c s -c t +1 M x ,
An array consisting of the l max column array of the ultrasound image to the l max + c t -1 column array is obtained,
Wherein,
Ultrasonic object recognition device, characterized in that for comparing the scatter plot of the array of from l max th column array of ultrasound images l max + t c to -1-th column array.
delete delete delete delete delete
KR1020150113502A 2015-08-11 2015-08-11 Method for recognizing object by ultrasound and apparatus therefor KR101696088B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150113502A KR101696088B1 (en) 2015-08-11 2015-08-11 Method for recognizing object by ultrasound and apparatus therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150113502A KR101696088B1 (en) 2015-08-11 2015-08-11 Method for recognizing object by ultrasound and apparatus therefor

Publications (1)

Publication Number Publication Date
KR101696088B1 true KR101696088B1 (en) 2017-01-24

Family

ID=57993129

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150113502A KR101696088B1 (en) 2015-08-11 2015-08-11 Method for recognizing object by ultrasound and apparatus therefor

Country Status (1)

Country Link
KR (1) KR101696088B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111354039A (en) * 2018-12-20 2020-06-30 核动力运行研究所 Weld joint region extraction rapid algorithm based on B-scan image recognition

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0875418A (en) * 1994-09-02 1996-03-22 Nippon Telegr & Teleph Corp <Ntt> Automatic recognition method and automatic recognition device for object
JP2006234493A (en) * 2005-02-23 2006-09-07 Aisin Seiki Co Ltd Object recognizing device, and object recognition method
JP2007255979A (en) * 2006-03-22 2007-10-04 Nissan Motor Co Ltd Object detection method and object detector
JP2011053197A (en) * 2009-08-03 2011-03-17 Kansai Electric Power Co Inc:The Method and device for automatically recognizing object

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0875418A (en) * 1994-09-02 1996-03-22 Nippon Telegr & Teleph Corp <Ntt> Automatic recognition method and automatic recognition device for object
JP2006234493A (en) * 2005-02-23 2006-09-07 Aisin Seiki Co Ltd Object recognizing device, and object recognition method
JP2007255979A (en) * 2006-03-22 2007-10-04 Nissan Motor Co Ltd Object detection method and object detector
JP2011053197A (en) * 2009-08-03 2011-03-17 Kansai Electric Power Co Inc:The Method and device for automatically recognizing object

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111354039A (en) * 2018-12-20 2020-06-30 核动力运行研究所 Weld joint region extraction rapid algorithm based on B-scan image recognition
CN111354039B (en) * 2018-12-20 2023-07-14 核动力运行研究所 Quick algorithm for extracting weld joint region based on B-scan image recognition

Similar Documents

Publication Publication Date Title
CN107392965B (en) Range finding method based on combination of deep learning and binocular stereo vision
Wang et al. A comparison of waveform processing algorithms for single-wavelength LiDAR bathymetry
KR101527876B1 (en) Method of real-time recognizing and tracking for underwater objects using sonar images
US6829197B2 (en) Acoustical imaging interferometer for detection of buried underwater objects
KR102186733B1 (en) 3D modeling method for undersea topography
Santos et al. Underwater place recognition using forward‐looking sonar images: A topological approach
Pyo et al. Beam slice-based recognition method for acoustic landmark with multi-beam forward looking sonar
CN112949380B (en) Intelligent underwater target identification system based on laser radar point cloud data
Chen et al. Probabilistic conic mixture model and its applications to mining spatial ground penetrating radar data
Cotter et al. Classification of broadband target spectra in the mesopelagic using physics-informed machine learning
CN115187666A (en) Deep learning and image processing combined side-scan sonar seabed elevation detection method
KR101696088B1 (en) Method for recognizing object by ultrasound and apparatus therefor
Dubrovinskaya et al. Anchorless underwater acoustic localization
Liang et al. MVCNN: A Deep Learning-Based Ocean–Land Waveform Classification Network for Single-Wavelength LiDAR Bathymetry
KR101696089B1 (en) Method and apparatus of finding object with imaging sonar
JP7133971B2 (en) 3D model generation device and 3D model generation method
CN115272461A (en) Seabed elevation detection method of side-scan sonar image based on priori knowledge
KR101943426B1 (en) Method, apparatus, computer program and computer readable recording medium for generating a drawing of an inner wall condition of a conduit, method, apparatus, computer program and computer readable recording medium for inspecting an inner wall condition of a conduit
Nathalie et al. Outlier detection for Multibeam echo sounder (MBES) data: From past to present
Pyo et al. Acoustic beam-based man-made underwater landmark detection method for multi-beam sonar
Hunt et al. Target detection in underwater LiDAR using machine learning to classify peak signals
Ozendi et al. An emprical point error model for TLS derived point clouds
Gubnitsky et al. A multispectral target detection in sonar imagery
KR101696087B1 (en) Method of object searching with supersonic wave and apparatus therefor
Muduli et al. A Review On Recent Advancements In Signal Processing and Sensing Technologies for AUVs

Legal Events

Date Code Title Description
GRNT Written decision to grant