KR20140120527A - Apparatus and method for matchong stereo image - Google Patents

Apparatus and method for matchong stereo image Download PDF

Info

Publication number
KR20140120527A
KR20140120527A KR1020130036400A KR20130036400A KR20140120527A KR 20140120527 A KR20140120527 A KR 20140120527A KR 1020130036400 A KR1020130036400 A KR 1020130036400A KR 20130036400 A KR20130036400 A KR 20130036400A KR 20140120527 A KR20140120527 A KR 20140120527A
Authority
KR
South Korea
Prior art keywords
codewords
code word
brightness
occlusion
codebook
Prior art date
Application number
KR1020130036400A
Other languages
Korean (ko)
Inventor
신선미
강성일
홍현기
김주현
이성목
Original Assignee
삼성전기주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전기주식회사 filed Critical 삼성전기주식회사
Priority to KR1020130036400A priority Critical patent/KR20140120527A/en
Publication of KR20140120527A publication Critical patent/KR20140120527A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The present invention relates to a stereo image matching apparatus and a stereo image matching method. According to an embodiment of the present invention, the provided stereo image matching apparatus includes: a parallax calculating block which calculates a parallax by matching a left image with a right image obtained from a stereo camera and detects an area (occlusion area) where the parallax is hidden; a codebook storing block which stores a codebook in which each code word is recorded about information including color, brightness, and depth in a previous frame; and a code word renewing block which generates each code word about information including color, brightness, and depth in a current frame and renews code words in the current frame by using the codebook in the occlusion area which is detected in the parallax calculating block. The stereo image matching method is also proposed.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a stereo image matching apparatus and a stereo image matching method,

The present invention relates to a stereo image matching apparatus and a stereo image matching method. More particularly, the present invention relates to a stereo image matching apparatus and a stereo image matching method using a codebook.

The stereoscopic camera system requires a lot of calculation to calculate dense parallax information for the target scene, but it has an advantage of obtaining the distance information on the space.

At this time, there is a method of using an infrared sensor as a method of extracting distance information, i.e., depth information. Extract structured IR patterns or modulated IR signals to extract depth information. Although this method is excellent in indoor performance, it is inferior in outdoor and indoor window performance. The infrared sensor has a disadvantage that it can be used only in a limited place.

In other methods, it operates in outdoor environment and extracts depth information by using inexpensive image sensor compared to IR sensor. Depth information can be extracted using a stereo camera using an image sensor. At this time, since a stereo camera is used, a method of matching left and right images is important. The similarity can be discriminated based on the luminance information of the left and right images by the stereo matching method or the similarity can be judged based on the color information.

When these methods are used, depth information of an image can be extracted, but there is a problem that noise is generated when applied to real-time image.

Korean Patent Publication No. 10-2009-0087638 (published on Aug. 18, 2009)

In order to solve the above-mentioned problems, the present invention proposes a stereo matching technique which can obtain a robust disparity result by using a codebook as a method for reducing parallax errors obtained by stereo matching.

To solve this problem, we propose a robust stereo matching technique based on a codebook that includes surrounding background information in order to realize stereo matching in real time and to solve the occlusion problem caused by foreground objects.

In order to solve the above-described problems, according to one aspect of the present invention, a parallax operation block for calculating a parallax by matching left and right images obtained from a stereo camera and detecting an area where closure is occluded (an occlusion area); A codebook storage block storing a codebook in which each code word related to information including color, brightness and depth in the previous frames is recorded; And a code word update block for generating codewords for information including color, brightness and depth in a current frame, and for updating codewords in a current frame using a codebook in an occlusion region detected in a parallax block, A stereo image matching device is proposed.

At this time, in one example, the code word update block updates the code word using the codebook in the occlusion region, updates the code word using the codebook in the non-occlusion region, adds a new code word, You can update it.

Also in this case, in one example, the codeword update block is configured to store the current codewords for color, brightness, or color and brightness information in the current frame with respect to the occlusion region, Brightness, or color and brightness information and updates the depth codeword in the previous frames with the depth codeword in the current frame if the difference between the current codewords and the previous codewords is below the set threshold .

Further, according to one example, the codeword update block may include one or more of color and brightness in the current frame for the non-occlusion region, current codewords for depth information, and color And brightness, and the previous codewords for depth information, and averages the previous codewords in the pre-set range similar to the current codewords to update the codeword and update the codebook.

At this time, in another example, the code word update block adds the current code words in the current frame to the new code word if the similar range of previous code words do not exist for the non-occlusion area, and updates the codebook .

In one example, the parallax operation block includes: a parallax operator for calculating a parallax by matching left and right images; And an occlusion detecting unit for detecting a parallax and detecting an occlusion area.

In addition, in one example, the stereo image matching apparatus may further include a stereoscopic image generation block that generates a stereoscopic image using the depth code words in the current frame, which is updated or added.

Next, in order to solve the above-described problem, according to another aspect of the present invention, there is provided a time difference calculating method for calculating a time difference by matching right and left images acquired from a stereo camera, and detecting a time lag area (occlusion area) An operation and an area detection step; And each code word for information including color, brightness and depth in the current frame, and for each detected occlusion region, each code word for information including color, brightness and depth in previous frames And a codeword updating step of updating the codewords in the current frame using the recorded codebook.

At this time, in one example, the code word updating step may include: an updating step of updating the code words using the codebook with respect to the occlusion area; A non-occlusion code word updating step of updating codewords or adding new codewords using a codebook for a non-occlusion area; And an updating step of updating the codebook in accordance with the update or addition of the codewords.

Also in this case, in one example, in the occlusion code word update step, the current codewords for color, brightness, or color and brightness information in the current frame with respect to the occlusion region and previous codewords Brightness, or color and brightness information of the current frame, and if the difference between the current codewords and the previous codewords is below the set threshold, the depth codeword in the previous frames is compared with the depth codeword . ≪ / RTI >

In another example, in the non-occlusion codeword updating step, the current codewords for at least one of color and brightness in the current frame and the depth information for the non-occlusion region, And compare the previous codewords for depth information and update the codeword by averaging if there are previous codewords of a predetermined range similar to the current codewords.

At this time, according to another example, in the non-occlusion code word updating step, if there is no similar range of previous codewords in the non-occlusion region, the current codewords in the current frame are replaced with a new codeword Can be added.

Further, according to one example, the parallax calculation and the area detection step include: calculating a parallax by matching left and right images; And detecting a parallax and detecting an occlusion area.

In another example, the stereoscopic image matching method may further include a stereoscopic image generation step of generating a stereoscopic image using depth code words in a current frame that is updated or added after the codeword updating step.

According to the embodiment of the present invention, it is possible to reduce errors in parallax obtained through stereo matching. That is, based on the more robust stereo matching result using the codebook, the disparity result and the sample depth information can be obtained.

It is apparent that various effects not directly referred to in accordance with various embodiments of the present invention can be derived by those of ordinary skill in the art from the various configurations according to the embodiments of the present invention.

1 is a block diagram schematically illustrating a stereo image matching apparatus according to an embodiment of the present invention.
2 is a block diagram schematically illustrating a stereo image matching apparatus according to another embodiment of the present invention.
3 is a block diagram schematically illustrating a stereo image matching apparatus according to another embodiment of the present invention.
4 is a flowchart schematically illustrating a stereo image matching method according to another embodiment of the present invention.
FIG. 5 is a flowchart schematically illustrating a stereo image matching method according to another embodiment of the present invention.
FIG. 6 is a flowchart schematically illustrating a stereo image matching method according to another embodiment of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a block diagram showing the configuration of a first embodiment of the present invention; Fig. In the description, the same reference numerals denote the same components, and a detailed description may be omitted for the sake of understanding of the present invention to those skilled in the art.

As used herein, unless an element is referred to as being 'direct' in connection, combination, or placement with other elements, it is to be understood that not only are there forms of being 'directly connected, They may also be present in the form of being connected, bonded or disposed.

It should be noted that, even though a singular expression is described in this specification, it can be used as a concept representing the entire constitution unless it is contrary to, or obviously different from, or inconsistent with the concept of the invention. It is to be understood that the phrases "including", "having", "having", "comprising", etc. in this specification are intended to be additionally or interchangeable with one or more other elements or combinations thereof.

A stereo image matching apparatus according to an embodiment of the present invention will be described in detail with reference to the drawings. Here, reference numerals not shown in the drawings to be referred to may be reference numerals in other drawings showing the same configuration.

FIG. 1 is a block diagram schematically showing a stereo image matching apparatus according to an embodiment of the present invention, FIG. 2 is a block diagram schematically showing a stereo image matching apparatus according to another embodiment of the present invention, 3 is a block diagram schematically illustrating a stereo image matching apparatus according to another embodiment of the present invention.

Referring to FIGS. 1 to 3, a stereo image matching apparatus according to an embodiment of the present invention includes a parallax operation block 10, a codebook storage block, and a code word update block 50. In addition, referring to FIG. 2, the stereo image matching apparatus according to one example may further include a stereoscopic image generation block 70. Let's look at each configuration in detail. The technique according to the present invention can be applied to implement element technology in the field of computer vision for effectively analyzing human gestures. For example, it can be used variously to develop an intuitive gesture-based interface using a stereo camera. The present invention is to develop an interactive interface using stereo vision obtained from a stereo vision system to obtain distance information on a target scene and a user. More specifically, stereo matching is implemented in real time by parallel processing using CUDA programming based on computer graphics hardware (GPU), and surrounding background information is included in order to solve the problem of occlusion caused by foreground objects and the like A codebook based robust stereo matching algorithm is proposed.

Referring to FIGS. 1 to 3, the parallax calculation block 10 calculates a parallax by matching left and right images obtained from a stereo camera (not shown), and detects a parallax-blocked area (occlusion area). Various obstacles occur due to foreground objects existing on the target scene, and the accuracy of disparity is greatly influenced by the inter-image calculation process for stereo image matching. There may be a method in which many outliers exist due to occlusion or the like and the reliable neighboring parallax information in the area is filled using the iterative region voting for the detected outlier region. It is difficult to expect a stable performance if the area is too wide or the difference in distance from the surrounding area is large. Accordingly, in the present embodiment, first, the continuity of two disparity maps obtained by stereo matching based on the left-right image is checked to distinguish the areas to be covered.

The occlusion region refers to the area covered by the foreground in one of the images due to the left and right parallax of the stereo image. On the other hand, the non-occlusion region refers to an area not covered by the foreground of both stereo images. That is, in the non-occlusion region, the right and left parallax of the stereo image can be calculated / detected, whereas the right parallax of the stereo image can not be detected / detected in the occlusion region. In this case, the disparity means a distance that is separated by a difference of right and left timings when any one point matches the stereo image. It is possible to calculate depth information of the corresponding point or distance information from the parallax information detected at the time of matching of the stereo image.

For example, referring to FIG. 3, in one example, the parallax calculation block 10 may include a parallax calculating section 11 and an occlusion detecting section 13. At this time, the time difference calculating unit 11 can calculate the time difference by matching the left and right images obtained from the stereo camera (not shown). The occlusion detector 13 detects the parallax in accordance with the parallax calculation in the parallax calculator 11, extracts the parallax detection area, and detects the occlusion area where the parallax is not detected.

Next, referring to FIGS. 1 to 3, the codebook storage block 30 of the stereo image matching apparatus stores a codebook 30. The codebook storage block 30 may be a memory element, for example. In the codebook 30, respective code words relating to information including color, brightness, and depth in the previous frames are recorded. The codewords recorded in the codebook 30 may be updated or added every frame. For example, codewords for previous frames of a predetermined interval may be recorded in the codebook 30. [ At this time, the information recorded in the code word further includes the corresponding frame number. For example, a code word not referred to for a predetermined period, for example, a code word not referred to for more than 10 frames, may be deleted from the codebook 30 to efficiently manage a codebook storage block, which is a storage memory.

In the conventional codebook, the foreground region is extracted from the target scene, assuming that the background and the camera are stationary. However, there are disadvantages that background is changed or performance is affected by noise. To solve this problem, distance information is stored together with existing color values, and continuity in object space and time is considered.

For example, the codebook 30 proposed in one example can be used for any point P, in a previous frame of a set arbitrary interval, for example, in a frame prior to occlusion by a foreground object, B), maximum and minimum brightness values (Imin, Imax), depth?, Corresponding coordinate value, finally referred frame number (corresponding frame number), and the like. That is, codewords are stored for each pixel of each frame, and may be composed of a frame number, color information of a pixel, maximum and minimum brightness values, distance information, and the like.

Next, the code word updating block 50 of the stereo image matching apparatus will be described in detail with reference to FIGS. 1 to 3. FIG. The codeword update block 50 may generate each codewords for information including color, brightness, and depth in the current frame. At this time, the code word updating block 50 can update the codewords in the current frame using the codebook 30 in the occlusion area detected by the parallax block 10.

The code word update block 50 generates a code word based on a reference image selected from any one of a disparity detection area, an occlusion map area, and a stereo image calculated / detected through the parallax calculation block 10, Lt; / RTI >

3 or 5 and / or 6, in one example, the code word update block 50 updates the code word using the codebook 30 in the occlusion region, It is possible to update the code word or add a new code word using the codebook 30 in the entire area. In addition, the code word update block 50 may update the codebook 30 with updated or added code words. That is, the codeword information updated or added in the current frame is updated in the codebook 30 and can be utilized as codeword information recorded in the codebook 30 in the next frame. The codebook update can be performed every frame.

For example, if it is determined that reliable distance information is obtained in the non-occlusion region, that is, in the non-occlusion region, the distance information input in the current frame is output as it is, Information is replaced with code words of n frames and updated. On the other hand, in the occlusion region, the codewords before n frames are compared with the codewords of n frames, and the output can be corrected to the distance information of the codewords having similar color information and brightness values. At this time, the codewords before n frames stored in the codebook 30 to be compared with the codeword of the current n frames may be codewords in frames of a predetermined arbitrary interval.

In the non-occlusion area, a method of updating the code word using the codebook 30 and a method of updating or adding the code word in the current frame to the new code word may be selectively or both performed. The method of updating the code word using the codebook 30 in the non-occlusion region may be different from updating the code word using the codebook 30 in the occlusion region. Since the left and right parallax of the stereo image can not be ascertained or detected in the occlusion area, the depth codeword among the codewords of the current frame can not be generated. Accordingly, the depth codeword information in the current frame can be reflected in the update information On the other hand, in the non-occlusion region, since the left and right parallax of the stereo image can be calculated / detected in the current frame, it can be updated with the codeword reflecting the depth codeword information in the current frame.

Specifically, in one example, the code word update block 50 generates a code word update block 50 that includes current code words for color, brightness, or color and brightness information in the current frame for the occlusion region, Brightness, or color and brightness information in previous frames of the interval. At this time, the code word update block 50 may update the depth code word obtained in the previous frames with the depth code word in the current frame if the difference between the current codewords and the previous codewords is below the set threshold. The codeword brightness value stored in the codebook 30 may be a maximum minimum brightness value set according to a predetermined weight range. For example, the codewords of the previous frame stored in the codebook 30 to be compared with the codewords in the current frame may be codewords in predetermined intervals of frames, The average value of the codewords can be compared with the codewords in the current frame. At this time, it is possible to compare / judge whether or not the difference between the codewords in the current frame and the averaged value of the codewords in the previous frames of the preset interval is below the threshold value.

For example, when the occlusion region is detected by checking the continuity of two disparity maps in the parallax calculation block 10, it is determined that the point P is from the continuity check to the occlusion region (masked region) The code word update block 50 may compare the color and / or brightness values stored in the codebook 30 with the current frame. For example, if the R, G, and B values of the current frame are compared with the R, G, and B values stored in the codebook 30 and the difference between the two values is less than the threshold value, It can be determined as the final depth information of the point P in the frame. For example, when the color value of the current frame is compared with the color value stored in the codebook 30, and when the input brightness value I is between the minimum and maximum brightness value areas stored in the codebook 30, To obtain the code word depth value. It is possible to determine whether the input I is present in the stored minimum and maximum areas and return 0 or 1 to obtain the code word depth value recorded in the codebook 30. [ At this time, the minimum and maximum brightness values stored in the codebook 30 are set by assigning a weight range to the brightness of the corresponding point in the previous frame, and are a threshold range value. In addition, the color values and brightness values stored in the codebook 30 compared with the color values and brightness values of the current frame are stored in the codebook 30 as color values and brightness values over previous frames of any interval stored in the codebook 30, And the color and brightness values of the current frame.

Further, according to one example, the code word update block 50 may determine whether the current code words and the codebook 30 relating to the at least one of the color and brightness in the current frame and the depth information with respect to the non- It is possible to compare previous codewords for depth information with one or more of the color and brightness in the previous stored frame. At this time, the code word updating block 50 averages the current code words and the previous code words, if there is a predetermined range of previous code words similar to the current code words, updates the code word and updates the codebook 30 can do. That is, the average codewords in which the current codewords and the previous codewords are averaged become the codewords in the current frame and can be updated and stored in the codebook 30 at the same time. At this time, with respect to the non-occlusion region, unlike in the occlusion region, the codewords stored in the codebook 30 that are averaged with the codewords in the current frame are not codewords in the frames over any interval And may be codewords in one or more previous frames compared to the current frame in a similar range of previous frames of the set arbitrary interval. That is, the codewords in the current frame can be averaged with the codewords in the previous frame that are determined to belong to the similar range to update the codewords.

For example, if there is no similar range of previous codewords for the non-occlusion region, the codeword update block 50 adds the current codewords in the current frame to the new codeword, Can be updated.

In other words, when no obstruction exists, it is determined that reliable distance information has been obtained, and the inputted distance value is used as the final information as it is. In the code book 30, Can be updated. Concretely, a code word similar to the inputted distance value can be obtained from the codebook 30 while satisfying the distribution condition of minimum color distance and brightness. The color and depth values of code words satisfying these conditions can be updated with the code words in the current frame to be averaged with the input information in the current frame and the code words of the codebook 30 can be updated, The codebook 30 is used in the next frame. At this time, when the codeword is updated through averaging, the values in the current frame and the values in the previous frame recorded in the codebook 30 can be averaged. If the above condition is not satisfied, a new code word including color, depth, constant brightness distribution, and current frame number can be added to the corresponding position.

In addition, for example, a 3x3 median filter can be applied to obtain a smoother interpolated parallax map, and more weighted distances on similar color values on a one-dimensional scan line to produce discontinuous noise components Can be removed. Since the area having similar color values has a high probability of being similar to the actual distance value, the distance value can be corrected according to the similarity of the color values in the filter. If the color values in the filter are similar, if there is an error in the distance value, the distance value can be corrected in consideration of the color similarity.

Further, in one example, the stereo image matching apparatus may further include a stereoscopic image generation block 70 for generating a stereoscopic image using the depth code words in the current frame, which is updated or added.

Next, a stereo image matching method according to another aspect of the present invention will be described in detail with reference to the following drawings. Here, stereo image matching devices according to the above-described embodiment and FIGS. 1 to 3 will be referred to, and redundant explanations therefor may be omitted.

FIG. 4 is a flowchart schematically illustrating a stereo image matching method according to another embodiment of the present invention, FIG. 5 is a flowchart schematically illustrating a stereo image matching method according to another embodiment of the present invention, FIG. Is a flowchart schematically illustrating a stereo image matching method according to another embodiment of the present invention.

Referring to FIGS. 4 to 6, a stereo image matching method according to an embodiment of the present invention includes a parallax calculation and region detection step S100 and a code word update step S300 and S300 '. In addition, referring to FIG. 6, a stereo image matching method according to one example may further include a step S 500 of generating a stereoscopic image. Hereinafter, each step will be described in detail. The technique according to the present invention can be applied to implement element technology in the field of computer vision for effectively analyzing human gestures.

4 to 6, in the parallax calculation and area detection step S100, the parallax is calculated by matching the left and right images obtained from the stereo camera (not shown), and the area (occlusion area) where the parallax is obscured is detected do.

For example, referring to FIG. 6, the parallax calculation and area detection step S100 may include a parallax calculation step S110 and an occlusion area detection step S130. In the parallax calculation step S110, the left and right images obtained from the stereo camera (not shown) are matched to calculate the parallax. Further, in the occlusion area detecting step (S130), it is possible to detect the parallax, extract the parallax detection area, and detect the occlusion area where the parallax is not detected.

Next, referring to FIGS. 4 to 6, in the code word updating step (S300, S300 '), code words related to information including color, brightness, and depth are generated in the current frame, , It is possible to update the codewords in the current frame using the codebook 30 in which each codeword relating to information including color, brightness and depth is recorded in previous frames. For example, referring to FIGS. 4 to 6, the code word updating step S300 and S300 'may include a code word generating step S310 and a code word updating step S330 and S330' of the current frame. 5 and 6, the codeword updating step (S300, S300 ') may further include an updating step (S350).

Referring to FIGS. 4 to 6, in the codeword generation step (S310), each codeword for information including color, brightness, and depth in the current frame may be generated. At this time, codewords related to color and brightness can be generated in the occlusion region, but since the left and right parallax of the stereo image can not be recognized / detected, the depth codeword can not be generated from the current frame. Accordingly, in the code word updating step (S330, S330 ') of the current frame, information including color, brightness and depth in the previous frames for the occlusion area detected in the occlusion area detecting step (S130) For example, a depth code word, in the current frame using the codebook 30 in which each code word associated with the code word is recorded.

For example, referring to FIG. 5 and / or 6, in one example, the code word updating step (S300, S300 ') includes the updating step S331 of the occlusion code word, the updating step S333 of non-occlusion code word And an update step S350. Of course, at this time, the codeword updating step (S300, S300 ') may include a codeword generating step (S310).

Referring to FIG. 5 and / or 6, in the occlusion code word updating step (S331), code words can be updated using the codebook 30 for the occlusion area. That is, with respect to the occlusion area, the code words in the current frame can be updated using the codebook 30 in which each code word related to information including color, brightness and depth in the previous frames is recorded.

For example, at this time, in one example, the occlusion code word updating step (S331) may update the current code words and the code book (30) regarding the color, brightness, Brightness, or color and brightness information in previous frames stored in the previous frame. Also, as a result of the comparison, if the difference between the current codewords and the previous codewords is equal to or less than the set threshold value, the depth codeword in the previous frames can be updated with the depth codeword in the current frame.

5 and 6, in the non-occlusion code word updating step S333, the code word 30 is used to update the code words with respect to the non-occlusion area, or new code words Can be added. At this time, with respect to the non-occlusion area, it is possible to update the codewords in the current frame using the codebook 30 in which the respective code words related to the information including the color, brightness and depth in the previous frames are recorded , Or each codeword for information including color, brightness, and depth in the current frame may be added to the codeword of the current frame or may be added to the new codeword.

For example, in another example, in the non-occlusion code word updating step (S333), the current code words related to at least one of the color and the brightness in the current frame and the depth information with respect to the non- It is possible to compare the previous codewords for depth information with at least one of the color and brightness in the previous frame stored in the codebook 30. [ At this time, if there is a previous code word in a predetermined range similar to the current code words as a result of the comparison, the code word can be updated by averaging.

If there is no similar range of previous codewords for the non-occlusion region, in one example, the non-occlusion codeword updating step S333 may update the current codewords in the current frame It can be updated or added with a new code word.

Subsequently, referring to FIG. 5 and / or 6, in the update step S350, the codebook 30 may be updated according to the update or addition of codewords.

6, a stereo image matching method according to one example includes a step S 500 of generating a stereoscopic image using depth codewords in a current frame, which is updated or added after the codeword updating step, As shown in FIG.

The foregoing embodiments and accompanying drawings are not intended to limit the scope of the present invention but to illustrate the present invention in order to facilitate understanding of the present invention by those skilled in the art. Embodiments in accordance with various combinations of the above-described configurations can also be implemented by those skilled in the art from the foregoing detailed description. Accordingly, various embodiments of the present invention may be embodied in various forms without departing from the essential characteristics thereof, and the scope of the present invention should be construed in accordance with the invention as set forth in the appended claims. Alternatives, and equivalents by those skilled in the art.

10: time difference calculation block 11:
13: Occlusion detector 30: Codebook or codebook storage block
50: code word update block 70: stereoscopic image generation block

Claims (14)

A parallax operation block for calculating a parallax by matching the left and right images obtained from the stereo camera and detecting an area where the parallax is obscured (occlusion area);
A codebook storage block storing a codebook in which each code word related to information including color, brightness and depth in the previous frames is recorded; And
Generating respective codewords for information including color, brightness and depth in a current frame, and updating the codewords in the current frame using the codebook in the occlusion region detected in the parallax operation block And a code word update block.
The method according to claim 1,
The code word update block updates the code word using the codebook in the occlusion region and updates the code word with the codebook in the non-occlusion region, or adds a new code word, And updating the stereo image.
The method of claim 2,
Wherein the code word update block comprises code words associated with color, brightness, or color and brightness information in the current frame with respect to the occlusion region, color, brightness, or color in the previous frames stored in the codebook, Comparing the previous codewords for brightness information and updating the depth codeword in the previous frames to the depth codeword in the current frame if the difference between the current codewords and the previous codewords is below a set threshold, And a stereo image matching device.
The method of claim 2,
Wherein the code word update block is configured to code the current codewords for at least one of color and brightness in the current frame and the depth information for the non-occlusion region and color and brightness in the previous frame stored in the codebook Comparing the one or more previous codewords with depth information and averaging the previous codewords in a predetermined range similar to the current codewords to update the codeword and update the codeword. Image matching device.
The method of claim 4,
Wherein the codeword update block adds the current codewords in the current frame to a new codeword and updates the codebook if the previous codewords of the similar range do not exist for the non-occlusion area And a stereo image matching device.
The method according to any one of claims 1 to 5,
Wherein the parallax operation block comprises: a parallax operator for calculating a parallax by matching the left and right images; And an occlusion detector for detecting a parallax and detecting the occlusion area.
The method according to any one of claims 1 to 5,
Further comprising a stereoscopic image generation block for generating a stereoscopic image using the depth code words in the current frame that is updated or added.
A parallax calculation and region detection step of calculating a parallax by matching the left and right images obtained from the stereo camera and detecting an area where the parallax is obscured (occlusion area); And
Generating respective codewords for information including color, brightness and depth in a current frame, wherein for each detected occlusion area, each codeword for information including color, brightness and depth in previous frames And updating the codewords in the current frame using the codebook in which the codewords are recorded.
The method of claim 8,
The code word updating step comprises:
An occlusion code word updating step of updating the code words using the codebook with respect to the occlusion area;
A non-occlusion code word updating step of updating the code words or adding new code words using the codebook with respect to the non-occlusion area; And
And updating the codebook according to an update or addition of the codewords.
The method of claim 9,
In the occlusion code word updating step, the current code words related to the color, brightness, or color and brightness information in the current frame with respect to the occlusion area and the color, brightness, Or previous codewords for color and brightness information and updates the depth codeword in the previous frames to the depth codeword in the current frame if the difference between the current codewords and the previous codewords is below a set threshold Wherein the stereo image matching method comprises:
The method of claim 9,
Wherein the non-occlusion code word updating step comprises updating the current code words for at least one of color and brightness and depth information in the current frame with respect to the non-occlusion region, Comparing the previous codewords for at least one of color and brightness and depth information and for updating the codeword by averaging the previous codewords in a predetermined range similar to the current codewords, Image matching method.
The method of claim 11,
Wherein the updating of the non-occlusion code word comprises adding the current code words in the current frame to a new code word if the previous code words in the similar range do not exist for the non-occlusion area Wherein the stereo image matching method comprises:
The method according to any one of claims 8 to 12,
Wherein the parallax calculation and the area detection step comprise: calculating a parallax by matching the left and right images; And detecting a parallax and detecting the occlusion area.
The method according to any one of claims 8 to 12,
Further comprising a stereoscopic image generation step of generating a stereoscopic image by using depth code words in the current frame updated or added after the code word updating step.
KR1020130036400A 2013-04-03 2013-04-03 Apparatus and method for matchong stereo image KR20140120527A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130036400A KR20140120527A (en) 2013-04-03 2013-04-03 Apparatus and method for matchong stereo image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130036400A KR20140120527A (en) 2013-04-03 2013-04-03 Apparatus and method for matchong stereo image

Publications (1)

Publication Number Publication Date
KR20140120527A true KR20140120527A (en) 2014-10-14

Family

ID=51992378

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130036400A KR20140120527A (en) 2013-04-03 2013-04-03 Apparatus and method for matchong stereo image

Country Status (1)

Country Link
KR (1) KR20140120527A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101889886B1 (en) * 2017-12-22 2018-08-21 세명대학교 산학협력단 Depth information generating method and apparatus
KR101988551B1 (en) 2018-01-15 2019-06-12 충북대학교 산학협력단 Efficient object detection and matching system and method using stereo vision depth estimation
KR101999797B1 (en) 2018-01-15 2019-07-12 충북대학교 산학협력단 Stereo image feature matching system and method based on harris corner vector clustering algorithm
KR20210081527A (en) 2019-12-24 2021-07-02 동의대학교 산학협력단 Apparatus and method for improving the performance of stereo-based ROI detection algorithm

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101889886B1 (en) * 2017-12-22 2018-08-21 세명대학교 산학협력단 Depth information generating method and apparatus
US10482620B2 (en) 2017-12-22 2019-11-19 Light And Math Inc. Method and device for producing depth information
KR101988551B1 (en) 2018-01-15 2019-06-12 충북대학교 산학협력단 Efficient object detection and matching system and method using stereo vision depth estimation
KR101999797B1 (en) 2018-01-15 2019-07-12 충북대학교 산학협력단 Stereo image feature matching system and method based on harris corner vector clustering algorithm
KR20210081527A (en) 2019-12-24 2021-07-02 동의대학교 산학협력단 Apparatus and method for improving the performance of stereo-based ROI detection algorithm

Similar Documents

Publication Publication Date Title
US11462028B2 (en) Information processing device and information processing method to generate a virtual object image based on change in state of object in real space
US11629964B2 (en) Navigation map updating method and apparatus and robot using the same
JP6244407B2 (en) Improved depth measurement quality
JP6554169B2 (en) Object recognition device and object recognition system
Yang et al. Depth hole filling using the depth distribution of neighboring regions of depth holes in the kinect sensor
CN109949347B (en) Human body tracking method, device, system, electronic equipment and storage medium
US11847796B2 (en) Calibrating cameras using human skeleton
KR100953076B1 (en) Multi-view matching method and device using foreground/background separation
US10957100B2 (en) Method and apparatus for generating 3D map of indoor space
CN104079912A (en) Image processing apparatus and image processing method
KR20140120527A (en) Apparatus and method for matchong stereo image
JP5027758B2 (en) Image monitoring device
JP2012506994A (en) Video processing method
KR101086274B1 (en) Apparatus and method for extracting depth information
US11861900B2 (en) Multi-view visual data damage detection
US11948312B2 (en) Object detection/tracking device, method, and program recording medium
KR102083293B1 (en) Object reconstruction apparatus using motion information and object reconstruction method using thereof
KR101700651B1 (en) Apparatus for tracking object using common route date based on position information
JP6962365B2 (en) Object detection system and program
KR102240570B1 (en) Method and apparatus for generating spanning tree,method and apparatus for stereo matching,method and apparatus for up-sampling,and method and apparatus for generating reference pixel
CN114757824B (en) Image splicing method, device, equipment and storage medium
WO2015150286A1 (en) Motion field estimation
KR101856257B1 (en) Apparatus for compensating disparity image and method thereof
JP7024401B2 (en) Information processing equipment, programs, and information processing methods
CN113673284B (en) Depth camera snapshot method, system, equipment and medium

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application