WO2019139303A1 - Procédé et dispositif de synthèse d'image - Google Patents

Procédé et dispositif de synthèse d'image Download PDF

Info

Publication number
WO2019139303A1
WO2019139303A1 PCT/KR2019/000111 KR2019000111W WO2019139303A1 WO 2019139303 A1 WO2019139303 A1 WO 2019139303A1 KR 2019000111 W KR2019000111 W KR 2019000111W WO 2019139303 A1 WO2019139303 A1 WO 2019139303A1
Authority
WO
WIPO (PCT)
Prior art keywords
boundary
value
pixel
minimum error
frames
Prior art date
Application number
PCT/KR2019/000111
Other languages
English (en)
Korean (ko)
Inventor
유성열
김준식
김규헌
강전호
Original Assignee
삼성전자 주식회사
경희대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사, 경희대학교 산학협력단 filed Critical 삼성전자 주식회사
Publication of WO2019139303A1 publication Critical patent/WO2019139303A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/18Image warping, e.g. rearranging pixels individually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • Various embodiments of the present disclosure are directed to a method and apparatus for composing images with different viewpoints into a single image.
  • one image acquired through a camera has a limited angle of view. Accordingly, a variety of image synthesis techniques have been studied to generate images having a wider angle of view than a single image, thereby providing a more realistic experience.
  • a scale invariant feature transform SIFT
  • SURF speed up robust feature
  • FAST feature form accelerated segment test
  • the stitching technique enables the generation of panoramic images without visual errors by sequentially using the algorithms according to the image synthesis technique.
  • the stitching technique has evolved with significant improvements in accuracy.
  • the improvement of the accuracy may lead to a decrease in processing speed as the amount of computation for image processing is increased.
  • Such a decrease in the processing speed may limit the use range of the panoramic image generation technique.
  • an image synthesis method and apparatus capable of securing versatility by processing speed and accuracy according to image synthesis.
  • an image synthesizing method and apparatus for synthesizing images at various different viewpoints to generate a single panoramic image it is possible to provide an image synthesizing method and apparatus for synthesizing images at various different viewpoints to generate a single panoramic image.
  • a method includes: A method of synthesizing an image, the method comprising: arranging the plurality of frames based on feature points extracted from a plurality of frames corresponding to each of a plurality of images; Determining a weight for each pixel on the basis of the depth information of the overlap region, and calculating a weight for each pixel based on a difference between pixel values of frames included in the overlap region, Generating a minimum error matrix using the minimum error matrix, and composing the aligned frames based on the generated boundary.
  • an apparatus comprising: An image synthesizer, comprising: an interface for acquiring a plurality of images; a plurality of frames arranged on the basis of feature points extracted from a plurality of frames corresponding to each of the plurality of images, Determining an overlap area including the minutiae corresponding to each other among the aligned frames, determining a weight for each pixel based on the depth information of the overlap area, and calculating a weight value between the pixel values of the frames included in the overlap area Generating a minimum error matrix by assigning a weight for each pixel to the difference, generating a boundary using the minimum error matrix, and compositing the aligned frames based on the generated boundary.
  • FIG. 1 is a block diagram of a video synthesizing apparatus according to an exemplary embodiment.
  • FIG. 2 (a) is a graph showing an example of a quantization function used for quantizing depth map values according to an embodiment
  • FIG. 2 (b) is a graph showing another example of a quantization function used for quantizing the depth map values according to an embodiment
  • FIG. 2 (c) is a graph showing another example of the quantization function used for quantizing the depth map values according to the embodiment
  • FIG. 2 (d) is a graph showing a further example of a quantization function used for quantizing depth map values according to an embodiment
  • Figure 3 (a) illustrates an overlap region for creating a bottom boundary in accordance with one embodiment
  • FIG. 3 (b) illustrates a process of generating a lower boundary according to an exemplary embodiment.
  • Figure 4 (a) illustrates an overlap region for creating an upper boundary in accordance with one embodiment
  • FIG. 4 (b) illustrates a process of creating a top boundary according to an embodiment
  • Figure 5 (a) shows an overlap region for creating a right-hand direction boundary according to one embodiment
  • Figure 7 (a) illustrates a first example of a minimum error boundary according to one embodiment
  • FIG. 7 (b) is a diagram illustrating a second example of a minimum error boundary according to one embodiment
  • FIG. 7C is a diagram illustrating a third example of a minimum error boundary according to one embodiment
  • Figure 7 (d) illustrates a fourth example of a minimum error boundary according to one embodiment
  • FIG. 7 (e) illustrates a fifth example of a minimum error boundary according to an embodiment
  • FIG. 7 (f) illustrates a sixth example of a minimum error boundary according to an embodiment
  • 8A is a view illustrating an example of a panoramic image generated using an existing method
  • FIG. 8 (b) is a diagram illustrating an example of a panoramic image generated using a method according to an embodiment
  • FIG. 9 is a flowchart illustrating an image synthesizing method of the image synthesizing apparatus according to an exemplary embodiment
  • the present disclosure describes an image synthesis method and apparatus. Specifically, the present disclosure describes a method and apparatus for generating panoramic images by combining images having different viewpoints into one image.
  • a depth map may be used to generate a panoramic image. When the depth map is used as described above, it is possible to reduce an error due to image distortion or the like occurring during image synthesis.
  • FIG. 1 is a block diagram illustrating an image synthesizing apparatus according to an exemplary embodiment of the present invention.
  • the image synthesizing apparatus may include an interface unit and at least one processor.
  • the image synthesizing apparatus may include at least one camera.
  • FIG. 1 shows functional blocks configured for image processing in one processor included in the image synthesizing apparatus, for convenience of explanation in accordance with various embodiments of the present disclosure.
  • the functional blocks shown in Fig. 1 may be implemented by a program in one or a plurality of processors. Also, the functional blocks shown in Fig. 1 may be implemented by hardware such as a digital logic circuit.
  • a device for image processing may include a parser 110, a frame extractor 120, and an image processing module 100.
  • the image processing module 100 includes a minimum error matrix generator 130, a minimum error seam generator 140, a minimum error boundary generator 150, and an image synthesizer 160 < / RTI >
  • the parser 110 may receive metadata related to each of the real images and the real images from one or a plurality of cameras.
  • the real images may include images captured by a plurality of cameras at different points in time or images captured by a camera at a plurality of different points in time.
  • the real images may include two or more real images for implementing a 3D image.
  • the parser 110 extracts depth information and position information from each of the input real images, and generates the depth map using the extracted depth information.
  • the parser 110 provides the inputted real images to the frame extracting unit 120, and provides the generated depth map and the extracted position information to the image processing module 100.
  • the parser 110 parses metadata associated with each of the input real images.
  • the metadata may include various information related to each of the real images, for example, information defined by a virtual world map (VWM) received from a separate system.
  • VWM virtual world map
  • the parser 110 may obtain a depth map of each of the real images based on the metadata.
  • the parser 110 may generate a matching position for the real images.
  • the parser 110 may extract a key point and a description from real images, for example.
  • the parser 110 may be used with one or more algorithms to extract feature points and descriptors.
  • the one or more algorithms may include, for example, an algorithm based on FAST (features form accelerated segment test) with an emphasis on speed.
  • the parser 110 may perform a process of matching minutiae and descriptors extracted from each of the real images, and may determine the positional relationship with the matching minutiae.
  • the parser 110 may detect the position of matching between the real images in consideration of the positional relationship.
  • the parser 110 can identify an area (hereinafter referred to as an overlap area) including corresponding minutiae between the real images based on the detected matched position.
  • the overlap area may be an area including the same subject in multiple frames corresponding to a still image captured at different points in time, for example.
  • the overlap area may be a region having a predetermined area connecting the upper and lower sides of the frame, an area having a predetermined area connecting the left and right sides of the frame, or a predetermined area connecting the left (or right) It may be an area having an area.
  • the parser 110 may provide the image processing module 100 with the position information on the overlap area and the depth map on the overlap area.
  • the parser 110 performs an operation for generating a matching position for real images, but the corresponding operation is performed by a separate physical structure such as a matching position generating unit, not the parser 110 .
  • the frame extracting unit 120 may extract frames from each of the real images provided from the parser 110.
  • the frames may be, for example, a unit constituting a still image or a unit constituting an image at a specific time point constituting a moving image.
  • the frame extracting unit 120 may provide the extracted frames to the image processing module 100.
  • the minimum error matrix generator 130, the minimum error boundary generator 140, the minimum error boundary direction selector 150 and the image synthesizer 160 included in the image processing module 100 can perform the following operations.
  • the minimum error matrix generator 130 may be provided with depth maps and position information extracted or generated from each of the frames extracted from each of the real images and the real images.
  • the minimum error matrix generator 130 may generate a minimum error matrix based on the provided frames, the depth map, and the position information.
  • the minimum error matrix may be, for example, a matrix for calculating eigenvalues corresponding to each of the pixels constituting one frame and defining eigenvalues calculated for the pixels constituting the frame.
  • the minimum error matrix generator 130 may obtain a minimum error matrix, for example, by the following equation (1).
  • E (i, j) represents a depth map-based minimum error matrix value at a position (i, j) corresponding to each of the pixels in the overlap region due to the combination of the frame A and the frame B
  • O a (i, j) is a position corresponding to the overlapping area each of the pixels in frame a represents a (the "first pixel location" referred to)
  • O B (i, j ) is the overlapping area of the frame B within the D (i, j) represents a position corresponding to each of the pixels (hereinafter referred to as a second pixel position)
  • D (i, j) represents a depth map value corresponding to each pixel in the overlap region according to the combination of the frame A and the frame B.
  • Quant () represents a quantization function.
  • Equation (1) a scheme for obtaining an eigenvalue defining each of the pixels included in the overlapping region (overlap region) when combining two frames (frame A and frame B) is proposed have.
  • the eigenvalues corresponding to each of the pixels included in the overlap region can be implemented by various schemes.
  • the quantization function in Equation (1) can be used to quantize the depth map values into a plurality of steps (quantization levels).
  • the quantization function may be used such that, for example, the depth map becomes larger as the depth (depth) between the camera and the image is greater.
  • the above quantization function it is possible to minimize an error that may occur when an object at a long distance is covered by a nearby object when creating a boundary.
  • the minimum error matrix generator 130 When the minimum error matrix generator 130 obtains the minimum error matrix, the minimum error matrix generator 130 outputs the minimum error matrix to the minimum error boundary generator 140.
  • the minimum error matrix may define eigenvalues indicating, for example, the characteristics of each of the pixels included in the overlap region.
  • the minimum error boundary generator 140 may generate a boundary for each boundary generation direction in the overlap area using the minimum error matrix provided from the minimum error matrix generator 130.
  • the boundaries define, for example, connections of pixels that are expected to have a low probability of error occurrence within the overlap region for compositing multiple frames based on a minimum error matrix. This will not only reduce the probability of errors occurring when composing multiple frames, but also reduce the amount of computation required to correct errors during compositing. For example, when two frames are synthesized, the boundaries at which the two frames are synthesized in the pixels with low probability of error are set, thereby reducing the amount of error due to composition and the amount of computation for error correction.
  • the starting position of the boundary may be defined by a corner where four pixels (upper edge, lower edge, left edge, right edge) constituting the rectangle defining the overlap region constitute the boundary generating pixel have.
  • the boundary generation direction when the starting pixel position of the boundary is the upper edge, the boundary generation direction may be the lower direction, and when the starting pixel position of the boundary is the lower edge, the boundary generation direction may be the upper direction, The boundary generation direction may be the right edge direction.
  • S ia (i, j) defined as Equation (2) defines a boundary (hereinafter referred to as a lower boundary) generated in the lower direction of the overlap region and S ib , j) defines a boundary (hereinafter referred to as an upper boundary) generated in the upper direction of the overlap region, and S j (i, j) defined as in Equation 4 is generated in the right direction of the overlap region (Hereinafter referred to as " right direction boundary ").
  • E (i, j) commonly used in the equations (2), (3) and (4) represents a minimum error matrix.
  • the minimum error matrix E (i, j) may be provided by the minimum error matrix generator 130.
  • the minimum error boundary generator 140 performs a minimum error search operation to search for positions of pixels having a low probability of generating an error for each boundary generation direction in the overlapping area based on the minimum error matrix.
  • the boundaries can be generated for each boundary generation direction by referring to pixel values (eigenvalues defined for each pixel in the minimum error matrix) around the same row or column as well as the next search row or column in the minimum error search. This makes the boundary generation method more flexible and dramatically increases the range of possible boundaries.
  • three boundaries, starting from the bottom, top, and right edge we can synthesize the frames considering the boundaries of all directions that can be generated. It considers the synthesis of left and right images in the case of panoramic image, but considering the synthesis of up, down, left, and right images for 360 degree image which is emerging recently, omnidirectional synthesis technique is needed and it is to support this.
  • the minimum error boundary generator 140 When the upper boundary, the lower edge generation boundary, and the right edge generation boundary are generated in the overlap area, the minimum error boundary generator 140 generates the minimum error boundary And outputs it to the direction selector 150.
  • the minimum error boundary direction selection unit 150 selects a boundary having the smallest error value among the boundaries generated in each direction of the overlap region. For this, the minimum error boundary direction selection unit 150 may use Equation (5) as follows.
  • Equation (5) S (i) represents the generated boundary, N represents the number of pixels for the generated boundary, and SEAMavr represents the error average of the boundary.
  • SEAMavr When SEAMavr is used, an error that can be caused by the generated boundary can be confirmed numerically.
  • the SEAMavr for each direction boundary can be used as a measure for evaluating the boundary.
  • the minimum error boundary direction selection unit 150 may select a direction corresponding to a boundary having the smallest error among the average of the errors with respect to the boundaries of the respective directions.
  • the minimum error boundary direction selection unit 150 outputs information on the boundary of the selected direction (i.e., the boundary having the minimum error) to the image composition unit 160.
  • the image synthesizer 160 synthesizes the images in the overlap area based on the information about the boundary input from the minimum error boundary direction selector 150.
  • the image synthesizer 160 synthesizes the real images in the left, right, up, and down directions on the basis of the boundary corresponding to the information, and outputs the panorama image.
  • the minimum error matrix generator 130, the minimum error boundary generator 140, the minimum error boundary direction generator 150, and the image synthesizer 160 are shown as separate components. However, it is to be understood that the components 130, 140, 150, and 160 may be implemented as at least one physical component 100, such as a controller or processor, in accordance with various embodiments.
  • FIG. 2 (a) is a graph showing an example of a quantization function used for quantizing depth map values according to an embodiment
  • FIG. 2 (b) is a graph used to quantize depth map values according to an embodiment A graph showing another example of the quantization function
  • FIG. 2 (c) is a graph showing another example of a quantization function used to quantize the depth map values according to an embodiment
  • FIG. 2 (d) is a graph illustrating a quantization function for quantizing depth map values according to an embodiment A graph showing a further example of the quantization function used.
  • the horizontal axis represents depth map values for each pixel expressed by a value between 0 and 255
  • the vertical axis represents a weight value value corresponding to each depth map value
  • the quantization function may define the relationship between the depth map value and the weight value in various forms as shown in Figs. 2 (a) to 2 (d). As shown in Figs. 2 (a) to 2 (d), the quantization function increases the weight value corresponding to the depth map as the depth increases, that is, as the distance between the camera and the object in the image increases .
  • the weight values that can correspond to the depth map values are quantized into 16 levels.
  • the weight values are not limited thereto and may be used in various ways. When the above quantization function is used, it is possible to minimize an error that may occur when an object at a long distance is covered by a nearby object when creating a boundary.
  • FIG. 3 (a) is a view illustrating an overlap area for creating a lower boundary according to an embodiment of the present invention
  • FIG. 3 (b) is a diagram illustrating a process of creating a lower boundary according to an embodiment.
  • the overlap region may be a rectangular region having 9 x 7 pixels, for example, and the minimum error matrix values (i.e., E (i, j)) may be determined for each pixel position have.
  • the minimum error matrix values are values to which weights corresponding to the depth map values of the corresponding pixel positions are applied, respectively. Therefore, the smaller the minimum error matrix value, the closer the distance between the camera and the object in the image is.
  • E (1, j) values (9, 4, 4, 9, 16, 25, 4, 1) included in the first row of the overlapping region , 2) are generated as S 1a (1, j) values 9, 4, 4, 9, 16, 25, 4, 1, 2 as they are.
  • the values E (2, j) (4, 4, 9, 16, 1, 9, 9, 4, 25) located in the second row of the overlap region are represented by S 2a 13, 20, 10, 13, 10, 5, 26).
  • the starting point for generating the lower boundary values may be a row (e.g., two rows) positioned at the lower end of the update of actual values of the two rows when there are two adjacent rows (e.g., 1 row and 2 rows) . ≪ / RTI >
  • the value 4 of E (2, 2) at pixel position (2, 2) is equal to the value of E (1, 2) adjacent to the previous row of pixel position
  • the value of E (2, 2) is added to the smallest value 4 of the values (9, 4, 4) of E (1, 2), E 2a (2, 2).
  • the value 4 of E (3, 3) at the pixel position 3, 3 corresponds to three E (1, 1), E (1, 2) E (1, 3) values (i.e., the updated value of 8, 13, 20) the summed with a value 8, the value of the E (3, 3) of the value of the added result S 2a (3, 3) of the Lt; / RTI >
  • FIG. 3 (b) shows the result as shown in FIG. 3 (b) when all the updates to E (i, j) are completed in the above manner.
  • the arrows shown in FIG. 3 (b) show the direction in which the values of E (i, j) are added to the corresponding pixels.
  • FIG. 3 (b) shows a case where the direction is the bottom direction of the overlap region.
  • S ia (i, j) as shown in FIG. 3 (b) a minimum value is detected for each row.
  • the pixels corresponding to the detected minimum value may be connected and determined as a bottom boundary.
  • pixels having the minimum values (1, 5, 6, 7, 8, 9, 10) in each of the first to seventh rows may be determined as pixels for the lower boundary.
  • FIG. 4A is a view showing an overlap area for creating an upper boundary according to an embodiment
  • FIG. 4B is a diagram illustrating a process of creating an upper boundary according to an embodiment.
  • the overlap region may be a rectangular region having 9 x 7 pixels, and minimum error matrix values (i.e., E (i, j)) are determined for each pixel position .
  • the minimum error matrix values are values to which weights corresponding to the depth map values of the corresponding pixel positions are applied, respectively. Therefore, the smaller the minimum error matrix value, the closer the distance between the camera and the object in the image is.
  • the starting point for generating the upper direction boundary values may be a row (e.g., 6 rows) positioned at the upper position where update of substantial values of the two rows is started when two adjacent rows (e.g., row 6, row 7) . ≪ / RTI >
  • the value 1 of E (6, 7) at pixel positions 6 and 7 is the sum of the three E (7, 7) adjacent to the next row of pixel positions 6, 7 , E (7, 6), E (7, 7) and E (7, 8) are added to the smallest value 1 among the values 25, 6b (6, 7).
  • the value 4 of E (5, 4) at the pixel position 5, 4 corresponds to three E (6, 3), E (6, 4) E (6, 5), the values (i.e., the updated value of 17, 26, 5) the summed with a value 5, the value of the E (5, 4) of the said added result value, S 5b (5, 4) of the To " 9 "
  • S ib (i, j) as shown in FIG. 4 (b) a minimum value is detected for each row.
  • the pixels corresponding to the detected minimum value may be connected and determined as the upper direction boundary.
  • pixels having the minimum values (10, 9, 5, 4, 3, 2, 1) in each of the first to seventh rows may be determined as pixels for the upper direction boundary.
  • FIG. 5A is a diagram illustrating an overlap region for generating a right-direction boundary according to an embodiment
  • FIG. 5B is a diagram illustrating a process of generating a right-direction boundary according to an embodiment.
  • the overlap region may be a rectangular region having 9 x 7 pixels, for example, and the minimum error matrix values (i.e., E (i, j)) may be determined for each pixel position have.
  • the minimum error matrix values are values to which weights corresponding to the depth map values of the corresponding pixel positions are applied, respectively. Therefore, the smaller the minimum error matrix value, the closer the distance between the camera and the object in the image is.
  • E (i, 2) values (4, 4, 16, 4, 25, 25, 4) S 2 (i, 2) value (8, 8, 20, 8, 29 , 29, 8).
  • the starting point for generating the right direction boundary values is a row (e.g., two columns) located at the right end where the updating of substantial values of the two columns is started when two adjacent columns (e.g., columns 1 and 2) . ≪ / RTI >
  • the value 4 of E (2, 2) at pixel position 2, 2 is equal to the value of E (1, 2) adjacent to the previous column of pixel position 2, 1), E (2, 1 ), E (3, 1) values (9, 4, 4) the summed with a value 4, the value of the E (2, 2) of the S 2 value of the added result of the (2, 2).
  • the value 4 of E (3, 3) at the pixel position 3, 3 corresponds to three E (2, 2), E (3, 2), E (4, 2) the values summed and the smallest value of 8 (i.e., the updated value of 8, 20, 8), the E (3, 3) the value of the added result value, S 2 (3, 3) of the The value is updated to 12.
  • FIG. 5 (b) a result as shown in FIG. 5 (b) is calculated.
  • the arrows shown in FIG. 5 (b) show the direction in which the value of E (i, j) is added to the pixel. In FIG. 5 (b), the direction is the rightward direction of the overlapping area.
  • S j (i, j) as shown in FIG. 5 (b) a minimum value is detected for each column.
  • the pixels corresponding to the detected minimum value may be connected and determined as a right-direction boundary.
  • pixels having the minimum values (4, 8, 9, 10, 14, 15, 9, 13, 17) in columns 1 to 9 can be determined as pixels for the right- .
  • the probability of occurrence of an error among the determined boundaries based on Equation (5) This lower boundary can be selected.
  • the real images may be synthesized in the direction corresponding to the selected boundary to be generated as a panoramic image.
  • FIG. 6 is a diagram illustrating an overlap region according to one embodiment.
  • the overlap area 600 may be an area including the same object in the multiple frames 610 and 620 corresponding to the real image shot at different points in time.
  • the overlap area may be a region having a predetermined area connecting the upper and lower sides of the frame, an area having a predetermined area connecting the left and right sides of the frame, or a predetermined area connecting the left (or right) It may be an area having an area.
  • the overlap area 600 has a rectangular shape as an area having a predetermined area connecting left and right sides of the frame.
  • the shape of the overlap area 600 is not limited to this, have.
  • minimum error boundaries as shown in FIGS. 7A to 7F can be generated .
  • FIG. 7A is a diagram illustrating a first example of a minimum error boundary according to an embodiment
  • FIG. 7B is a diagram illustrating a second example of a minimum error boundary according to an embodiment
  • FIG. 7C 7D is a diagram illustrating a fourth example of a minimum error boundary according to an embodiment
  • FIG. 7E is a diagram showing a fourth example of a minimum error boundary according to an embodiment
  • FIG. 7 (f) is a diagram illustrating a sixth example of a minimum error boundary according to an exemplary embodiment of the present invention.
  • a minimum error boundary may be generated when a scheme based on equations (1) to (5) as described above is used, the shape of the minimum error boundary may vary in various forms, as shown in FIGS. 7 (a) .
  • Fig. 7 (a) shows the minimum error boundary of the shape extending from the upper end to the right end of the overlap region
  • Fig. 7 (b) shows the minimum error boundary of the shape extending from the upper end to the lower end of the overlap region
  • Fig. 7 ) Represents a minimum error boundary of the form leading from the top of the overlap region to the left end.
  • FIG. 7 (d) shows the minimum error boundary of the shape extending from the right end to the bottom end of the overlap region
  • FIG. 7 (e) shows the minimum error boundary of the shape extending from the left end to the right end of the overlap region
  • f) represents a minimum error boundary of the shape extending from the left end to the bottom end of the overlap region.
  • the minimum error boundaries of the overlapping region can be generated in various manners, the corresponding real images are synthesized in the left, right, up, and down directions with respect to the minimum error boundary, Images can be generated.
  • FIG. 8A is a view showing an example of a panorama image generated using the conventional method
  • FIG. 8B is an example of a panorama image generated using the method according to the embodiment of the present invention
  • Fig. When the real images are synthesized based on the minimum error boundaries as described above, errors such as image distortion occurring during image synthesis can be solved.
  • FIG. 9 is a flowchart illustrating an image synthesizing method of an image synthesizing apparatus according to an embodiment.
  • the image synthesizer receives a depth map corresponding to each of the real images and the real images (operation 910).
  • the depth map of each real image may have the same or different size than the corresponding real image. If the overall system performance is considered, it is more advantageous to use a depth map having the same size as the corresponding real image. This is because, if the real image and the depth map have different sizes, there is a loss of information in the process of matching the sizes, which may affect the performance of the entire system.
  • the image synthesizer performs an operation to generate a matching position of the real images (operation 920). That is, the image synthesizer performs a process of matching feature points and descriptors extracted from each of the real images, and determines a positional relationship between the matching feature points. In addition, the image synthesizing apparatus generates a position to be matched between the real images in consideration of the positional relationship, and confirms the overlapping region based on the generated position.
  • the image synthesizer generates a minimum error matrix based on the position information and the depth map of the overlap area (operation 930). As shown in Equation 1, the depth map value for each pixel based on the depth map is changed to a weight value and reflected in the minimum error matrix.
  • the image synthesizer generates a minimum error boundary using the minimum error matrix (operation 940). For example, the image synthesizer generates a boundary for each direction of the overlapping area based on Equations 2, 3, and 4, and selects a boundary where errors are least likely to occur among the generated boundaries, A minimum error boundary can be generated.
  • the image synthesizing apparatus determines whether to generate a composite image as a panorama image (operation 950). If it is determined that the composite image should be generated, the image synthesizer synthesizes the images in the overlap region based on the minimum error boundary to generate the panorama image (operation 960). The operations shown in FIG. 9 may be repeatedly performed for generating the panoramic image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

Selon un mode de réalisation, la présente invention concerne un dispositif de synthèse d'image destiné à aligner une pluralité de trames sur la base de points caractéristiques extraits de la pluralité de trames correspondant respectivement à une pluralité d'images, à confirmer une région de chevauchement comprenant les points caractéristiques en correspondance entre les trames alignées dans une région comprenant les trames alignées, à déterminer un poids pour chaque pixel sur la base d'informations de profondeur de la région de chevauchement, à générer une matrice de confusion minimale par application du poids pour chaque pixel au différentiel entre des valeurs de pixel des trames comprises dans la région de chevauchement, à générer une limite au moyen de la matrice de confusion minimale, et à synthétiser les trames alignées sur la base de la limite générée.
PCT/KR2019/000111 2018-01-12 2019-01-03 Procédé et dispositif de synthèse d'image WO2019139303A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2018-0004708 2018-01-12
KR1020180004708A KR102637913B1 (ko) 2018-01-12 2018-01-12 영상 합성 방법 및 장치

Publications (1)

Publication Number Publication Date
WO2019139303A1 true WO2019139303A1 (fr) 2019-07-18

Family

ID=67219760

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/000111 WO2019139303A1 (fr) 2018-01-12 2019-01-03 Procédé et dispositif de synthèse d'image

Country Status (2)

Country Link
KR (1) KR102637913B1 (fr)
WO (1) WO2019139303A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102630106B1 (ko) * 2021-04-01 2024-01-25 경희대학교 산학협력단 우선순위 사물 기반의 영상 정합 장치 및 그 방법

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0686162A (ja) * 1992-08-31 1994-03-25 Sony Corp 映像信号合成装置およびその方法
KR100790890B1 (ko) * 2006-09-27 2008-01-02 삼성전자주식회사 파노라마 영상 생성장치 및 방법
JP5360190B2 (ja) * 2011-12-27 2013-12-04 カシオ計算機株式会社 画像処理装置、画像処理方法およびプログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9563953B2 (en) 2014-08-28 2017-02-07 Qualcomm Incorporated Systems and methods for determining a seam

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0686162A (ja) * 1992-08-31 1994-03-25 Sony Corp 映像信号合成装置およびその方法
KR100790890B1 (ko) * 2006-09-27 2008-01-02 삼성전자주식회사 파노라마 영상 생성장치 및 방법
JP5360190B2 (ja) * 2011-12-27 2013-12-04 カシオ計算機株式会社 画像処理装置、画像処理方法およびプログラム

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CAO, FANG ET AL.: "Seamless Image Stitching Using Optimized Boundary Matching for Gradient and Curvature", 2010 INTERNATIONAL SYMPOSIUM ON INTELLIGENCE INFORMATION PROCESSING AND TRUSTED COMPUTING, 28 October 2010 (2010-10-28), pages 495 - 498, XP031831824, Retrieved from the Internet <URL:https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=5663453> *
LEE, JUN-TAE ET AL.: "Stitching of heterogeneous images using depth information", 2013 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE, 29 October 2013 (2013-10-29), pages 1 - 4, XP032549824, Retrieved from the Internet <URL:https://ieeexpiore.ieee.org/document/6694216> *

Also Published As

Publication number Publication date
KR102637913B1 (ko) 2024-02-20
KR20190086323A (ko) 2019-07-22

Similar Documents

Publication Publication Date Title
US20060120712A1 (en) Method and apparatus for processing image
EP3560188A1 (fr) Dispositif électronique de création d&#39;image panoramique ou d&#39;image animée et procédé associé
WO2020101103A1 (fr) Appareil et procédé de mesure de la vitesse d&#39;écoulement d&#39;un flux au moyen d&#39;un traitement d&#39;image d&#39;écoulement optique
WO2015005577A1 (fr) Appareil et procédé d&#39;estimation de pose d&#39;appareil photo
WO2013151270A1 (fr) Appareil et procédé de reconstruction d&#39;image tridimensionnelle à haute densité
WO2012161431A9 (fr) Procédé de génération d&#39;une image d&#39;une vue autour d&#39;un véhicule
EP0180446A2 (fr) Procédés pour la détection du mouvement dans des images de télévision
WO2017099510A1 (fr) Procédé permettant de segmenter une scène statique sur la base d&#39;informations statistiques d&#39;image et procédé s&#39;y rapportant
EP2329655A2 (fr) Appareil et procédé pour obtenir une image à haute résolution
WO2021221334A1 (fr) Dispositif de génération de palette de couleurs formée sur la base d&#39;informations gps et de signal lidar, et son procédé de commande
WO2019139303A1 (fr) Procédé et dispositif de synthèse d&#39;image
WO2018139810A1 (fr) Appareil de détection pour calculer des informations de position d&#39;un objet en mouvement, et procédé de détection l&#39;utilisant
WO2014010820A1 (fr) Procédé et appareil d&#39;estimation de mouvement d&#39;image à l&#39;aide d&#39;informations de disparité d&#39;une image multivue
WO2019098421A1 (fr) Dispositif de reconstruction d&#39;objet au moyen d&#39;informations de mouvement et procédé de reconstruction d&#39;objet l&#39;utilisant
WO2017213335A1 (fr) Procédé pour combiner des images en temps réel
WO2019194561A1 (fr) Procédé et système de reconnaissance d&#39;emplacement pour fournir une réalité augmentée dans un terminal mobile
WO2023136414A1 (fr) Dispositif terminal de collecte d&#39;informations pour collecter des informations concernant des objets routiers dangereux et procédé de fonctionnement associé
JP4605582B2 (ja) ステレオ画像認識装置及びその方法
WO2016104842A1 (fr) Système de reconnaissance d&#39;objet et procédé de prise en compte de distorsion de caméra
WO2019009579A1 (fr) Procédé et appareil de correspondance stéréo utilisant une interpolation à points de support
WO2017007047A1 (fr) Procédé et dispositif de compensation de la non-uniformité de la profondeur spatiale en utilisant une comparaison avec gigue
WO2019083068A1 (fr) Système d&#39;acquisition d&#39;informations tridimensionnelles à l&#39;aide d&#39;une pratique de lancement, et procédé de calcul de paramètres de caméra
WO2020171257A1 (fr) Procédé de traitement d&#39;image et dispositif correspondant
WO2020256517A2 (fr) Procédé et système de traitement de mappage de phase automatique basés sur des informations d&#39;image omnidirectionnelle
JPH11194027A (ja) 三次元座標計測装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19738114

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19738114

Country of ref document: EP

Kind code of ref document: A1