CN109598753B - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN109598753B
CN109598753B CN201811440471.9A CN201811440471A CN109598753B CN 109598753 B CN109598753 B CN 109598753B CN 201811440471 A CN201811440471 A CN 201811440471A CN 109598753 B CN109598753 B CN 109598753B
Authority
CN
China
Prior art keywords
depth information
background object
pixel points
fitting
equation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811440471.9A
Other languages
Chinese (zh)
Other versions
CN109598753A (en
Inventor
尚砚娜
杨汇成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201811440471.9A priority Critical patent/CN109598753B/en
Publication of CN109598753A publication Critical patent/CN109598753A/en
Application granted granted Critical
Publication of CN109598753B publication Critical patent/CN109598753B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure provides an image processing method, including: acquiring an image to be processed, wherein the image to be processed comprises a foreground object and a background object; acquiring first depth information, wherein the first depth information represents the depth of a plurality of pixel points in an image corresponding to the surface of a foreground object; acquiring second depth information, wherein the second depth information represents the depth of the plurality of pixel points corresponding to the surface of the background object; and amplifying a difference value between corresponding depth information in the first depth information and the second depth information for each of the plurality of pixel points. The present disclosure also provides an image processing apparatus.

Description

Image processing method and device
Technical Field
The present disclosure relates to an image processing method and an image processing apparatus.
Background
When the image with depth change is subjected to region extraction, if the depth change of the background part of the detected object in the image is large, and the depth change of the foreground part relative to the surrounding background part is small, the contrast between the foreground and the background is low on the image, which brings great difficulty to subsequent region extraction/detection.
For example, some characters are engraved on the surface of a ceramic cup, the detected object is a cup, the foreground part in the image can be characters on the cup, and the background part can be the part around the characters.
Disclosure of Invention
An aspect of the present disclosure provides an image processing method capable of performing image processing by enlarging a difference value of a foreground depth and a background depth, including: acquiring an image to be processed, wherein the image to be processed comprises a foreground object and a background object; acquiring first depth information, wherein the first depth information represents the depth of a plurality of pixel points in the image corresponding to the surface of the foreground object; acquiring second depth information, wherein the second depth information represents the depth of the plurality of pixel points corresponding to the surface of the background object; and amplifying a difference between corresponding depth information in the first depth information and the second depth information for each of the plurality of pixel points.
Optionally, the acquiring the second depth information includes: determining a fitting equation for fitting a surface of the background object; and taking each pixel point in the plurality of pixel points as a variable of the fitting equation, and calculating the depth of each pixel point corresponding to the surface of the background object to obtain the second depth information.
Optionally, the determining a fitting equation for fitting the surface of the background object comprises: if the surface of the background object is a plane, determining a linear equation or a plane equation for fitting the surface of the background object; or if the surface of the background object is a curved surface, determining a curve equation or a curved surface equation for fitting the surface of the background object.
Optionally, the determining a fitting equation for fitting the surface of the background object comprises: determining a predetermined number of pixel points located on a surface of the background object; measuring to obtain depth information corresponding to the preset number of pixel points; and fitting the surface of the background object by using the preset number of pixel points and the depth information obtained by measurement to obtain the fitting equation.
Optionally, any one of the following is included in the plurality of pixel points: all pixel points located on the surface of the foreground object; a plurality of feature pixel points located on a surface of the foreground object; and all pixel points which are positioned on the edge of the foreground object connected with the background object.
Another aspect of the present disclosure provides an image processing apparatus including: the device comprises a first acquisition module, a second acquisition module and a processing module, wherein the first acquisition module is used for acquiring an image to be processed, and the image to be processed comprises a foreground object and a background object; the second obtaining module is used for obtaining first depth information, wherein the first depth information represents the depth of a plurality of pixel points in the image corresponding to the surface of the foreground object; a third obtaining module, configured to obtain second depth information, where the second depth information represents a depth of the multiple pixel points corresponding to the surface of the background object; and a processing module configured to amplify a difference between corresponding depth information in the first depth information and the second depth information for each of the plurality of pixel points.
Optionally, the third obtaining module includes: a determination unit for determining a fitting equation for fitting a surface of the background object; and the calculating unit is used for taking each pixel point in the plurality of pixel points as a variable of the fitting equation, and calculating the depth of each pixel point corresponding to the surface of the background object to obtain the second depth information.
Optionally, the determining unit is further configured to: if the surface of the background object is a plane, determining a linear equation or a plane equation for fitting the surface of the background object; or if the surface of the background object is a curved surface, determining a curve equation or a curved surface equation for fitting the surface of the background object.
Optionally, the determining unit includes: a determining subunit for determining a predetermined number of pixel points located on a surface of the background object; the quantum measurement unit is used for measuring and obtaining the depth information corresponding to the pixels with the preset number; and the fitting subunit is used for fitting the surface of the background object by using the preset number of pixel points and the depth information obtained by measurement to obtain the fitting equation.
Optionally, any one of the following is included in the plurality of pixel points: all pixel points located on the surface of the foreground object; a plurality of feature pixel points located on a surface of the foreground object; and all pixel points which are positioned on the edge of the foreground object connected with the background object.
Another aspect of the present disclosure provides a computer apparatus comprising: one or more processors; memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method as described above.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as described above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically illustrates an application scenario of an image processing method and apparatus according to an embodiment of the present disclosure;
FIG. 2 schematically shows a flow chart of an image processing method according to an embodiment of the present disclosure;
FIG. 3A schematically illustrates an effect diagram before enhancing an image according to an embodiment of the present disclosure;
FIG. 3B schematically shows an effect diagram after enhancing an image according to an embodiment of the disclosure;
FIG. 4 schematically illustrates a schematic diagram of a fitting equation to determine a background surface in accordance with an embodiment of the disclosure;
fig. 5 schematically shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure;
FIG. 6 schematically shows a block diagram of a third acquisition module according to an embodiment of the disclosure;
FIG. 7 schematically shows a block diagram of a determination unit according to an embodiment of the disclosure; and
FIG. 8 schematically illustrates a block diagram of a computer system suitable for implementing the image processing method and apparatus according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. The techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system.
Embodiments of the present disclosure provide an image processing method capable of performing image processing by enlarging a difference value of a foreground depth and a background depth, and an image processing apparatus capable of applying the method. The method comprises the steps of obtaining an image to be processed, wherein the image to be processed comprises a foreground object and a background object; acquiring first depth information, wherein the first depth information represents the depth of a plurality of pixel points in an image corresponding to the surface of a foreground object; acquiring second depth information, wherein the second depth information represents the depth of the plurality of pixel points corresponding to the surface of the background object; and amplifying a difference value between corresponding depth information in the first depth information and the second depth information for each of the plurality of pixel points.
Fig. 1 schematically illustrates an application scenario of an image processing method and apparatus according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
When an image with depth change is subjected to region extraction, if the depth change of a background part of a detected object in the image is large, and the depth change of a foreground part relative to a surrounding background part is small, the image shows that the contrast between a foreground (also called a foreground object) and a background (also called a background object) is low, which brings great difficulty to subsequent region extraction/detection.
In the application scenario shown in fig. 1, it is obvious that the depth of the background portion of the image changes greatly, and the depth of the foreground portion changes slightly relative to the surrounding background portion, so that the contrast between the background portion and the foreground portion is small, and image extraction/detection is difficult.
Fig. 2 schematically shows a flow chart of an image processing method according to an embodiment of the present disclosure.
As shown in fig. 2, the method includes operations S210 to S230, in which:
in operation S210, an image to be processed is acquired, where the image to be processed includes a foreground object and a background object.
It should be noted that, in an image, the foreground and the background are relative concepts, and generally, some objects in the image may be set as foreground objects and the rest other objects may be set as background objects according to the actual needs of the user.
For example, a defect/character in an image may be set as a foreground object, and objects other than the defect/character may be set as background objects.
In operation S220, first depth information is obtained, where the first depth information represents depths of a plurality of pixel points in the image corresponding to the surface of the foreground object.
That is, the first depth information is characteristic of the depth of the surface of the foreground object. Specifically, this depth information may be determined by directly reading the depth of the pixel points corresponding to the surface of the foreground object.
It should be noted that the plurality of pixel points corresponding to the surface of the foreground object may include, but are not limited to, any one of the following: "all pixels" located on the surface of the foreground object; a plurality of characteristic pixel points located on the surface of the foreground object (i.e. pixel points of the aforementioned "all pixel points" that can represent the characteristics of the surface of the foreground object); all pixel points located on the edge of the foreground object that meets the background object.
In operation S230, second depth information is acquired, wherein the second depth information characterizes depths of the plurality of pixel points corresponding to the surface of the background object.
That is, here the second depth information is characteristic of the depth of the surface of the background object, more specifically it is characteristic of the depth of the surface of the part of the background object that is occluded by the foreground object. Since the surface of part of the background object is actually covered by the foreground object, this depth information cannot be determined by directly reading the depth of the corresponding pixel point.
In particular, in embodiments of the present disclosure, the second depth information may be determined by fitting a corresponding equation to the surface of the background object. How to fit the corresponding equation to the surface of the background object will be described in detail in the following embodiments, and will not be described herein again.
In operation S240, a difference between corresponding depth information of the first depth information and the second depth information is enlarged for each of the plurality of pixel points.
For example, assuming that the fitting equation of the surface of the background object is g, and the point (x0, y0) is on the surface of the background object, and the point (x0, y1) is on the surface of the foreground object, then substituting the point (x0, y0) into equation g, the value of y0 can be found, and the value of y1 can be found by measuring or reading the depth of the corresponding pixel point. Wherein, the second depth information may be represented by y0, the first depth information may be represented by y1, and if (y1-y0) is represented by y2, y2 may represent a difference value between the two depth information, which may be implemented by enlarging y2 to enhance the contrast between the foreground object and the background object, and specifically, y2 may be mapped to a value range of 0 to 255 according to a certain ratio.
Compared with the prior art that when image region extraction/detection is performed under the condition of low contrast between the foreground and the background, difficulty is high, and even corresponding image regions are difficult to accurately extract/detect, the embodiment of the disclosure can achieve the purpose of facilitating image region extraction/detection by amplifying the contrast between the foreground and the background.
For example, the upper graph in fig. 3A represents the effect before enhancing the image, and the lower graph in fig. 3A represents the result of image segmentation based on the upper graph in fig. 3A, and obviously the segmentation effect is very poor; the upper graph in fig. 3B represents the effect graph after the image is enhanced by using the scheme provided by the present disclosure, and the lower graph in fig. 3B represents the result graph of image segmentation based on the upper graph in fig. 3B, and obviously, the effect of image segmentation based on the upper graph in fig. 3B is better than that of fig. 3A.
The method of fig. 2 is further described with reference to fig. 4 in conjunction with specific embodiments.
As an alternative embodiment, the obtaining the second depth information includes:
determining a fitting equation for fitting a surface of a background object; and
and taking each pixel point in the plurality of pixel points as a variable of a fitting equation, and calculating the depth of each pixel point corresponding to the surface of the background object to obtain second depth information.
As mentioned before, the second depth information is characteristic of the depth of the surface of the background object, more specifically it is characteristic of the depth of the surface of the part of the background object that is occluded by the foreground object. Since the surface of a part of the background object is actually covered by the foreground object, this depth information cannot be determined by directly reading the depth of the corresponding pixel point or by direct measurement.
Assuming that a cross-sectional view of a real object corresponding to a foreground portion and a background portion in an image is shown in fig. 4, wherein a shaded portion corresponds to a foreground object in the image and a straight portion corresponds to a background object in the image, an in-line equation or a plane equation can be fitted to a surface of the background object as shown in fig. 4.
Taking the fitted straight line equation as an example:
(1) assume the fitting equation for the surface of the background object is y-kx + b;
(2) measuring the depths of any two points on the surface (the part which is not blocked by the foreground object) of the background object, such as (x2, y2), (x3, y 3);
(3) substituting (x2, y2), (x3, y3) into y kx + b to obtain a k value (assuming that k is k1) and a b value (assuming that b is b1), thereby determining a fitting equation y klx + b1 for the surface of the background object;
(4) calculating second depth information, for example, y1 represents depth information at point 1 on the surface where x0 corresponds to the foreground object, y0 represents depth information at point 2 on the surface where x0 corresponds to the background object, where point 1 and point 2 correspond, and then substituting x0 and y0 into y-klx + bl to obtain y 0-klx 0+ b1, and (klx0+ b1) is one of the second depth information;
(5) the depth information of the surface corresponding to the other point to the background object can be obtained by the method shown in (4).
According to the embodiment of the disclosure, because the surfaces of the background objects corresponding to the plurality of pixel points are actually covered by the surfaces of the foreground objects and are difficult to obtain through actual measurement, the depth information of the surfaces of the background objects corresponding to the pixel points can be estimated through a fitting equation mode.
As an alternative embodiment, determining a fitting equation for fitting the surface of the background object comprises:
if the surface of the background object is a plane, determining a linear equation or a plane equation for fitting the surface of the background object; or
If the surface of the background object is a curved surface, a curve equation or a curved surface equation for fitting the surface of the background object is determined.
That is, when fitting an equation to a surface of a background object, the type of the surface of the background object may be determined first, and then the corresponding equation may be fitted. Wherein, if the surface of the background object is a plane, it can be fitted to two equations, i.e. a straight line equation or a plane equation; if the surface of the background object is a curved surface, it can also be fitted to two equations, i.e. a curve equation or a curved surface equation.
As an alternative embodiment, determining a fitting equation for fitting the surface of the background object comprises:
determining a predetermined number of pixel points located on a surface of a background object;
measuring to obtain depth information corresponding to a predetermined number of pixel points; and
and fitting the surface of the background object by using the preset number of pixel points and the depth information obtained by measurement to obtain a fitting equation.
Taking the surface of the background object as a plane as an example, the process of fitting the linear equation is as follows: as shown in fig. 4, assuming that the linear equation is y ═ kx + b, and the depths of two points (x2, y2) and (x3, y3) are measured, and y ═ kx + b is substituted, thereby obtaining a fitted linear equation for the surface of the background object; the flow of fitting the plane equation is as follows: assuming that the plane equation is Ax + By + Cz + D is 0, and the depths of the three points (x1, y1), (x2, y2) and (x3, y3) are measured and substituted into Ax + By + Cz + D is 0, thereby obtaining a fitting plane equation for the surface of the background object.
The method for fitting the corresponding equation for the case where the surface of the background object is a curved surface is similar to the method for fitting the corresponding equation for the case where the surface of the background object is a flat surface, and details are not repeated here.
According to the embodiment of the disclosure, a fitting equation suitable for the surface of the background object can be set according to the type (such as a plane/curved surface) of the surface of the background object, and then the fitting equation of the surface of the background object is finally obtained by measuring the depth of some points and substituting the measured fitting equation.
Fig. 5 schematically shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure.
As shown in fig. 5, the image processing apparatus 500 includes a first acquisition module 510, a second acquisition module 520, a third acquisition module 530, and a processing module 540. The image processing apparatus 500 may execute the method described above to achieve the purpose of enhancing the contrast between the foreground and the background in the image, thereby facilitating the image detection.
Specifically, the first obtaining module 510 is configured to obtain an image to be processed, where the image to be processed includes a foreground object and a background object;
a second obtaining module 520, configured to obtain first depth information, where the first depth information represents a depth of a surface of the foreground object corresponding to a plurality of pixel points in the image;
a third obtaining module 530, configured to obtain second depth information, where the second depth information represents a depth of the plurality of pixel points corresponding to the surface of the background object; and
the processing module 540 is configured to amplify, for each of the plurality of pixel points, a difference between corresponding depth information in the first depth information and the second depth information.
Compared with the prior art that when image region extraction/detection is performed under the condition of low contrast between the foreground and the background, difficulty is high, and even corresponding image regions are difficult to accurately extract/detect, the embodiment of the disclosure can achieve the purpose of facilitating image region extraction/detection by amplifying the contrast between the foreground and the background.
As an alternative embodiment, as shown in fig. 6, the third obtaining module 530 includes: a determining unit 610 for determining a fitting equation for fitting a surface of the background object; and a calculating unit 620, configured to calculate a depth of each pixel point corresponding to the surface of the background object by using each pixel point of the multiple pixel points as a variable of the fitting equation, so as to obtain second depth information.
According to the embodiment of the disclosure, because the surfaces of the background objects corresponding to the plurality of pixel points are actually covered by the surfaces of the foreground objects and are difficult to obtain through actual measurement, the depth information of the surfaces of the background objects corresponding to the pixel points can be estimated through a fitting equation mode.
As an alternative embodiment, the determining unit is further configured to: if the surface of the background object is a plane, determining a linear equation or a plane equation for fitting the surface of the background object; or if the surface of the background object is a curved surface, determining a curve equation or a curved surface equation for fitting the surface of the background object.
As an alternative embodiment, as shown in fig. 7, the determining unit 610 includes: a determining subunit 710 for determining a predetermined number of pixel points located on the surface of the background object; the quantum measurement unit 720 is used for measuring and obtaining depth information corresponding to the predetermined number of pixel points; and a fitting subunit 730, configured to fit the surface of the background object by using a predetermined number of pixel points and the measured depth information, so as to obtain a fitting equation.
According to the embodiment of the disclosure, a fitting equation suitable for the surface of the background object can be set according to the type (such as a plane/curved surface) of the surface of the background object, and then the fitting equation of the surface of the background object is finally obtained by measuring the depth of some points and substituting the measured fitting equation.
As an optional embodiment, the plurality of pixel points include any one of the following: all pixel points located on the surface of the foreground object; a plurality of feature pixel points located on a surface of the foreground object; all pixel points located on the edge of the foreground object that meets the background object.
Any of the modules, units, sub-units, or at least part of the functionality of any of them according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, units and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, units, sub-units according to the embodiments of the present disclosure may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of three implementations of software, hardware, and firmware, or in any suitable combination of any of them. Alternatively, one or more of the modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as computer program modules, which, when executed, may perform the corresponding functions.
For example, any number of the first obtaining module 510, the second obtaining module 520, the third obtaining module 530 and the processing module 540 may be combined and implemented in one module, or any one of them may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the present disclosure, at least one of the first obtaining module 510, the second obtaining module 520, the third obtaining module 530 and the processing module 540 may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or may be implemented by any one of three implementations of software, hardware and firmware, or any suitable combination of any of them. Alternatively, at least one of the first acquiring module 510, the second acquiring module 520, the third acquiring module 530 and the processing module 540 may be at least partially implemented as a computer program module, which, when executed, may perform a corresponding function.
FIG. 8 schematically illustrates a block diagram of a computer system suitable for implementing the image processing method and apparatus according to an embodiment of the present disclosure. The computer system illustrated in FIG. 8 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 8, computer system 800 includes a processor 810, a computer-readable storage medium 820. The computer system 800 may perform a method according to an embodiment of the disclosure.
In particular, processor 810 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 810 may also include on-board memory for caching purposes. Processor 810 may be a single processing unit or a plurality of processing units for performing different actions of a method flow according to embodiments of the disclosure.
Computer-readable storage medium 820, for example, may be a non-volatile computer-readable storage medium, specific examples including, but not limited to: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and so on.
The computer-readable storage medium 820 may include a computer program 821, which computer program 821 may include code/computer-executable instructions that, when executed by the processor 810, cause the processor 810 to perform a method according to an embodiment of the present disclosure, or any variation thereof.
The computer program 821 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 821 may include one or more program modules, including for example 821A, modules 821B, … …. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, which when executed by the processor 810, enable the processor 810 to perform the method according to the embodiments of the present disclosure or any variation thereof.
According to an embodiment of the present disclosure, the processor 810 may perform a method according to an embodiment of the present disclosure or any variation thereof.
According to an embodiment of the present disclosure, at least one of the first obtaining module 510, the second obtaining module 520, the third obtaining module 530 and the processing module 540 may be implemented as a computer program module described with reference to fig. 8, which, when executed by the processor 810, may implement the respective operations described above.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (10)

1. An image processing method comprising:
acquiring an image to be processed, wherein the image to be processed comprises a foreground object and a background object;
acquiring first depth information, wherein the first depth information represents the depth of a plurality of pixel points in the image corresponding to the surface of the foreground object;
acquiring second depth information, wherein the second depth information represents the depth of the plurality of pixel points corresponding to the surface of the background object, and the second depth information comprises the depth representing the surface of the background object part blocked by the foreground object; and
amplifying a difference between corresponding depth information in the first depth information and the second depth information for each of the plurality of pixel points.
2. The method of claim 1, wherein the obtaining second depth information comprises:
determining a fitting equation for fitting a surface of the background object; and
and taking each pixel point in the plurality of pixel points as a variable of the fitting equation, and calculating the depth of each pixel point corresponding to the surface of the background object to obtain the second depth information.
3. The method of claim 2, wherein the determining a fitting equation for fitting the surface of the background object comprises:
if the surface of the background object is a plane, determining a linear equation or a plane equation for fitting the surface of the background object; or
And if the surface of the background object is a curved surface, determining a curve equation or a curved surface equation for fitting the surface of the background object.
4. The method of claim 2, wherein the determining a fitting equation for fitting the surface of the background object comprises:
determining a predetermined number of pixel points located on a surface of the background object;
measuring to obtain depth information corresponding to the preset number of pixel points; and
and fitting the surface of the background object by using the preset number of pixel points and the depth information obtained by measurement to obtain the fitting equation.
5. The method of claim 1, wherein any of the following is included in the plurality of pixel points:
all pixel points located on the surface of the foreground object;
a plurality of feature pixel points located on a surface of the foreground object;
and all pixel points which are positioned on the edge of the foreground object connected with the background object.
6. An image processing apparatus comprising:
the device comprises a first acquisition module, a second acquisition module and a processing module, wherein the first acquisition module is used for acquiring an image to be processed, and the image to be processed comprises a foreground object and a background object;
the second obtaining module is used for obtaining first depth information, wherein the first depth information represents the depth of a plurality of pixel points in the image corresponding to the surface of the foreground object;
a third obtaining module, configured to obtain second depth information, where the second depth information represents a depth of the multiple pixel points corresponding to the surface of the background object, and the second depth information includes a depth representing a surface of a background object portion blocked by the foreground object; and
a processing module, configured to amplify, for each of the plurality of pixel points, a difference between corresponding depth information in the first depth information and the second depth information.
7. The apparatus of claim 6, wherein the third obtaining means comprises:
a determination unit for determining a fitting equation for fitting a surface of the background object; and
and the calculating unit is used for taking each pixel point in the plurality of pixel points as a variable of the fitting equation, and calculating the depth of each pixel point corresponding to the surface of the background object to obtain the second depth information.
8. The apparatus of claim 7, wherein the determination unit is further configured to:
if the surface of the background object is a plane, determining a linear equation or a plane equation for fitting the surface of the background object; or
And if the surface of the background object is a curved surface, determining a curve equation or a curved surface equation for fitting the surface of the background object.
9. The apparatus of claim 7, wherein the determining unit comprises:
a determining subunit for determining a predetermined number of pixel points located on a surface of the background object;
the quantum measurement unit is used for measuring and obtaining the depth information corresponding to the pixels with the preset number; and
and the fitting subunit is used for fitting the surface of the background object by using the preset number of pixel points and the depth information obtained by measurement to obtain the fitting equation.
10. The apparatus of claim 6, wherein any of the plurality of pixel points comprises:
all pixel points located on the surface of the foreground object;
a plurality of feature pixel points located on a surface of the foreground object;
and all pixel points which are positioned on the edge of the foreground object connected with the background object.
CN201811440471.9A 2018-11-28 2018-11-28 Image processing method and device Active CN109598753B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811440471.9A CN109598753B (en) 2018-11-28 2018-11-28 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811440471.9A CN109598753B (en) 2018-11-28 2018-11-28 Image processing method and device

Publications (2)

Publication Number Publication Date
CN109598753A CN109598753A (en) 2019-04-09
CN109598753B true CN109598753B (en) 2021-02-19

Family

ID=65959832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811440471.9A Active CN109598753B (en) 2018-11-28 2018-11-28 Image processing method and device

Country Status (1)

Country Link
CN (1) CN109598753B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101504771A (en) * 2009-03-20 2009-08-12 北京航空航天大学 Vision tracing method for non-parameterized model
CN102609934A (en) * 2011-12-22 2012-07-25 中国科学院自动化研究所 Multi-target segmenting and tracking method based on depth image
CN103310231A (en) * 2013-06-24 2013-09-18 武汉烽火众智数字技术有限责任公司 Auto logo locating and identifying method
CN104246822A (en) * 2012-03-22 2014-12-24 高通股份有限公司 Image enhancement
CN104657993A (en) * 2015-02-12 2015-05-27 北京格灵深瞳信息技术有限公司 Lens shielding detection method and device
CN106204617A (en) * 2016-07-21 2016-12-07 大连海事大学 Adapting to image binarization method based on residual image rectangular histogram cyclic shift

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8903119B2 (en) * 2010-10-11 2014-12-02 Texas Instruments Incorporated Use of three-dimensional top-down views for business analytics
US8890936B2 (en) * 2010-10-12 2014-11-18 Texas Instruments Incorporated Utilizing depth information to create 3D tripwires in video

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101504771A (en) * 2009-03-20 2009-08-12 北京航空航天大学 Vision tracing method for non-parameterized model
CN102609934A (en) * 2011-12-22 2012-07-25 中国科学院自动化研究所 Multi-target segmenting and tracking method based on depth image
CN104246822A (en) * 2012-03-22 2014-12-24 高通股份有限公司 Image enhancement
CN103310231A (en) * 2013-06-24 2013-09-18 武汉烽火众智数字技术有限责任公司 Auto logo locating and identifying method
CN104657993A (en) * 2015-02-12 2015-05-27 北京格灵深瞳信息技术有限公司 Lens shielding detection method and device
CN106204617A (en) * 2016-07-21 2016-12-07 大连海事大学 Adapting to image binarization method based on residual image rectangular histogram cyclic shift

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"2D to 3D Image Conversion Based on Classification of Background Depth Profiles";Lin G S等;《PSIVT 2011: Advances in Image and Video Technology》;20111231;全文 *
"2D转3D中基于物体建模的深度图生成";李逸伦等;《北京:中国科技论文在线》;20161227;全文 *
"利用邊界特性改善立體影像的深度估測";邵俊棋;《臺北科技大學資訊工程系研究所學位論文》;20111231;全文 *

Also Published As

Publication number Publication date
CN109598753A (en) 2019-04-09

Similar Documents

Publication Publication Date Title
CN109300190B (en) Three-dimensional data processing method, device, equipment and storage medium
KR102566727B1 (en) Apparatus, method, computer program. computer readable recording medium for image processing
US10846867B2 (en) Apparatus, method and image processing device for smoke detection in image
US8873835B2 (en) Methods and apparatus for correcting disparity maps using statistical analysis on local neighborhoods
US20180160102A1 (en) Method for 3d reconstruction of an environment of a mobile device, corresponding computer program product and device
US20210192761A1 (en) Image depth estimation method and device, readable storage medium, and electronic apparatus
KR102506264B1 (en) Apparatus, method, computer program. computer readable recording medium for image processing
JP2020038619A (en) Object detection method, device, and storage medium
US20190205231A1 (en) Method and terminal device for testing performance of gpu, and computer readable storage medium
US20150379371A1 (en) Object Detection Utilizing Geometric Information Fused With Image Data
US9990762B2 (en) Image processing apparatus and method
US20170061231A1 (en) Image processing device, image processing method, and computer-readable recording medium
KR102137263B1 (en) Image processing apparatus and method
KR102392060B1 (en) Shading method and system via quad merging
WO2015132817A1 (en) Edge detection device, edge detection method, and program
KR20170052634A (en) Depth map enhancement
US9384381B2 (en) Image processing device for extracting foreground object and image processing method thereof
CN110689134A (en) Method, apparatus, device and storage medium for performing machine learning process
CN109102026B (en) Vehicle image detection method, device and system
US8824778B2 (en) Systems and methods for depth map generation
JP6919764B2 (en) Radar image processing device, radar image processing method, and program
US20150169970A1 (en) Image processing apparatus and image processing method
CN109598753B (en) Image processing method and device
KR20200065590A (en) Method and apparatus for detecting lane center point for accurate road map
CN112102145B (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant