CN102393954B - Image processing apparatus, image processing method, and image processing program - Google Patents

Image processing apparatus, image processing method, and image processing program Download PDF

Info

Publication number
CN102393954B
CN102393954B CN201110188702.3A CN201110188702A CN102393954B CN 102393954 B CN102393954 B CN 102393954B CN 201110188702 A CN201110188702 A CN 201110188702A CN 102393954 B CN102393954 B CN 102393954B
Authority
CN
China
Prior art keywords
image
images
parts
characteristic
gradation conversion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201110188702.3A
Other languages
Chinese (zh)
Other versions
CN102393954A (en
Inventor
町田佳士
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to CN201410841045.1A priority Critical patent/CN104599234B/en
Publication of CN102393954A publication Critical patent/CN102393954A/en
Application granted granted Critical
Publication of CN102393954B publication Critical patent/CN102393954B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/407Control or modification of tonal gradation or of extreme levels, e.g. background level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/32Transforming X-rays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

An image processing apparatus includes an image acquisition unit configured to acquire a plurality of partial images obtained by shooting each of a plurality of shooting ranges which a shooting region of an object are divided into, a feature amount acquisition unit configured to acquire a feature amount of at least one of the partial images, a characteristic acquisition unit configured to acquire a gradation conversion processing characteristic based on the feature amount and the shooting region, and a conversion unit configured to convert, based on the processing characteristic, a gradation of an image of the shooting region of the object obtained by joining the partial images.

Description

Image processing equipment, radiation imaging system and image processing method
Technical field
The present invention relates to the gradation conversion of image.
Background technology
Gradation conversion emphasizes the process of certain pixel value band domain compared with other band domain in image.Gradation conversion is used to match to make the gray scale of the gray scale during the gray scale of image and record and display device and increase the gray scale of area-of-interest.This process can prevent the loss of the shadow detail in the unnecessary loss of the details emphasized in image or image.In addition, the image being easier to check area-of-interest can be obtained.
On the other hand, there is the image pickup method of such as large image taking or segmentation image taking etc., in these image pickup methods, when shooting area being divided into multiple coverage, obtaining multiple parts of images by the shooting area taking subject.Use these methods to take the large subject being not suitable for single image.The image of whole subject can be obtained by engaging the various piece image utilizing large image taking to obtain.The U.S.'s No. 7440559 patent discussion are used for the technology of the gray scale of the image that the large image taking of conversion obtains.In the U.S.'s No. 7440559 patents, discuss based on index map picture intensity histogram or carry out the technology of gradation conversion based on a parts of images.
But if use index map picture as analytic target, then processing load increases, and this makes more to be difficult to realize Accurate Analysis.On the other hand, if carry out gradation conversion based on a parts of images to large image, then owing to not considering other parts image, thus whole large image may can not have suitable gray scale.
Summary of the invention
According to an aspect of the present invention, a kind of image processing equipment, comprising: image acquisition component, splits multiple parts of images that multiple coverages of obtaining obtain for obtaining by taking respectively to the shooting area of subject; Characteristic quantity obtaining widget, for obtaining the characteristic quantity of at least one parts of images in described multiple parts of images; Characteristic obtaining widget, for the area-of-interest based on described shooting area and described subject pixel value one of at least and described characteristic quantity, obtain gradation conversion treatment characteristic; And converting member, for based on described gradation conversion treatment characteristic, the gray scale of the image of the shooting area by engaging the subject that described multiple parts of images obtains is changed.
According to other aspects of the invention, a kind of image processing method, comprises the following steps: obtain and split to the shooting area of subject multiple parts of images that multiple coverages of obtaining obtain by taking respectively; Obtain the characteristic quantity of at least one parts of images in described multiple parts of images; Based on the area-of-interest of described shooting area and described subject pixel value one of at least and described characteristic quantity, obtain gradation conversion treatment characteristic; And based on described gradation conversion treatment characteristic, the gray scale of the image of the shooting area by engaging the subject that described multiple parts of images obtains is changed.
By below with reference to the detailed description of accompanying drawing to exemplary embodiments, further feature of the present invention and aspect will become obvious.
Accompanying drawing explanation
Comprise in the description and form the accompanying drawing of an instructions part, exemplary embodiments of the present invention, characteristic sum aspect are shown, and are used for explaining principle of the present invention together with instructions.
Fig. 1 is the structural drawing of the radiation imaging system according to the present invention first exemplary embodiments;
Fig. 2 illustrates the structure of the image processing equipment according to the first exemplary embodiments;
Fig. 3 illustrates the structure of the characteristic quantity acquiring unit according to the first exemplary embodiments;
Fig. 4 illustrates the example of the gradation conversion function produced by image processing equipment;
Fig. 5 is the process flow diagram of the flow process that the process carried out according to the radiation imaging system of the first exemplary embodiments is shown;
Fig. 6 illustrates the structure of the image processing equipment according to the present invention the 5th exemplary embodiments;
Fig. 7 is the process flow diagram of the flow process that the process carried out according to the radiation imaging system of the 5th exemplary embodiments is shown.
Embodiment
Below with reference to the accompanying drawings various exemplary embodiments of the present invention, characteristic sum aspect is described in detail.
Refer now to Fig. 1 ~ 5 explanation according to the radiation imaging system 100 of the present invention first exemplary embodiments.
First, the structure of radiation imaging system 100 is described with reference to figure 1.As shown in Figure 1, radiation imaging system 100 comprises radiation source 101, detecting device 103, image processing equipment 104 and display unit 105.Radiation source 101 is made up of the X-ray tube of the radioactive ray launching such as X ray etc.Radiation source 101 irradiates subject 102 schedule time with suitable radiological dose.Detecting device 103 forms indirect-type flat-panel detector (FPD), and wherein, indirect-type FPD has the fluorescent illuminant that radioactive ray converted to visible ray and receives visible ray and visible ray converted to the imageing sensor of the electric signal corresponding to light quantity.The electric signal of the image representing subject 102 can be obtained by detecting device 103.Detecting device 103 generates the image of subject by carrying out known correction to this electric signal.Transmissivity due to radioactive ray depends on its material passed, and thus can the inner structure of subject be made visual based on the image obtained by detecting device 103.For detecting device 103, Direct-type FPD X ray being directly changed into electric signal can also be used.
In addition, radiation imaging system 100 supports large image taking, in large image taking, when shooting area being divided into multiple coverage, obtains multiple parts of images by the shooting area taking subject 102.Large image taking is so a kind of image pickup method: while the direction of illumination being changed radiation source 101 by unshowned driver element, by along subject 102 moving detector 103 repeatedly to take.Therefore, the image of the subject larger than effective shooting area of detecting device 103 can be obtained.
Image processing equipment 104 generates large image by engaging the multiple parts of images utilizing large image taking to obtain.Term " large image " refers to the image that the multiple parts of images utilizing large image taking to obtain by joint obtain.Image processing equipment 104 also by analysis part image determination gradation conversion treatment characteristic, and changes the gray scale of large image.Then, the image with the gray scale after conversion is presented on display unit 105.Due to the image information important to diagnosis can be shown in the mode of easy understand by gradation conversion process, the image easily checked thus can be obtained.When carrying out relating to the diagnosis studied in great detail, such image is particularly useful.
Image processing equipment 104 comprises as the CPU (central processing unit) (CPU) 106 of hardware, random access memory (RAM) 107, ROM (read-only memory) (ROM) 108, hard disk drive (HDD) 109, network interface (I/F) 112 and display unit interface 113.Keyboard 110 is connected with image processing equipment 104 with mouse 111.The computer program being used for realizing each functional block shown in Fig. 2 or being used for performing following process is stored in ROM 108 or HDD 109.By CPU 106 in RAM 107 executive routine to realize the following function of image processing equipment 104.Although in FIG, only as a block, CPU106 is shown, the present invention is not limited to this.Multiple CPU 106 can be comprised.
The structure of image processing equipment 104 is illustrated in greater detail referring now to Fig. 2.Image processing equipment 104 comprise parts of images acquiring unit 201, for generate the image generation unit 202 of large image, gradation conversion unit 205, output unit 206, shooting area acquiring unit 207, characteristic quantity acquiring unit 208, for obtaining the characteristic acquiring unit 209 of gradation conversion treatment characteristic, control module 210 and storage unit 211.
Image generation unit 202 comprises image correction unit 203 and image combining unit 204, wherein, image correction unit 203 is for correcting the pixel value of the parts of images obtained by parts of images acquiring unit 201, and image combining unit 204 is for locating parts of images and engaging and built-up section image in overlapping region.
As shown in Figure 3, characteristic quantity acquiring unit 208 comprises saturated pixel value computing unit 301, subject extraction unit 302, max pixel value computing unit 303, minimum pixel value computing unit 304 and area-of-interest pixel value calculating unit 305.These computing units in characteristic quantity acquiring unit 208, based on the information relevant with the shooting area obtained by shooting area acquiring unit 207, calculate characteristic quantity by analyzing each parts of images.These parts of images are obtained by parts of images acquiring unit 201 and parts of images after being corrected by image correction unit 203.The characteristic quantity obtained comprises the max pixel value in the saturated pixel value of each parts of images, subject, the minimum pixel value in subject and area-of-interest pixel value.Thus, owing to calculating characteristic quantity by performing each analyzing and processing to parts of images, compared with during the whole large image of analysis, the precision of analyzing and processing is higher.In addition, the time needed for analyzing and processing can be shortened.
Characteristic acquiring unit 209 utilizes the method corresponding to shooting area, obtains gradation conversion treatment characteristic according to characteristic quantity.This " treatment characteristic " is the characteristic of the function or look-up table etc. used in such as gradation conversion.In this exemplary embodiments, owing to using function to carry out gradation conversion, the parameter thus needed for acquisition definition gradation conversion function is as gradation conversion treatment characteristic.When determining shooting area, owing to determining the general trend of the pixel value in reference object, thus the gradation conversion corresponding to shooting area can be carried out.Explanation is used for the method obtaining gradation conversion treatment characteristic below.
Control module 210 controls each function above-mentioned in a centralised manner.
The information relevant with area-of-interest obtained by characteristic quantity acquiring unit 208 needed for area-of-interest pixel value is associated with shooting area, and is stored in storage unit 211.Characteristic quantity acquiring unit 208 is by reference to this acquisition of information area-of-interest pixel value.In addition, title characteristic acquiring unit 209 feature based amount being obtained the function that the parameter that uses in gradation conversion uses is associated with shooting area, and is stored in storage unit 211.Function for obtaining this parameter can be performed by characteristic acquiring unit 209.Characteristic acquiring unit 209, by reference to this information, obtains the gradation conversion function of the treatment characteristic as gradation conversion.
Refer now to Fig. 4 illustrate undertaken by characteristic acquiring unit 209, for generation of the summary of the process of gradation conversion function.
First, based on the saturated pixel value obtained from each parts of images as characteristic quantity, determined the saturated pixel value of whole large image by the method corresponding to shooting area.Then, be maximal value in image by having the pixel amplitude limit being equal to or greater than this saturated pixel value.
Then, the minimum pixel value from each several part image, determines the minimum pixel value as whole large image.Here, the minimum pixel value as large image needs not to be minimum, also can be such as to be averaged obtained value to the minimum pixel value of parts of images.About the method, select suitable method accordingly with shooting area.Determine accordingly as the whole max pixel value of large image and the pixel value of area-of-interest with shooting area in the same way.
Then, the desired value of the minimum pixel value after conversion, max pixel value and area-of-interest pixel value is set.Based on output gray level quantity with to the degree that area-of-interest will be emphasized, these desired values are set.Therefore, based on shooting area determination desired value, or can not can consider shooting area and use setting to carry out Offered target value.Thus, the gradation conversion function shown in Fig. 4 is determined.
Refer now to the flow process that Fig. 5 illustrates the process performed by radiation imaging system 100.First, in step S501, based on the input from keyboard 110 and mouse 111, the shooting condition of subject is set.Can also by arranging shooting condition from external information system acceptance shooting command information.These shooting conditions comprise that expression will take which region of subject, relevant with shooting area information.Shooting condition is stored in the storage unit 211 of image processing equipment 104.
In step S502, by driving radiation source 101 and detecting device 103 based on set shooting condition, shooting subject 102.In this exemplary embodiments, if the shooting area of subject 102 is greater than the region that detecting device 103 can be taken, then carry out large image taking by subject being divided into multiple coverage and repeatedly taking.Based on large image taking, detecting device 103 produces the multiple parts of images obtained by a part for the shooting area of shooting subject.
In step S503, the parts of images acquiring unit 201 of image processing equipment 104 obtains the multiple parts of images obtained by large image taking.The information of the image that parts of images is numbered for which with the parts of images taken as the expression of supplementary and represent that the information of sum of the parts of images utilizing large image taking to obtain is associated.
In step S504, the image correction unit 203 of image generation unit 202 corrects the pixel value of the multiple parts of images obtained.As pixel value bearing calibration, can make in the following method, in the method, as classic method, use the mean value of overlapping region to change the pixel value of each integral image, then carry out pixel value and correct to make the pixel value of overlapping region roughly consistent.Here, " roughly consistent " need not strict interpretation.Such as, " roughly consistent " can instigate represent common region pixel value between difference lower than predetermined threshold.Such as, the pixel value representing common region is the mean value of the pixel value in common region.In addition, pixel value correction can also be carried out by making the histogrammic difference of overlapping region minimize.
In step S505, control module 210 starts the process of the parts of images after correcting for packed-pixel in image combining unit 204.Image combining unit 204 locates captured multiple parts of images, and combines these parts of images to generate large image.As the method for combination image, use known method below, in the method, based on the distance of the tie point at the region place overlapped with image, the contribution rate stage of each image changes.Can be positioned based on characteristics of image by image combining unit 204, or can be undertaken manually adjusting positioning by user.
The process for determining gradation conversion function of step S506 ~ S509 is carried out concurrently with the image combining process started in step S505.Therefore, can shorten from shooting until the large image with the gray scale after conversion to be presented at the time on display unit 105.In addition, during the large image taking to subject 102, the parts of images that passage partial image acquiring unit 201 obtains can be analyzed in turn.Carry out process in this way to make to shorten the processing time.
In step S506, shooting area acquiring unit 207 is according to the acquisition of information shooting area relevant with shooting condition be stored in storage unit 211.
In step S507, characteristic quantity acquiring unit 208, by analyzing the parts of images of the pixel value after having transformation, obtains characteristic quantity.The characteristic quantity obtained is the max pixel value of subject, minimum pixel value, area-of-interest pixel value and saturated pixel value.These characteristic quantities are obtained for each parts of images.
Saturated pixel value computing unit 301 calculates the saturated pixel value of having carried out the parts of images that pixel value corrects.The method utilizing sensor characteristic can be used to carry out the calculating of saturated pixel value.
Subject extraction unit 302 is extracted in removes X ray remaining subject after not through the part being incident to Direct-type X ray flat-panel detector subject and the shield portions of being blocked by collimating apparatus etc. from parts of images.The method utilizing histogram analysis and two-dimension analysis result can be used to carry out subject extracting method.
Max pixel value computing unit 303 calculates the maximal value in subject part.As the method for calculating max pixel value, the method for the histogram calculation typical value according to image can be used.But the present invention is not limited to this.Any method for calculating max pixel value can be applied.
Minimum pixel value computing unit 304 calculates the minimum value in subject part concurrently with the maximum value calculation process undertaken by max pixel value computing unit 303.As the method for calculating minimum pixel value, the method for the histogram calculation typical value according to image can be used.
Area-of-interest pixel value calculating unit 305 calculates the area-of-interest pixel value of subject.As the method for calculating pixel value, can use and calculate the method for typical value according to image histogram or extract area-of-interest from the two-dimensional structure of image and therefrom obtain the method that statistical value is representatively worth.Due to for each shooting area, will the information that uses of area-of-interest be specified to be stored in storage unit 211, thus area-of-interest pixel value calculating unit 305 processes by reference to this information.This process is carried out to each parts of images.Based on the process in step S507, each parts of images is obtained to the characteristic quantity of Four types.
In step S508, characteristic acquiring unit 209, based on the information relevant with shooting area, determines the method obtaining the parameter that will use in gradation conversion according to characteristic quantity.As mentioned above, the information relevant with shooting area and expression are used for being associated with each other from the information of characteristic quantity method getparms, and are stored in storage unit 211.Characteristic acquiring unit 209 determines parameter acquiring method with reference to this information.
First, characteristic acquiring unit 209, based on the saturated pixel value in each parts of images, obtains the saturated pixel value of large image.Such as, when taking lower limb total length, in the saturated pixel value obtained from each parts of images, selection minimum value is used as the saturated pixel value for combination image.The pixel selecting minimum value to make it possible to having the pixel value larger than the minimum pixel value calculated as saturated pixel value carries out amplitude limit.Because lower limb thickness can not change too large along its total length, the possibility thus normal pixel being used as mistakenly saturated pixel is minimum.Therefore, by the pixel value amplitude limit being greater than saturated pixel value, the impact of saturated pixel can be reduced.On the other hand, when taking whole backbone, the example of the method used comprises method below: calculate the average pixel value of the saturated pixel value obtained for the various piece image in selected multiple parts of images, then using the method for this mean value as the saturated pixel value of combination image; Calculate the method for the intermediate value of the saturated pixel value of various piece image; And use the method for maximal value.If there is error in the saturated pixel value of various piece image calculates, then use mean value or intermediate value to reduce the impact of error, this makes it possible to carry out stable amplitude limiting processing.In addition, when taking when the condition of such as taking dosage and the distance between focus and sensor etc. can not fluctuate for each captured parts of images, the mean value of calculating saturated pixel value or the method for intermediate value are effective.When taking whole backbone, because the difference in thickness of subject is large, thus use maximal value that the mistake of normal pixel being carried out to amplitude limit is reduced.Thus, can by selecting optimal computed method to calculate best saturated pixel value based on shooting condition.
In addition, characteristic acquiring unit 209, based on the subject maximal value in each parts of images, obtains the max pixel value of large image.When taking the very large region of as whole backbone, thickness change, characteristic acquiring unit 209 obtains maximum pixel value as max pixel value from the max pixel value of various piece image.Characteristic acquiring unit 209 also obtains minimum pixel value from the minimum pixel value of various piece image.For whole backbone, because the difference between the thick part of subject and thin part is large, the gray scale reflecting this change thus can be realized.On the other hand, when take such as lower limb total length etc., region that subject variation in thickness is little time, calculate the mean value of the max pixel value of each parts of images or the mean value of intermediate value and minimum pixel value or intermediate value.If use mean value or intermediate value, even if then there is error in the calculating of the max pixel value/minimum pixel value of some parts image, due to the impact of this error can be reduced, thus also stable gray proces can be carried out.Thus, by based on shooting area seletion calculation method, the subject max pixel value and subject minimum pixel value that are suitable for diagnostic purpose can be calculated.
Characteristic acquiring unit 209 also selects intermediate value based on the area-of-interest pixel value of each parts of images, and this intermediate value is set to the area-of-interest pixel value of combination image.By selecting intermediate value like this, even if there is error in the area-of-interest pixel value of some parts image, the impact of this error also can be reduced.In addition, computing method can use the mean value of multiple selected area-of-interest pixel value.In addition, by using the value of reflection multiple semi-cylindrical hills pixel value, even if there is error in parts of images analysis, the impact of this error can also be reduced.Such as, when taking lower limb total length, if area-of-interest is bone portion, then in the area-of-interest pixel value of each parts of images, there is not large difference.In this case, by adopting the method being arranged the area-of-interest of large image by calculating intermediate value or mean value, can reducing portion partial image analyze in error impact while, suitably carry out the gradation conversion to the bone portion as area-of-interest.
When area-of-interest value differences is very large, use the method selecting an area-of-interest pixel value from various piece image.Such as, when taking whole backbone, if area-of-interest is lumbar vertebrae, then according to parts of images, may be widely different by the area-of-interest pixel value of mainly taking the image that thoracic vertebrae obtains and the area-of-interest pixel value of image that obtained by main shooting lumbar vertebrae.In this case, the area-of-interest pixel value calculated based on the image obtained by shooting lumbar vertebrae is selected.On the contrary, if area-of-interest is thoracic vertebrae, then uses the area-of-interest pixel value obtained according to the image obtained by main shooting thoracic vertebrae, obtain gradation conversion function.In addition, when taking whole body, if area-of-interest is bone portion, then in the area-of-interest pixel value of various piece image, usually there is the pixel value greatly different from other pixel value.In this case, in order to get rid of exceptional value from parts of images analysis, can by abandoning maximal value and minimum value in the area-of-interest pixel value of various piece image, then based on rest of pixels value calculating mean value, arranging area-of-interest.Thus, by based on shooting area seletion calculation method, the area-of-interest pixel value being suitable for diagnostic purpose can be calculated.
In step S509, characteristic acquiring unit 209, based on obtained parameter, obtains gradation conversion function.Because this process is identical with reference to the process illustrated by figure 4, thus omit its description here.
In step S510, gradation conversion unit 205 carries out the gradation conversion of large image.When completing in the process for built-up section image that step S505 starts, control module 210 wait for carry out in step S509, for obtaining completing of the process of gradation conversion parameter, then indicate the gradation conversion that gradation conversion unit 205 starts large image.Based on this instruction, gradation conversion unit 205 changes the gray scale of large image according to gradation conversion function.
In step S511, the image with the gray scale after conversion is exported to display unit 105 by output unit 206, and display unit 105 shows this image.
As mentioned above, each parts of images obtained by the large image taking of analysis and utilization calculates characteristic quantity, with analysis index map as time compared with, can analysis precision be increased, and can the processing time be shortened.In addition, by utilizing the Combination of Methods characteristic quantity based on shooting area to produce gradation conversion function, the gradation conversion to large image being suitable for each shooting area can be realized.
In the first exemplary embodiments, each parts of images is calculated to the characteristic quantity of Four types.But, in the second exemplary embodiments of the present invention, be only retrieved as the characteristic quantity needed for gradation conversion function obtaining area-of-interest pixel value.Omit the process of some being used for obtaining from parts of images characteristic quantity.Omit the explanation of component to identical with the first exemplary embodiments and process below.The distinctive characteristic of this exemplary embodiments is only described.
Characteristic quantity acquiring unit 208 carries out for determine based on the information relevant with shooting area will from the process of the type of the characteristic quantity of each Image Acquisition.Such as, when taking lower limb total length, if area-of-interest is bone portion, then in the area-of-interest pixel value of various piece image, there is not large difference.Thus, be associated being used to specify the information of any one illustrated in the parts of images of bone portion with the information relevant with shooting area, and be stored in storage unit 211.Characteristic quantity acquiring unit 208 with reference to this information, and only obtains area-of-interest pixel value from specified parts of images.
As another example, when taking whole backbone, as mentioned above, if area-of-interest is lumbar vertebrae, then by mainly taking the area-of-interest pixel value of the image that thoracic vertebrae obtains and passing through the very big difference of area-of-interest pixel value possibility of the image that main shooting lumbar vertebrae obtains.Therefore, if lumbar vertebrae is set in advance as area-of-interest, then only obtain area-of-interest pixel value from the parts of images obtained by shooting lumbar vertebrae.
As an example again, if there is multiple semi-cylindrical hills in the large image of lumbar vertebrae or thoracic vertebrae etc., and if the width of the pixel value distribution in each area-of-interest is large, then only use an image to whole area-of-interest emphasize may not or image may not nature.Therefore, by only from obtaining area-of-interest pixel value based on the parts of images specified by shooting area and carrying out gradation conversion based on this value, factitious gray scale can be prevented.
Thus, by carrying out analyzing and processing to based on the parts of images specified by the information relevant with shooting area, the necessity of all parts of images being carried out to analyzing and processing is eliminated.Therefore, can the processing time be shortened and can processing load be reduced.
In addition, in this exemplary embodiments, in various piece image, characteristic quantity acquiring unit 208 is only from extracted region characteristic quantity nonoverlapping with other parts image.This is because, in large image taking, usually take these regions by adjustment camera site to make the region non-overlapping copies comprising a large amount of diagnosis information needed, thus can analyze by omitting overlapping region.In addition, in above-mentioned exemplary embodiments, due to parallel combined treatment and analyzing and processing of carrying out parts of images, thus the pixel value of overlapping region changes according to combined treatment.Therefore, analysis precision may deterioration.By only arranging Non-overlapping Domain as analytic target, the precision of the analyzing and processing for obtaining gradation conversion function can be increased, and can the processing time be shortened.
In the 3rd exemplary embodiments of the present invention, when taking such as whole backbone, the area-of-interest in a large image is divided into predetermined group of such as lumbar vertebrae and thoracic vertebrae etc., and obtains gradation conversion function for each group.Grouping is arranged and is associated with the information relevant with shooting area, and be stored in advance in storage unit 211.Such as, when taking whole backbone, because pixel distribution exists large difference between lumbar vertebrae and thoracic vertebrae, thus lumbar vertebrae and thoracic vertebrae are set to independently group respectively.On the contrary, when taking lower limb total length and when bone portion is set to area-of-interest, because the difference of the area-of-interest pixel value obtained from this multiple parts of images is not too large, thus these area-of-interests being set to same group.Can use keyboard 110 and mouse 111 that manually grouping is set to each shooting area.Characteristic quantity acquiring unit 208 with reference to this information, with from each parts of images obtain for obtaining the characteristic quantity organizing corresponding gradation conversion function with each.Be similar to the first exemplary embodiments, the kind of characteristic quantity comprises saturated pixel value, the minimum pixel value of subject, the max pixel value of subject and area-of-interest pixel value.Characteristic acquiring unit 209 obtains gradation conversion function based on obtained characteristic quantity.
The object lesson of this process is now described.For the large image of whole backbone and the large image of lower limb total length, carry out the process of minimum pixel value to large image and max pixel value in the mode identical with the first exemplary embodiments.For area-of-interest pixel value, when taking whole backbone, producing and emphasize the gradation conversion function of the area-of-interest pixel value obtained from lumbar vertebrae and emphasize the gradation conversion function of the area-of-interest pixel value obtained from thoracic vertebrae.On the other hand, when taking lower limb total length, obtaining the pixel value of bone portion from parts of images, and obtaining the gradation conversion function for emphasizing this pixel value.
Thus, by obtaining the gradation conversion function for each group, gradation conversion function can be obtained for each area-of-interest with similar pixel value distribution.Compared with during the gradation conversion function produced for each parts of images, when there is the area-of-interest with greatly different pixel value distributions in parts of images, owing to producing the independently gradation conversion function corresponding with each area-of-interest, the image suitably emphasizing different area-of-interests thus can be produced.In addition, when exist across multiple parts of images have similar pixel value distribution area-of-interest time, due to a gradation conversion function can be utilized jointly to emphasize these area-of-interests, the processing load that function that characteristic acquiring unit 209 carries out produces process and gradation conversion unit 205 thus can be reduced.In addition, replace area-of-interest to divide into groups, parts of images can be divided into groups.
In the 4th exemplary embodiments of the present invention, obtain multiple gradation conversion function based on shooting area.
Although can think that the large image utilizing large image taking to obtain has the wide image-region comprising multiple semi-cylindrical hills, because this image-region is large, thus dynamic range is also large.Therefore, need to prepare multiple gradation conversion function, and use these functions based on diagnostic purpose.
Therefore, be different from the second exemplary embodiments, but similar with the first exemplary embodiments, in the 4th exemplary embodiments, obtain saturated pixel value, the max pixel value of subject and minimum pixel value and area-of-interest pixel value from each parts of images, and obtain multiple gradation conversion function.In the 4th exemplary embodiments, for area-of-interest, no matter shooting area how, and the quantity area-of-interest pixel value obtained from each parts of images being used as the gradation conversion function of the area-of-interest pixel value of whole large image produced is only the quantity of area-of-interest pixel value.In this case, owing to not needing the kind changing the characteristic quantity obtained based on the information relevant with shooting area, thus can simplified apparatus structure or process.
In addition, determine mean value or the intermediate value of the area-of-interest pixel value obtained from each parts of images, and use this value to obtain gradation conversion function.When selecting large image taking, user may wish the trend of checking whole large image.By using the image that the trend of whole image is shown, the large image meeting this requirement can be obtained.
In this exemplary embodiments, by omitting the gradation conversion process undertaken by gradation conversion unit 205, the large image on display unit 105 before display gray scale conversion.In addition, the gradation conversion function produced according to generation order Concurrent Display, as option in turn.Control module 210 controls, thus makes this large image through different gradation conversion based on the selection inputted by keyboard 110 and mouse 111.
Thus, due to the acquisition process of the generating process executed in parallel gradation conversion function with large image, thus compared with when starting after generating large image to produce gradation conversion function, gradation conversion function option can be presented to user more quickly.
In the 5th exemplary embodiments of the present invention, being different from above-mentioned exemplary embodiments, when not using the information relevant with shooting area, based on the size of area-of-interest pixel value in parts of images, obtaining gradation conversion parameter.
Refer now to the structure that Fig. 6 illustrates image processing equipment 600.Be that image processing equipment 600 has area-of-interest designating unit 608 and judging unit 609 from the different of above-mentioned exemplary embodiments.
Then, the flow process of the process performed by radiation imaging system 100 is described with reference to figure 7.
In step S701, radiation source 101 launches radioactive ray based on outside instruction.Detecting device 103 detects the radioactive ray that have passed through subject 102, and generates the radiation image of subject.
In step S702, the parts of images acquiring unit 601 of image processing equipment 600 obtains radiation image via I/F 112 from detecting device 103.By control module 611, obtained image is sent to large image generation unit 602, and obtained image is stored in storage unit 612.
In step S703, image correction unit 603 carries out correction process, in this correction process, changes the pixel value of each image, to make the pixel value level in region common each other in each image consistent.Owing to being eliminated the difference in each image between pixel value level by this process, the process for combining common region thus by carrying out subsequently, can make the pixel value whole large image should with the region of same pixel value substantially identical.In addition, due in combined treatment only overlapping region be handling object, thus advantage is, after this pixel value corrects and from the characteristic quantity that each parts of images obtains before combined treatment, be relative to value substantially constant during large Image Acquisition after combined treatment.
In step S704, control module 611 is parallel starts the combined treatment of being undertaken by image combining unit 604 and the designated treatment to area-of-interest of being undertaken by area-of-interest designating unit 608.These process can be started simultaneously, or slightly shift to an earlier date each other or postpone these process of beginning.
In step S705, image combining unit 604 combines the common region of the image after this multiple correction.With this process concurrently, in step S706, area-of-interest designating unit 608 specifies area-of-interest.
In step S707, judging unit 609 calculates the value of the width of the pixel value represented in area-of-interest.In this exemplary embodiments, judging unit 609 extracts minimum pixel value in area-of-interest and max pixel value, calculates the difference between these values, and using this difference as pixel value width.
In step S708, judging unit 609 judges to represent whether the value of pixel value width is included in preset range.This preset range is set to from 0 to the scope of the threshold value be stored in advance in storage unit 612 by control module 611.If judging unit 609 is judged as that the value of pixel value width is included in preset range (step S708 is "No"), then process enters step S709.In step S709, whole image is set to the region that will emphasize by area-of-interest designating unit 608, and obtains the function by the gray scale for transition region set by function acquiring unit 610.
On the other hand, if be judged as that the value of pixel value width exceedes preset range (step S708 is "Yes") in step S708 judging unit 609, then process enters step S710.In step S710, Iamge Segmentation is become multiple region by area-of-interest designating unit 608.The quantity in the region that will split being set by reference to the table associated between the value for arranging pixel value width and the quantity in region, carrying out this process.This table is stored in storage unit 612.Alternatively, can become be suitable for the region of this width by Iamge Segmentation by the pixel value width determining to utilize each gradation conversion to emphasize, carry out this process.Then, in step S710, function acquiring unit 610 obtains the function of the gradation conversion corresponding with each region.
If there is the region by more than three set by area-of-interest designating unit 608, then as in the 3rd exemplary embodiments, the region with the distribution of similar pixel value can be divided into one group, and gradation conversion can be carried out to each group.In this case, area-of-interest designating unit 608 is used as the packet processing unit region with the distribution of similar pixel value being divided into a group.Known clustering technique can be used to carry out this grouping.Therefore, can while the complicacy caused when preventing the image after for each gradation conversion respectively T.G Grammar, the suitably gray scale of conversion area-of-interest.On the other hand, when not dividing into groups, the parameter of the gray scale for difference transition region can be obtained.In this case, by the image behind each region respectively T.G Grammar, diagnosis person can be made to be concerned about each region.
In step S711, gradation conversion unit 605 is based on the gray scale of produced gradation conversion function converted image.If obtain multiple gradation conversion function in step S710, then the obtained quantity with the quantity function of the image of the gray scale after conversion is identical.Gradation conversion unit 605 can based on the acquisition of function, carries out for by the process of these function applications in large image.Alternatively, by carrying out gradation conversion in turn based on the input of user, can be provided in and prevent from performing the unnecessary wieldy system in process aspect.
In step S712, the image with the gray scale after correction is presented on display unit 105 by output unit 606.Diagnosis person can by checking shown image to diagnose.
Therefore, from the parameter that the multiple Image Acquisition corrected by image correction unit 603 will be used gradation conversion, this make it possible to obtain with from cardinal principle identical gradation conversion parameter during large Image Acquisition.In addition, owing to having split the image being used as analytic target, thus analyzing and processing load is large unlike when analyzing as single image.In addition, carry out the combined treatment of image with acquisition gradation conversion Parameter Parallel, this makes the processing time shorten.Therefore, the large image with the area-of-interest that suitably converted gray scale can be exported after photographing more quickly, can also the processing time be shortened simultaneously.
In addition, by specifying area-of-interest and this region segmentation being become at least one region to remove region not too important in the region of subject outside and subject, converting gradation can also be carried out.In addition, can also to emphasize the mode converting gradation in the region that diagnosis person should pay close attention to.Especially when the pixel value width of area-of-interest is large, owing to being divided into multiple region also synthetic image after emphasizing process at area-of-interest, thus when not damaging the gray balance of whole image, the image with the area-of-interest emphasized can be generated.And, compared with when using the information relevant with shooting area, the gray scale of the difference closer reflected in subject can be realized.
By carrying out the process described in the first ~ three exemplary embodiments when not using the information relevant with shooting area, based on the information relevant with area-of-interest pixel value, the suitable gray scale of the respective difference reflecting each subject can be realized.On the contrary, the table information be associated with shooting area can also be used to carry out such process.In this case, stable gray scale can be realized based on region.In addition, relevant with shooting area information and the information relevant with area-of-interest pixel value can together with use.
According to above-mentioned exemplary embodiments, by obtaining characteristic quantity from parts of images and obtaining the gradation conversion function of index map picture, compared with during the whole large-sized image of analysis, the time needed for analyzing and processing can be shortened further.Therefore, although for this is true until the required time life period of image captured by display limits, good precision and processing time fast can be realized simultaneously.In addition, by combining based on the characteristic quantity of shooting area to parts of images, can realize and precision suitable when analyzing whole image and processing time.In addition, owing to being associated with combined method by shooting area, thus by means of only making user specify shooting area, just high-precision gradation conversion can be obtained at short notice.
In addition, carrying out parts of images joining process and the parts of images analyzing and processing for obtaining gradation conversion by parallel, can the processing time be shortened.Because joining process comprises the analyzing and processing to image, thus to the correction of the pixel value level between parts of images and high to the processing load of the localization process of pixel.In addition, this process needs the time.From the angle of preventing error diagnosis, when manually carrying out pixel value level correction and localization process to improve precision, need the more time compared with when only carrying out automatic joining process.On the other hand, because the analyzing and processing for obtaining gradation conversion treatment characteristic also needs graphical analysis, thus this process needs the time.Even if can manual set handling characteristic and without the need to analysis chart picture, this still need with automatically process in identical time.In this exemplary embodiments, carry out joining process consuming time and gradation conversion characteristics acquisition process, until show the time lag of large image after shooting can being reduced in fact by parallel.In addition, whenever taking the parts of images of subject, while forehanding and taking the photograph next parts of images, the analyzing and processing to the parts of images taken can be carried out.When photographing section image, owing to needing the time for the process of mobile X-ray tube and plate detecting device, thus carrying out analyzing and processing by taking to walk abreast with segmentation, greatly can shorten and showing retardation time.
By using the above-mentioned process of the gradation conversion treatment characteristic for obtaining index map picture to radiation image and radioscopic image, the gray scale of diagnostic index map picture can be suitably set.Especially, due to monochrome display radioscopic image, thus use the image of the gray scale after having suitably adjustment that diagnosis person can be diagnosed efficiently.
In above-mentioned exemplary embodiments, by analyzing pixel value through the parts of images that the transformation of image correction unit 203 corrects, obtain characteristic quantity.But, can walk abreast and carry out the process of gradation conversion parameter acquiring and change correction process.In the generation of the gradation conversion function undertaken by characteristic acquiring unit 209, except the pixel value of the characteristic quantity obtained by characteristic quantity acquiring unit 208, the pixel value transformation amount of the parts of images undertaken by image correction unit 203 can also be used.
Not necessarily carry out gradation conversion based on function, and the look-up table for gradation conversion can be used.In this case, characteristic acquiring unit 209 obtains the coefficient for changing each parts of images in large image, instead of for determining the parameter of functional form.In addition, this curve can also be changed by the curve form of fixing gradation conversion and based on the characteristic quantity of large image and determine gradation conversion treatment characteristic.Such as, can be changed as benchmark by use sense region-of-interest pixel value.
The present invention can be embodied as the image processing system process that image processing equipment carries out be dispersed on multiple equipment.The present invention can also be realized by making the process being organized as individual feature block be distributed in multiple functional block.In addition, the present invention can be realized by the imaging device of the function comprising image processing equipment in above-mentioned exemplary embodiments and display device at imaging device or detecting device.Alternatively, the present invention can be realized by the process being organized as individual feature block is assembled into one or more hardware circuit.
The present invention also comprises situation below, and wherein, such as, the operating system (OS) etc. operated on computing electronics carries out part or all of actual treatment, and by this process, realizes the function of above-mentioned exemplary embodiments.In addition, multiple CPU can be comprised at this computing machine.In this case, the present invention can be realized by distribution on this multiple CPU.In addition, in this case, the program code itself read from storage medium realizes the function of these exemplary embodiments, and the storage medium thus storing this program or program code forms the present invention.
The example of image processing equipment above to the explanation of exemplary embodiments.The present invention is not limited to this explanation.
Can also utilize to read and perform and be recorded in program that memory device is set up to carry out computing machine devices such as (or) CPU or MPU of the system or equipment of the function of above-described embodiment and to realize aspect of the present invention by method below, wherein, the computing machine of system or equipment is utilized to be recorded in by such as reading and performing program that memory device is set up to carry out said method step with the function of carrying out above-described embodiment.For this reason, such as, by network or by the various types of recording mediums (such as, computer-readable medium) being used as storage arrangement, this program is supplied to computing machine.
Although describe the present invention with reference to exemplary embodiments, should be appreciated that, the present invention is not limited to disclosed exemplary embodiments.The scope of appended claims meets the widest explanation, to comprise all this kind of amendments, equivalent structure and function.

Claims (19)

1., for an image processing equipment for radiation image, comprising:
Condition acquiring unit, for obtaining the condition for carrying out radioactive ray imaging, this condition comprises the information representing and want the anatomic region of imaging;
Image acquisition unit, the parts of images that the subregion for obtaining by taking whole shooting area respectively obtains, wherein said whole shooting area and accessed expression want the information of the anatomic region of imaging corresponding;
Correcting unit, for obtaining the corrected value of the pixel value for changing parts of images described at least one, with the difference between the pixel value reducing the common region of described parts of images;
Characteristic quantity acquiring unit, for wanting the information of the anatomic region of imaging according to accessed expression, the characteristic quantity of parts of images described at least one described in obtaining, wherein, accessed characteristic quantity is used for gradation conversion process;
Characteristic acquiring unit, for based on accessed corrected value and accessed characteristic quantity, obtains gradation conversion treatment characteristic; And
Graphics processing unit, for based on the corrected value got and described gradation conversion treatment characteristic, image procossing is carried out, to obtain the image after process to the image obtained by engaging described parts of images, wherein, the gray scale of the image after described process is converted.
2. image processing equipment according to claim 1, it is characterized in that, also comprise determining unit, described determining unit is used for the information according to the described anatomic region of accessed expression, determines from the characteristic quantity described in each accessed by parts of images and the method for obtaining described gradation conversion treatment characteristic according to accessed characteristic quantity.
3. image processing equipment according to claim 1, is characterized in that, also comprises determining unit, and described determining unit obtains the pixel value of the area-of-interest of subject as described characteristic quantity for determining from which parts of images.
4. image processing equipment according to claim 1, it is characterized in that, also comprise determining unit, described determining unit is used for determining based on the mean value of the pixel value of the area-of-interest of subject or intermediate value the process obtaining described gradation conversion treatment characteristic, wherein, described area-of-interest obtains from described parts of images.
5. image processing equipment according to claim 1, is characterized in that, described characteristic quantity acquiring unit is used for obtaining described characteristic quantity from parts of images with the nonoverlapping region of other parts image.
6. image processing equipment according to claim 1, it is characterized in that, described correcting unit is used for the pixel value changing parts of images described at least one based on accessed corrected value, and makes the pixel value representing the common region of described parts of images roughly consistent
Wherein, described characteristic acquiring unit is used for obtaining described gradation conversion treatment characteristic based on parts of images described at least one.
7. image processing equipment according to claim 1, is characterized in that, described characteristic acquiring unit for obtaining the gradation conversion treatment characteristic corresponding with various piece image,
Wherein, described image processing equipment also comprises control module, and described control module is used for described gradation conversion treatment characteristic alternatively with by engaging together with image that described parts of images obtains to show, and
Wherein, described control module is used for, according to for selecting the input of described gradation conversion treatment characteristic, the image obtained being carried out to the gradation conversion of selected treatment characteristic by engaging described parts of images, and the image after Graphics Processing.
8. image processing equipment according to claim 1, it is characterized in that, described characteristic quantity acquiring unit is for obtaining at least one in the pixel value of the max pixel value in the subject region shown in described parts of images, minimum pixel value and area-of-interest as described characteristic quantity.
9. image processing equipment according to claim 1, is characterized in that, accessed gradation conversion treatment characteristic is the information representing one of the function used in gradation conversion and the look-up table used in gradation conversion.
10. image processing equipment according to claim 1, is characterized in that, also comprises:
Generation unit, engages described parts of images synthetic image for passing through; And
Control module, for controlling described generation unit and described characteristic acquiring unit,
Wherein, described characteristic acquiring unit for analyzing parts of images described in each, to obtain described gradation conversion treatment characteristic, and
Wherein, described control module, for controlling described characteristic acquiring unit, carries out described analysis concurrently to generate index map picture with described generation unit.
11. image processing equipments according to claim 1, is characterized in that, also comprise:
Processing unit, for the width distributed according to the pixel value of the area-of-interest in generated index map picture, is divided into described area-of-interest and multiplely will carries out based on identical gradation conversion the group emphasized,
Wherein, described characteristic acquiring unit is for obtaining and organizing corresponding gradation conversion treatment characteristic described in each.
12. image processing equipments according to claim 1, is characterized in that, described characteristic quantity acquiring unit is used for obtaining polytype characteristic quantity from parts of images described at least one.
13. image processing equipments according to claim 12, it is characterized in that, also comprise determining unit, described determining unit is used for the information based on the described anatomic region of accessed expression, which kind of which determine to use the characteristic quantity of the type obtained from parts of images to obtain described gradation conversion treatment characteristic
Wherein, described characteristic acquiring unit is used for obtaining described gradation conversion treatment characteristic based on the determination of described determining unit.
14. 1 kinds of radiation imaging systems, comprising:
Image processing equipment according to claim 1;
Radiation source, for launching radioactive ray;
Detected radioactive ray for detecting the radioactive ray also passing subject launched from described radiation source, and are converted to the electric signal of the image representing described subject by detecting device; And
Display unit, for showing the image after conversion.
15. 1 kinds, for the image processing method of radiation image, comprise the following steps:
Condition obtaining step, for obtaining the condition for carrying out radioactive ray imaging, this condition comprises the information representing and want the anatomic region of imaging;
Image acquisition step, the parts of images that the subregion for obtaining by taking whole shooting area respectively obtains, wherein said whole shooting area and accessed expression want the information of the anatomic region of imaging corresponding;
Aligning step, for obtaining the corrected value of the pixel value for changing parts of images described at least one, with the difference between the pixel value reducing the common region of described parts of images;
Characteristic quantity obtaining step, for wanting the information of the anatomic region of imaging according to accessed expression, obtains the characteristic quantity of parts of images described at least one, and wherein, accessed characteristic quantity is used for gradation conversion process;
Characteristic obtaining step, for based on accessed corrected value and accessed characteristic quantity, obtains gradation conversion treatment characteristic; And
Image processing step, for based on the corrected value got and described gradation conversion treatment characteristic, image procossing is carried out, to obtain the image after process to the image obtained by engaging described parts of images, wherein, the gray scale of the image after described process is converted.
16. image processing methods according to claim 15, is characterized in that,
In described characteristic quantity obtaining step, obtain polytype characteristic quantity from parts of images described at least one.
17. image processing methods according to claim 16, is characterized in that, also comprise:
Determining step, for the information based on the described anatomic region of accessed expression, determines to use the characteristic quantity of which kind of type obtained from which parts of images to obtain described gradation conversion treatment characteristic,
Wherein, in the characteristic obtaining step of described gradation conversion treatment characteristic, the determination based on described determining step obtains described gradation conversion treatment characteristic.
18. 1 kinds, for the image processing equipment of radiation image, comprising:
Area acquisition unit, for obtaining the information representing anatomic region;
Image acquisition unit, the parts of images that the subregion for obtaining by taking whole shooting area respectively obtains, wherein said whole shooting area is corresponding with the information of the described anatomic region of accessed expression;
Correcting unit, for obtaining the corrected value of the pixel value for changing parts of images described at least one, with the difference between the pixel value reducing the common region of described parts of images;
Characteristic quantity acquiring unit, for the information according to the described anatomic region of accessed expression, the characteristic quantity of parts of images described at least one described in obtaining, wherein, accessed characteristic quantity is used for gradation conversion process;
Characteristic acquiring unit, for based on accessed corrected value and accessed characteristic quantity, obtains described gradation conversion treatment characteristic; And
Graphics processing unit, for based on the corrected value got and described gradation conversion treatment characteristic, carry out image procossing to the image based on described parts of images, to obtain the image after process, wherein, the gray scale of the image after described process is converted.
19. 1 kinds, for the image processing method of radiation image, comprise the steps:
Area acquisition step, for obtaining the information representing anatomic region;
Image acquisition step, the parts of images that the subregion for obtaining by taking whole shooting area respectively obtains, wherein said whole shooting area is corresponding with the information of the described anatomic region of accessed expression;
Aligning step, for obtaining the corrected value of the pixel value for changing parts of images described at least one, with the difference between the pixel value reducing the common region of described parts of images;
Characteristic quantity obtaining step, for the information according to the described anatomic region of accessed expression, obtains the characteristic quantity of parts of images described at least one, and wherein, accessed characteristic quantity is used for gradation conversion process;
Characteristic obtaining step, for based on accessed corrected value and accessed characteristic quantity, obtains gradation conversion treatment characteristic; And
Image processing step, for based on the corrected value got and described gradation conversion treatment characteristic, carry out image procossing to the image based on described parts of images, to obtain the image after process, wherein, the gray scale of the image after described process is converted.
CN201110188702.3A 2010-07-05 2011-07-05 Image processing apparatus, image processing method, and image processing program Expired - Fee Related CN102393954B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410841045.1A CN104599234B (en) 2010-07-05 2011-07-05 Image processing equipment, radiation imaging system and image processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010-153224 2010-07-05
JP2010153224A JP5665393B2 (en) 2010-07-05 2010-07-05 Image processing apparatus, image processing method, and program

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201410841045.1A Division CN104599234B (en) 2010-07-05 2011-07-05 Image processing equipment, radiation imaging system and image processing method

Publications (2)

Publication Number Publication Date
CN102393954A CN102393954A (en) 2012-03-28
CN102393954B true CN102393954B (en) 2015-01-28

Family

ID=44907666

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201110188702.3A Expired - Fee Related CN102393954B (en) 2010-07-05 2011-07-05 Image processing apparatus, image processing method, and image processing program
CN201410841045.1A Active CN104599234B (en) 2010-07-05 2011-07-05 Image processing equipment, radiation imaging system and image processing method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201410841045.1A Active CN104599234B (en) 2010-07-05 2011-07-05 Image processing equipment, radiation imaging system and image processing method

Country Status (7)

Country Link
US (1) US8823835B2 (en)
EP (1) EP2405395A1 (en)
JP (1) JP5665393B2 (en)
KR (2) KR20120003811A (en)
CN (2) CN102393954B (en)
BR (1) BRPI1103510A2 (en)
RU (1) RU2496142C2 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6016403B2 (en) * 2012-03-27 2016-10-26 キヤノン株式会社 Image processing apparatus and image processing method
JP6159618B2 (en) * 2012-08-22 2017-07-05 住友化学株式会社 Salt, resist composition and method for producing resist pattern
WO2014106777A1 (en) * 2013-01-03 2014-07-10 Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi A system and method for contrast enhancement
US20160117797A1 (en) * 2013-06-06 2016-04-28 Hitachi, Ltd. Image Processing Apparatus and Image Processing Method
JP6179422B2 (en) * 2014-02-26 2017-08-16 株式会社島津製作所 X-ray equipment
JP5972347B2 (en) * 2014-12-11 2016-08-17 キヤノン株式会社 Image processing apparatus, image processing method, and program
KR102495199B1 (en) * 2016-09-29 2023-02-01 엘지디스플레이 주식회사 Display device
JP6980456B2 (en) * 2017-08-22 2021-12-15 キヤノン株式会社 Radiation imaging system
EP3547666B1 (en) * 2018-03-29 2023-03-01 Teledyne Dalsa B.V. Image sensor with artificial pixels and shielded pixels
JP7313862B2 (en) * 2019-03-29 2023-07-25 キヤノン株式会社 Information processing device, method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1457024A (en) * 2002-05-10 2003-11-19 佳能株式会社 Greyscale transform process
CN1764243A (en) * 2004-10-19 2006-04-26 奥林巴斯株式会社 Image processing apparatus, image recording apparatus, and image processing method
CN1847977A (en) * 2005-04-12 2006-10-18 诺日士钢机株式会社 Gradation conversion calibration method and gradation conversion calibration module using the same
CN1849627A (en) * 2003-09-11 2006-10-18 松下电器产业株式会社 Visual processing device, visual processing method, visual processing program, and semiconductor device
CN101365039A (en) * 2007-08-06 2009-02-11 株式会社尼康 Image processing apparatus, imaging apparatus and image processing program

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04270474A (en) * 1991-01-09 1992-09-25 Nec Corp Gradation controller by attention areas
US6314198B1 (en) * 1996-09-25 2001-11-06 Canon Kabushiki Kaisha Radiographic, digital image processing system
JP3861405B2 (en) * 1997-10-09 2006-12-20 コニカミノルタホールディングス株式会社 Radiation image processing method and radiation image processing apparatus
JP4040222B2 (en) 1999-03-23 2008-01-30 富士フイルム株式会社 Radiographic image connection processing method and radiographic image processing apparatus
JP2000342567A (en) * 1999-06-07 2000-12-12 Fuji Photo Film Co Ltd Method and device for image processing and recording medium
JP4030680B2 (en) * 1999-06-07 2008-01-09 富士フイルム株式会社 Image processing method and apparatus, and recording medium
JP2001274974A (en) 2000-03-24 2001-10-05 Fuji Photo Film Co Ltd Connection processing method of radiation picture and radiation picture processor
US20020085743A1 (en) 2000-04-04 2002-07-04 Konica Corporation Image processing selecting method, image selecting method and image processing apparatus
JP2001351092A (en) 2000-04-04 2001-12-21 Konica Corp Image processing selecting method, image selecting method, and image processor
US7650044B2 (en) * 2001-07-30 2010-01-19 Cedara Software (Usa) Limited Methods and systems for intensity matching of a plurality of radiographic images
JP4230731B2 (en) 2002-07-29 2009-02-25 株式会社東芝 Digital image processing apparatus and X-ray diagnostic apparatus
JP4323770B2 (en) * 2002-10-18 2009-09-02 キヤノン株式会社 Image processing apparatus, image processing method, program, and recording medium
JP4980552B2 (en) * 2003-09-30 2012-07-18 コニカミノルタエムジー株式会社 Image processing method, image processing apparatus, and image processing program
US7440559B2 (en) 2003-10-22 2008-10-21 Nokia Corporation System and associated terminal, method and computer program product for controlling the flow of content
JP2005296332A (en) * 2004-04-12 2005-10-27 Toshiba Corp X-ray diagnosis apparatus, and device and method for generating image
US7412111B2 (en) * 2004-11-19 2008-08-12 General Electric Company Enhanced image processing method for the presentation of digitally-combined medical images
JP2006181149A (en) * 2004-12-28 2006-07-13 Canon Inc Radiograph processing apparatus and method, program and computer readable medium
US7239805B2 (en) * 2005-02-01 2007-07-03 Microsoft Corporation Method and system for combining multiple exposure images having scene and camera motion
CA2610262A1 (en) * 2005-06-07 2006-12-14 Thomson Licensing Content-based gaussian noise reduction for still image, video and film
RU2336655C1 (en) * 2006-12-08 2008-10-20 Общество с ограниченной ответственностью ООО "Юник Ай Сиз" Method of object and background areas selection in digital images
JP2009017020A (en) * 2007-07-02 2009-01-22 Nissan Motor Co Ltd Image processor and method for generating display image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1457024A (en) * 2002-05-10 2003-11-19 佳能株式会社 Greyscale transform process
CN1849627A (en) * 2003-09-11 2006-10-18 松下电器产业株式会社 Visual processing device, visual processing method, visual processing program, and semiconductor device
CN1764243A (en) * 2004-10-19 2006-04-26 奥林巴斯株式会社 Image processing apparatus, image recording apparatus, and image processing method
CN1847977A (en) * 2005-04-12 2006-10-18 诺日士钢机株式会社 Gradation conversion calibration method and gradation conversion calibration module using the same
CN101365039A (en) * 2007-08-06 2009-02-11 株式会社尼康 Image processing apparatus, imaging apparatus and image processing program

Also Published As

Publication number Publication date
RU2496142C2 (en) 2013-10-20
KR101510726B1 (en) 2015-04-10
BRPI1103510A2 (en) 2014-06-03
CN104599234A (en) 2015-05-06
JP2012011132A (en) 2012-01-19
CN104599234B (en) 2017-11-14
KR20120003811A (en) 2012-01-11
KR20140097096A (en) 2014-08-06
US8823835B2 (en) 2014-09-02
RU2011127476A (en) 2013-01-10
CN102393954A (en) 2012-03-28
JP5665393B2 (en) 2015-02-04
EP2405395A1 (en) 2012-01-11
US20120002083A1 (en) 2012-01-05

Similar Documents

Publication Publication Date Title
CN102393954B (en) Image processing apparatus, image processing method, and image processing program
US9498180B2 (en) Detecting and quantifying patient motion during tomosynthesis scans
KR101367747B1 (en) Radiographic apparatus and control method for the same
JP6117183B2 (en) X-ray diagnostic imaging apparatus and control method for X-ray generation apparatus
US10779791B2 (en) System and method for mobile X-ray imaging
JP2009136376A (en) Image processing device and program thereof
WO2020036007A1 (en) Medical information processing device, medical information processing method, and program
US10231685B2 (en) Operating an x-ray system for examining an object
EP3370616B1 (en) Device for imaging an object
US8199995B2 (en) Sensitometric response mapping for radiological images
US10123757B2 (en) Image-processing device, radiation image capture system, image-processing method, and computer-readable storage medium
US8611498B2 (en) Image processing apparatus, image processing method, radiation imaging system, and computer-readable recording medium
JP5972347B2 (en) Image processing apparatus, image processing method, and program
KR101092207B1 (en) Method and Apparatus for Correcting Balance of Digital Radiation Detection Image having Nonlinear Response Characteristics
JP5908817B2 (en) Bone mineral content measuring apparatus and method
JP7370694B2 (en) Medical information processing device, medical information processing method, and program
US11145094B2 (en) Image reconstruction apparatus and image reconstruction method
WO2018018087A1 (en) A method and system for automating radiation dose parameters
JP2006263226A (en) Radiation image processing apparatus, the radiation image processing method, program, and computer readable media
JP2013173058A (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150128