CN106982327A - Image processing method and device - Google Patents
Image processing method and device Download PDFInfo
- Publication number
- CN106982327A CN106982327A CN201710210736.5A CN201710210736A CN106982327A CN 106982327 A CN106982327 A CN 106982327A CN 201710210736 A CN201710210736 A CN 201710210736A CN 106982327 A CN106982327 A CN 106982327A
- Authority
- CN
- China
- Prior art keywords
- wavelet
- wavelet band
- area
- band
- visible images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
The disclosure provides a kind of image processing method and device, to solve the problem of camera terminal is shot in the prior art visible images loss in detail is serious.This method includes:Obtain the visible images and near infrared light image of reference object;Determine the first area in the visible images, the first area is to need to carry out the enhanced region of details in the visible images;The first wavelet band is obtained wavelet decomposition is carried out to the first area, and the second area in the near infrared light image is subjected to wavelet decomposition obtaining the second wavelet band, the second area and the same area that the first area is the reference object;The second wavelet band is fused to the first wavelet band, the first wavelet band after being merged;Wavelet inverse transformation is carried out to the first wavelet band after the fusion, the enhanced visible images of details are generated.
Description
Technical field
This disclosure relates to image processing field, in particular it relates to a kind of image processing method and device.
Background technology
In the photographed scene of natural light, intensity of illumination dynamic range is very big, for example, in fine outdoor, light
Subject scattering influence in line transmission medium and its scene, intensity of illumination dynamic range is up to 109, and professional other number
The intensity of illumination dynamic range that phase function is captured is 104.If the phase function that exceedes of intensity of illumination dynamic range is caught in scene
The intensity of illumination dynamic range arrived, camera can only photograph a part for whole light intensity, cause in camera imaging, intensity of illumination
Imaging region beyond camera dynamic range can lose partial data.For example, overexposure region occurs in camera imaging, or owe
Exposure area, in that region lost part image detail.
The content of the invention
The purpose of the disclosure is to provide a kind of image processing method and device, is shot with solving camera terminal in the prior art
Visible images loss in detail it is serious the problem of.
To achieve these goals, the disclosure is adopted the following technical scheme that:
According to the first aspect of the disclosure there is provided a kind of image processing method, methods described includes:
Obtain the visible images and near infrared light image of reference object;
Determine the first area in the visible images, the first area is to need to carry out in the visible images
The enhanced region of details;
Wavelet decomposition is carried out to the first area and obtains the first wavelet band, and by second in the near infrared light image
Region carries out wavelet decomposition and obtains the second wavelet band, and the second area is the identical of the reference object with the first area
Position;
The second wavelet band is fused to the first wavelet band, the first wavelet band after being merged;
Wavelet inverse transformation is carried out to the first wavelet band after the fusion, the enhanced visible images of details are generated.
Alternatively, the first wavelet band includes the first average wavelet band, and the second wavelet band includes the second average son
Wavestrip, it is described that the second wavelet band is fused to the first wavelet band, including:
The second average wavelet band is fused to by the first average wavelet band according to the algorithm of Histogram Matching.
Alternatively, the first wavelet band includes the first details wavelet band, and the second wavelet band includes the second details
Wavestrip, it is described that the second wavelet band is fused to the first wavelet band, including:
The second details wavelet band is fused to by the first details wavelet band according to average weighted algorithm.
Alternatively, the first area determined in the visible images, including:
According to position of each pixel of the visible images in the HSV color spaces of the visible images, really
The loss in detail order of severity of fixed each pixel;
Determine that the position that the loss in detail order of severity exceedes the pixel of threshold value is the first area.
Alternatively, HSV color space of each pixel according to the visible images in the visible images
In position, it is determined that the loss in detail order of severity of each pixel, including:
Obtain the loss in detail order of severity Ws of each saturation degree channel S of the pixel in the HSV color spaces;
Obtain the loss in detail order of severity Wv of each lightness V passage of the pixel in the HSV color spaces;
The loss in detail order of severity W of each pixel is calculated by equation below:
W=Ws*Wv.
According to the second aspect of the disclosure there is provided a kind of image processing apparatus, described device includes:
Acquisition module, is configured as obtaining the visible images and near infrared light image of reference object;
Determining module, is configured to determine that the first area in the visible images, the first area be it is described can
Seeing in light image needs to carry out the enhanced region of details;
Wavelet decomposition module, is configured as carrying out wavelet decomposition to the first area obtaining the first wavelet band, and by institute
State the progress of the second area near infrared light image wavelet decomposition and obtain the second wavelet band, the second area and firstth area
Domain is the same area of the reference object;
Wavelet tape handling module, is configured as the second wavelet band being fused to the first wavelet band, is merged
The first wavelet band afterwards;
Inverse wavelet transform module, is configured as carrying out the first wavelet band after the fusion wavelet inverse transformation, generation is thin
Save enhanced visible images.
Alternatively, the first wavelet band includes the first average wavelet band, and the second wavelet band includes the second average son
Wavestrip, the wavelet tape handling module, is configured as:
The second average wavelet band is fused to by the first average wavelet band according to the algorithm of Histogram Matching.
Alternatively, the first wavelet band includes the first details wavelet band, and the second wavelet band includes the second details
Wavestrip, the wavelet tape handling module, is configured as:
The second details wavelet band is fused to by the first details wavelet band according to average weighted algorithm.
Alternatively, the determining module includes:
First determining module, is configured as each pixel according to the visible images in the visible images
Position in HSV color spaces, it is determined that the loss in detail order of severity of each pixel;
Second determining module, it is described the to be configured to determine that the loss in detail order of severity exceedes the position of the pixel of threshold value
One region.
Alternatively, described device also includes:
Saturation degree acquisition module, is configured as obtaining each saturation degree S of the pixel in the HSV color spaces and leads to
The loss in detail order of severity Ws in road;
Lightness acquisition module, is configured as obtaining each lightness V passage of the pixel in the HSV color spaces
Loss in detail order of severity Wv;
Computing module, is configured as calculating the loss in detail order of severity W of each pixel by equation below:
W=Ws*Wv.
According to the third aspect of the disclosure there is provided a kind of image processing apparatus, described device includes:
Processor;
For the memory for the executable instruction for storing the processor;
Wherein, the processor is configured as:
Obtain the visible images and near infrared light image of reference object;
Determine the first area in the visible images, the first area is to need to carry out in the visible images
The enhanced region of details;
Wavelet decomposition is carried out to the first area and obtains the first wavelet band, and by second in the near infrared light image
Region carries out wavelet decomposition and obtains the second wavelet band, and the second area is the identical of the reference object with the first area
Position;
The second wavelet band is fused to the first wavelet band, the first wavelet band after being merged;
Wavelet inverse transformation is carried out to the first wavelet band after the fusion, the enhanced visible images of details are generated.
By above-mentioned technical proposal, after the visible images and near infrared light image of reference object are obtained, determine visible
Need to carry out the enhanced region of details, then the visible images and near infrared light image to the reference object respectively in light image
Wavelet decomposition is carried out, the wavelet band of near infrared light image is fused to the wavelet band of visible images, then will be visible after fusion
The wavelet band of light image carries out wavelet inverse transformation, generates the enhanced visible images of details.So, by visible images
Detailed information in middle fusion near infrared light image, enhances the detailed information of visible images, solves and clap in the prior art
Take the photograph terminal taking visible images loss in detail it is serious the problem of.
Other feature and advantage of the disclosure will be described in detail in subsequent embodiment part.
Brief description of the drawings
Accompanying drawing is, for providing further understanding of the disclosure, and to constitute a part for specification, with following tool
Body embodiment is used to explain the disclosure together, but does not constitute limitation of this disclosure.In the accompanying drawings:
Fig. 1 is a kind of flow chart of image processing method according to an exemplary embodiment.
Fig. 2 is the flow chart of another image processing method according to an exemplary embodiment.
Fig. 3 is the flow chart of another image processing method according to an exemplary embodiment.
Fig. 4 is the flow chart of another image processing method according to an exemplary embodiment.
Fig. 5 is the flow chart of another image processing method according to an exemplary embodiment.
Fig. 6 A are a kind of block diagrams of image processing apparatus 600 according to an exemplary embodiment.
Fig. 6 B are the block diagrams of another image processing apparatus 600 according to an exemplary embodiment.
Fig. 6 C are the block diagrams of another image processing apparatus 600 according to an exemplary embodiment.
Fig. 7 is a kind of block diagram of image processing apparatus 700 according to an exemplary embodiment.
Embodiment
It is described in detail below in conjunction with accompanying drawing embodiment of this disclosure.It should be appreciated that this place is retouched
The embodiment stated is merely to illustrate and explained the disclosure, is not limited to the disclosure.
Fig. 1 is a kind of flow chart of image processing method according to an exemplary embodiment.As shown in figure 1, described
Method includes:
Step S101, obtains the visible images and near infrared light image of reference object.
Step S102, determines the first area in the visible images, and the first area is the visible images
In need to carry out the enhanced region of details.
Step S103, carries out wavelet decomposition to the first area and obtains the first wavelet band, and by the near infrared light figure
Second area as in carries out wavelet decomposition and obtains the second wavelet band, and the second area and the first area are the shootings
The same area of object.
Step S104, the first wavelet band is fused to by the second wavelet band, the first wavelet band after being merged.
Step S105, wavelet inverse transformation is carried out to the first wavelet band after the fusion, and generation details is enhanced visible
Light image.
In recent years, near-infrared image has obtained extensive attention and development in computer vision and calculating shooting field,
And achieve certain achievement in research.The wavelength of visible ray is 400nm-700nm, and the wavelength of near-infrared is 700nm-
1100nm.Due to the difference of wavelength, the scattering properties of the two is also significantly different, particularly when light transmission medium be mist and other
During pollutant, to the propagation effect very little of near-infrared, obtained near-infrared image is relatively sharp, and the object of distant place can be protected
Stay more details.Meanwhile, the color of cloud and sky is closely similar in visible images, and two in near-infrared image
The contrast of person is significantly increased.And under severe weather conditions, near-infrared image can not be influenceed by atmospheric medium, than visible
Light image collects more scene detailed information, so as to be conducive to auxiliary visible images to carry out fast and effectively observability
Enhancing.
By above-mentioned technical proposal, after the visible images and near infrared light image of reference object are obtained, determine visible
Need to carry out the enhanced region of details, then the visible images and near infrared light image to the reference object respectively in light image
Wavelet decomposition is carried out, the wavelet band of near infrared light image is fused to the wavelet band of visible images, then will be visible after fusion
The wavelet band of light image carries out wavelet inverse transformation, generates the enhanced visible images of details.So, by visible images
Detailed information in middle fusion near infrared light image, enhances the detailed information of visible images, solves and clap in the prior art
Take the photograph terminal taking visible images loss in detail it is serious the problem of.
Specifically, in step S101, the visible images and near infrared light image for obtaining reference object can pass through
A kind of double systems of taking the photograph realize that double systems of taking the photograph include common RGB (Red Green Blue, a three primary colors) camera, use
In obtaining visible images, in addition to a near infrared light camera, for obtaining near infrared light image.
In addition, in step S103, the wavelet decomposition original image of different scale can be used in the wavelet decomposition, its is medium and small
Wave function can be Haar functions or other be applied to the wavelet decomposition function of image procossing, the disclosure do not do herein
Limit.Accordingly, in step S105, the use function in the wavelet inverse transformation can perform step S103 wavelet decompositions
When the function that uses.
Fig. 2 is the flow chart of another image processing method according to an exemplary embodiment.As shown in Fig. 2 institute
The method of stating includes:
Step S201, obtains the visible images and near infrared light image of reference object.
Step S202, determines the first area in the visible images, and the first area is the visible images
In need to carry out the enhanced region of details.
Step S203, carries out wavelet decomposition to the first area and obtains the first wavelet band, and by the near infrared light figure
Second area as in carries out wavelet decomposition and obtains the second wavelet band, and the second area and the first area are the shootings
The same area of object.
Wherein, the first wavelet band includes the first average wavelet band, and the second wavelet band includes the second average wavelet
Band.
Step S204, the described first average son is fused to according to the algorithm of Histogram Matching by the second average wavelet band
Wavestrip, the first wavelet band after being merged.
Step S205, wavelet inverse transformation is carried out to the first wavelet band after the fusion, and generation details is enhanced visible
Light image.
Wherein, the second average wavelet band carries the contrast information of near infrared light image, is shown by the present embodiment
The technical scheme gone out, the contrast information near infrared light image is merged in visible images, visible images are enhanced
Contrast information, solves contrast in the visible images details that camera terminal is shot in the prior art and loses serious ask
Topic.
Fig. 3 is the flow chart of another image processing method according to an exemplary embodiment.As shown in figure 3, institute
The method of stating includes:
Step S301, obtains the visible images and near infrared light image of reference object.
Step S302, determines the first area in the visible images, and the first area is the visible images
In need to carry out the enhanced region of details.
Step S303, carries out wavelet decomposition to the first area and obtains the first wavelet band, and by the near infrared light figure
Second area as in carries out wavelet decomposition and obtains the second wavelet band, and the second area and the first area are the shootings
The same area of object.
Wherein, the first wavelet band includes the first details wavelet band, and the second wavelet band includes the second details wavelet
Band.
Step S304, the first details wavelet is fused to according to average weighted algorithm by the second details wavelet band
Band, the first wavelet band after being merged.
Step S305, wavelet inverse transformation is carried out to the first wavelet band after the fusion, and generation details is enhanced visible
Light image.
Wherein, the second details wavelet band carries the texture information of near infrared light image, is shown by the present embodiment
Technical scheme, in visible images merge near infrared light image in texture information, enhance the texture of visible images
Information, solves the problem of grain details lose serious in the visible images details that camera terminal is shot in the prior art.
Fig. 4 is the flow chart of another image processing method according to an exemplary embodiment.As shown in figure 4, institute
The method of stating includes:
Step S401, obtains the visible images and near infrared light image of reference object.
Step S402, according to each pixel of the visible images in the HSV color spaces of the visible images
Position, it is determined that the loss in detail order of severity of each pixel.
Step S403, determines that the position that the loss in detail order of severity exceedes the pixel of threshold value is the first area, described
First area is to need in the visible images to carry out the enhanced region of details.
In visible images the serious region of loss in detail be often in image color it is excessively bright, excessively secretly, or color saturation
The low region of degree.Positioned at these regions pixel color-values in corresponding HSV (Hue/Saturation/Value, colourity/full
With degree/purity) in color space, there is also saturation degree S is too low, the problem of lightness V is excessively bright or excessively dark.In the specific implementation,
Can using in image the saturation degree of all pixels it is average as benchmark saturation degree, obtain each pixel intensity and benchmark saturation degree
Deviation, if deviation exceedes predetermined threshold value, then it is assumed that the pixel is the too low pixel of saturation degree.The position of the pixel is institute
State needs to carry out the enhanced region of details in visible images.
Step S404, carries out wavelet decomposition to the first area and obtains the first wavelet band, and by the near infrared light figure
Second area as in carries out wavelet decomposition and obtains the second wavelet band, and the second area and the first area are the shootings
The same area of object.
Step S405, the first wavelet band is fused to by the second wavelet band, the first wavelet band after being merged.
Step S406, wavelet inverse transformation is carried out to the first wavelet band after the fusion, and generation details is enhanced visible
Light image.
Found out by HSV color spaces needs to carry out the enhanced region of details in visible images, then near red by fusion
The detailed information of outer light image, can to strengthen the detailed information for needing to carry out the enhanced region of details in the visible images
Lift the details performance of visible images.What deserves to be explained is, the region performed well for details in visible images is not required to
The detailed information of near infrared light image is merged again.
Fig. 5 is the flow chart of another image processing method according to an exemplary embodiment.As shown in figure 5, institute
The method of stating includes:
Step S501, obtains the visible images and near infrared light image of reference object.
Step S502, the loss in detail for obtaining each saturation degree channel S of the pixel in the HSV color spaces is tight
Weight degree Ws;
Step S503, the loss in detail for obtaining each lightness V passage of the pixel in the HSV color spaces is serious
Degree Wv;
Step S504, the loss in detail order of severity W of each pixel is calculated by equation below:
W=Ws*Wv.
Step S505, determines that the position that the loss in detail order of severity exceedes the pixel of threshold value is the first area, described
First area is to need in the visible images to carry out the enhanced region of details.
Step S506, carries out wavelet decomposition to the first area and obtains the first wavelet band, and by the near infrared light figure
Second area as in carries out wavelet decomposition and obtains the second wavelet band, and the second area and the first area are the shootings
The same area of object.
Step S507, the first wavelet band is fused to by the second wavelet band, the first wavelet band after being merged.
Step S508, wavelet inverse transformation is carried out to the first wavelet band after the fusion, and generation details is enhanced visible
Light image.
In summary, after the visible images and near infrared light image of reference object are obtained, determine in visible images
Need to carry out the enhanced region of details, then visible images and near infrared light image progress small echo respectively to the reference object
Decompose, the wavelet band of near infrared light image is fused to the wavelet band of visible images, then by the visible images after fusion
Wavelet band carries out wavelet inverse transformation, generates the enhanced visible images of details.So, by merging near in visible images
Detailed information in infrared light image, enhances the detailed information of visible images, solves camera terminal bat in the prior art
The problem of visible images loss in detail taken the photograph is serious.
Following device embodiments for the disclosure, can be used for performing disclosed method embodiment.For disclosure dress
The details not disclosed in embodiment is put, disclosed method embodiment is refer to.
Fig. 6 A are a kind of block diagrams of image processing apparatus 600 according to an exemplary embodiment.As shown in Figure 6A, institute
Stating device includes:
Acquisition module 610, is configured as obtaining the visible images and near infrared light image of reference object;
Determining module 620, is configured to determine that the first area in the visible images, and the first area is described
Need to carry out the enhanced region of details in visible images;
Wavelet decomposition module 630, is configured as obtaining the first wavelet band to first area progress wavelet decomposition, and will
Second area in the near infrared light image carries out wavelet decomposition and obtains the second wavelet band, the second area and described first
Region is the same area of the reference object;
Wavelet tape handling module 640, is configured as the second wavelet band being fused to the first wavelet band, is melted
The first wavelet band after conjunction;
Inverse wavelet transform module 650, is configured as carrying out the first wavelet band after the fusion wavelet inverse transformation, generation
The enhanced visible images of details.
Using said apparatus, after the visible images and near infrared light image of reference object are obtained, visible ray figure is determined
Need to carry out the enhanced region of details, then visible images respectively to the reference object and near infrared light image progress as in
Wavelet decomposition, the wavelet band of near infrared light image is fused to the wavelet band of visible images, then by the visible ray figure after fusion
The wavelet band of picture carries out wavelet inverse transformation, generates the enhanced visible images of details.So, by melting in visible images
The detailed information near infrared light image is closed, the detailed information of visible images is enhanced, solves and shoots in the prior art eventually
Hold the problem of visible images loss in detail shot is serious.
Alternatively, on the basis of image processing apparatus 600 shown in Fig. 6 A, as shown in Figure 6B, the determining module 620 is wrapped
Include:
First determining module 621, is configured as each pixel according to the visible images in the visible images
HSV color spaces in position, it is determined that the loss in detail order of severity of each pixel;Second determining module 622, by with
It is set to and determines that the loss in detail order of severity is the first area more than the position of the pixel of threshold value.
Alternatively, on the basis of image processing apparatus 600 shown in Fig. 6 B, as shown in Figure 6 C, described image processing unit
600 also include:Saturation degree acquisition module 660, is configured as obtaining each saturation of the pixel in the HSV color spaces
Spend the loss in detail order of severity Ws of channel S;Lightness acquisition module 670, is configured as obtaining each pixel in the HSV
The loss in detail order of severity Wv of lightness V passages in color space;Computing module 680, is configured as by equation below meter
Calculate the loss in detail order of severity W of each pixel:W=Ws*Wv.
Wherein, the image processing apparatus 600 that the embodiment of the present disclosure is provided can be the side with software or software plus hardware
Formula can set up what be communicated as the part or independence of camera terminal and the camera terminal with camera terminal
Device.
On the device in above-described embodiment, wherein modules perform the concrete mode of operation in relevant this method
Embodiment in be described in detail, explanation will be not set forth in detail herein.
Fig. 7 is a kind of block diagram of image processing apparatus 700 according to an exemplary embodiment.For example, the device 700
Camera terminal is can apply to, the camera terminal can be digital camera, smart mobile phone, notebook computer, and possess bat
The Intelligent worn device of camera shooting function, digital broadcast equipment, intelligent monitoring device, intelligent television etc..
Reference picture 7, device 700 can include following one or more assemblies:Processing assembly 702, memory 704, electric power
Component 706, multimedia groupware 708, audio-frequency assembly 710, the interface 712 of input/output (I/O), sensor cluster 714, and
Communication component 716.
The integrated operation of the usual control device 700 of processing assembly 702, such as with display, call, data communication, phase
Machine operates the operation associated with record operation.Processing assembly 702 can refer to including one or more processors 720 to perform
Order, to complete all or part of step of above-mentioned image processing method.In addition, processing assembly 702 can include one or many
Individual module, is easy to the interaction between processing assembly 702 and other assemblies.For example, processing assembly 702 can include multimedia mould
Block, to facilitate the interaction between multimedia groupware 708 and processing assembly 702.
Memory 704 is configured as storing various types of data supporting the operation in device 700.These data are shown
Example includes the instruction of any application program or method for being operated on device 700, and contact data, telephone book data disappears
Breath, picture, video etc..Memory 704 can be by any kind of volatibility or non-volatile memory device or their group
Close and realize, such as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM) is erasable to compile
Journey read-only storage (EPROM), programmable read only memory (PROM), read-only storage (ROM), magnetic memory, flash
Device, disk or CD.
Electric power assembly 706 provides electric power for the various assemblies of device 700.Electric power assembly 706 can include power management system
System, one or more power supplys, and other components associated with generating, managing and distributing electric power for device 700.
Multimedia groupware 708 is included in the screen of one output interface of offer between described device 700 and user.One
In a little embodiments, screen can include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, screen
Curtain may be implemented as touch-screen, to receive the input signal from user.Touch panel includes one or more touch sensings
Device is with the gesture on sensing touch, slip and touch panel.The touch sensor can not only sensing touch or sliding action
Border, but also detection touches or slide related duration and pressure with described.In certain embodiments, many matchmakers
Body component 708 includes a preposition shooting head and/or rearmounted shooting head.When device 700 be in operator scheme, such as screening-mode or
During video mode, preposition shooting head and/or the rearmounted head that shoots can receive the multi-medium data of outside.Each preposition shooting head and
Rearmounted shooting head can be a fixed optical lens system or with focusing and optical zoom capabilities.
Audio-frequency assembly 710 is configured as output and/or input audio signal.For example, audio-frequency assembly 710 includes a Mike
Wind (MIC), when device 700 be in operator scheme, when such as call model, logging mode and speech recognition mode, microphone by with
It is set to reception external audio signal.The audio signal received can be further stored in memory 704 or via communication set
Part 716 is sent.In certain embodiments, audio-frequency assembly 710 also includes a loudspeaker, for exports audio signal.
I/O interfaces 712 is provide interface between processing assembly 702 and peripheral interface module, above-mentioned peripheral interface module can
To be keyboard, click wheel, button etc..These buttons may include but be not limited to:Home button, volume button, start button and lock
Determine button.
Sensor cluster 714 includes one or more sensors, and the state for providing various aspects for device 700 is commented
Estimate.For example, sensor cluster 714 can detect opening/closed mode of device 700, the relative positioning of component is for example described
Component is the display and keypad of device 700, and sensor cluster 714 can be with 700 1 components of detection means 700 or device
Position change, the existence or non-existence that user contacts with device 700, the orientation of device 700 or acceleration/deceleration and device 700
Temperature change.Sensor cluster 714 can include proximity transducer, be configured to detect in not any physical contact
The presence of neighbouring object.Sensor cluster 714 can also include optical sensor, such as CMOS or ccd image sensor, for into
As being used in application.In certain embodiments, the sensor cluster 714 can also include acceleration transducer, gyro sensors
Device, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 716 is configured to facilitate the communication of wired or wireless way between device 700 and other equipment.Device
700 can access the wireless network based on communication standard, such as Wi-Fi, 2G or 3G, or combinations thereof.In an exemplary reality
Apply in example, communication component 716 receives broadcast singal or the related letter of broadcast from external broadcasting management system via broadcast channel
Breath.In one exemplary embodiment, the communication component 716 also includes near-field communication (NFC) module, to promote short distance to lead to
Letter.For example, radio frequency identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra wide band (UWB) can be based in NFC module
Technology, bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 700 can be believed by one or more application specific integrated circuits (ASIC), numeral
Number processor (DSP), digital signal processing appts (DSPD), PLD (PLD), field programmable gate array
(FPGA), controller, microcontroller, microprocessor or other electronic components are realized, the above-mentioned image processing method for performing.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instructing, example are additionally provided
Such as include the memory 704 of instruction, above-mentioned instruction can be performed to complete above-mentioned image processing method by the processor 720 of device 700
Method.For example, the non-transitorycomputer readable storage medium can be ROM, random access memory (Random Access
Memory, referred to as:RAM), read-only optical disc (Compact Disc Read-Only Memory, abbreviation:CD-ROM), tape, soft
Disk and optical data storage devices etc..
Those skilled in the art will readily occur to other embodiment party of the disclosure after considering specification and putting into practice the disclosure
Case.The application is intended to any modification, purposes or the adaptations of the disclosure, these modifications, purposes or adaptability
Change follows the general principle of the disclosure and including the undocumented common knowledge or usual skill in the art of the disclosure
Art means.Description and embodiments are considered only as exemplary, and the true scope of the disclosure and spirit are by following claim
Point out.
It should be appreciated that the precision architecture that the disclosure is not limited to be described above and is shown in the drawings, and
And various modifications and changes can be being carried out without departing from the scope.The scope of the present disclosure is only limited by appended claim.
Claims (11)
1. a kind of image processing method, it is characterised in that methods described includes:
Obtain the visible images and near infrared light image of reference object;
Determine the first area in the visible images, the first area is to need to carry out details in the visible images
Enhanced region;
Wavelet decomposition is carried out to the first area and obtains the first wavelet band, and by the second area in the near infrared light image
Carry out wavelet decomposition and obtain the second wavelet band, the second area and the identical portions that the first area is the reference object
Position;
The second wavelet band is fused to the first wavelet band, the first wavelet band after being merged;
Wavelet inverse transformation is carried out to the first wavelet band after the fusion, the enhanced visible images of details are generated.
2. according to the method described in claim 1, it is characterised in that the first wavelet band includes the first average wavelet band, institute
Stating the second wavelet band includes the second average wavelet band, described that the second wavelet band is fused into the first wavelet band, including:
The second average wavelet band is fused to by the first average wavelet band according to the algorithm of Histogram Matching.
3. method according to claim 1 or 2, it is characterised in that the first wavelet band includes the first details wavelet band,
The second wavelet band includes the second details wavelet band, described that the second wavelet band is fused into the first wavelet band, bag
Include:
The second details wavelet band is fused to by the first details wavelet band according to average weighted algorithm.
4. method according to claim 1 or 2, it is characterised in that the firstth area in the determination visible images
Domain, including:
According to position of each pixel of the visible images in the HSV color spaces of the visible images, it is determined that often
The loss in detail order of severity of the individual pixel;
Determine that the position that the loss in detail order of severity exceedes the pixel of threshold value is the first area.
5. method according to claim 4, it is characterised in that each pixel according to the visible images is in institute
The position in the HSV color spaces of visible images is stated, it is determined that the loss in detail order of severity of each pixel, including:
Obtain the loss in detail order of severity Ws of each saturation degree channel S of the pixel in the HSV color spaces;
Obtain the loss in detail order of severity Wv of each lightness V passage of the pixel in the HSV color spaces;
The loss in detail order of severity W of each pixel is calculated by equation below:
W=Ws*Wv.
6. a kind of image processing apparatus, it is characterised in that described device includes:
Acquisition module, is configured as obtaining the visible images and near infrared light image of reference object;
Determining module, is configured to determine that the first area in the visible images, and the first area is the visible ray
Need to carry out the enhanced region of details in image;
Wavelet decomposition module, is configured as obtaining first area progress wavelet decomposition on the first wavelet band, and will be described near
Second area in infrared light image carries out wavelet decomposition and obtains the second wavelet band, and the second area is with the first area
The same area of the reference object;
Wavelet tape handling module, is configured as the second wavelet band being fused to the first wavelet band, after being merged
First wavelet band;
Inverse wavelet transform module, is configured as carrying out the first wavelet band after the fusion wavelet inverse transformation, generation details increases
Visible images after strong.
7. device according to claim 6, it is characterised in that the first wavelet band includes the first average wavelet band, institute
Stating the second wavelet band includes the second average wavelet band, and the wavelet tape handling module is configured as:
The second average wavelet band is fused to by the first average wavelet band according to the algorithm of Histogram Matching.
8. the device according to claim 6 or 7, it is characterised in that the first wavelet band includes the first details wavelet band,
The second wavelet band includes the second details wavelet band, and the wavelet tape handling module is configured as:
The second details wavelet band is fused to by the first details wavelet band according to average weighted algorithm.
9. the device according to claim 6 or 7, it is characterised in that the determining module includes:
First determining module, is configured as the HSV colors in the visible images according to each pixels of the visible images
Position in color space, it is determined that the loss in detail order of severity of each pixel;
Second determining module, is configured to determine that the position that the loss in detail order of severity exceedes the pixel of threshold value is firstth area
Domain.
10. device according to claim 9, it is characterised in that described device also includes:
Saturation degree acquisition module, is configured as obtaining each saturation degree channel S of the pixel in the HSV color spaces
Loss in detail order of severity Ws;
Lightness acquisition module, is configured as obtaining the details of each lightness V passage of the pixel in the HSV color spaces
Lose order of severity Wv;
Computing module, is configured as calculating the loss in detail order of severity W of each pixel by equation below:
W=Ws*Wv.
11. a kind of image processing apparatus, it is characterised in that described device includes:
Processor;
For the memory for the executable instruction for storing the processor;
Wherein, the processor is configured as:
Obtain the visible images and near infrared light image of reference object;
Determine the first area in the visible images, the first area is to need to carry out details in the visible images
Enhanced region;
Wavelet decomposition is carried out to the first area and obtains the first wavelet band, and by the second area in the near infrared light image
Carry out wavelet decomposition and obtain the second wavelet band, the second area and the identical portions that the first area is the reference object
Position;
The second wavelet band is fused to the first wavelet band, the first wavelet band after being merged;
Wavelet inverse transformation is carried out to the first wavelet band after the fusion, the enhanced visible images of details are generated.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710210736.5A CN106982327B (en) | 2017-03-31 | 2017-03-31 | Image processing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710210736.5A CN106982327B (en) | 2017-03-31 | 2017-03-31 | Image processing method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106982327A true CN106982327A (en) | 2017-07-25 |
CN106982327B CN106982327B (en) | 2020-02-28 |
Family
ID=59343621
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710210736.5A Active CN106982327B (en) | 2017-03-31 | 2017-03-31 | Image processing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106982327B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108737728A (en) * | 2018-05-03 | 2018-11-02 | Oppo广东移动通信有限公司 | A kind of image capturing method, terminal and computer storage media |
CN109120869A (en) * | 2018-11-07 | 2019-01-01 | 深圳市道通智能航空技术有限公司 | Double light image integration methods, integration equipment and unmanned plane |
CN113507558A (en) * | 2020-03-24 | 2021-10-15 | 华为技术有限公司 | Method and device for removing image glare, terminal equipment and storage medium |
CN114073063A (en) * | 2020-05-27 | 2022-02-18 | 北京小米移动软件有限公司南京分公司 | Image processing method and device, camera assembly, electronic equipment and storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1925565A (en) * | 2006-09-15 | 2007-03-07 | 重庆大学 | Welding puddle image acquisition technology based on image coalescence and sensing system |
CN105342561A (en) * | 2015-10-09 | 2016-02-24 | 中国科学院自动化研究所 | Wireless voice-operated wearable molecular imaging navigation system |
-
2017
- 2017-03-31 CN CN201710210736.5A patent/CN106982327B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1925565A (en) * | 2006-09-15 | 2007-03-07 | 重庆大学 | Welding puddle image acquisition technology based on image coalescence and sensing system |
CN105342561A (en) * | 2015-10-09 | 2016-02-24 | 中国科学院自动化研究所 | Wireless voice-operated wearable molecular imaging navigation system |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108737728A (en) * | 2018-05-03 | 2018-11-02 | Oppo广东移动通信有限公司 | A kind of image capturing method, terminal and computer storage media |
CN109120869A (en) * | 2018-11-07 | 2019-01-01 | 深圳市道通智能航空技术有限公司 | Double light image integration methods, integration equipment and unmanned plane |
CN113507558A (en) * | 2020-03-24 | 2021-10-15 | 华为技术有限公司 | Method and device for removing image glare, terminal equipment and storage medium |
CN114073063A (en) * | 2020-05-27 | 2022-02-18 | 北京小米移动软件有限公司南京分公司 | Image processing method and device, camera assembly, electronic equipment and storage medium |
CN114073063B (en) * | 2020-05-27 | 2024-02-13 | 北京小米移动软件有限公司南京分公司 | Image processing method and device, camera assembly, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN106982327B (en) | 2020-02-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112333380B (en) | Shooting method and equipment | |
CN110502954B (en) | Video analysis method and device | |
CN108594997B (en) | Gesture skeleton construction method, device, equipment and storage medium | |
CN111179282B (en) | Image processing method, image processing device, storage medium and electronic apparatus | |
CN108307125B (en) | Image acquisition method, device and storage medium | |
US11759143B2 (en) | Skin detection method and electronic device | |
WO2022262313A1 (en) | Picture-in-picture-based image processing method, device, storage medium, and program product | |
CN114489533A (en) | Screen projection method and device, electronic equipment and computer readable storage medium | |
CN110138999B (en) | Certificate scanning method and device for mobile terminal | |
US20230364510A1 (en) | Image prediction method, electronic device, and storage medium | |
CN106982327A (en) | Image processing method and device | |
CN108718388B (en) | Photographing method and mobile terminal | |
CN108040204A (en) | A kind of image capturing method based on multi-cam, device and storage medium | |
CN114727220A (en) | Equipment searching method and electronic equipment | |
CN110807769B (en) | Image display control method and device | |
WO2023273050A1 (en) | Living body detection method and apparatus, electronic device, and storage medium | |
CN111860064A (en) | Target detection method, device and equipment based on video and storage medium | |
CN113923351B (en) | Method, device and storage medium for exiting multi-channel video shooting | |
CN115631250B (en) | Image processing method and electronic equipment | |
CN114302063B (en) | Shooting method and equipment | |
CN110717365B (en) | Method and device for obtaining picture | |
CN116782023A (en) | Shooting method and electronic equipment | |
CN107194363B (en) | Image saturation processing method and device, storage medium and computer equipment | |
CN112184802A (en) | Calibration frame adjusting method and device and storage medium | |
CN115150543B (en) | Shooting method, shooting device, electronic equipment and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |