CN106303272B - Control method and control device - Google Patents
Control method and control device Download PDFInfo
- Publication number
- CN106303272B CN106303272B CN201610615988.1A CN201610615988A CN106303272B CN 106303272 B CN106303272 B CN 106303272B CN 201610615988 A CN201610615988 A CN 201610615988A CN 106303272 B CN106303272 B CN 106303272B
- Authority
- CN
- China
- Prior art keywords
- exposure
- time
- exposure time
- region
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The invention discloses a kind of control method, for controlling imaging device to be imaged.Imaging device includes imaging sensor.Control method includes:Determine the highlights region and dark portion region of target scene;Highlight exposure time and shadow exposure time are calculated according to the brightness in highlights region and dark portion region respectively;The displacement of imaging sensor is determined according to the line of demarcation in highlights region and dark portion region;Determine more sub- time for exposure;The multiframe preview image of corresponding highlight exposure time, shadow exposure time and more sub- time for exposure are obtained respectively;And synthesis multiframe preview image.In addition, the invention also discloses a kind of control device.The control method and control device of embodiment of the present invention, in imaging process, by controlling micro mechanical system to drive imaging sensor movement to carry out multiexposure, multiple exposure and image being synthesized so as to shoot the photo of fade effect, it is simple to operate, mobile accuracy is high, image is soft naturally, also increasing the interest that electronic installation is taken pictures simultaneously.
Description
Technical field
The present invention relates to imaging technique, more particularly to a kind of control method and control device.
Background technology
It is existing some special screnes are shot using mobile phone camera effect it is poor, such as scene by luminance contrast it is larger two
Part is formed, and as setting sun seascape, sky high brightness and marine site low-light level form luminance contrast, it is difficult to select suitable exposure value to make
The each exposure value for obtaining image make it that the various pieces light exposure of image is suitable, poor image quality.
The content of the invention
It is contemplated that at least solves one of technical problem present in prior art.Therefore, the present invention needs offer one
Kind control method and control device.
The control method of embodiment of the present invention, for controlling imaging device to be imaged, the imaging device passes including image
Sensor, the control method comprise the following steps:
Determine the highlights region and dark portion region of target scene;
When calculating the highlight exposure in the highlights region respectively according to the brightness in the highlights region and the dark portion region
Between and the dark portion region the shadow exposure time;
According to the line of demarcation in the highlights region and the dark portion region determine the displacement of described image sensor with
The scene after movement is set to fall into the highlights region or the dark portion region;
When determining more height exposures according to the highlight exposure time, the shadow exposure time and the displacement
Between;
Control MEMS driving described image sensor is moved and obtained respectively along the first direction vertical with optical axis
The multiframe preview image of the corresponding highlight exposure time, the shadow exposure time and the multiple sub- time for exposure;And
The multiframe preview image is synthesized to form final image.
In some embodiments, the step of highlights region and dark portion region of the determination target scene includes:
Scene Recognition technology is used to identify the target scene to determine the highlights region of the target scene and dark portion area
Domain.
In some embodiments, it is described according to the highlight exposure time, the shadow exposure time and the movement
Distance determines that the step of more sub- time for exposure includes:
Described image sensor is determined according to the highlight exposure time, the shadow exposure time and the displacement
More sub- exposure positions;And
The multiple sub- time for exposure according to corresponding to determining the multiple sub- exposure position.
In some embodiments, the control MEMS driving described image sensor is along vertical with optical axis the
One direction is mobile and obtains corresponding highlight exposure time, the shadow exposure time and the multiple sub- time for exposure respectively
Multiframe preview image the step of include:
MEMS driving described image sensor is controlled to be moved simultaneously along the first direction according to the displacement
The multiframe preview graph of the corresponding highlight exposure time, the shadow exposure time and the multiple sub- time for exposure are obtained respectively
Picture.
In some embodiments, the synthesis multiframe preview image to form final image the step of include:
Choose image based on preview image corresponding to the highlight exposure time;And
Same section in the multiframe preview image is added up and covers the base image to form the final image.
The control device of embodiment of the present invention, for controlling imaging device to be imaged, the imaging device passes including image
Sensor, the control device include:
First determining module, for determining the highlights region and dark portion region of target scene;
Computing module, for calculating the highlights region respectively according to the brightness in the highlights region and the dark portion region
The highlight exposure time and the dark portion region the shadow exposure time;And
Second determining module, for determining that described image passes according to the line of demarcation in the highlights region and the dark portion region
The displacement of sensor is so that the scene after mobile falls into the highlights region or the dark portion region;
3rd determining module, for according to the highlight exposure time, the shadow exposure time and the displacement
Determine more sub- time for exposure;
Control module, for controlling MEMS driving described image sensor to be moved along the first direction vertical with optical axis
Multiframe that is dynamic and obtaining the corresponding highlight exposure time, the shadow exposure time and the multiple sub- time for exposure respectively is pre-
Look at image;And
Synthesis module, for synthesizing the multiframe preview image to form final image.
In some embodiments, first determining module is used to identify the target scene using scene Recognition technology
To determine the highlights region of the target scene and dark portion region.
In some embodiments, the 3rd determining module is used to be exposed according to the highlight exposure time, the dark portion
Between light time and the displacement determines more sub- exposure positions of described image sensor and according to the multiple sub- exposure position
Put the multiple sub- time for exposure corresponding to determination.
In some embodiments, the control module is used to control MEMS driving institute according to the displacement
Imaging sensor is stated to move along the first direction and obtain the corresponding highlight exposure time, the shadow exposure time respectively
And the multiframe preview image of the multiple sub- time for exposure.
In some embodiments, the synthesis module is used to choose preview image work corresponding to the highlight exposure time
Based on image.And same section in the multiframe preview image is added up and to cover the base image described final to be formed
Image
The control method and control device of embodiment of the present invention, in imaging process, by controlling micro mechanical system to drive
Motion video sensor moves, and carries out multiexposure, multiple exposure and image is synthesized so as to shoot the photo with fade effect, behaviour
Make simple and convenient, mobile accuracy height, image is soft naturally, also increasing the interest that electronic installation is taken pictures simultaneously.
The advantages of additional aspect of the present invention, will be set forth in part in the description, and will partly become from the following description
Obtain substantially, or recognized by the practice of the present invention.
Brief description of the drawings
The above-mentioned and/or additional aspect and advantage of the present invention is from combining in description of the accompanying drawings below to embodiment by change
Obtain substantially and be readily appreciated that, wherein:
Fig. 1 is the schematic flow sheet of the control method of embodiment of the present invention.
Fig. 2 is the high-level schematic functional block diagram of the control device of embodiment of the present invention.
Fig. 3 is the view of the control method of embodiment of the present invention.
Fig. 4 is the structural representation of the control device of embodiment of the present invention.
Embodiment
Embodiments of the present invention are described below in detail, the example of the embodiment is shown in the drawings, wherein identical
Or similar label represents same or similar element or the element with same or like function from beginning to end.Below by ginseng
The embodiment for examining accompanying drawing description is exemplary, is only used for explaining embodiments of the present invention, and it is not intended that to this hair
The limitation of bright embodiment.
Referring to Fig. 1, the control method of embodiment of the present invention, for controlling imaging device to be imaged, imaging device includes
Imaging sensor.Control method includes step:
S10:Determine the highlights region and dark portion region of target scene;
S20:Calculate highlight exposure time and the dark portion in highlights region respectively according to the brightness in highlights region and dark portion region
The shadow exposure time in region;
S30:The displacement of imaging sensor is determined according to the line of demarcation in highlights region and dark portion region so that after mobile
Scene fall into highlights region or dark portion region;
S40:More sub- time for exposure are determined according to highlight exposure time, shadow exposure time and displacement;
S50:Control MEMS driving imaging sensor is moved and obtained respectively along the first direction vertical with optical axis
The multiframe preview image of corresponding highlight exposure time, shadow exposure time and more sub- time for exposure;
S60:Multiframe preview image is synthesized to form final image.
Referring to Fig. 2, the control device 100 of embodiment of the present invention includes the first determining module 10, timing module 20, the
Two determining modules 30, the 3rd determining module 40, control module 50 and synthesis module 60.As an example, embodiment of the present invention
Control method can be realized by the control device 100 of embodiment of the present invention, and can be applied to electronic installation 1000 and be used to control
The imaging device 200 of electronic installation 1000 processed.Imaging device 200 includes imaging sensor 210.
Imaging sensor 210 is used to the optical signal of collection being converted into electric signal, carries out forming image.
Wherein, the step S10 of the control method of embodiment of the present invention can be realized by the first determining module 10, step
S20 can be realized that step S30 can be realized that step S40 can be by the 3rd determining module by the second module 30 by computing module 20
40 are realized, step S50 can be realized that step S60 can be realized by synthesis module 60 by control module 50.In other words, first
Determining module 10 is used for the highlights region and dark portion region for determining target scene.Computing module 20 is for according to highlights region and secretly
The brightness in portion region calculates the highlight exposure time in highlights region and the shadow exposure time in dark portion region respectively.Second determines mould
Block 30 is used to determine the displacement of imaging sensor according to the line of demarcation in highlights region and dark portion region the field so that after mobile
Scape falls into highlights region or dark portion region.3rd determining module 40 is used for according to highlight exposure time, shadow exposure time and shifting
Dynamic distance determines more sub- time for exposure.Control module 50 is used to control MEMS driving imaging sensor edge to hang down with optical axis
Straight first direction is mobile and obtains the multiframe of corresponding highlight exposure time, shadow exposure time and more sub- time for exposure respectively
Preview image.Synthesis module 60 is used to synthesize multiframe preview image to form final image.
Under normal circumstances, for such as mobile phone of portable electron device 1000, design size, imaging device 200 are limited to
Camera module designs such as slr camera relatively simple compared to mm professional camera special in other words, and its function and performance are also relatively
It is weak.Therefore, user needs to carry out complex operation when carrying out some special shootings, and often ineffective.
As an example, in some embodiments, when shooting in some landscape scenes, scene be mostly by light condition not
Same two parts are formed, and are made up of in other words the larger two parts of luminance contrast, such as setting sun seascape, the high brightness of sky portion
And the low-light level of marine site part forms more strong luminance contrast, the imaging device of existing mobile phone when shooting such scene,
Only carry out a light-metering and then expose to obtain image, institute is into high-brightness region overexposure being presented in image and low brightness area owes to expose
Effect, image whole structure is poor.And if wish preferably to shoot the image of such scene, it usually needs by such as gradual change
The auxiliary appliances such as mirror are shot, and operation is complex.
Referring to Fig. 3, the control method of embodiment of the present invention is by driving imaging sensor 210 to move, to highlights area
Domain and dark portion region carry out multiexposure, multiple exposure so that highlights region and dark portion region respective time for exposure are suitable, are passed through
Image synthesizes and then realizes the image taking with fade effect, in shooting process without other auxiliary appliances.
In this way, the control method of embodiment of the present invention, control device 100 and electronic installation 1000, in imaging process,
By controlling micro mechanical system to drive imaging sensor 210 to move, carry out multiexposure, multiple exposure and image is synthesized so as to shoot
Go out to carry the photo of fade effect, simple to operate, mobile accuracy is high, and image is soft naturally, also increasing electronics dress simultaneously
Put 1000 interests taken pictures.
Specifically, referring to Fig. 4, imaging device 200 also includes MEMS (micro electro-
mechanical system,MEMS).MEMS includes fixed electrode, float electrode and can deformation connector.Float electrode with
Fixed electrode coordinates.Connector is fixedly connected with fixed electrode and float electrode.Fixed electrode and float electrode are used in driving electricity
Electrostatic force is produced in the presence of pressure.Connector is used for the direction deformation moved in the presence of electrostatic force along float electrode to allow
Float electrode is moved so as to drive imaging sensor 210 to move.
Usually, can cause under the setting of certain shape and size can deformation wire rod only may be used in the presence of external force
Along the moving direction deformation of float electrode, and (be less than predetermined threshold) in the range of certain external force, can deformation wire rod deformation
Amount is directly proportional to external force size, and in other directions, can deformation wire rod keep rigidly, being not easy deformation.
In some instances, can deformation wire rod deformation quantity be less than or equal to 150 microns.In other words, actuator 220
Stroke is less than or equal to 150 microns.
By the effect of electrostatic force, MEMS can drive imaging sensor 210 every time with the distance suitable with Pixel Dimensions
It is mobile.For example, in some instances, the size of the pixel of imaging sensor 210 is 2 microns, then can be quiet by being precisely controlled
The each mobile 2 microns distances that is to say a mobile pixel of size driving imaging sensor 210 of electric power.
In the shooting process of image, imaging sensor 210 needs to be followed by receiving suitable light-inletting quantity in shutter opening, and then
Produce photovoltaic reaction generation image.The highlights region included in target scene is different from the brightness in dark portion region, can be imaged
The time for exposure is different in other words for suitable light-inletting quantity.Therefore, first have to determine the highlights region and dark portion region of target scene, enter
And determine the time for exposure corresponding to respective region.
In some instances, imaging device 200 can identify mesh under scene screening-mode by using scene Recognition technology
Mark scene and then determine the highlights region and dark portion region of target scene.
In some embodiments, the step of identifying target scene by using scene Recognition technology can be determined by first
Module 10 is realized.In other words, the first determining module 10 is used to identify target scene and then determination by using scene Recognition technology
The highlights region and dark portion region of target scene.
In operation, it is scene mode that user, which sets screening-mode, such as can select honourable scene, the first determining module 10
Can determine that after light-metering target scene whether be it is bright secretly contrast strong photographed scene, and then determine highlights region and dark portion area
The line of demarcation in domain.
In some embodiments, step S40 includes:
The more height for determining imaging sensor according to highlight exposure time, shadow exposure time and displacement expose position
Put;And
The more sub- time for exposure according to corresponding to determining more sub- exposure positions.
In some embodiments, image sensing is determined according to highlight exposure time, shadow exposure time and displacement
The step of more sub- exposure positions of device and can the step of more sub- time for exposure according to corresponding to determining more sub- exposure positions
To be realized by the 3rd determining module 40.In other words, the 3rd determining module 40 was used for according to highlight exposure time, shadow exposure time
And displacement determines more sub- exposure positions of imaging sensor, and more height according to corresponding to determining more sub- exposure positions
Time for exposure.
It is appreciated that in the larger scene of the bright dark contrast of photographed scene, as the exposure needed for highlights region to dark portion region
It is progressively longer between light time.In other words, compared to the situation of single exposure, it is desirable to reduce the time for exposure in highlights region and increase
Add the exposure time in dark portion region.
Make it that imaging effect is softer, transition is more natural, can be in highlights region to increasing multiple exposures between dark portion region
Photon position.The determination of specific sub- position can that is to say imaging sensor 210 by bright according to the displacement of imaging sensor 210
Portion region is moved to dark portion region or each into the change in location in highlights region and highlights region and dark portion region by dark portion region
From time for exposure determine.
As an example, the moving direction of imaging sensor 210 be perpendicular to optical axis direction, highlights region and dark portion region
Line of demarcation is the point midway of the longitudinal size of imaging sensor, and in other words, the displacement S of imaging sensor 210 is figure
As the 1/2 of the longitudinal size of sensor.Highlights region and dark portion region each self-corresponding time for exposure are t1 and t2, and wherein t2 is big
In t1.
Further, t2 and t1 difference is the traveling time of imaging sensor 210, and then can be according to displacement S
The time interval t that imaging sensor 210 moves every time is calculated, so that it is determined that sub- exposure position, wherein displacement can be with
In units of pixel.And the sub- time for exposure then for previous exposure time and time interval t's and.In this way, moved by highlights region
Respective exposure can be completed during to dark portion region.Also, due to being provided with multiple sub- positions of exposure so that transition is more natural.When
So also dark portion region directly can be moved directly to by highlights region, two parts expose respectively.
Further, in some embodiments, step S50 includes:
Control MEMS driving imaging sensor to move along the first direction according to displacement and obtain respectively correspondingly
The multiframe preview image of highlight exposure time, shadow exposure time and more sub- time for exposure.
In some embodiments, drive imaging sensor to move along the first direction according to displacement and obtain respectively pair
The step of answering the multiframe preview image of highlight exposure time, shadow exposure time and more individual sub- time for exposure can be by control module
50 realize.In other words, control module 50 is used to control MEMS driving imaging sensor 210 along first according to displacement
Direction is mobile and obtains the multiframe preview graph of corresponding highlight exposure time, shadow exposure time and more sub- time for exposure respectively
Picture.
In this way, the two field picture of time for exposure appropriate corresponding region can be obtained respectively, acquired two field picture can be put into electricity
In the caching of sub-device 1000, for providing material for image synthesis.
Further, in some embodiments, step S60 includes:
Image based on preview image corresponding to the selection highlight exposure time;And
Same section in multiframe preview image is added up and covers base image to form final image.
In some embodiments, choose corresponding to the highlight exposure time based on preview image the step of image and will
Add up and cover the step of base image is to form the final image can be by closing for same section in multiframe preview image
Realized into module 60.In other words, synthesis module 60 is used to choose image based on preview image corresponding to the highlight exposure time
And same section in multiframe preview image is added up and covers base image to form final image.
It should be noted that the selection of base image can determine according to the moving direction of imaging sensor 210, if for example,
Map sensor 210 is moved to dark portion region by highlights region, then based on preview image corresponding to the selection highlight exposure time
Image, analogously, if map sensor 210 is moved to highlights region by dark portion region, choose corresponding to the shadow exposure time
Image based on preview image.
In image synthesizing procedure, choose identical part in each two field picture and added up, that is to say that pixel value brightness is entered
Row is cumulative, and covers base image, you can obtains final image.In this way, in acquired final image, the exposure value of each several part
Respective luminous environment is adapted to, appropriate to dark portion regional exposure by highlights region, transition is naturally soft, and fade effect is preferable.
In the description of embodiments of the present invention, it is to be understood that term " " center ", " longitudinal direction ", " transverse direction ", " length
Degree ", " width ", " thickness ", " on ", " under ", "front", "rear", "left", "right", " vertical ", " level ", " top ", " bottom ", " interior ",
The orientation or position relationship of the instruction such as " outer ", " clockwise ", " counterclockwise " are based on orientation shown in the drawings or position relationship, only
It is to be described for the ease of description embodiments of the present invention with simplified, rather than indicates or imply that the device of meaning or element are necessary
With specific orientation, with specific azimuth configuration and operation, therefore it is not intended that limitation to embodiments of the present invention.
In addition, term " first ", " second " are only used for describing purpose, and it is not intended that instruction or hint relative importance
Or the implicit quantity for indicating indicated technical characteristic.Thus, define " first ", the feature of " second " can be expressed or
Implicitly include one or more feature.In the description of embodiments of the present invention, " multiple " are meant that two
Individual or two or more, unless otherwise specifically defined.
, it is necessary to illustrate in the description of embodiments of the present invention, unless otherwise clearly defined and limited, term
" installation ", " connected ", " connection " should be interpreted broadly, for example, it may be fixedly connected or be detachably connected, or one
Connect body;It can be mechanical connection or electrical connection or can mutually communicate;Can be joined directly together, can also lead to
Cross intermediary to be indirectly connected, can be connection or the interaction relationship of two elements of two element internals.For ability
For the those of ordinary skill in domain, it can understand that above-mentioned term in embodiments of the present invention specific contains as the case may be
Justice.
In embodiments of the present invention, unless otherwise clearly defined and limited, fisrt feature second feature it
" on " or it " under " can directly be contacted including the first and second features, it is not directly to connect that can also include the first and second features
Touch but pass through the other characterisation contact between them.Moreover, fisrt feature second feature " on ", " top " and " on
Face " includes fisrt feature directly over second feature and oblique upper, or to be merely representative of fisrt feature level height special higher than second
Sign.Fisrt feature second feature " under ", " lower section " and " below " include fisrt feature immediately below second feature and tiltedly under
Side, or be merely representative of fisrt feature level height and be less than second feature.
Above disclosure provides many different embodiments or example is used for realizing embodiments of the present invention not
Same structure.In order to simplify the disclosure of embodiments of the present invention, above the part and setting of specific examples are described.When
So, they are only example, and purpose does not lie in the limitation present invention.In addition, embodiments of the present invention can be in different examples
Repeat reference numerals and/or reference letter in son, this repetition are for purposes of simplicity and clarity, itself not indicate to be begged for
By the relation between various embodiments and/or setting.In addition, the various specific techniques that embodiments of the present invention provide
With the example of material, but those of ordinary skill in the art can be appreciated that the application of other techniques and/or making for other materials
With.
In the description of this specification, reference term " embodiment ", " some embodiments ", " schematically implementation
The description of mode ", " example ", " specific example " or " some examples " etc. means the tool with reference to the embodiment or example description
Body characteristicses, structure, material or feature are contained at least one embodiment or example of the present invention.In this manual,
Identical embodiment or example are not necessarily referring to the schematic representation of above-mentioned term.Moreover, the specific features of description, knot
Structure, material or feature can combine in an appropriate manner in any one or more embodiments or example.
Any process or method described otherwise above description in flow chart or herein is construed as, and represents to include
Module, fragment or the portion of the code of the executable instruction of one or more the step of being used to realize specific logical function or process
Point, and the scope of the preferred embodiment of the present invention includes other realization, wherein can not press shown or discuss suitable
Sequence, including according to involved function by it is basic simultaneously in the way of or in the opposite order, carry out perform function, this should be of the invention
Embodiment person of ordinary skill in the field understood.
Expression or logic and/or step described otherwise above herein in flow charts, for example, being considered use
In the order list for the executable instruction for realizing logic function, may be embodied in any computer-readable medium, for
Instruction execution system, device or equipment (such as computer based system including the system of processing module or other can be from instruction
The system of execution system, device or equipment instruction fetch and execute instruction) use, or combine these instruction execution systems, device or
Equipment and use.For the purpose of this specification, " computer-readable medium " can be it is any can include, store, communicating, propagating or
Transmission program uses for instruction execution system, device or equipment or with reference to these instruction execution systems, device or equipment
Device.The more specifically example (non-exhaustive list) of computer-readable medium includes following:With one or more wiring
Electrical connection section (electronic installation), portable computer diskette box (magnetic device), random access memory (RAM), read-only storage
(ROM), erasable edit read-only storage (EPROM or flash memory), fiber device, and portable optic disk is read-only deposits
Reservoir (CDROM).In addition, computer-readable medium, which can even is that, to print the paper of described program thereon or other are suitable
Medium, because can then enter edlin, interpretation or if necessary with it for example by carrying out optical scanner to paper or other media
His suitable method is handled electronically to obtain described program, is then stored in computer storage.
It should be appreciated that each several part of embodiments of the present invention can be with hardware, software, firmware or combinations thereof come real
It is existing.In the above-described embodiment, multiple steps or method can use storage in memory and by suitable instruction execution system
The software or firmware of execution is realized.If for example, being realized with hardware, with another embodiment, ability can be used
Any one of following technology known to domain or their combination are realized:With for realizing logic function to data-signal
The discrete logic of logic gates, there is the application specific integrated circuit of suitable combinational logic gate circuit, programmable gate array
(PGA), field programmable gate array (FPGA) etc..
Those skilled in the art are appreciated that to realize all or part of step that above-described embodiment method carries
Suddenly it is that by program the hardware of correlation can be instructed to complete, described program can be stored in a kind of computer-readable storage medium
In matter, the program upon execution, including one or a combination set of the step of embodiment of the method.
In addition, each functional unit in various embodiments of the present invention can be integrated in a processing module, also may be used
To be that unit is individually physically present, can also two or more units be integrated in a module.It is above-mentioned integrated
Module can both be realized in the form of hardware, can also be realized in the form of software function module.The integrated module
If realized in the form of software function module and as independent production marketing or in use, a calculating can also be stored in
In machine read/write memory medium.
Storage medium mentioned above can be read-only storage, disk or CD etc..
Although embodiments of the invention have been shown and described above, it is to be understood that above-described embodiment is example
Property, it is impossible to limitation of the present invention is interpreted as, one of ordinary skill in the art within the scope of the invention can be to above-mentioned
Embodiment is changed, changed, replacing and modification.
Claims (8)
1. a kind of control method, for controlling imaging device to be imaged, it is characterised in that the imaging device includes image sensing
Device, the control method comprise the following steps:
Determine the highlights region and dark portion region of target scene;
According to the brightness in the highlights region and the dark portion region calculate respectively the highlights region the highlight exposure time and
The shadow exposure time in the dark portion region;
The displacement of described image sensor is determined according to the line of demarcation in the highlights region and the dark portion region so as to move
Scene after dynamic falls into the highlights region or the dark portion region;
More sub- time for exposure are determined according to the highlight exposure time, the shadow exposure time and the displacement;
Control MEMS driving described image sensor is moved and obtained respectively corresponding along the first direction vertical with optical axis
The multiframe preview image of the highlight exposure time, the shadow exposure time and the multiple sub- time for exposure;And
The multiframe preview image is synthesized to form final image;
Wherein, it is described to determine that more height expose according to the highlight exposure time, the shadow exposure time and the displacement
The step of between light time, includes:
Described image sensor are determined according to the highlight exposure time, the shadow exposure time and the displacement more
Individual sub- exposure position;And
The multiple sub- time for exposure according to corresponding to determining the multiple sub- exposure position;
The sub- time for exposure is equal to the movement of previous exposure time and described image sensor between every sub- exposure position
Time interval sum.
2. control method as claimed in claim 1, it is characterised in that the highlights region and dark portion area for determining target scene
The step of domain, includes:
Scene Recognition technology is used to identify the target scene to determine the highlights region of the target scene and dark portion region.
3. control method as claimed in claim 1, it is characterised in that the control MEMS driving described image sensing
Device is moved and obtains the corresponding highlight exposure time respectively along the first direction vertical with optical axis, the shadow exposure time and
The step of multiframe preview image of the multiple sub- time for exposure, includes:
MEMS driving described image sensor is controlled to move and distinguish along the first direction according to the displacement
Obtain the multiframe preview image of the corresponding highlight exposure time, the shadow exposure time and the multiple sub- time for exposure.
4. control method as claimed in claim 1, it is characterised in that the synthesis multiframe preview image is final to be formed
The step of image, includes:
Choose image based on preview image corresponding to the highlight exposure time;And
Same section in the multiframe preview image is added up and covers the base image to form the final image.
5. a kind of control device, for controlling imaging device to be imaged, it is characterised in that the imaging device includes image sensing
Device, the control device include:
First determining module, for determining the highlights region and dark portion region of target scene;
Computing module, for calculating the bright of the highlights region respectively according to the brightness in the highlights region and the dark portion region
Portion's time for exposure and the shadow exposure time in the dark portion region;And
Second determining module, for determining described image sensor according to the line of demarcation in the highlights region and the dark portion region
Displacement so that the scene after mobile falls into the highlights region or the dark portion region;
3rd determining module, for being determined according to the highlight exposure time, the shadow exposure time and the displacement
More sub- time for exposure;
Control module, for controlling MEMS driving described image sensor to be moved simultaneously along the first direction vertical with optical axis
The multiframe preview graph of the corresponding highlight exposure time, the shadow exposure time and the multiple sub- time for exposure are obtained respectively
Picture;And
Synthesis module, for synthesizing the multiframe preview image to form final image;
Wherein, the 3rd determining module is used for according to the highlight exposure time, the shadow exposure time and the movement
Distance determines more sub- exposure positions of described image sensor and described according to corresponding to determining the multiple sub- exposure position
More sub- time for exposure;
Each sub- time for exposure is equal to the shifting of previous exposure time and described image sensor between every sub- exposure position
Dynamic time interval sum.
6. control device as claimed in claim 5, it is characterised in that first determining module is used to use scene Recognition skill
Art identifies the target scene to determine the highlights region of the target scene and dark portion region.
7. control device as claimed in claim 5, it is characterised in that the control module is used for according to the displacement control
When MEMS driving described image sensor processed moves along the first direction and obtains the corresponding highlight exposure respectively
Between, the multiframe preview image of the shadow exposure time and the multiple sub- time for exposure.
8. control device as claimed in claim 5, it is characterised in that when the synthesis module is used to choose the highlight exposure
Between image based on corresponding preview image.And same section in the multiframe preview image is added up and covers the basis
Image is to form the final image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610615988.1A CN106303272B (en) | 2016-07-29 | 2016-07-29 | Control method and control device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610615988.1A CN106303272B (en) | 2016-07-29 | 2016-07-29 | Control method and control device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106303272A CN106303272A (en) | 2017-01-04 |
CN106303272B true CN106303272B (en) | 2018-03-16 |
Family
ID=57663476
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610615988.1A Active CN106303272B (en) | 2016-07-29 | 2016-07-29 | Control method and control device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106303272B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107295270B (en) * | 2017-07-31 | 2020-01-24 | 爱博康电子(深圳)有限公司 | Image brightness value determination method and device, terminal and computer-readable storage medium |
CN107463052B (en) * | 2017-08-30 | 2020-08-11 | 北京小米移动软件有限公司 | Shooting exposure method and device |
CN111147739A (en) | 2018-03-27 | 2020-05-12 | 华为技术有限公司 | Photographing method, photographing device and mobile terminal |
CN114862722B (en) * | 2022-05-26 | 2023-03-24 | 广州市保伦电子有限公司 | Image brightness enhancement implementation method and processing terminal |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101278549A (en) * | 2005-10-04 | 2008-10-01 | 卢森特技术有限公司 | Multiple exposure optical imaging apparatus |
CN102098438A (en) * | 2009-12-15 | 2011-06-15 | 索尼公司 | Image capturing apparatus and image capturing method |
CN102104738A (en) * | 2009-12-18 | 2011-06-22 | 三星电子株式会社 | Multi-step exposed image acquisition method by electronic shutter and photographing apparatus using the same |
CN102131051A (en) * | 2010-12-28 | 2011-07-20 | 惠州Tcl移动通信有限公司 | Image pick-up equipment and image acquisition method and device thereof |
CN103379287A (en) * | 2012-04-13 | 2013-10-30 | 株式会社东芝 | Light receiver, light reception method and transmission system |
CN105516611A (en) * | 2014-10-08 | 2016-04-20 | 奥林巴斯株式会社 | An imaging device and a shooting method |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000278595A (en) * | 1999-03-26 | 2000-10-06 | Minolta Co Ltd | Digital camera and image pickup method |
CN1140291C (en) * | 2001-12-11 | 2004-03-03 | 张祖乾 | Stone-eliminating Jinyin medicine powder |
US20090174784A1 (en) * | 2008-01-08 | 2009-07-09 | Karlsson Sven-Olof | Camera having digital gray filtering and method of providing same |
-
2016
- 2016-07-29 CN CN201610615988.1A patent/CN106303272B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101278549A (en) * | 2005-10-04 | 2008-10-01 | 卢森特技术有限公司 | Multiple exposure optical imaging apparatus |
CN102098438A (en) * | 2009-12-15 | 2011-06-15 | 索尼公司 | Image capturing apparatus and image capturing method |
CN102104738A (en) * | 2009-12-18 | 2011-06-22 | 三星电子株式会社 | Multi-step exposed image acquisition method by electronic shutter and photographing apparatus using the same |
CN102131051A (en) * | 2010-12-28 | 2011-07-20 | 惠州Tcl移动通信有限公司 | Image pick-up equipment and image acquisition method and device thereof |
CN103379287A (en) * | 2012-04-13 | 2013-10-30 | 株式会社东芝 | Light receiver, light reception method and transmission system |
CN105516611A (en) * | 2014-10-08 | 2016-04-20 | 奥林巴斯株式会社 | An imaging device and a shooting method |
Also Published As
Publication number | Publication date |
---|---|
CN106303272A (en) | 2017-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106303272B (en) | Control method and control device | |
CN103366352B (en) | Apparatus and method for producing the image that background is blurred | |
CN107959778B (en) | Imaging method and device based on dual camera | |
CA2969482C (en) | Method and apparatus for multiple technology depth map acquisition and fusion | |
JP6911192B2 (en) | Image processing methods, equipment and devices | |
CN105657289B (en) | Control method, control device and electronic device | |
CN101690243B (en) | Image pickup device and image pickup method | |
CN107835372A (en) | Imaging method, device, mobile terminal and storage medium based on dual camera | |
CN108322646A (en) | Image processing method, device, storage medium and electronic equipment | |
US7725019B2 (en) | Apparatus and method for deciding in-focus position of imaging lens | |
CN104604215A (en) | Image capture apparatus, image capture method and program | |
CN105827980B (en) | Focusing control method and device, image formation control method and device, electronic installation | |
KR20170135855A (en) | Automated generation of panning shots | |
CN106257917B (en) | Exposure-control device and exposal control method | |
JP2008197531A (en) | Imaging apparatus | |
CN101959020A (en) | Imaging device and formation method | |
CN102984461B (en) | Image capturing device and control method thereof | |
CN110324532A (en) | A kind of image weakening method, device, storage medium and electronic equipment | |
CN107087112A (en) | The control method and control device of dual camera | |
CN106254772A (en) | Multiple image synthetic method and device | |
CN104717422A (en) | Display apparatus and display method | |
CN108737696A (en) | Picture pick-up device, control method and non-transitory storage medium | |
CN106101567B (en) | Shoot light-regulating method, device and mobile terminal | |
JP2009258610A (en) | Focal length detecting device, imaging apparatus, imaging method, camera, focusing device, and focusing method | |
CN107370962A (en) | High dynamic range images image pickup method, device and terminal device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP01 | Change in the name or title of a patent holder | ||
CP01 | Change in the name or title of a patent holder |
Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Patentee after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Patentee before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. |