CN103945116A - Apparatus and method for processing image in mobile terminal having camera - Google Patents

Apparatus and method for processing image in mobile terminal having camera Download PDF

Info

Publication number
CN103945116A
CN103945116A CN201410030654.9A CN201410030654A CN103945116A CN 103945116 A CN103945116 A CN 103945116A CN 201410030654 A CN201410030654 A CN 201410030654A CN 103945116 A CN103945116 A CN 103945116A
Authority
CN
China
Prior art keywords
image
images
described multiple
region
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410030654.9A
Other languages
Chinese (zh)
Other versions
CN103945116B (en
Inventor
尹泳权
孙才植
金杓宰
罗进熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN103945116A publication Critical patent/CN103945116A/en
Application granted granted Critical
Publication of CN103945116B publication Critical patent/CN103945116B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0075Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/676Bracketing for image capture at varying focusing conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/38Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)

Abstract

An apparatus and method process an image in a portable terminal having a camera. The apparatus obtains a final image from a plurality of images by performing a focusing function based on a plurality of images photographed by varying focus characteristics of an identical subject.

Description

For process equipment and the method for image at the mobile terminal with camera
Technical field
The disclosure relates to a kind of for process the method and apparatus of image at the portable terminal with camera, more specifically, relate to a kind of by the method and apparatus based on carry out focusing functions and obtain final image from described multiple images through changing multiple images that the focus characteristics of same target takes.
Background technology
In traditional camera, in order to utilize full Jiao or function out of focus, must be by changing camera lens or change F-number the degree of depth that changes camera lens.But the integrated camera lens of the structure of lens changing or aperture has large scale, therefore, they have problem being applied to aspect the camera of mobile phone of the trend that meets miniaturization or compact camera.
In addition, can by use cutting techniques and the Blur filter is applied to background and by have the high depth of focus camera obtain image be divided into the part being focused and background, utilize function out of focus.But owing to dividing aspect object and background and can have problems, and in the time of application the Blur filter, the degeneration that can produce picture quality due to the restriction of cutting techniques, so having problems aspect obtaining on high-quality photos.Can be by using the EDoF(depth of focus that process the degree of depth of expansion effect by image to expand) technology utilizes complete burnt function.But the method is because the degeneration of picture quality is seldom used.In addition, can by use obtain the information of light path camera (such as, full light camera and array camera) utilize the function that reassembles.But the method has problem aspect acquisition high resolution picture.
Therefore, according to the trend of miniaturization, be, necessary for the camera of mobile phone and compact camera in the development of the technical elements that focusing function (such as, complete burnt, out of focus and focusing function again) is provided.
Summary of the invention
In order to address the aforementioned drawbacks, main object is herein to provide a kind of multiple images of taking by the focus characteristics based on through changing same target carries out focusing function and obtains from described multiple images equipment and the method for final image.
According to of the present disclosure a kind of for process the equipment of image at the mobile terminal with camera, described equipment comprises: display unit; Control unit, multiple the first images of taking based on the focus characteristics by changing same target are carried out focusing function and from described multiple the first images, are obtained the 3rd image, and show by display unit the 3rd image obtaining.
In addition, according to of the present disclosure a kind of for process the method for image at the mobile terminal with camera, described method comprises: multiple the first images of taking by the focus characteristics based on through changing same target are carried out focusing function and from described multiple the first images, obtained the 3rd image; Show the 3rd image obtaining.
According to the disclosure, in picture catching, can be by based on obtaining multiple images with different focus characteristics through changing multiple images that focus characteristics catches, and carry out focusing functions based on the multiple images that obtain and utilize focusing function (such as, complete burnt, out of focus and focusing function again).
" embodiment " below carrying out before, it can be favourable being set forth in the certain words of use in the full section of patent document and the definition of phrase: term " comprises " and derivative represents hard-core comprising; Term "or" is inclusive, and the meaning is "and/or"; Phrase " with ... be associated " and " associated with it " and derivative thereof can represent to comprise, are included, with ... interconnect, comprise, be included, be connected to ... or with ... connect, be coupled to ... or with ... coupling, with ... can communicate by letter, with ... staggered, juxtaposition, close to ..., be bound to ... or with ... bind, have, have ... attribute, etc.; And term " controller " represents to control any device, system or the device of at least one operation, a part for system, wherein, described device may be implemented as a certain combination of at least two in hardware, firmware or software or hardware, firmware and software.It should be noted that: be no matter locally or remotely, all can concentrate or function that distribution is associated with any specific controller.The full section of patent document provides the definition for certain words and phrase, those of ordinary skill in the art to it is to be understood that in multiple (if not maximum) example, described definition be applied to restriction word and expression before and following use.
Brief description of the drawings
In order more completely to understand the disclosure and its advantage, referring now to the description below in conjunction with accompanying drawing, wherein, same label represents same part:
Fig. 1 illustrates according to the block diagram of the internal structure of the portable terminal of embodiment of the present disclosure;
Fig. 2 illustrates according to the flow chart of the process of the processing image of embodiment of the present disclosure;
Fig. 3 illustrates according to the example of the photographed scene of embodiment of the present disclosure;
Fig. 4 illustrates according to the variable motion of the lens location of embodiment of the present disclosure;
Fig. 5 illustrates according to the flow chart of the process of the processing image of another embodiment of the present disclosure;
Fig. 6 illustrates the flow chart of the detailed process of multiple the second images of the selection in Fig. 5;
Fig. 7 illustrates according to the example of multiple the second images of the selection of another embodiment of the present disclosure to Fig. 9.
Embodiment
Fig. 1 discussed below should not be understood to limit the scope of the present disclosure by any way for describing the various embodiment of principle of the present disclosure to be only used as signal to Fig. 9 with in document of the present invention.Those skilled in the art will appreciate that principle of the present disclosure can be implemented as system or the device suitably arranged arbitrarily.Hereinafter, describe embodiment of the present disclosure in detail with reference to accompanying drawing.Run through whole accompanying drawing, identical reference symbol is used to indicate identical or same part.The known function comprising herein and the detailed description of structure can be omitted to avoid fuzzy theme of the present disclosure.
Fig. 1 illustrates according to the block diagram of the internal structure of the portable terminal 100 of embodiment of the present disclosure.
With reference to Fig. 1, can comprise camera unit 110, audio treatment unit 120, display unit 130, memory cell 140 and control unit 150 according to portable terminal 100 of the present disclosure.
Camera unit 110 is carried out the function of receiving video signals.Camera unit 110 is configured with camera lens 111 and imageing sensor 113, and processed frame image (such as, the rest image being obtained by imageing sensor 113 under communication pattern or screening-mode and moving image).The two field picture of being processed by camera unit 110 can be output by display unit 130.In addition the two field picture of being processed by camera unit 110, can be stored in memory cell 140 or be sent to outside by wireless communication unit (not shown).
Camera unit 110 can be configured with more than one camera according to the specification of portable terminal 100.For example, portable terminal 100 can have two cameras of the both sides that are placed in respectively display unit 130 the same sides or display unit 130.
Audio treatment unit 120 can be configured with coding decoder, and coding decoder can be configured with the data encoding decoder of handle packet and the audio signal audio coder-decoder of (such as, voice).Audio treatment unit 120 is transformed to simulated audio signal by audio coder-decoder by digital audio and video signals, and play by loud speaker (SPK), and will be transformed to digital audio and video signals by the simulated audio signal of microphone (MIC) input by audio coder-decoder.
Display unit 130 visually provides diversified information (such as menu, input data and the function setting of, portable terminal 100) for user.Display unit 130 is carried out the function of startup screen, wait screen, menu screen, communication screen and the application screen of output portable terminal 100.
Display unit 130 can be configured with LCD(liquid crystal display), OLED(Organic Light Emitting Diode), AMOLED(active-matrix Organic Light Emitting Diode), flexible display or 3 dimension displays.
Memory cell 140 is carried out the required program of the operation of storage portable terminals 100 and the effect of data, and can be divided into program area and data area.Program area can store general operation for controlling portable terminal 100, start the program of other optional function (such as, the playing function of voice dialogue function, camera function, sound-playing function and rest image or moving image) of the required application program of operating system (OS), the play multimedia content of portable terminal 100 and portable terminal 100.Data area be storage by using the region of the data that produce of portable terminal 100, and can storing static image, moving image, telephone directory and voice data.
Control unit 150 is controlled the general operation of each assembly of portable terminal 100.If the image camera function being provided by portable terminal 100 is performed, control unit 150 enters image taking pattern by controlling camera unit 110.In addition, control unit 150 can be according to user's operation and photographic images the image of shooting is stored in memory cell 140.
Particularly, in the time catching multiple the first image by the focus characteristics that changes same target, obtain multiple first images with different focus characteristics according to control unit 150 of the present disclosure.In other words, in the time catching image by camera unit 110, control unit 150 is photographic images by repeatedly changing distance between imageing sensor 113 and camera lens 111, and obtains multiple first images with different focus characteristics.In addition, control unit 150 can be by carrying out focusing function and obtain the 3rd image from multiple the first image based on multiple the first images.For this reason, can be configured with lens location controller 151, graphics processing unit 153 and image storage unit 155 according to control unit 150 of the present disclosure.
Hereinafter, describe according to the image processing operations of embodiment of the present disclosure to Fig. 4 with reference to Fig. 2.
Fig. 2 illustrates that, according to the flow chart of the process of the processing image of embodiment of the present disclosure, Fig. 3 illustrates that, according to the example of the photographed scene of embodiment of the present disclosure, Fig. 4 illustrates according to the variable motion of the lens location of embodiment of the present disclosure.
In Fig. 2, if portable terminal enters image taking pattern (S210),, in the time catching image according to user's operation, lens location controller 151 changes the position that is arranged on the camera lens 111 in camera unit 110 to obtain multiple first images (S220) with different focus characteristics.In other words, lens location controller 151 can be by taking when repeatedly changing the position of camera lens 111 and obtain multiple the first images adjusting distance between camera lens 111 and imageing sensor 113.
With reference to Fig. 3, in the time that the order (object 3, object 2, object 1) of the distance according to based on camera unit 110 catches image, positioner 151 can obtain image to adjust focal plane (FP) by the position that is changed camera lens 111 by camera unit 110.In other words,, along with the distance between camera lens 111 and imageing sensor 113 is increasing, focal plane (FP) approaches camera unit 110.Meanwhile, along with object is positioned at away from focal plane (FP), image blurring degree increases.With reference to Fig. 4, if start to catch image (st) according to user's operation, lens location controller 151 by adjust distance between camera lens 111 and imageing sensor 113 (D1, D2 ... Dn) obtain multiple images with different focus characteristics.
Return to Fig. 2, graphics processing unit 153 is carried out focusing function and from multiple the first images, is obtained the 3rd image (S230) by multiple the first images based on catching when change the position of camera lens 111 via lens location controller 151.
In more detail, graphics processing unit 153 can be by multiple the first images are compared mutually, and according to the image of multiple targets synthetic focusing in multiple the first images that forms the first image, obtain the 3rd image from multiple the first images.In other words, for multiple the first images, graphics processing unit 153 can focus on the multiple images on each object in the image that comprises multiple objects by acquisition, and the multiple images that focus on each object are synthesized to carry out complete burnt function.
In addition, graphics processing unit 153 can obtain the 3rd image by following operation: analyze multiple the first images relatively mutual, the first image that user is selected from multiple the first images synthesizes the image of focusing and remaining multiple the first images are synthesized to unfocused image.In other words the image that, graphics processing unit 153 can synthesize focusing by the object that user is selected from multiple objects synthesizes unfocused image by remaining multiple objects and carries out function out of focus and focusing function again.
Here, graphics processing unit 153 can obtain the 3rd image after the Blur filter being applied to remaining multiple the first object from multiple the first images.In other words, graphics processing unit 153 can increase the fog-level of remaining multiple the first objects, sharper keen with the image section that makes to be selected by user.
In addition,, in the time of application the Blur filter, graphics processing unit 153 can be applied to different the Blur filters each remaining image.In other words, graphics processing unit 153 is not applied to remaining image by identical the Blur filter, and different the Blur filters is applied to each remaining image.
Subsequently, graphics processing unit 153 can show the 3rd image (S240) obtaining by display unit 130.Here, image storage unit 155 can be stored in memory cell 140 by multiple the first images by merging into file format.For example, can be included in the image file being stored in memory cell 140 by the multiple images that obtain for same target change focus characteristics.Graphics processing unit 153 can have multiple first images of different focus characteristics and they carry out focusing function again with stored in file format by merging.In other words, graphics processing unit 153 can be included in the multiple images in an image file being stored in memory cell 140 and be carried out focusing function again by only the image of being selected by user being carried out to function out of focus by use.
Hereinafter, describe according to the method for the processing image of another embodiment of the present disclosure with reference to Fig. 5.
Fig. 5 illustrates according to the flow chart of the process of the processing image of another embodiment of the present disclosure.
According to the process of the processing image of embodiment in fact with the similar process of the processing image of a upper embodiment, therefore the discrepant part of tool is only described hereinafter.
With reference to Fig. 5, if portable terminal 100 enters image taking pattern (S510),, in the time catching image according to user's operation, the position that lens location controller 151 is arranged on the camera lens 111 in camera unit 110 by change obtains multiple first images (S520) with different focus characteristics.
Subsequently, in the time changing the position of camera lens 111 by lens location controller 151, graphics processing unit 153 is selected multiple the second images (S530) from multiple the first images.
Then, graphics processing unit 153 is carried out focusing function and from multiple the second images, is obtained the 3rd image (S540) by multiple the second images based on selecting from multiple the first images.In other words, graphics processing unit 153 can be carried out complete burnt, out of focus or focusing function again based on multiple the second images.
Subsequently, graphics processing unit 153 can show the 3rd image (S550) obtaining by display unit 130.Here, image storage unit 155 can be stored in multiple the second images in memory cell 140 by merging into file format.
Hereinafter, describe in further detail according to the process of multiple the second images of the selection of another embodiment of the present disclosure to Fig. 9 with reference to Fig. 6.
Fig. 6 illustrates the flow chart of the detailed process of multiple the second images of the selection in Fig. 5, and Fig. 7 illustrates according to the example of multiple the second images of the selection of another embodiment of the present disclosure to Fig. 9.Process shown in Fig. 6 is the example of an embodiment of the step S530 in Fig. 5.
With reference to Fig. 6, each in multiple the first images is divided into predetermined multiple regions (S610) by graphics processing unit 153.As shown in Figure 7, graphics processing unit 153 can be divided into by the first image (IM) region (IR) that quantity is " m × n " according to preliminary dimension.
Subsequently, graphics processing unit 153 calculates the boundary value (S620) for each region of multiple the first images of dividing.Boundary value represents the average acutance in each region.For example, marginal value is larger, and the acutance in region is larger.As shown in Figure 8, graphics processing unit 153 for divide multiple the first images (IM_1, IM_2 ... IM_n) calculate the marginal value of each region (IR).For example, graphics processing unit 153 carrys out computing differential value from the difference forming between pixel and the neighbor in each region (IR), and differential value based on calculating calculates the marginal value of each region (IR).
Then, the marginal value in each region of graphics processing unit 153 based on calculating obtains the position (S630) that focuses on the camera lens 111 on each region.For example, graphics processing unit 153 by the marginal value of region (IR) with for multiple the first images (IM_1, IM_2 ... IM_n) calculate multiple marginal values compare, and can from multiple the first images (IM_1, IM_2 ... IM_n) in, obtain and the position of the corresponding camera lens 111 of image with high rim value.Graphics processing unit 153 can obtain by each region (IR) repeatedly carried out to this processing the position of the camera lens 111 that focuses on each region.In other words, graphics processing unit 153 can obtain for the distance between camera lens 111 and the imageing sensor 113 of each region (IR).
Subsequently, the quantity in the region of graphics processing unit 153 based on being focused on respectively by camera lens 111 is selected multiple the second images (S640) from multiple the first images.In more detail, graphics processing unit 153 can obtain from the range information between the camera lens 111 and the imageing sensor 113 that obtain for each region (IR) quantity in the region focusing on accordingly with each position of camera lens 111.For example, graphics processing unit 153 can obtain the histogram shown in Fig. 9.If the distance between camera lens 111 and imageing sensor 113 is ' Dn ', the image (IM_n) catching has the region of about 8 focusing.If distance is ' D2 ', the image (IM_2) catching has the region of about 2 focusing.If distance is ' D1 ', the image (IM_1) catching has the region of about 6 focusing.
Graphics processing unit 153 can be based on selecting multiple the second images with the quantity in the corresponding region focusing on, each position of camera lens 111 from multiple the first images.Here, graphics processing unit 153 can be determined by use information (such as the total quantity in, the region (IR) of division, be included in the quantity in the quantity of the object in image and the region of focusing) quantity of the second image of selecting from multiple the first images.For example, be 3 if be included in the quantity of the object in image, graphics processing unit 153 can be selected 4 the second images from multiple the first images.In addition, graphics processing unit 153 can select the image of number change in the region with maximum focusing as the second image.The quantity of the second image of certainly, selecting from multiple the first images can be scheduled.
Graphics processing unit 153 can be based on selecting multiple the second images with the quantity in the corresponding region focusing on, each position of camera lens 111 and predetermined importance factor from multiple the first images.Here, importance factor comprises: to the central distance of image, the type of object and the degree that color changes.In general, user takes pictures image central in the case of main object is positioned, therefore the central region of more close image has higher value aspect importance factor, and aspect importance factor, has lower value further from the central region of image.By using this specific character, graphics processing unit 153 can be to compensating with the region of each corresponding multiple focusing in position of camera lens 111.Graphics processing unit 153 can based on the quantity in the region of the focusing of the corresponding compensation in each position of camera lens 111, from multiple the first images, select multiple the second images.
As mentioned above, when catching when image, can there are multiple images of different focus characteristics the multiple images based on obtaining by acquisitions and carry out focusing functions and utilize focusing function (such as, burnt, out of focus and focusing function again entirely).
The disclosure has described that control unit 150, lens location controller 151, graphics processing unit 153 and image storage unit 155 are configured to independent piece and each piece is carried out different functions.But, this only for convenience of description, and that each function can not be divided into is as implied above.For example, the specific function of being carried out by lens location controller 151 and graphics processing unit 153 can be carried out by control unit 150.
In addition, described when catching when image, there is multiple first images of different focus characteristics the execution of the image based on multiple acquisitions focusing function and obtain the method for the 3rd image by acquisitions, but the disclosure has been not limited to this.According to another embodiment, multiple the first images that obtain in the time catching image can be by merging them to be stored as image file, and if selected the image file of storage according to user's operation, can be based on being included in multiple the first images in image file and carrying out focusing function according to user's operation.
Although usage example embodiment has described the disclosure, can be by various changes and amending advice to those skilled in the art.Be intended that the disclosure and comprise the change and the amendment that fall within the scope of claim.

Claims (15)

1. for process an equipment for image at the mobile terminal with camera, described equipment comprises:
Display unit;
Control unit, is arranged to based on using multiple the first images that change the focus characteristics of same target and take to carry out focusing function and obtain the 3rd image, and makes display unit show the 3rd image obtaining.
2. equipment as claimed in claim 1, also comprises:
Camera unit, comprises transducer and camera lens;
Wherein, control unit is arranged in the time using camera unit to produce image, obtains described multiple first images with different focus characteristics by the distance changing between imageing sensor and camera lens.
3. equipment as claimed in claim 2, wherein, control unit is arranged to each in described multiple the first images is divided into predetermined multiple regions, calculate for each the marginal value in each region in described multiple the first images, the marginal value in each region based on calculating obtains the lens location focusing on each region, based on selecting multiple the second images by the quantity in the region of lens focus for each position from described multiple the first images, and obtain the 3rd image based on described multiple the second images.
4. equipment as claimed in claim 3, wherein, control unit be arranged to based on for each position by the quantity in the region of lens focus and predetermined importance factor, from described multiple the first images, select described multiple the second image.
5. equipment as claimed in claim 3, also comprises:
Memory cell,
Wherein, control unit is arranged to by described multiple the first images or described multiple the second image being stored in memory cell with file format described multiple the first images of merging or described multiple the second image.
6. equipment as claimed in claim 1, wherein, control unit is arranged to by described multiple the first images are compared mutually, and to synthesizing according to the image that is presented on the focusing that the target in described multiple the first image selects from described multiple the first images, from described multiple the first images, obtain the 3rd image.
7. equipment as claimed in claim 1, wherein, described multiple the first image comprises one or more image that focuses on different objects, and wherein, the 3rd image comprises the part focusing on the different objects that produce at least a portion from described one or more image.
8. for process a method for image at the mobile terminal with camera, described method comprises:
Carry out focusing function by multiple the first images based on taking with the focus characteristics that changes same target and obtain the 3rd image;
Show the 3rd image obtaining.
9. method as claimed in claim 8, also comprises:
By when using camera to produce when image, repeatedly change the distance between imageing sensor and camera lens and obtain described multiple first images with different focus characteristics.
10. method as claimed in claim 9, wherein, the step that obtains the 3rd image comprises:
Each in described multiple the first images is divided into predetermined multiple regions;
Calculate the marginal value for each region of described multiple the first images;
The marginal value in each region based on calculating obtains the position that focuses on the camera lens on each region;
Based on the quantity by the region of lens focus for each position, from described multiple the first images, select multiple the second images;
Based on select described multiple the second images and obtain the 3rd image.
11. methods as claimed in claim 10, wherein, select the step of described multiple the second images to comprise: based on selecting described multiple the second image for each position by the quantity in the region of lens focus and predetermined importance factor from described multiple the first images.
12. methods as claimed in claim 8, wherein, the step that obtains the 3rd image comprises: by described multiple the first images are compared mutually, and to synthesizing according to the image that is presented on the focusing that the target in described multiple the first image selects from described multiple the first images, from described multiple the first images, obtain the 3rd image.
13. methods as claimed in claim 8, wherein, the step that obtains the 3rd image comprises: by described multiple the first images are compared mutually, the part of described multiple the first images of being selected by user is synthesized to the image of focusing, and remaining described multiple the first images are synthesized to unfocused image, from described multiple the first images, obtain the 3rd image.
14. methods as claimed in claim 13, wherein, the step that obtains the 3rd image comprises: after the Blur filter being applied to remaining described multiple the first images, from described multiple the first images, obtain the 3rd image.
15. methods as claimed in claim 14, wherein, the step that obtains the 3rd image comprises: in the time of application the Blur filter, different the Blur filters is applied to and corresponding each image of remaining described multiple the first images.
CN201410030654.9A 2013-01-23 2014-01-22 For handling the device and method of image in the mobile terminal with camera Active CN103945116B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2013-0007252 2013-01-23
KR1020130007252A KR102022892B1 (en) 2013-01-23 2013-01-23 Apparatus and method for processing image of mobile terminal comprising camera

Publications (2)

Publication Number Publication Date
CN103945116A true CN103945116A (en) 2014-07-23
CN103945116B CN103945116B (en) 2018-06-22

Family

ID=50030063

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410030654.9A Active CN103945116B (en) 2013-01-23 2014-01-22 For handling the device and method of image in the mobile terminal with camera

Country Status (4)

Country Link
US (1) US9167150B2 (en)
EP (1) EP2760197B1 (en)
KR (1) KR102022892B1 (en)
CN (1) CN103945116B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107862243A (en) * 2016-09-21 2018-03-30 佳能株式会社 Search equipment includes the picture pick-up device and searching method of the search equipment
CN108234867A (en) * 2017-12-21 2018-06-29 维沃移动通信有限公司 Image processing method and mobile terminal
WO2019033970A1 (en) * 2017-08-17 2019-02-21 捷开通讯(深圳)有限公司 Method for image virtualization, mobile device, and storage device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6325841B2 (en) * 2014-02-27 2018-05-16 オリンパス株式会社 Imaging apparatus, imaging method, and program
CN105578045A (en) 2015-12-23 2016-05-11 努比亚技术有限公司 Terminal and shooting method of terminal
CN106937045B (en) * 2017-02-23 2020-08-14 华为机器有限公司 Display method of preview image, terminal equipment and computer storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060061678A1 (en) * 2004-09-17 2006-03-23 Casio Computer Co., Ltd. Digital cameras and image pickup methods
CN101090442A (en) * 2006-06-13 2007-12-19 三星电子株式会社 Method and apparatus for taking images using mobile communication terminal with plurality of camera lenses
US20080259176A1 (en) * 2007-04-20 2008-10-23 Fujifilm Corporation Image pickup apparatus, image processing apparatus, image pickup method, and image processing method
JP2008271241A (en) * 2007-04-20 2008-11-06 Fujifilm Corp Imaging apparatus, image processing apparatus, imaging method, and image processing method
US20090169122A1 (en) * 2007-12-27 2009-07-02 Motorola, Inc. Method and apparatus for focusing on objects at different distances for one image
CN101480040A (en) * 2006-07-07 2009-07-08 索尼爱立信移动通讯有限公司 Active autofocus window
US20110135208A1 (en) * 2009-12-03 2011-06-09 Qualcomm Incorporated Digital image combining to produce optical effects
US20110187900A1 (en) * 2010-02-01 2011-08-04 Samsung Electronics Co., Ltd. Digital image processing apparatus, an image processing method, and a recording medium storing the image processing method
CN102273213A (en) * 2009-01-29 2011-12-07 三星电子株式会社 Image data obtaining method and apparatus therefor

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7084910B2 (en) * 2002-02-08 2006-08-01 Hewlett-Packard Development Company, L.P. System and method for using multiple images in a digital image capture device
JP4747568B2 (en) * 2004-12-03 2011-08-17 カシオ計算機株式会社 Camera apparatus and photographing control program
JP4135726B2 (en) * 2005-04-20 2008-08-20 オムロン株式会社 Manufacturing condition setting system, manufacturing condition setting method, control program, and computer-readable recording medium recording the same
JP5094070B2 (en) * 2006-07-25 2012-12-12 キヤノン株式会社 Imaging apparatus, imaging method, program, and storage medium
JP5028574B2 (en) * 2007-08-10 2012-09-19 株式会社メガチップス Digital camera system
US8154647B2 (en) * 2008-03-05 2012-04-10 Applied Minds, Llc Automated extended depth of field imaging apparatus and method
US8675085B2 (en) * 2010-07-14 2014-03-18 James Randall Beckers Camera that combines images of different scene depths
JP5144724B2 (en) * 2010-07-15 2013-02-13 富士フイルム株式会社 Imaging apparatus, image processing apparatus, imaging method, and image processing method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060061678A1 (en) * 2004-09-17 2006-03-23 Casio Computer Co., Ltd. Digital cameras and image pickup methods
CN101090442A (en) * 2006-06-13 2007-12-19 三星电子株式会社 Method and apparatus for taking images using mobile communication terminal with plurality of camera lenses
CN101480040A (en) * 2006-07-07 2009-07-08 索尼爱立信移动通讯有限公司 Active autofocus window
US20080259176A1 (en) * 2007-04-20 2008-10-23 Fujifilm Corporation Image pickup apparatus, image processing apparatus, image pickup method, and image processing method
JP2008271241A (en) * 2007-04-20 2008-11-06 Fujifilm Corp Imaging apparatus, image processing apparatus, imaging method, and image processing method
US20090169122A1 (en) * 2007-12-27 2009-07-02 Motorola, Inc. Method and apparatus for focusing on objects at different distances for one image
CN102273213A (en) * 2009-01-29 2011-12-07 三星电子株式会社 Image data obtaining method and apparatus therefor
US20110135208A1 (en) * 2009-12-03 2011-06-09 Qualcomm Incorporated Digital image combining to produce optical effects
US20110187900A1 (en) * 2010-02-01 2011-08-04 Samsung Electronics Co., Ltd. Digital image processing apparatus, an image processing method, and a recording medium storing the image processing method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107862243A (en) * 2016-09-21 2018-03-30 佳能株式会社 Search equipment includes the picture pick-up device and searching method of the search equipment
CN107862243B (en) * 2016-09-21 2021-10-19 佳能株式会社 Search apparatus, image pickup apparatus including the same, and search method
WO2019033970A1 (en) * 2017-08-17 2019-02-21 捷开通讯(深圳)有限公司 Method for image virtualization, mobile device, and storage device
CN108234867A (en) * 2017-12-21 2018-06-29 维沃移动通信有限公司 Image processing method and mobile terminal

Also Published As

Publication number Publication date
US9167150B2 (en) 2015-10-20
US20140204236A1 (en) 2014-07-24
KR102022892B1 (en) 2019-11-04
CN103945116B (en) 2018-06-22
KR20140094791A (en) 2014-07-31
EP2760197B1 (en) 2017-03-08
EP2760197A1 (en) 2014-07-30

Similar Documents

Publication Publication Date Title
US20170064174A1 (en) Image shooting terminal and image shooting method
KR102338576B1 (en) Electronic device which stores depth information associating with image in accordance with Property of depth information acquired using image and the controlling method thereof
CN107950018B (en) Image generation method and system, and computer readable medium
US9692959B2 (en) Image processing apparatus and method
US9179070B2 (en) Method for adjusting focus position and electronic apparatus
JP5567235B2 (en) Image processing apparatus, photographing apparatus, program, and image processing method
TWI706379B (en) Method, apparatus and electronic device for image processing and storage medium thereof
CN103945116A (en) Apparatus and method for processing image in mobile terminal having camera
CN112714255B (en) Shooting method and device, electronic equipment and readable storage medium
CN104660909A (en) Image acquisition method, image acquisition device and terminal
US20140168371A1 (en) Image processing apparatus and image refocusing method
CN112637515B (en) Shooting method and device and electronic equipment
CN113411498A (en) Image shooting method, mobile terminal and storage medium
CN112738397A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN108986117B (en) Video image segmentation method and device
CN112017137A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN108259767B (en) Image processing method, image processing device, storage medium and electronic equipment
CN114390197A (en) Shooting method and device, electronic equipment and readable storage medium
CN113989387A (en) Camera shooting parameter adjusting method and device and electronic equipment
CN115623313A (en) Image processing method, image processing apparatus, electronic device, and storage medium
TW201911853A (en) Dual-camera image pick-up apparatus and image capturing method thereof
CN107085841B (en) Picture zooming processing method and terminal
CN112565586A (en) Automatic focusing method and device
CN110933300B (en) Image processing method and electronic terminal equipment
CN104994294A (en) Shooting method of multiple wide-angle lenses and mobile terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant