WO2022188007A1 - Image processing method and electronic device - Google Patents

Image processing method and electronic device Download PDF

Info

Publication number
WO2022188007A1
WO2022188007A1 PCT/CN2021/079594 CN2021079594W WO2022188007A1 WO 2022188007 A1 WO2022188007 A1 WO 2022188007A1 CN 2021079594 W CN2021079594 W CN 2021079594W WO 2022188007 A1 WO2022188007 A1 WO 2022188007A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
image
camera image
objects
focus distance
Prior art date
Application number
PCT/CN2021/079594
Other languages
French (fr)
Inventor
Remma SUGAWARA
Jun Luo
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2021/079594 priority Critical patent/WO2022188007A1/en
Priority to CN202180095508.2A priority patent/CN117121499A/en
Publication of WO2022188007A1 publication Critical patent/WO2022188007A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters

Definitions

  • the present disclosure relates to an image processing method and an electronic device.
  • a technique for generating bokeh in images of objects located in the foreground and the background of a subject is applied to a camera image obtained by imaging the subject with a camera having a deep depth of field such as a smartphone.
  • bokeh is generated by image processing.
  • bokeh is generated based on depth information including a distance between the camera and the subject.
  • the focus distance which is a distance between the camera and an in-focus position of the camera, is fixed when the subject is imaged.
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an imaging lens assembly, a camera module, and an imaging device.
  • an image processing method includes:
  • the image processing method may further include changing the intensity of the bokeh in accordance with a second user operation.
  • the first user operation may be an operation of a first slider which is displayed by a display device, the display device displaying the camera image.
  • the second user operation may be an operation of a second slider which is displayed by a display device, the display device displaying the camera image.
  • the camera image may be a still image.
  • the camera image may be a moving image.
  • the generating the bokeh may be performed when the objects are imaged.
  • the generating the bokeh may be performed after the objects are imaged and may be performed based on a camera image stored in a memory.
  • the first user operation may be a single pressing operation of a button which is displayed by a display device, the display device displaying the camera image.
  • the camera image may be a moving image
  • the changing the focus distance may be performed within a predetermined range of the focus distance in response to the pressing operation during imaging of the moving image.
  • the predetermined range may be a range from a minimum value to a maximum value of the focus distance.
  • the image processing method may further include selecting a plurality of targets from the camera image in accordance with a third user operation, the camera image being displayed by a display device, wherein
  • the changing the focus distance may be performed so as to focus, one by one, on selected targets.
  • the third operation may be tapping the targets in the camera image.
  • the image processing method may further include tracking the selected targets, wherein
  • the changing the focus distance may be performed based on tracking result of the selected targets.
  • an electronic device includes:
  • a camera which focuses on a subject among objects and which images the objects to acquire a camera image
  • processor configured to
  • a focus distance which is a distance between the camera and an in-focus position of the camera in accordance with a first user operation
  • FIG. 1 is a diagram showing a configuration example of an electronic device capable of implementing the image processing method according to an embodiment of the present disclosure
  • FIG. 2 is a flowchart showing the image processing method according to an embodiment of the present disclosure
  • FIG. 3 is a diagram which shows setting a focus distance on a short distance side in the image processing method shown in FIG. 2;
  • FIG. 4 is a diagram which shows setting a focus distance on a long distance side in the image processing method shown in FIG. 2;
  • FIG. 5 is a flowchart showing an image processing method according to an embodiment of the present disclosure
  • FIG. 6 is a flowchart showing an image processing method according to an embodiment of the present disclosure.
  • FIG. 7 is a diagram which shows changing the focus distance from the minimum value to the maximum value in the image processing method shown in FIG. 6;
  • FIG. 8 is a flowchart showing an image processing method according to an embodiment of the present disclosure.
  • FIG. 9 is a diagram showing a plurality of targets in the image processing method shown in FIG. 8;
  • FIG. 10 is a diagram which shows changing the focus distance in accordance with a plurality of targets in the image processing method shown in FIG. 8, and
  • FIG. 11 is a diagram which shows tracking a target in the image processing method shown in FIG. 8.
  • FIG. 1 is a diagram showing a configuration example of an electronic device capable of implementing the image processing method according to the first embodiment of the present disclosure.
  • the electric device 100 includes a stereo camera module 10 as an example of a camera, a range sensor module 20, and an image signal processor 30 as an example of a processor.
  • the image signal processor 30 controls the stereo camera module 10 and the range sensor module 20, and processes camera image data acquired from the stereo camera module 10.
  • the stereo camera module 10 includes a master camera module 11 as an example of a camera and a slave camera module 12 to be used for binocular stereo viewing.
  • the master camera module 11 includes a first lens 11a that is capable of focusing on a subject, a first image sensor 11b that detects an image inputted via the first lens 11a, and a first image sensor driver 11c that drives the first image sensor 11b.
  • the master camera module 11 focuses on the subject among objects within a viewing angle of the master camera module 11 and images the objects to acquire a master camera image as an example of a camera image.
  • the slave camera module 12 includes a second lens 12a that is capable of focusing on a subject, a second image sensor 12b that detects an image inputted via the second lens 12a, and a second image sensor driver 12c that drives the second image sensor 12b.
  • the slave camera module 12 focuses on the subject among the objects and images the objects to acquire a slave camera image.
  • the range sensor module 20 includes a lens 20a, a range sensor 20b, a range sensor driver 20c and a projector 20d, as shown in FIG. 1.
  • the projector 20d emits pulsed light toward the objects including a subject and the range sensor 20b detects reflection light from the objects through the lens 20a.
  • the range sensor module 20 acquires time of flight (ToF) depth information (ToF depth value) based on the time from emitting the pulsed light to receiving the reflection light.
  • the resolution of the ToF depth information detected by the range sensor module 20 is lower than the resolution of stereo depth information of a stereo image that is acquired based on the master camera image and the slave camera image.
  • the image signal processor 30 controls the master camera module 11, the slave camera module 12, and the range sensor module 20.
  • the image signal processor 30 generates bokeh on the master camera image based on the master camera image, the slave camera image, and the ToF depth information.
  • the image signal processor 30 may, for example, correct the stereo depth information based on the ToF depth information, the stereo depth information being obtained by stereo processing of the master camera image and the slave camera image.
  • the stereo depth information may indicate a value corresponding to a deviation in a horizonal direction (x direction) between corresponding pixels of the master camera image and the slave camera image.
  • the image signal processor 30 may generate bokeh based on the corrected stereo depth information.
  • the bokeh may be generated using a Gaussian filter having a standard deviation ⁇ corresponding to the corrected stereo depth information.
  • the electric device 100 includes a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial navigation unit (IMU) 47, a main processor 48, and a memory 49.
  • GNSS global navigation satellite system
  • IMU inertial navigation unit
  • the GNSS module 40 measures the current position of the electric device 100.
  • the wireless communication module 41 performs wireless communications with the Internet.
  • the CODEC 42 bidirectionally performs encoding and decoding, using a predetermined encoding/decoding method.
  • the speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42.
  • the microphone 44 outputs sound data to the CODEC 42 based on inputted sound.
  • the display module 45 displays predefined information.
  • the input module 46 receives a user’s input.
  • An IMU 47 detects the angular velocity and the acceleration of the electric device 100.
  • the main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
  • the memory 49 stores a program and data required for the image signal processor 30 to control the stereo camera module 10 and the range sensor module 20, acquired image data, and a program and data required for the main processor 48 to control the electric device 100.
  • the memory 49 includes a computer readable storage medium having a computer program stored thereon, the computer program being executed by the image signal processor 30 or the main processor 48 to implement an image processing method of the present disclosure.
  • the image processing method includes acquiring depth information indicating distances between a camera and objects, the camera focusing on a subject among the objects and imaging the objects to acquire a camera image.
  • the method further includes changing a focus distance which is a distance between the camera and an in-focus position of the camera in accordance with a first user operation.
  • the method further includes generating a bokeh on the camera image based on the depth information and the focus distance.
  • the electric device 100 having the above-described configuration is a mobile phone such as a smartphone in this embodiment but may be other types of electric devices including camera modules 11 and 12.
  • FIG. 2 is a flowchart showing an image processing method according to the first embodiment of the present disclosure.
  • FIG. 3 is a diagram which shows setting a focus distance on a short distance (near) side in the image processing method shown in FIG. 2.
  • FIG. 4 is a diagram which shows setting a focus distance on a long distance (far) side in the image processing method shown in FIG 2.
  • the image signal processor 30 After recording of a moving image of the subject is started, the image signal processor 30 first inputs the master camera image and the slave camera image from the stereo camera module 10 (step S1) .
  • the image signal processor 30 acquires depth information, which indicates distances between the master camera module 11 and the objects within the viewing angle of the master camera module 11, based on the input master camera image and the input slave camera image (Step S2) .
  • the depth information may be obtained by correcting the stereo depth information based on the ToF depth information.
  • the subject, and the object other than the subject may be classified by matting (segmentation) using AI technology, and the depth information may be acquired for each classified object.
  • the specific mode of the depth information is not particularly limited as long as it indicates distances between the master camera module 11 and the objects located within the viewing angle of the master camera module 11.
  • the image signal processor 30 acquires positions of a first slider SL1 and a second slider SL2 shown in FIG. 3 (step S3) .
  • the first slider SL1 is a graphic user interface (GUI) displayed by the display module 45.
  • the image signal processor 30 changes a focus distance indicating a distance between the master camera module 11 and an in-focus position of the master camera module 11 in response to a left and right direction drag operation (first user operation) of the first slider SL1.
  • the image signal processor 30 changes the subject to be focused and changes a region in which bokeh is generated.
  • the variable range of the focus distance may correspond to the range of the distances between the master camera module 11 and the objects indicated by the depth information. As shown in FIG.
  • the second slider SL 2 is a graphic user interface (GUI) displayed by the display module 45.
  • the image signal processor 30 changes intensity of the bokeh in response to a vertical direction drag operation (second user operation) of the second slider SL2.
  • the image signal processor 30 may change the intensity of the bokeh, for example, by changing a standard deviation ⁇ of a Gaussian filter according to displacement of the second slider SL2.
  • the image signal processor 30 updates the focus distance and the intensity of the bokeh based on the acquired positions of the first slider SL1 and the second slider SL2 (step S4) .
  • the image signal processor 30 generates the bokeh on the master camera image based on the focus distance and the intensity of the bokeh (step S5) . Specifically, the image signal processor 30 generates the bokeh, which intensity corresponds to the position of the second slider SL2, on an image of an object other than the subject which is located in the position of the focal distance. On the other hand, the image signal processor 30 does not generate the bokeh on an image of the subject. For example, in FIG. 3, a person 101 is set as a focused subject since the focus distance is set on the short distance side. As a result, in FIG. 3, the bokeh is generated on an image of the background 102 of the person 101. On the other hand, in FIG. 4, the background 102 is set as the focused subject since the focus distance is set on the long distance side. As a result, in FIG. 4, the bokeh is generated on an image of the person 101.
  • the image signal processor 30 determines whether an instruction to stop the recording is issued (step S6) .
  • step S6 When the instruction to stop the recording is issued (step S6: Yes) , the image signal processor 30 saves the recorded moving image data in the memory 49 to end the process (step S7) .
  • step S6 when the instruction to stop the recording is not issued (step S6: No) , the image signal processor 30 repeats the input of the master camera image and of the slave camera image (step S1) .
  • FIG. 2 to 4 describe an example of imaging a moving image of the subject
  • the first embodiment can also be applied to imaging a still image of the subject.
  • the first embodiment by changing the focus distance with the operation of the first slider SL1 (first user operation) , it is possible to generate the bokeh in a region suited to the user's preference by means of a simple operation. Further, according to the first embodiment, by operating the second slider SL 2 (second user operation) , it is possible to generate the bokeh at an intensity suited to the user's preference by means of a simple operation.
  • FIG. 5 is a flowchart showing an image processing method according to the second embodiment of the present disclosure.
  • the bokeh is generated on the master camera image when the subject is imaged.
  • the bokeh is generated on the master camera image stored in the memory 49 after the subject is imaged.
  • the image signal processor 30 reads the master camera image selected by the user and the depth information corresponding to the master camera image from the memory 49 (step S11) .
  • the master camera image may be either a moving image or a still image.
  • step S11 the steps S3 to S5 are performed. Since the depth information is stored in the memory 49, acquisition of the depth information described in step S2 of FIG. 2 is not performed.
  • step S5 the image signal processor 30 determines whether a bokeh image determination operation (for example, a shutter button pressing operation) is performed (step S61) .
  • a bokeh image determination operation for example, a shutter button pressing operation
  • step S61 When the bokeh image determination operation is performed (step S61: Yes) , the image signal processor 30 saves the determined blurred image in the memory 49 (step S71) .
  • step S61 the image signal processor 30 reads the master camera image and the depth information newly selected by the user from the memory 49 (step S11) .
  • the bokeh can be generated in a region suited to the user's preference by means of a simple operation.
  • FIG. 6 is a flowchart showing an image processing method according to the third embodiment of the present disclosure.
  • FIG. 7 is a diagram which shows changing the focus distance from the minimum value to the maximum value in the image processing method shown in FIG. 6.
  • the focus distance is changed in accordance with the operation of the first slider SL1 in accordance with the operation of the first slider SL1 in the first embodiment.
  • the focus distance is changed from a minimum value to a maximum value in accordance with a single pressing operation of a shutter button (first user operation) .
  • the image signal processor 30 determines whether the shutter button B is pressed (step S21) .
  • the shutter button B is a graphic user interface (GUI) displayed by the display module 45.
  • step S21: Yes the image signal processor 30 starts automatically changing the focus distance from the minimum value to the maximum value (step S22) .
  • the automatic change of the focus distance is performed gradually over a certain period of time, for example.
  • step S21 when the shutter button B is not pressed (step S21: No) , the image signal processor 30 repeats determining whether the shutter button B is pressed (step S21) .
  • the image signal processor 30 generates the bokeh on the image of an object other than the subject in the master camera image, the subject corresponding to the current focus distance (step S5) .
  • the image signal processor 30 determines whether the focus distance has reached the maximum value (step S23) .
  • step S23: Yes the image signal processor 30 ends the automatic change of the focus distance and proceeds to step S6 described in FIG. 2.
  • step S23 when the focus distance has not reached the maximum value (step S23: No) , the image signal processor 30 continues the automatic change of the focus distance (step S24) .
  • FIG. 8 is a flowchart showing an image processing method according to the fourth embodiment of the present disclosure.
  • FIG. 9 is a diagram showing a plurality of targets using the image processing method shown in FIG. 8.
  • FIG. 10 is a diagram which shows changing the focus distance according to a plurality of targets using the image processing method shown in FIG. 8.
  • FIG. 11 is a diagram which shows tracking a target using the image processing method shown in FIG. 8.
  • the focus distance is automatically changed to focus, one by one, on a plurality of targets, the targets being selected by the user from the master camera image.
  • the image signal processor 30 selects targets to which tap operations were performed by the user (third user operation) in the master camera image displayed by the display module 45 (Step S31) .
  • the first target T1 and the second target T2 are selected.
  • the image signal processor 30 acquires the depth information in the same manner as in FIG. 2 (step S2) .
  • the image signal processor 30 tracks each of the selected targets using multi-object tracking (step S32) .
  • the image signal processor 30 updates distance information indicating a distance between the master camera module 11 and the target.
  • the distance information may be acquired based on the stereo camera image or the ToF depth information described above.
  • the first target T1 which moves as the time changes from 0 (t) to N (t) , is tracked.
  • the image signal processor 30 automatically changes the focus distance to focus, one by one, on the selected targets T1, T2 (step S33) .
  • the automatic change of the focus distance is performed gradually over a certain period of time, for example.
  • the image signal processor 30 generates the bokeh on the image of a target other than a currently focused target (step S5) .
  • the first target T1 and the second target T2 are focused, one by one, with the passage of time. That is, in the example shown in FIG. 10, the image of the second target T 2 and the image of the first target T 1 are blurred, one by one, with the passage of time.
  • the image of the target on which the bokeh is generated can be changed while focusing, one by one, on a plurality of targets.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • a feature defined as “first” and “second” may comprise one or more of this feature.
  • a plurality of means “two or more than two” , unless otherwise specified.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

An image processing method includes acquiring depth information indicating distances between a camera and objects, the camera focusing on a subject among the objects and imaging the objects to acquire a camera image, changing a focus distance which is a distance between the camera and an in-focus position of the camera in accordance with a first user operation, and generating a bokeh on the camera image based on the depth information and the focus distance.

Description

IMAGE PROCESSING METHOD AND ELECTRONIC DEVICE TECHNICAL FIELD
The present disclosure relates to an image processing method and an electronic device.
BACKGROUND
In recent years, a technique for generating bokeh in images of objects located in the foreground and the background of a subject is applied to a camera image obtained by imaging the subject with a camera having a deep depth of field such as a smartphone.
When a subject is imaged using a camera with a deep depth of field such as a smart phone, an image that appears to be in focus from a close position to a distant position can be obtained. Therefore, in order to obtain an image in which only the subject is clear and the foreground and the background of the subject are blurred, bokeh is generated by image processing.
In such a technique for generating bokeh, bokeh is generated based on depth information including a distance between the camera and the subject.
However, in conventional technologies, the focus distance, which is a distance between the camera and an in-focus position of the camera, is fixed when the subject is imaged.
Therefore, it is difficult to generate bokeh in an area that suits the user’s preference by means of a simple operation.
SUMMARY
The present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an imaging lens assembly, a camera module, and an imaging device.
In accordance with the present disclosure, an image processing method includes:
acquiring depth information indicating distances between a camera and objects, the camera focusing on a subject among the objects and imaging the objects to acquire a camera image;
changing a focus distance which is a distance between the camera and an in-focus position of the camera in accordance with a first user operation; and
generating a bokeh on the camera image based on the depth information and the focus distance.
In one example, the image processing method may further include changing the intensity of the bokeh in accordance with a second user operation.
In one example, the first user operation may be an operation of a first slider which is displayed by a display device, the display device displaying the camera image.
In one example, the second user operation may be an operation of a second slider which is displayed by a display device, the display device displaying the camera image.
In one example, the camera image may be a still image.
In one example, the camera image may be a moving image.
In one example, the generating the bokeh may be performed when the objects are imaged.
In one example, the generating the bokeh may be performed after the objects are imaged and may be performed based on a camera image stored in a memory.
In one example, the first user operation may be a single pressing operation of a button which is displayed by a display device, the display device displaying the camera image.
In one example, the camera image may be a moving image, and
the changing the focus distance may be performed within a predetermined range of the focus distance in response to the pressing operation during imaging of the moving image.
In one example, the predetermined range may be a range from a minimum value to a maximum value of the focus distance.
In one example, the image processing method may further include selecting a plurality of targets from the camera image in accordance with a third user operation, the camera image being displayed by a display device, wherein
the changing the focus distance may be performed so as to focus, one by one, on selected targets.
In one example, the third operation may be tapping the targets in the camera image.
In one example, the image processing method may further include tracking the selected targets, wherein
the changing the focus distance may be performed based on tracking result of the selected targets.
In accordance with the present disclosure, an electronic device includes:
a camera which focuses on a subject among objects and which images the objects to acquire a camera image; and
a processor, wherein the processor is configured to
acquire depth information indicating distances between the camera and the objects,
change a focus distance which is a distance between the camera and an in-focus position of the camera in accordance with a first user operation, and
generate a bokeh on the camera image based on the depth information and the focus distance.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of the embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings, in which:
FIG. 1 is a diagram showing a configuration example of an electronic device capable of implementing the image processing method according to an embodiment of the present disclosure;
FIG. 2 is a flowchart showing the image processing method according to an embodiment of the present disclosure;
FIG. 3 is a diagram which shows setting a focus distance on a short distance side in the image processing method shown in FIG. 2;
FIG. 4 is a diagram which shows setting a focus distance on a long distance side in the image processing method shown in FIG. 2;
FIG. 5 is a flowchart showing an image processing method according to an embodiment of the present disclosure;
FIG. 6 is a flowchart showing an image processing method according to an embodiment of the present disclosure;
FIG. 7 is a diagram which shows changing the focus distance from the minimum value to the maximum value in the image processing method shown in FIG. 6;
FIG. 8 is a flowchart showing an image processing method according to an embodiment of the present disclosure;
FIG. 9 is a diagram showing a plurality of targets in the image processing method shown in FIG. 8;
FIG. 10 is a diagram which shows changing the focus distance in accordance with a plurality of targets in the image processing method shown in FIG. 8, and
FIG. 11 is a diagram which shows tracking a target in the image processing method shown in FIG. 8.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described in detail and examples of the embodiments will be illustrated in the accompanying drawings. The same or similar elements  and elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The embodiments described herein with reference to the drawings are explanatory and aim to illustrate the present disclosure, but they shall not be construed to limit the present disclosure.
(First Embodiment)
FIG. 1 is a diagram showing a configuration example of an electronic device capable of implementing the image processing method according to the first embodiment of the present disclosure. In the example shown in FIG. 1, the electric device 100 includes a stereo camera module 10 as an example of a camera, a range sensor module 20, and an image signal processor 30 as an example of a processor. The image signal processor 30 controls the stereo camera module 10 and the range sensor module 20, and processes camera image data acquired from the stereo camera module 10.
In the example shown in FIG. 1, the stereo camera module 10 includes a master camera module 11 as an example of a camera and a slave camera module 12 to be used for binocular stereo viewing. The master camera module 11 includes a first lens 11a that is capable of focusing on a subject, a first image sensor 11b that detects an image inputted via the first lens 11a, and a first image sensor driver 11c that drives the first image sensor 11b. The master camera module 11 focuses on the subject among objects within a viewing angle of the master camera module 11 and images the objects to acquire a master camera image as an example of a camera image.
In the example shown in FIG. 1, the slave camera module 12 includes a second lens 12a that is capable of focusing on a subject, a second image sensor 12b that detects an image inputted via the second lens 12a, and a second image sensor driver 12c that drives the second image sensor 12b. The slave camera module 12 focuses on the subject among the objects and images the objects to acquire a slave camera image.
The range sensor module 20 includes a lens 20a, a range sensor 20b, a range sensor driver 20c and a projector 20d, as shown in FIG. 1. The projector 20d emits pulsed light toward the objects including a subject and the range sensor 20b detects reflection light from the objects through the lens 20a. The range sensor module 20 acquires time of flight (ToF) depth information (ToF depth value) based on the time from emitting the pulsed light to receiving the reflection light. The resolution of the ToF depth information detected by the range sensor module 20 is lower than the resolution of stereo depth information of a stereo image that is acquired based on the master camera image and the slave camera image.
The image signal processor 30 controls the master camera module 11, the slave camera module 12, and the range sensor module 20. The image signal processor 30 generates bokeh on the master camera image based on the master camera image, the slave camera image, and the ToF depth information. Specifically, the image signal processor 30 may, for example, correct the stereo depth information based on the ToF depth information, the stereo depth information being obtained by stereo processing of the master camera image and the slave camera image. The stereo depth information may indicate a value corresponding to a deviation in a horizonal direction (x direction) between corresponding pixels of the master camera image and the slave camera image. The image signal processor 30 may generate bokeh based on the corrected stereo depth information. The bokeh may be generated using a Gaussian filter having a standard deviation σ corresponding to the corrected stereo depth information.
Furthermore, as shown in FIG. 1, the electric device 100 includes a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial navigation unit (IMU) 47, a main processor 48, and a memory 49.
The GNSS module 40 measures the current position of the electric device 100. The wireless communication module 41 performs wireless communications with the Internet. The CODEC 42 bidirectionally performs encoding and decoding, using a predetermined  encoding/decoding method. The speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42. The microphone 44 outputs sound data to the CODEC 42 based on inputted sound. The display module 45 displays predefined information. The input module 46 receives a user’s input. An IMU 47 detects the angular velocity and the acceleration of the electric device 100.
The main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47. The memory 49 stores a program and data required for the image signal processor 30 to control the stereo camera module 10 and the range sensor module 20, acquired image data, and a program and data required for the main processor 48 to control the electric device 100.
The memory 49 includes a computer readable storage medium having a computer program stored thereon, the computer program being executed by the image signal processor 30 or the main processor 48 to implement an image processing method of the present disclosure.
For example, the image processing method includes acquiring depth information indicating distances between a camera and objects, the camera focusing on a subject among the objects and imaging the objects to acquire a camera image. The method further includes changing a focus distance which is a distance between the camera and an in-focus position of the camera in accordance with a first user operation. The method further includes generating a bokeh on the camera image based on the depth information and the focus distance.
The electric device 100 having the above-described configuration is a mobile phone such as a smartphone in this embodiment but may be other types of electric devices including camera modules 11 and 12.
Next, the image processing method according to the first embodiment of the present disclosure will be described with reference to FIG. 2 to 4. FIG. 2 is a flowchart showing an image processing method according to the first embodiment of the present disclosure. FIG. 3 is a diagram which shows setting a focus distance on a short distance (near) side in the image processing method shown in FIG. 2. FIG. 4 is a diagram which shows setting a focus distance on a long distance (far) side in the image processing method shown in FIG 2.
After recording of a moving image of the subject is started, the image signal processor 30 first inputs the master camera image and the slave camera image from the stereo camera module 10 (step S1) .
Next, the image signal processor 30 acquires depth information, which indicates distances between the master camera module 11 and the objects within the viewing angle of the master camera module 11, based on the input master camera image and the input slave camera image (Step S2) . As mentioned above, the depth information may be obtained by correcting the stereo depth information based on the ToF depth information. In acquiring the depth information, the subject, and the object other than the subject, may be classified by matting (segmentation) using AI technology, and the depth information may be acquired for each classified object. However, the specific mode of the depth information is not particularly limited as long as it indicates distances between the master camera module 11 and the objects located within the viewing angle of the master camera module 11.
Next, the image signal processor 30 acquires positions of a first slider SL1 and a second slider SL2 shown in FIG. 3 (step S3) . As shown in FIG. 3, the first slider SL1 is a graphic user interface (GUI) displayed by the display module 45. The image signal processor 30 changes a focus distance indicating a distance between the master camera module 11 and an in-focus position of the master camera module 11 in response to a left and right direction drag operation (first user operation) of the first slider SL1. By changing the focus distance, the image signal processor 30 changes the subject to be focused and changes a region in which bokeh is generated. The variable range of the focus distance may correspond to the range of the distances between the master camera module 11 and the objects indicated by the depth information. As shown in  FIG. 3, the second slider SL 2 is a graphic user interface (GUI) displayed by the display module 45. The image signal processor 30 changes intensity of the bokeh in response to a vertical direction drag operation (second user operation) of the second slider SL2. The image signal processor 30 may change the intensity of the bokeh, for example, by changing a standard deviation σ of a Gaussian filter according to displacement of the second slider SL2.
Next, the image signal processor 30 updates the focus distance and the intensity of the bokeh based on the acquired positions of the first slider SL1 and the second slider SL2 (step S4) .
Next, the image signal processor 30 generates the bokeh on the master camera image based on the focus distance and the intensity of the bokeh (step S5) . Specifically, the image signal processor 30 generates the bokeh, which intensity corresponds to the position of the second slider SL2, on an image of an object other than the subject which is located in the position of the focal distance. On the other hand, the image signal processor 30 does not generate the bokeh on an image of the subject. For example, in FIG. 3, a person 101 is set as a focused subject since the focus distance is set on the short distance side. As a result, in FIG. 3, the bokeh is generated on an image of the background 102 of the person 101. On the other hand, in FIG. 4, the background 102 is set as the focused subject since the focus distance is set on the long distance side. As a result, in FIG. 4, the bokeh is generated on an image of the person 101.
Next, the image signal processor 30 determines whether an instruction to stop the recording is issued (step S6) .
When the instruction to stop the recording is issued (step S6: Yes) , the image signal processor 30 saves the recorded moving image data in the memory 49 to end the process (step S7) .
On the other hand, when the instruction to stop the recording is not issued (step S6: No) , the image signal processor 30 repeats the input of the master camera image and of the slave camera image (step S1) .
Although FIG. 2 to 4 describe an example of imaging a moving image of the subject, the first embodiment can also be applied to imaging a still image of the subject.
According to the first embodiment, by changing the focus distance with the operation of the first slider SL1 (first user operation) , it is possible to generate the bokeh in a region suited to the user's preference by means of a simple operation. Further, according to the first embodiment, by operating the second slider SL 2 (second user operation) , it is possible to generate the bokeh at an intensity suited to the user's preference by means of a simple operation.
(Second Embodiment)
Next, with reference to FIG. 5, the image processing method according to the second embodiment of the present disclosure will be described focusing on the differences from the first embodiment.
FIG. 5 is a flowchart showing an image processing method according to the second embodiment of the present disclosure. In the first embodiment, an example in which the bokeh is generated on the master camera image when the subject is imaged has been described. In contrast to it, in the second embodiment, the bokeh is generated on the master camera image stored in the memory 49 after the subject is imaged.
In the flowchart of FIG. 5, it is premised that the master camera image is stored in the memory 49 in association with the depth information corresponding to it.
Under such a premise, first, the image signal processor 30 reads the master camera image selected by the user and the depth information corresponding to the master camera image from the memory 49 (step S11) . The master camera image may be either a moving image or a still image.
As shown in FIG. 2, after step S11, the steps S3 to S5 are performed. Since the depth information is stored in the memory 49, acquisition of the depth information described in step S2 of FIG. 2 is not performed.
After step S5, the image signal processor 30 determines whether a bokeh image determination operation (for example, a shutter button pressing operation) is performed (step S61) .
When the bokeh image determination operation is performed (step S61: Yes) , the image signal processor 30 saves the determined blurred image in the memory 49 (step S71) .
On the other hand, when the bokeh image determination operation is not performed (step S61: No) , the image signal processor 30 reads the master camera image and the depth information newly selected by the user from the memory 49 (step S11) .
According to the second embodiment, even after the subject is imaged, the bokeh can be generated in a region suited to the user's preference by means of a simple operation.
(Third Embodiment)
Next, with reference to FIG. 6 and 7, the image processing method according to the third embodiment of the present disclosure will be described focusing on the differences from the first embodiment.
FIG. 6 is a flowchart showing an image processing method according to the third embodiment of the present disclosure. FIG. 7 is a diagram which shows changing the focus distance from the minimum value to the maximum value in the image processing method shown in FIG. 6.
In the first embodiment, an example in which the focus distance is changed in accordance with the operation of the first slider SL1 has been described. In contrast, in the third embodiment, the focus distance is changed from a minimum value to a maximum value in accordance with a single pressing operation of a shutter button (first user operation) .
Specifically, as shown in FIG. 6, after the acquisition of the depth information (step S2) , the image signal processor 30 determines whether the shutter button B is pressed (step S21) . As shown in FIG. 7, the shutter button B is a graphic user interface (GUI) displayed by the display module 45.
When the shutter button B is pressed (step S21: Yes) , the image signal processor 30 starts automatically changing the focus distance from the minimum value to the maximum value (step S22) . The automatic change of the focus distance is performed gradually over a certain period of time, for example.
On the other hand, when the shutter button B is not pressed (step S21: No) , the image signal processor 30 repeats determining whether the shutter button B is pressed (step S21) .
Next, in the process of automatic change of the focus distance, the image signal processor 30 generates the bokeh on the image of an object other than the subject in the master camera image, the subject corresponding to the current focus distance (step S5) .
Next, the image signal processor 30 determines whether the focus distance has reached the maximum value (step S23) .
When the focus distance has reached the maximum value (step S23: Yes) , the image signal processor 30 ends the automatic change of the focus distance and proceeds to step S6 described in FIG. 2.
On the other hand, when the focus distance has not reached the maximum value (step S23: No) , the image signal processor 30 continues the automatic change of the focus distance (step S24) .
According to the third embodiment, as shown in FIG. 7, it is possible to image the moving image while gradually changing the area where the bokeh is generated by means of pressing the shutter button B only once.
(Fourth Embodiment)
Next, with reference to FIG. 8 to 11, the image processing method according to the fourth embodiment of the present disclosure will be described focusing on the differences from the first embodiment.
FIG. 8 is a flowchart showing an image processing method according to the fourth embodiment of the present disclosure. FIG. 9 is a diagram showing a plurality of targets using the image processing method shown in FIG. 8. FIG. 10 is a diagram which shows changing the focus distance according to a plurality of targets using the image processing method shown in FIG. 8. FIG. 11 is a diagram which shows tracking a target using the image processing method shown in FIG. 8.
In the fourth embodiment, the focus distance is automatically changed to focus, one by one, on a plurality of targets, the targets being selected by the user from the master camera image.
Specifically, as shown in FIG. 8, the image signal processor 30 selects targets to which tap operations were performed by the user (third user operation) in the master camera image displayed by the display module 45 (Step S31) . In the example shown in FIG. 9, the first target T1 and the second target T2 are selected.
Next, the image signal processor 30 acquires the depth information in the same manner as in FIG. 2 (step S2) .
Next, the image signal processor 30 tracks each of the selected targets using multi-object tracking (step S32) . In the process of tracking the target, the image signal processor 30 updates distance information indicating a distance between the master camera module 11 and the target. The distance information may be acquired based on the stereo camera image or the ToF depth information described above. In the example shown in FIG. 11, the first target T1, which moves as the time changes from 0 (t) to N (t) , is tracked.
Next, the image signal processor 30 automatically changes the focus distance to focus, one by one, on the selected targets T1, T2 (step S33) . The automatic change of the focus distance is performed gradually over a certain period of time, for example.
Next, the image signal processor 30 generates the bokeh on the image of a target other than a currently focused target (step S5) . In the example shown in FIG. 10, the first target T1 and the second target T2 are focused, one by one, with the passage of time. That is, in the example shown in FIG. 10, the image of the second target T 2 and the image of the first target T 1 are blurred, one by one, with the passage of time.
According to the fourth embodiment, during imaging of the moving image, the image of the target on which the bokeh is generated can be changed while focusing, one by one, on a plurality of targets.
In the description of embodiments of the present disclosure, it is to be understood that terms such as "central" , "longitudinal" , "transverse" , "length" , "width" , "thickness" , "upper" , "lower" , "front" , "rear" , "back" , "left" , "right" , "vertical" , "horizontal" , "top" , "bottom" , "inner" , "outer" , "clockwise" and "counterclockwise" should be construed to refer to the orientation or the position as described or as shown in the drawings in discussion. These relative terms are only used to simplify the description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or must be constructed or operated in a particular orientation. Thus, these terms cannot be constructed to limit the present disclosure.
In addition, terms such as "first" and "second" are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, a feature defined as "first" and "second" may comprise one or more of this feature. In the description of the present disclosure, "a plurality of" means “two or more than two” , unless otherwise specified.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted" , "connected" , "coupled" and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is "on" or "below" a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween. Furthermore, a first feature "on" , "above" or "on top of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "on" , "above" or "on top of" the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature "below" , "under" or "on bottom of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "below" , "under" or "on bottom of" the second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure, certain elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may also be applied.
Reference throughout this specification to "an embodiment" , "some embodiments" , "an exemplary embodiment" , "an example" , "a specific example" or "some examples" means that a particular feature, structure, material, or characteristics described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
The logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function, may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment. As to the specification, "the computer readable medium" may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment. More specific examples of the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) . In addition, the computer readable medium may even be a paper or other appropriate  medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
It should be understood that each part of the present disclosure may be realized by the hardware, software, firmware or their combination. In the above embodiments, a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system. For example, if it is realized by the hardware, likewise in another embodiment, the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
Those skilled in the art shall understand that all or parts of the steps in the above exemplifying method of the present disclosure may be achieved by commanding the related hardware with programs. The programs may be stored in a computer readable storage medium, and the programs comprise one or a combination of the steps in the method embodiments of the present disclosure when run on a computer.
In addition, each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module. The integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
The storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
Although embodiments of the present disclosure have been shown and described, it should be appreciated by those skilled in the art that the embodiments are explanatory and cannot be construed to limit the present disclosure, and changes, modifications, alternatives and variations can be made in the embodiments without departing from the scope of the present disclosure.

Claims (15)

  1. An image processing method comprising:
    acquiring depth information indicating distances between a camera and objects, the camera focusing on a subject among the objects and imaging the objects to acquire a camera image;
    changing a focus distance which is a distance between the camera and an in-focus position of the camera in accordance with a first user operation; and
    generating a bokeh on the camera image based on the depth information and the focus distance.
  2. The method according to claim 1, further comprising changing an intensity of the bokeh in accordance with a second user operation.
  3. The method according to claim 1 or 2, wherein the first user operation is an operation of a first slider which is displayed by a display device, the display device displaying the camera image.
  4. The method according to claim 2, wherein the second user operation is an operation of a second slider which is displayed by a display device, the display device displaying the camera image.
  5. The method according to claim 1, wherein the camera image is a still image.
  6. The method according to claim 1, wherein the camera image is a moving image.
  7. The method according to claim 1, wherein the generating the bokeh is performed when the objects are imaged.
  8. The method according to claim 1, wherein the generating the bokeh is performed after the objects are imaged and is performed based on a camera image stored in a memory.
  9. The method according to claim 1 or 2, wherein the first user operation is a single pressing operation of a button which is displayed by a display device, the display device displaying the camera image.
  10. The method according to claim 9, wherein
    the camera image is a moving image, and
    the changing the focus distance is performed within a predetermined range of the focus distance in response to the pressing operation during imaging of the moving image.
  11. The method according to claim 10, wherein the predetermined range is a range from a minimum value to a maximum value of the focus distance.
  12. The method according to claim 1, further comprising selecting a plurality of targets from the camera image in accordance with a third user operation, the camera image being displayed by a display device, wherein
    the changing the focus distance is performed so as to focus, one by one, on selected targets.
  13. The method according to claim 12, wherein the third operation is tapping the targets in the camera image.
  14. The method according to claim 12 or 13, further comprising tracking the selected targets, wherein
    the changing the focus distance is performed based on a tracking result of the selected target.
  15. An electronic device comprising:
    a camera which focuses on a subject among objects, and which images the objects to acquire a camera image; and
    a processor, wherein the processor is configured to
    acquire depth information indicating distances between the camera and the objects,
    change a focus distance which is a distance between the camera and an in-focus position of the camera in accordance with a first user operation, and
    generate a bokeh on the camera image based on the depth information and the focus distance.
PCT/CN2021/079594 2021-03-08 2021-03-08 Image processing method and electronic device WO2022188007A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/079594 WO2022188007A1 (en) 2021-03-08 2021-03-08 Image processing method and electronic device
CN202180095508.2A CN117121499A (en) 2021-03-08 2021-03-08 Image processing method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/079594 WO2022188007A1 (en) 2021-03-08 2021-03-08 Image processing method and electronic device

Publications (1)

Publication Number Publication Date
WO2022188007A1 true WO2022188007A1 (en) 2022-09-15

Family

ID=83227311

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/079594 WO2022188007A1 (en) 2021-03-08 2021-03-08 Image processing method and electronic device

Country Status (2)

Country Link
CN (1) CN117121499A (en)
WO (1) WO2022188007A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103188434A (en) * 2011-12-31 2013-07-03 联想(北京)有限公司 Method and device of image collection
CN103546682A (en) * 2012-07-09 2014-01-29 三星电子株式会社 Camera device and method for processing image
CN104657936A (en) * 2013-11-15 2015-05-27 宏达国际电子股份有限公司 Method, electronic device and medium for adjusting depth values
US20150373257A1 (en) * 2013-02-21 2015-12-24 Nec Corporation Image processing device, image processing method and permanent computer-readable medium
US20160035068A1 (en) * 2014-08-04 2016-02-04 Adobe Systems Incorporated Dynamic Motion Path Blur Techniques
CN109598699A (en) * 2017-09-29 2019-04-09 交互数字Ce专利控股公司 For manipulating the user interface of light field image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103188434A (en) * 2011-12-31 2013-07-03 联想(北京)有限公司 Method and device of image collection
CN103546682A (en) * 2012-07-09 2014-01-29 三星电子株式会社 Camera device and method for processing image
US20150373257A1 (en) * 2013-02-21 2015-12-24 Nec Corporation Image processing device, image processing method and permanent computer-readable medium
CN104657936A (en) * 2013-11-15 2015-05-27 宏达国际电子股份有限公司 Method, electronic device and medium for adjusting depth values
US20160035068A1 (en) * 2014-08-04 2016-02-04 Adobe Systems Incorporated Dynamic Motion Path Blur Techniques
CN109598699A (en) * 2017-09-29 2019-04-09 交互数字Ce专利控股公司 For manipulating the user interface of light field image

Also Published As

Publication number Publication date
CN117121499A (en) 2023-11-24

Similar Documents

Publication Publication Date Title
CN109151301B (en) Electronic device including camera module
US9516241B2 (en) Beamforming method and apparatus for sound signal
CN107950018B (en) Image generation method and system, and computer readable medium
US11343425B2 (en) Control apparatus, control method, and storage medium
US9313419B2 (en) Image processing apparatus and image pickup apparatus where image processing is applied using an acquired depth map
EP3429187A1 (en) Method and device for switching between cameras, and terminal
US9832362B2 (en) Image-capturing apparatus
EP3038345B1 (en) Auto-focusing method and auto-focusing device
CN111052727A (en) Electronic device for storing depth information in association with image according to attribute of depth information obtained using image and control method thereof
US11631190B2 (en) Electronic device and method for controlling auto focusing thereof
JP2021090208A (en) Method for refocusing image captured by plenoptic camera, and refocusing image system based on audio
CN114026841A (en) Automatic focus extension
KR20180078596A (en) Method and electronic device for auto focus
KR20200117562A (en) Electronic device, method, and computer readable medium for providing bokeh effect in video
US20170171456A1 (en) Stereo Autofocus
CN105827961A (en) Mobile terminal and focusing method
CN111105454A (en) Method, device and medium for acquiring positioning information
US20180332214A1 (en) Image processing apparatus, image capturing apparatus, image processing method, and storage medium
KR20190087215A (en) Electronic device and methof to control auto focus of camera
US11792518B2 (en) Method and apparatus for processing image
WO2022188007A1 (en) Image processing method and electronic device
JP6645711B2 (en) Image processing apparatus, image processing method, and program
US20190394363A1 (en) Image Processing Method, Image Processing Apparatus, Electronic Device, and Computer Readable Storage Medium
CN112866555B (en) Shooting method, shooting device, shooting equipment and storage medium
WO2022241728A1 (en) Image processing method, electronic device and non–transitory computer–readable media

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21929490

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21929490

Country of ref document: EP

Kind code of ref document: A1