CN110602392A - Control method, imaging module, electronic device and computer-readable storage medium - Google Patents

Control method, imaging module, electronic device and computer-readable storage medium Download PDF

Info

Publication number
CN110602392A
CN110602392A CN201910829325.3A CN201910829325A CN110602392A CN 110602392 A CN110602392 A CN 110602392A CN 201910829325 A CN201910829325 A CN 201910829325A CN 110602392 A CN110602392 A CN 110602392A
Authority
CN
China
Prior art keywords
pixel
image
area
camera
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910829325.3A
Other languages
Chinese (zh)
Other versions
CN110602392B (en
Inventor
赵正涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910829325.3A priority Critical patent/CN110602392B/en
Publication of CN110602392A publication Critical patent/CN110602392A/en
Application granted granted Critical
Publication of CN110602392B publication Critical patent/CN110602392B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses an imaging module, a control method thereof, electronic equipment and a computer readable storage medium. The imaging module comprises a first camera and a second camera, wherein the fields of view of the first camera and the second camera are at least partially overlapped. The control method comprises the following steps: controlling a first camera and a second camera to respectively shoot a first image and a second image; cutting the first image to obtain a cut image; acquiring a first area of the cut image, which is the same as the scene of the second image, and acquiring a second area of the second image, which is the same as the scene of the cut image; acquiring pixel values of all pixel points of a first area and a second area; and processing the cut image according to the pixel values of the pixel points in the first area and the pixel values of the corresponding pixel points in the second area. The imaging module, the control method of the imaging module, the electronic device and the computer readable storage medium can process pixel points of an image with a larger view field according to pixel values of areas with the same scene of two images to obtain a digital zoom image with higher definition.

Description

Control method, imaging module, electronic device and computer-readable storage medium
Technical Field
The present disclosure relates to the field of imaging technologies, and in particular, to a control method, an imaging module, an electronic device, and a computer-readable storage medium.
Background
In the related art, because a user has multiple requirements for shooting a large scene, shooting a long-range view and the like, a plurality of cameras (for example, a wide-angle camera and a telephoto camera) can be arranged on an electronic device to meet the requirements of the user, however, the plurality of cameras on the electronic device are generally fixed-focus cameras, when the user shoots, if the adopted zoom multiple is between the zoom multiples of two cameras, the camera with the smaller zoom multiple is generally used for shooting in a digital zoom mode, and the digital zoom mode can cause the shot image to lose a lot of detailed information, so that the image is unclear.
Disclosure of Invention
Embodiments of the application provide a control method, an imaging module, an electronic device and a computer-readable storage medium.
The control method of the embodiment of the application is used for controlling the imaging module, the imaging module comprises a plurality of cameras, the cameras comprise a first camera and a second camera, the zoom multiple of the first camera is smaller than that of the second camera, and the view field of the second camera and the view field of the first camera are at least partially overlapped. The control method comprises the following steps: controlling the first camera to shoot a first image and controlling the second camera to shoot a second image; cutting the first image according to the current zoom multiple of the imaging module to obtain a cut image; acquiring a first area of the scene in the cut image, which is the same as the scene in the second image, and acquiring a second area of the scene in the second image, which is the same as the scene in the cut image; acquiring pixel values of all pixel points of the first area and the second area; and processing the cut image according to the pixel values of the pixel points in the first region and the pixel values of the corresponding pixel points in the second region.
The imaging module of this application embodiment includes a plurality of cameras and treater, and is a plurality of the camera includes first camera and second camera, the multiple of zooming of first camera is less than the multiple of zooming of second camera, the visual field of second camera with the visual field of first camera is at least partly coincide. The processor is configured to: controlling the first camera to shoot a first image and controlling the second camera to shoot a second image; cutting the first image according to the current zoom multiple of the imaging module to obtain a cut image; acquiring a first area of the scene in the cut image, which is the same as the scene in the second image, and acquiring a second area of the scene in the second image, which is the same as the scene in the cut image; acquiring pixel values of all pixel points of the first area and the second area; and processing the cut image according to the pixel values of the pixel points in the first region and the pixel values of the corresponding pixel points in the second region.
The electronic equipment of this application embodiment includes casing and formation of image module, and the formation of image module includes a plurality of cameras and treater, and is a plurality of the camera includes first camera and second camera, the multiple of zooming of first camera is less than the multiple of zooming of second camera, the visual field of second camera with the visual field of first camera is at least partial coincidence. The processor is configured to: controlling the first camera to shoot a first image and controlling the second camera to shoot a second image; cutting the first image according to the current zoom multiple of the imaging module to obtain a cut image; acquiring a first area of the scene in the cut image, which is the same as the scene in the second image, and acquiring a second area of the scene in the second image, which is the same as the scene in the cut image; acquiring pixel values of all pixel points of the first area and the second area; and processing the cut image according to the pixel values of the pixel points in the first region and the pixel values of the corresponding pixel points in the second region. The imaging module is mounted on the housing.
The computer-readable storage medium of the embodiments of the present application includes a computer program for use in conjunction with an electronic device, the computer program being executable by a processor to perform the control method described above. Electronic equipment includes casing and formation of image module, and the formation of image module includes a plurality of cameras and treater, and is a plurality of the camera includes first camera and second camera, the multiple of zooming of first camera is less than the multiple of zooming of second camera, the visual field of second camera with the visual field of first camera is at least partly coincide. The processor is configured to: controlling the first camera to shoot a first image and controlling the second camera to shoot a second image; cutting the first image according to the current zoom multiple of the imaging module to obtain a cut image; acquiring a first area of the scene in the cut image, which is the same as the scene in the second image, and acquiring a second area of the scene in the second image, which is the same as the scene in the cut image; acquiring pixel values of all pixel points of the first area and the second area; and processing the cut image according to the pixel values of the pixel points in the first region and the pixel values of the corresponding pixel points in the second region. The imaging module is mounted on the housing.
The control method, the imaging module, the electronic device and the computer readable storage medium control the first camera and the second camera with at least partially overlapped view fields to shoot images simultaneously, cut the first image shot by the first camera with a smaller zoom multiple according to the current zoom multiple, and process the cut first image according to the pixel values of the same area of the scenes of the two images, so as to obtain a digital zoom image with higher definition.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart of a control method according to certain embodiments of the present application;
FIG. 2 is a schematic structural diagram of an electronic device according to some embodiments of the present application;
FIG. 3 is a schematic illustration of a first image, a cropped image, a first region, a second image and a second region according to some embodiments of the present application;
FIGS. 4 and 5 are schematic flow charts of control methods according to certain embodiments of the present application;
FIG. 6 is a schematic illustration of a second image of some embodiments of the present application;
FIGS. 7-9 are schematic flow charts of control methods according to certain embodiments of the present application;
FIG. 10 is a schematic diagram of a connection between a computer-readable storage medium and an electronic device according to some embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
Referring to fig. 1, fig. 2 and fig. 3, the control method of the embodiment of the present application is used for controlling an imaging module 100, where the imaging module 100 includes a plurality of cameras. The plurality of cameras include a first camera 10 and a second camera 20, and the zoom factor of the first camera 10 is smaller than the zoom factor of the second camera 20. The field of view of the second camera 20 and the field of view of the first camera 10 at least partially coincide. The control method comprises the following steps:
01, controlling the first camera 10 to shoot a first image i1 and controlling the second camera 20 to shoot a second image i 2;
02, cropping the first image i1 according to the current zoom multiple of the imaging module 100 to obtain a cropped image i 3;
03, acquiring a first region i4 in which the scene in the cropped image i3 is the same as the scene in the second image i2, and acquiring a second region i5 in which the scene in the second image i2 is the same as the scene in the cropped image i 3;
04, obtaining pixel values of all pixel points of the first area i4 and the second area i 5;
05, processing the cut image i3 according to the pixel values of the pixel points in the first region i4 and the pixel values of the corresponding pixel points in the second region i 5.
The imaging module 100 of the embodiment of the present application includes a plurality of cameras and a processor 30, the plurality of cameras includes a first camera 10 and a second camera 20, a zoom multiple of the first camera 10 is smaller than a zoom multiple of the second camera 20, and a field of view of the second camera 20 and a field of view of the first camera 10 at least partially coincide. The imaging module 100 may be configured to implement the control method according to the embodiment of the present disclosure, and step 01, step 02, step 03, step 04, and step 05 may all be implemented by the processor 30, that is, the processor 30 may be configured to: controlling the first camera 10 to shoot a first image i1 and controlling the second camera 20 to shoot a second image i 2; cropping the first image i1 according to the current zoom multiple of the imaging module 100 to obtain a cropped image i3, acquiring a first region i4 of which the scene in the cropped image i3 is the same as the scene in the second image i2, and acquiring a second region i5 of which the scene in the second image i2 is the same as the scene in the cropped image i 3; acquiring pixel values of all pixel points of a first region i4 and a second region i 5; and processing the cropped image i3 according to the pixel values of the pixel points in the first region i4 and the pixel values of the corresponding pixel points in the second region i 5.
Referring to fig. 2, the electronic device 1000 according to the embodiment of the present disclosure includes an imaging module 100 and a housing 40, and the imaging module 100 may be disposed inside the housing 40. The housing 40 can protect the imaging module 100. The electronic device 1000 may be an electronic device with the imaging module 100, such as a mobile phone, a tablet computer, and the like, without limitation. The electronic device 1000 according to the embodiment of the present application is a mobile phone. When a user uses the electronic device 1000 to capture an image, the imaging module 100 is activated, and the user can change the current zoom factor of the imaging module 100 according to the capture scene to obtain the image desired by the user. At this point, the current zoom factor may be obtained by the processor 30. In one embodiment, a user can zoom in a mode of reducing a picture or a mode of enlarging the picture on a shooting interface, and when the picture is reduced, the current zoom multiple is changed from big to small; when the picture is enlarged, the current zoom factor is changed from small to large. In another embodiment, an icon of the zoom magnification may also be set on the photographing interface, and the user may change the current zoom magnification by clicking the icon. For example, two icons may be provided on the shooting interface, one icon is an icon for increasing the zoom factor, and the other icon is an icon for decreasing the zoom factor, and the user may click the corresponding icon as needed to change the current zoom factor. Of course, a zoom factor bar may be provided on the shooting interface, and the user may click or slide on the zoom factor bar to change the current zoom factor. In the embodiment of the present application, the processor 30 may obtain an operation instruction of a user, and obtain the current zoom multiple of the imaging module 100 according to the operation instruction.
According to the control method, the imaging module 100 and the electronic device 1000 of the embodiment of the application, the first camera 10 and the second camera 20 with at least partially overlapped view fields are controlled to shoot images simultaneously, the first image i1 shot by the first camera 10 with a smaller zoom multiple is cut according to the current zoom multiple, and the cut first image i1 is processed according to the pixel value of the same area of the scenes of the two images, so that a digital zoom image with higher definition is obtained.
In some embodiments, the corresponding camera is controlled to output the preview image according to the current zoom factor of the imaging module 100. For example, when the zoom factor of the first camera 10 is 3.0, the zoom factor of the second camera 20 is 5.0, and the current zoom factor of the imaging module 100 acquired by the processor 30 is 4.0, the first camera 10 is controlled to acquire a preview image, and the preview image is displayed on the display screen of the electronic device 1000. The current zoom factor of the imaging module 100 in the embodiment of the present application may be the current zoom factor of the first camera 10.
In some embodiments, the imaging module 100 may include two cameras or more than two cameras (e.g., three cameras). For example, the imaging module 100 may include a wide-angle camera and a telephoto camera; or the imaging module 100 may include a wide camera, a main camera, a telephoto camera, and the like. The first camera 10 and the second camera 20 are disposed on the same side, for example, the first camera 10 and the second camera 20 are both front-facing cameras, or the first camera 10 and the second camera 20 are both rear-facing cameras, or the first camera 10 and the second camera 20 are both located on the same side of the electronic device 1000. It should be noted that the field of view of the first camera 10 and the field of view of the second camera 20 at least partially overlap, that is, the full-size image of the first camera 10 and the full-size image of the second camera 20 have the same area, where the same area refers to the same scene corresponding to the two areas. For example, the field of view of the second camera 20 may be entirely within the field of view of the first camera 10.
Referring to fig. 3 and 4, in some embodiments, the zoom ratio of the first camera 10 is a first zoom ratio, and the zoom ratio of the second camera 20 is a second zoom ratio. The control method further comprises the following steps:
06, judging whether the current zoom multiple of the imaging module 100 is between the first zoom multiple and the second zoom multiple;
when the current zoom multiple of the imaging module 100 is between the first zoom multiple and the second zoom multiple, proceeding to step 01 (controlling the first camera 10 to shoot the first image i1, and controlling the second camera 20 to shoot the second image i 2);
07, when the current zoom multiple of the imaging module 100 is not between the first zoom multiple and the second zoom multiple, determining whether the current zoom multiple of the imaging module 100 is the first zoom multiple;
071, when the current zoom multiple of the imaging module 100 is the first zoom multiple, controlling the first camera 10 to shoot the first image i 1;
072, when the current zoom multiple of the imaging module 100 is not the first zoom multiple, control the second camera 20 to shoot the second image i 2.
Referring to fig. 2, step 06, step 07, step 071 and step 072 of the imaging module 100 according to the embodiment of the present invention can be implemented by the processor 30. That is, processor 30 may be configured to: judging whether the current zoom multiple of the imaging module 100 is between the first zoom multiple and the second zoom multiple; when the current zoom multiple of the imaging module 100 is between the first zoom multiple and the second zoom multiple, proceeding to step 01 (controlling the first camera 10 to shoot the first image i1, and controlling the second camera 20 to shoot the second image i 2); when the current zoom multiple of the imaging module 100 is not between the first zoom multiple and the second zoom multiple, judging whether the current zoom multiple of the imaging module 100 is the first zoom multiple; when the current zoom multiple of the imaging module 100 is a first zoom multiple, controlling the first camera 10 to shoot a first image i1, where the first image i1 is an image previewed by the user; when the current zoom multiple of the imaging module 100 is not the first zoom multiple, the second camera 20 is controlled to capture a second image i2, where the second image i2 is an image previewed by the user.
For example, the imaging module 100 includes a main camera and a telephoto camera, and the first camera 10 may be the main camera and the second camera 20 may be the telephoto camera. The zoom multiple of the main camera is 1.0, and the zoom multiple of the tele camera is 5.0. When the current zoom multiple of the imaging module 100 is 3.0, 3.0 is not the zoom multiple of the main camera, nor the zoom multiple of the telephoto camera, and is between 1.0 and 5.0. At this time, the main camera may be controlled by the processor 30 to take one image as the first image i1, the first image i1 is a full-size image taken by the main camera at a zoom magnification of 1.0, the tele camera is controlled to take one image as the second image i2, and the second image i2 is a full-size image taken by the tele camera at a zoom magnification of 5.0. The first image i1 is then cropped by the processor 30 according to the current zoom factor 3.0 of the imaging module 100 (first camera 10) to obtain a cropped image i 3. The cropping mode may be a zoom-in operation performed on a certain block area of the first image i1 according to an operation instruction of the user on the electronic device. That is, the processor 30 cuts out the images of other areas in the first image i1 according to the operation instruction, and only the area required by the user is left. This block is the cropped image i 3. Since the scenes of the field of view of the tele camera and the field of view of the main camera are at least partially identical, the processor 30 acquires the same regions of the scenes in the cropped image i3 and the second image i2 as the first region i4 in which the scene in the cropped image i3 is identical to the scene in the second image i2, and the second region i5 in which the scene in the second image i2 is identical to the scene in the cropped image i3, respectively; the processor 30 then obtains the pixel values of the pixels in the first region i4 and the second region i 5. And processing the cropped image i3 according to the pixel values of the pixel points in the first region i4 and the pixel values of the corresponding pixel points in the second region i 5. When the current zoom multiple of formation of image module 100 is 1.0, 1.0 is the zoom multiple of main camera, and the direct control main camera shoots the image, and steerable long focus camera is closed. Or when the current multiple of the imaging module 100 is 5.0, the long-focus camera is directly controlled to shoot images, and the main camera can be controlled to be closed.
Referring to fig. 3 and 5, in some embodiments, step 05 includes:
051, obtaining the difference value of the pixel point of the first area i4 and the pixel value of the pixel point of the second area i 5;
052, judging whether the difference value is in a preset pixel difference interval;
053, when the difference value is within a preset pixel difference interval, averaging the pixel value of the first region i4 and the pixel value of the second region i5 to obtain an average pixel value, and taking the average pixel value as the pixel value of the cropped image i 3;
054, when the difference value is not in the preset pixel difference interval, judging whether the difference value is less than the lower limit of the preset pixel difference interval;
055, when the difference value is less than the lower limit of the preset pixel difference interval, taking the pixel value of the second region i5 as the pixel value of the cropped image i 3;
056, when the difference value is larger than the upper limit of the preset pixel difference interval, regarding the pixel value of the first region i4 as the pixel of the cropped image i 3.
Referring to fig. 2, the imaging module 100 of the present embodiment, steps 051, 052, 053, 054, 055, and 056 may be implemented by the processor 30. That is, processor 30 may be configured to: judging whether the difference value is in a preset pixel difference interval or not; when the difference value is within a preset pixel difference interval, averaging the pixel value of the first region i4 and the pixel value of the second region i5 to obtain an average pixel value, and taking the average pixel value as the pixel value of the cropped image i 3; when the difference value is not within the preset pixel difference interval and the difference value is smaller than the lower limit of the preset pixel difference interval, taking the pixel value of the second region i5 as the pixel value of the cropped image i 3; when the difference value is not within the preset pixel difference interval and when the difference value is greater than the upper limit of the preset pixel difference interval, the pixel value of the first region i4 is taken as the pixel of the cropped image i 3.
It should be noted that the pixel values above and below should be understood as the brightness or color difference of the pixel.
In some embodiments, the predetermined pixel difference interval may be obtained according to a plurality of experiments and then stored in the storage element of the imaging module 100. In other embodiments, the preset pixel difference interval may be set by the user according to the preference of the user. Of course, a plurality of different pixel difference intervals may be preset in the imaging module 100, and then the user selects different pixel difference intervals to work according to different requirements, so as to obtain a better image.
Referring to fig. 3, for example, the preset pixel difference interval is [60, 80], the pixel value of one pixel point p66 in the first region i4 is 20, the pixel value of a pixel point p6 '6' in the second region i5 corresponding to the pixel point p66 in the first region i4 is 60, the difference between the pixel value of the pixel point p66 in the first region i4 and the pixel value of the pixel point p6 '6' in the second region i5 is 60-20, the difference is 40, the difference is not in the preset pixel difference interval and is smaller than the lower limit 60 of the preset pixel difference interval [60, 80], the pixel value of the pixel point p6 '6' in the second region i5 is used as the pixel value of the clipped image i3 at the pixel point p66, that is, and the pixel value of the clipped image i3 at the pixel point p66 is 60. Because the pixel values of the two pixel points have small difference, that is, the luminance of the two images at the pixel point is not much, because the second region i5 belongs to the second image i2, and the second image i2 is obtained by shooting with the second camera 20, the zoom multiple of the second camera 20 is large (the focal length is large), so that the second camera 20 can image a distant scene more clearly, and the pixel values of the pixel points of the second region i5 are used as the pixel values of the cropped image i3 at the pixel points, so that the cropped image i3 is clearer.
For example, the preset pixel difference interval is [60, 80], the pixel value of one pixel point p66 in the first region i4 is 20, the pixel value of a pixel point p6 '6' in the second region i5 corresponding to the pixel point p66 in the first region i4 is 150, the difference between the pixel value of the pixel point p66 in the first region i4 and the pixel value of the pixel point p6 '6' in the second region i5 is 150-20 ═ 130, the difference is not in the preset pixel difference interval and is greater than the upper limit of the preset pixel difference interval [60, 80], and the pixel value of the pixel point p66 in the first region i4 is used as the pixel value of the clipped image i3 at the pixel point p 66. Due to the fact that the difference between the pixel values of the pixels in the first region i4 and the pixel values of the pixels in the second region i5 is too large, if the clipped image i3 is processed by the pixel values of the pixels in the second region i5, the brightness of the clipped image i3 may be abnormal. Therefore, when the difference is greater than the upper limit of the preset pixel difference interval, the pixel values of the pixels in the first area i4 are not processed, and the brightness abnormality in the cut image i3 can be better avoided.
For example, the preset pixel difference interval is [60, 80], the pixel value of one pixel point p66 in the first region i4 is 20, the pixel value of a pixel point p6 '6' in the second region i5 corresponding to the pixel point p66 in the first region i4 is 90, the difference between the pixel value of the pixel point p66 in the first region i4 and the pixel value of the pixel point p6 '6' in the second region i5 is 90-20 ═ 70, the difference is within the preset pixel difference interval, at this time, the pixel value of the pixel point p66 in the first region i4 and the pixel value of the pixel point p6 '6' in the second region i5 are averaged, and the average pixel value is (20+90)/2 ═ 55, the average pixel value 55 is taken as the pixel value of the pixel point p 3 in the pixel point p 66. When the difference between the pixel value of the pixel point in the first region i4 and the pixel value of the pixel point in the second region i5 is within the preset pixel difference interval, it is described that the luminance difference between the pixel point in the first region i4 and the corresponding pixel point in the second region i5 is relatively large, and if the pixel value of the pixel point corresponding to the second region i5 is directly used as the pixel value of the cropped image i3, it may cause the brightness of the cropped image i3 at the pixel point to be abnormal. Since the second region i5 is obtained from the second image i2, the second image i2 is obtained by shooting with the second camera, the pixel point of the second region i5 is clearer than the pixel point corresponding to the first region i4, and if the pixel value of the pixel point of the first region i4 is not processed, the information of the pixel point of the second region i5 is lost, so that the pixel value of the pixel point of the first region i4 and the pixel value of the pixel point corresponding to the second region i5 can be averaged, and the averaged value is used as the pixel value of the pixel point corresponding to the cropped image i3, so that the cropped image i3 is clearer.
In some embodiments, the number of the pixels in the first zone i4 and the number of the pixels in the second zone i5 may be equal, and of course, the number of the pixels in the first zone i4 and the number of the pixels in the second zone i5 may also be different. For example, referring to fig. 3 and 6, the number of the pixels in the second region i5 is greater than the number of the pixels in the first region i 4.
Referring to fig. 3, fig. 6 and fig. 7, the control method according to the embodiment of the present application further includes:
08, judging whether the number of the pixel points in the second area i5 is more than that of the pixel points in the first area i 4;
09, when the number of the pixel points of the second region i5 is greater than that of the pixel points of the first region i4, compressing the second region i5 according to the number of the pixel points of the second region i5 and the number of the pixel points of the first region i4 to obtain a compressed image i 6;
step 04 comprises:
041, obtaining pixel values of all pixel points of the first region i4 and the compressed image i 6;
step 05 comprises:
057, processing the clipped image i3 according to the pixel values of the pixel points of the first region i4 and the pixel values of the corresponding pixel points of the compressed image i 6;
when the number of the pixels in the second region i5 is equal to the number of the pixels in the first region i4, the process proceeds to step 04 (obtaining the pixel values of the pixels in the first region i4 and the second region i 5).
Referring to FIG. 2, step 08, step 09, step 041 and step 057 may all be implemented by processor 30. That is, processor 30 may be configured to: judging whether the number of the pixel points of the second area i5 is more than that of the pixel points of the first area i 4; when the number of the pixel points of the second region i5 is greater than that of the pixel points of the first region i4, compressing the second region i5 according to the number of the pixel points of the second region i5 and the number of the pixel points of the first region i4 to obtain a compressed image i 6; acquiring pixel values of all pixel points of the first region i4 and the compressed image i 6; processing the clipped image i3 according to the pixel values of the pixels in the first region i4 and the pixel values of the corresponding pixels in the compressed image i 6; when the number of the pixels in the second region i5 is equal to the number of the pixels in the first region i4, the process proceeds to step 04 (obtaining the pixel values of the pixels in the first region i4 and the second region i 5).
Since the second zoom multiple is greater than the first zoom multiple, at this time, the range of the scene shot by the second camera 20 is smaller than the range of the scene shot by the first camera 10, and when the resolutions of the first camera 10 and the second camera 20 are the same (or the difference between the resolutions of the first camera 10 and the second camera 20 is smaller than the predetermined resolution, or the resolution of the second camera 20 is higher than the resolution of the first camera 10), the number of pixels of the second image i2 corresponding to a certain scene is greater than the number of pixels of the first image i1 corresponding to the same scene, that is, the number of pixels of the second region i5 may be greater than the number of pixels of the first region i 4. In this case, the second region i5 may be compressed to obtain a compressed image i6, so that the number of pixels of the compressed image i6 and the first region i4 is the same, to form a one-to-one correspondence relationship of the pixels between the two regions. And acquiring pixel values of all pixel points of the compressed image i6 and the first region i4, and processing the cut image i3 according to the pixel values of the pixel points of the first region i4 and the pixel values of the corresponding pixel points of the compressed image i 6. The processing method here may also be similar to the above "processing the clipped image i3 according to the pixel values of the pixels in the first region i4 and the pixel values of the corresponding pixels in the second region i 5", for example, the pixel values of the pixels in the first region i4 and the pixel values of the pixels in the compressed image i6 are subjected to difference, and when the difference is within a preset pixel difference interval, the pixel values of the pixels in the first region i4 and the pixel values of the pixels in the compressed image i6 are averaged to obtain the average pixel value of the pixels, where the average pixel value is the pixel value of the clipped image i3 at the pixel. And when the difference value is smaller than the lower limit of the preset pixel difference interval, taking the pixel value of the pixel point of the compressed image i6 as the pixel value of the pixel point of the cutting image i 3. When the difference value is larger than the upper limit of the preset pixel difference interval, the pixel value of the pixel point of the first region i4 is directly used as the pixel value of the pixel point of the cropped image i3, that is, the pixel point of the cropped image i3 is not processed.
Referring to fig. 3 and 8, in the control method according to the embodiment of the present application, step 09 includes:
091, according to the ratio of the number of the pixel points in the second region i5 to the number of the pixel points in the first region i4, compressing the number of the pixel points in the second region i5 so that the number of the pixel points in the compressed image i6 is the same as the number of the pixel points in the first region i 4.
Referring to fig. 2, the imaging module 100 of the embodiment of the present application, step 091, can be implemented by the processor 30. That is, processor 30 may be configured to: and compressing the number of the pixel points of the second region i5 according to the ratio of the number of the pixel points of the second region i5 to the number of the pixel points of the first region i4, so that the number of the pixel points of the compressed image i6 is the same as the number of the pixel points of the first region i 4.
Since the number of the pixels in the second region i5 is greater than that of the pixels in the first region i4, the number of the pixels in the second region i5 needs to be compressed, so that the number of the pixels in the second region i5 is the same as that of the pixels in the first region i 4. That is to say, the pixel points of the compressed image i6 obtained by compressing the second region i5 correspond to the pixel points of the first region i4 one to one. For example, the number of the pixels in the first region i4 is 100 ten thousand pixels, the number of the pixels in the second region i5 is 400 ten thousand pixels, the ratio of 400 ten thousand pixels in the second region i5 to 100 ten thousand pixels in the first region i4 is 4:1, that is, the pixels in the second region i5 are compressed according to the ratio 4:1, and this compression method may be to select 4 pixels in the second region i5 to compress into 1 pixel, and then use this compressed pixel as the pixel of the compressed image i 6.
Referring to fig. 3, 6 and 9, in the control method according to the embodiment of the present application, step 09 includes:
092, averaging the pixel values of the pixels in the second region i5 to obtain the pixel value of the compressed image i 6.
Referring to fig. 2, in the imaging module 100 according to the embodiment of the present disclosure, step 092 may be implemented by the processor 30. That is, processor 30 may be configured to: and averaging the pixel values of the plurality of pixel points in the second region i5 according to the ratio of the number of the pixel points in the first region i4 to the number of the pixel points in the second region i5 to obtain the pixel value of the compressed image i 6.
It should be noted that, performing averaging on the pixel values of the plurality of pixels in the second region i5 means selecting, according to the ratio of the numbers of pixels in the first region i4 and the second region i5, the same number of pixels as the ratio in the second region i5, and performing averaging on the selected pixels. For example, the number of the pixels in the first region i4 is 400 ten thousand, the number of the pixels in the second region i5 is 1600 ten thousand, that is, according to the ratio of 1600 ten thousand to 400 ten thousand being 4:1, 4 adjacent pixels are selected from the pixels in the second region i5 according to the ratio, the pixel values of the 4 adjacent pixels are subjected to mean processing to obtain an average pixel value, the average pixel value is used as the pixel value of one pixel in the compressed image i6, and so on, until all 1600 ten thousand pixels in the second region i5 are compressed, 400 ten thousand pixels in the compressed image i6 are obtained. And then, making a difference value between the pixel value of each pixel point in the compressed image i6 and the pixel value of the pixel point in the first region i4, and selecting a processing mode (mean value replacement, direct replacement or no processing) of the clipped image i3 at the pixel point according to the relation between the difference value and a preset pixel difference interval.
Referring to fig. 2, 3 and 10, a computer readable storage medium 2000 of an embodiment of the present application includes a stored computer program 50, and the computer program 50 can implement the control method of any one of the above embodiments when being executed by the processor 30. For example, when the computer program is executed by the processor 30, the control method implemented includes: controlling the first camera 10 to shoot a first image i1 and controlling the second camera 20 to shoot a second image i 2; cropping the first image i1 according to the current zoom multiple of the imaging module 100 to obtain a cropped image i 3; acquiring a first region i4 in which a scene in the cropped image i3 is the same as a scene in the second image i2, and acquiring a second region i5 in which a scene in the second image i2 is the same as a scene in the cropped image i 3; acquiring pixel values of all pixel points of a first region i4 and a second region i 5; and processing the cropped image i3 according to the pixel values of the pixel points in the first region i4 and the pixel values of the corresponding pixel points in the second region i 5.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (16)

1. A control method is used for controlling an imaging module, wherein the imaging module comprises a plurality of cameras, the plurality of cameras comprise a first camera and a second camera, the zoom multiple of the first camera is smaller than that of the second camera, and the field of view of the second camera is at least partially overlapped with that of the first camera; the control method comprises the following steps:
controlling the first camera to shoot a first image and controlling the second camera to shoot a second image;
cutting the first image according to the current zoom multiple of the imaging module to obtain a cut image;
acquiring a first area of the scene in the cut image, which is the same as the scene in the second image, and acquiring a second area of the scene in the second image, which is the same as the scene in the cut image;
acquiring pixel values of all pixel points of the first area and the second area;
and processing the cut image according to the pixel values of the pixel points in the first region and the pixel values of the corresponding pixel points in the second region.
2. The control method according to claim 1, wherein the zoom factor of the first camera is a first zoom factor, and the zoom factor of the second camera is a second zoom factor, the control method further comprising:
and when the current zoom multiple of the imaging module is between the first zoom multiple and the second zoom multiple, controlling the first camera to shoot a first image and controlling the second camera to shoot a second image.
3. The control method according to claim 1, wherein the processing the cropped image according to the pixel values of the pixel points in the first region and the pixel values of the corresponding pixel points in the second region includes:
acquiring a difference value between the pixel value of the pixel point in the first area and the pixel value of the corresponding pixel point in the second area;
when the difference value is within a preset pixel difference interval, averaging the pixel values of the pixels in the first area and the pixel values of the corresponding pixels in the second area to obtain an average pixel value, and taking the average pixel value as the pixel value of the corresponding pixel in the cut image;
and when the difference value is not within the preset pixel difference interval, taking the pixel value of the pixel point of the first area or the pixel value of the corresponding pixel point of the second area as the pixel value of the corresponding pixel point of the cut image.
4. The method according to claim 3, wherein when the difference is not within the preset pixel difference interval, taking the pixel value of the pixel point in the first region or the pixel value of the corresponding pixel point in the second region as the pixel value of the corresponding pixel point of the cropped image includes:
when the difference value is smaller than the lower limit of the preset pixel difference interval, taking the pixel value of the corresponding pixel point of the second area as the pixel value of the corresponding pixel point of the cut image;
and when the difference value is larger than the upper limit of the preset pixel difference interval, taking the pixel value of the pixel point of the first area as the pixel value of the corresponding pixel point of the cutting image.
5. The control method according to claim 1, wherein the number of pixels of the second area is greater than the number of pixels of the first area, the control method further comprising:
compressing the second area according to the number of the pixel points of the second area and the number of the pixel points of the first area to obtain a compressed image;
the obtaining of the pixel value of each pixel point of the first region and the second region includes:
acquiring pixel values of the first region and each pixel point of the compressed image;
the processing the cut image according to the pixel values of the pixel points in the first region and the pixel values of the pixel points corresponding to the second region includes:
and processing the cut image according to the pixel values of the pixel points in the first region and the pixel values of the corresponding pixel points of the compressed image.
6. The control method according to claim 5, wherein compressing the second region according to the number of pixels in the second region and the number of pixels in the first region to obtain a compressed image comprises:
and compressing the number of the pixel points of the second area according to the ratio of the number of the pixel points of the second area to the number of the pixel points of the first area so as to enable the number of the pixel points of the compressed image to be the same as that of the pixel points of the first area.
7. The control method according to claim 6, wherein the compressing the second region to obtain the compressed image comprises:
and averaging the pixel values of the plurality of pixel points in the second area to obtain the pixel value of the corresponding pixel point of the compressed image.
8. The utility model provides an imaging module, its characterized in that, imaging module includes a plurality of cameras and treater, and is a plurality of the camera includes first camera and second camera, the multiple of zooming of first camera is less than the multiple of zooming of second camera, the visual field of second camera with the visual field of first camera is at least partly coincide, the treater is used for:
controlling the first camera to shoot a first image and controlling the second camera to shoot a second image;
cutting the first image according to the current zoom multiple of the imaging module to obtain a cut image;
acquiring a first area of the scene in the cut image, which is the same as the scene in the second image, and acquiring a second area of the scene in the second image, which is the same as the scene in the cut image;
acquiring pixel values of all pixel points of the first area and the second area;
and processing the cut image according to the pixel values of the pixel points in the first region and the pixel values of the corresponding pixel points in the second region.
9. The imaging module of claim 8, wherein the zoom factor of the first camera is a first zoom factor, wherein the zoom factor of the second camera is a second zoom factor, and wherein the processor is further configured to:
and when the current zoom multiple of the imaging module is between the first zoom multiple and the second zoom multiple, controlling the first camera to shoot a first image and controlling the second camera to shoot a second image.
10. The imaging module of claim 8, wherein the processor is configured to:
acquiring a difference value between the pixel value of the pixel point in the first area and the pixel value of the corresponding pixel point in the second area;
when the difference value is within a preset pixel difference interval, averaging the pixel values of the pixels in the first area and the pixel values of the corresponding pixels in the second area to obtain an average pixel value, and taking the average pixel value as the pixel value of the corresponding pixel in the cut image;
and when the difference value is not within the preset pixel difference interval, taking the pixel value of the pixel point of the first area or the pixel value of the corresponding pixel point of the second area as the pixel value of the cutting image.
11. The imaging module of claim 10, wherein the processor is configured to:
when the difference value is smaller than the lower limit of the preset pixel difference interval, taking the pixel value of the corresponding pixel point of the second area as the pixel value of the corresponding pixel point of the cut image;
and when the difference value is larger than the upper limit of the preset pixel difference interval, taking the pixel value of the pixel point of the first area as the pixel value of the corresponding pixel point of the cutting image.
12. The imaging module of claim 8, wherein the number of pixels in the second region is greater than the number of pixels in the first region, and wherein the processor is further configured to:
compressing the second area according to the number of the pixel points of the second area and the number of the pixel points of the first area to obtain a compressed image;
acquiring pixel values of the first region and each pixel point of the compressed image;
and processing the cut image according to the pixel values of the pixel points in the first region and the pixel values of the corresponding pixel points of the compressed image.
13. The imaging module of claim 12, wherein the processor is configured to:
and compressing the number of the pixel points of the second area according to the ratio of the number of the pixel points of the second area to the number of the pixel points of the first area so as to enable the number of the pixel points of the compressed image to be the same as that of the pixel points of the first area.
14. The imaging module of claim 13, wherein the processor is configured to:
and averaging the pixel values of the plurality of pixel points in the second area to obtain the pixel value of the corresponding pixel point of the compressed image.
15. An electronic device, comprising the imaging module of any one of claims 8 to 14 and a housing, wherein the imaging module is disposed in the housing.
16. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the control method of any one of claims 1 to 7.
CN201910829325.3A 2019-09-03 2019-09-03 Control method, imaging module, electronic device and computer-readable storage medium Active CN110602392B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910829325.3A CN110602392B (en) 2019-09-03 2019-09-03 Control method, imaging module, electronic device and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910829325.3A CN110602392B (en) 2019-09-03 2019-09-03 Control method, imaging module, electronic device and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN110602392A true CN110602392A (en) 2019-12-20
CN110602392B CN110602392B (en) 2021-10-01

Family

ID=68857193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910829325.3A Active CN110602392B (en) 2019-09-03 2019-09-03 Control method, imaging module, electronic device and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN110602392B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113099143A (en) * 2021-03-29 2021-07-09 南昌欧菲光电技术有限公司 Image processing method and device, electronic equipment and storage medium
CN114915728A (en) * 2022-05-23 2022-08-16 普联技术有限公司 Method for zooming multi-view camera and multi-view camera

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101472075A (en) * 2007-12-28 2009-07-01 佳能株式会社 Image pickup apparatus and control method thereof
CN102075679A (en) * 2010-11-18 2011-05-25 无锡中星微电子有限公司 Method and device for acquiring image
CN103856719A (en) * 2014-03-26 2014-06-11 深圳市金立通信设备有限公司 Photographing method and terminal
US20140185145A1 (en) * 2011-05-30 2014-07-03 Yohei Takano Zoom lens, imaging device and information device
CN103986867A (en) * 2014-04-24 2014-08-13 宇龙计算机通信科技(深圳)有限公司 Image shooting terminal and image shooting method
CN106131397A (en) * 2016-06-21 2016-11-16 维沃移动通信有限公司 A kind of method that multi-medium data shows and electronic equipment
US20170208248A1 (en) * 2005-06-03 2017-07-20 Craig P. Mowry System and apparatus for increasing quality and efficiency of film capture and methods of use thereof
CN107277360A (en) * 2017-07-17 2017-10-20 惠州Tcl移动通信有限公司 A kind of dual camera switching carries out method, mobile terminal and the storage device of zoom
CN107566742A (en) * 2017-10-27 2018-01-09 广东欧珀移动通信有限公司 Image pickup method, device, storage medium and electronic equipment
CN107835372A (en) * 2017-11-30 2018-03-23 广东欧珀移动通信有限公司 Imaging method, device, mobile terminal and storage medium based on dual camera
CN108391035A (en) * 2018-03-26 2018-08-10 华为技术有限公司 A kind of image pickup method, device and equipment
CN207835595U (en) * 2017-12-14 2018-09-07 信利光电股份有限公司 A kind of dual camera module and terminal
US20180292627A1 (en) * 2017-04-11 2018-10-11 Canon Kabushiki Kaisha Zoom lens and image pickup apparatus
CN208609085U (en) * 2018-08-31 2019-03-15 北京小米移动软件有限公司 Camera module and electronic equipment
CN110012213A (en) * 2017-11-16 2019-07-12 佳能株式会社 Imaging-control apparatus and recording medium
CN110072070A (en) * 2019-03-18 2019-07-30 华为技术有限公司 A kind of multichannel kinescope method and equipment

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170208248A1 (en) * 2005-06-03 2017-07-20 Craig P. Mowry System and apparatus for increasing quality and efficiency of film capture and methods of use thereof
CN101472075A (en) * 2007-12-28 2009-07-01 佳能株式会社 Image pickup apparatus and control method thereof
CN102075679A (en) * 2010-11-18 2011-05-25 无锡中星微电子有限公司 Method and device for acquiring image
US20140185145A1 (en) * 2011-05-30 2014-07-03 Yohei Takano Zoom lens, imaging device and information device
CN103856719A (en) * 2014-03-26 2014-06-11 深圳市金立通信设备有限公司 Photographing method and terminal
CN103986867A (en) * 2014-04-24 2014-08-13 宇龙计算机通信科技(深圳)有限公司 Image shooting terminal and image shooting method
CN106131397A (en) * 2016-06-21 2016-11-16 维沃移动通信有限公司 A kind of method that multi-medium data shows and electronic equipment
US20180292627A1 (en) * 2017-04-11 2018-10-11 Canon Kabushiki Kaisha Zoom lens and image pickup apparatus
CN107277360A (en) * 2017-07-17 2017-10-20 惠州Tcl移动通信有限公司 A kind of dual camera switching carries out method, mobile terminal and the storage device of zoom
CN107566742A (en) * 2017-10-27 2018-01-09 广东欧珀移动通信有限公司 Image pickup method, device, storage medium and electronic equipment
CN110012213A (en) * 2017-11-16 2019-07-12 佳能株式会社 Imaging-control apparatus and recording medium
CN107835372A (en) * 2017-11-30 2018-03-23 广东欧珀移动通信有限公司 Imaging method, device, mobile terminal and storage medium based on dual camera
CN207835595U (en) * 2017-12-14 2018-09-07 信利光电股份有限公司 A kind of dual camera module and terminal
CN108391035A (en) * 2018-03-26 2018-08-10 华为技术有限公司 A kind of image pickup method, device and equipment
CN208609085U (en) * 2018-08-31 2019-03-15 北京小米移动软件有限公司 Camera module and electronic equipment
CN110072070A (en) * 2019-03-18 2019-07-30 华为技术有限公司 A kind of multichannel kinescope method and equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113099143A (en) * 2021-03-29 2021-07-09 南昌欧菲光电技术有限公司 Image processing method and device, electronic equipment and storage medium
CN114915728A (en) * 2022-05-23 2022-08-16 普联技术有限公司 Method for zooming multi-view camera and multi-view camera

Also Published As

Publication number Publication date
CN110602392B (en) 2021-10-01

Similar Documents

Publication Publication Date Title
US11039059B2 (en) Imaging capturing device and imaging capturing method
CN110572581B (en) Zoom blurring image acquisition method and device based on terminal equipment
EP3328055B1 (en) Control method, control device and electronic device
CN111294517B (en) Image processing method and mobile terminal
US8937644B2 (en) Stereoscopic image capture
US9313400B2 (en) Linking-up photographing system and control method for linked-up cameras thereof
US20050134719A1 (en) Display device with automatic area of importance display
KR20180035899A (en) Dual Aperture Zoom Digital Camera User Interface
JP2008061148A (en) Imaging apparatus, zoom information display method, and zoom information display program
US20060133791A1 (en) Image pickup apparatus with autofocus function
CN110602392B (en) Control method, imaging module, electronic device and computer-readable storage medium
EP2280553A2 (en) Steroscopic Image Display Display Apparatus, Method, Recording Medium And Image Pickup Apparatus
CN115812312A (en) Image acquisition method, terminal device and computer-readable storage medium
JP6116436B2 (en) Image processing apparatus and image processing method
CN114449174A (en) Shooting method and device and electronic equipment
CN112529778B (en) Image stitching method and device of multi-camera equipment, storage medium and terminal
CN108810326B (en) Photographing method and device and mobile terminal
JP4788172B2 (en) Imaging apparatus and program
JP2000217022A (en) Electronic still camera and its image data recording and reproducing method
CN113473018B (en) Video shooting method and device, shooting terminal and storage medium
CN112019735B (en) Shooting method and device, storage medium and electronic device
CN113037988B (en) Zoom method, electronic device, and computer-readable storage medium
CN113905170A (en) Zoom control method, device, storage medium and electronic device
CN113630555B (en) Shooting method, shooting device, terminal and readable storage medium
CN113994656B (en) Control method, imaging module, electronic device, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant