CN114979487B - Image processing method and device, electronic equipment and storage medium - Google Patents

Image processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114979487B
CN114979487B CN202210586416.0A CN202210586416A CN114979487B CN 114979487 B CN114979487 B CN 114979487B CN 202210586416 A CN202210586416 A CN 202210586416A CN 114979487 B CN114979487 B CN 114979487B
Authority
CN
China
Prior art keywords
image data
acquisition device
overlapping
boundary
marks
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210586416.0A
Other languages
Chinese (zh)
Other versions
CN114979487A (en
Inventor
贺峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202210586416.0A priority Critical patent/CN114979487B/en
Publication of CN114979487A publication Critical patent/CN114979487A/en
Application granted granted Critical
Publication of CN114979487B publication Critical patent/CN114979487B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an image processing method, an image processing device, electronic equipment and a storage medium, wherein the image processing device responds to an image acquisition instruction to acquire first image data acquired by a first acquisition device aiming at a target scene and second image data acquired by a second acquisition device aiming at the target scene; the visual angle range of the first acquisition device is larger than that of the second acquisition device; previewing and displaying the image data of the overlapping view angles of the first image data and the second image data and the image data of the non-overlapping view angles of the first image data and the second image data; wherein the image data of the overlapping view angles are displayed with a first display parameter; the image data of the non-overlapping viewing angles is displayed with the second display parameter.

Description

Image processing method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processing device, an electronic device, and a storage medium.
Background
When the electronic equipment with the camera collects images, the collected images are displayed through the preview interface, so that a user can determine required framing content according to the content of the preview interface. However, the content of the current preview interface is relatively single, and the intelligence is poor.
Disclosure of Invention
The application provides an image processing method, an image processing device, electronic equipment and a storage medium, which comprise the following technical scheme:
An image processing method, the method comprising:
in response to the image acquisition instruction,
Acquiring first image data acquired by a first acquisition device aiming at a target scene and second image data acquired by a second acquisition device aiming at the target scene; the view angle range of the first acquisition device is larger than that of the second acquisition device;
Previewing image data of overlapping view angles of the first image data and the second image data, and image data of non-overlapping view angles of the first image data and the second image data; wherein the image data of the overlapping view angle is displayed with a first display parameter; the image data of the non-overlapping viewing angles is displayed with a second display parameter.
In the above method, preferably, the image data of the overlapping view angle is the second image data, and the image data of the non-overlapping view angle belongs to the first image data; or alternatively
The image data of the overlapping view angle and the image data of the non-overlapping view angle both belong to the first image data.
In the above method, preferably, the previewing the image data of the overlapping view angle of the first image data and the second image data, and the image data of the non-overlapping view angle of the first image data and the second image data includes:
Determining a matching region in the first image data, which matches the second image data;
fusing the second image data into the first image data, and previewing the first image data fused with the second image data; the second image data is positioned in the matching area in the first image data, the second image data is displayed with the first display parameter, and the non-matching area in the first image data is displayed with the second display parameter; or alternatively
Displaying the first image data in a preview mode; and displaying the matching area by the first display parameter, and displaying the non-matching area in the first image data by the second display parameter.
The above method, preferably, the fusing the second image data into the first image data includes:
adjusting the size of the second image data to the size of the matching region;
overlaying the second image data to the matching region; each pixel point in the second image data covers a corresponding matching pixel point in the matching region.
In the above method, preferably, the first display parameter and the second display parameter represent different display effects of a target dimension; the target dimension includes any one of the following dimensions: transparency, clarity, brightness, color.
In the above method, preferably, a plurality of first marks are displayed in a preview display area, and different first marks are used for indicating different view ranges in the first image data;
wherein the view range indicated by one of the two first marks is a sub-range of the view range indicated by the other first mark.
The above method, preferably, further comprises:
Outputting prompt information if the boundary of the matching area matched with the second image data in the first image data is not overlapped with the boundary of the view finding range indicated by any one of the first marks; the prompt information indicates that the acquisition range of the second acquisition device is adjusted so that the boundary of a matching area matched with the second image data in the first image data overlaps with the boundary of a framing range indicated by any one of the first marks;
Or alternatively
And if the boundary of the matching area matched with the second image data in the first image data is not overlapped with the boundary of the framing range indicated by any one of the first marks, adjusting the acquisition range of the second acquisition device so that the boundary of the matching area matched with the second image data in the first image data is overlapped with the boundary of the framing range indicated by any one of the first marks.
The above method, preferably, further comprises:
And hiding the plurality of first marks in response to a first mark hiding instruction when the boundary of a matching area matched with the second image data in the first image data overlaps with the boundary of a framing range indicated by any one of the first marks.
The above method, preferably, further comprises:
And hiding the first image data in response to a first image data hiding instruction, and previewing and displaying the second image data.
An image processing apparatus, the apparatus comprising:
The acquisition module is used for responding to the image acquisition instruction, acquiring first image data acquired by the first acquisition device aiming at a target scene and second image data acquired by the second acquisition device aiming at the target scene; the view angle range of the first acquisition device is larger than that of the second acquisition device;
an output module for previewing image data of overlapping view angles of the first image data and the second image data, and image data of non-overlapping view angles of the first image data and the second image data; wherein the image data of the overlapping view angle is displayed with a first display parameter; the image data of the non-overlapping viewing angles is displayed with a second display parameter.
An electronic device, comprising:
A memory for storing a program;
A processor for calling and executing the program in the memory, and implementing the respective steps of the control method according to any one of the above by executing the program.
A readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the control method as claimed in any one of the preceding claims.
According to the scheme, the image processing method, the image processing device, the electronic equipment and the storage medium are used for responding to the image acquisition instruction to acquire first image data acquired by the first acquisition device aiming at the target scene and second image data acquired by the second acquisition device aiming at the target scene; the visual angle range of the first acquisition device is larger than that of the second acquisition device; previewing and displaying the image data of the overlapping view angles of the first image data and the second image data and the image data of the non-overlapping view angles of the first image data and the second image data; wherein the image data of the overlapping view angles are displayed with a first display parameter; the image data of the non-overlapping viewing angles is displayed with the second display parameter. When the image is acquired, the image data are acquired through the two acquisition devices in different view angle ranges, and when the image is previewed, the image data of the overlapped view angle and the image data of the non-overlapped view angle of the image data acquired by the two acquisition devices are displayed with different display parameters, so that the diversity of display information of a preview interface is increased, the intelligence of the electronic equipment is improved, the follow-up shooting operation of a user is facilitated, and the shooting experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed for the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of an implementation of an image processing method according to an embodiment of the present application;
FIG. 2 is a flowchart of an implementation of preview display image data provided by an embodiment of the present application;
FIG. 3 is a flowchart of another implementation of preview display image data provided by an embodiment of the present application;
FIG. 4 is an exemplary diagram of a preview display interface of an electronic device according to an embodiment of the present application;
FIG. 5 illustrates three unreasonable patterning examples provided by embodiments of the present application;
fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in other sequences than those illustrated herein.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without any inventive effort, are intended to be within the scope of the application.
The image processing method provided by the embodiment of the application can be used in electronic equipment, the electronic equipment is provided with at least two image acquisition devices, which are marked as a first acquisition device and a second acquisition device for convenience of distinguishing and description, and the two image acquisition devices are positioned on the same plane of the electronic equipment, so that the image acquisition of the same target scene can be performed. The view angle range of the first acquisition device is larger than that of the second acquisition device.
As an example, the first collection device may be a collection device with a wide angle lens, while the second collection device is a collection device with a standard lens. The wide-angle lens has a focal length shorter than that of the standard lens and a viewing angle greater than that of the standard lens.
As an example, the first collection device may be a collection device with a super wide angle lens, while the second collection device is a collection device with a standard lens. The focal length of the ultra-wide angle lens is shorter than that of the standard lens, and the visual angle is larger than that of the standard lens.
As an example, the first pickup device may be a pickup device having a super wide-angle lens, and the second pickup device may be a pickup device having a wide-angle lens. The super wide angle lens has a shorter focal length than the wide angle lens and a larger viewing angle than the wide angle lens.
Referring to fig. 1, a flowchart for implementing an image processing method according to an embodiment of the present application may include:
Step S101: and responding to the image acquisition instruction, acquiring first image data acquired by the first acquisition device aiming at the target scene and second image data acquired by the second acquisition device aiming at the target scene.
The image acquisition instruction can be triggered manually by a user, for example, the user clicks an icon of the photographing application software to trigger the image acquisition instruction so that the photographing application software can acquire an image;
Or the user clicks a certain interactive key in the non-photographing application software to trigger an image acquisition instruction so that the non-photographing application software can acquire the image.
Or the image acquisition instructions may be automatically triggered by the non-photographing application program according to its processing logic so that the non-photographing application can acquire the image.
When the image data are acquired, the two image acquisition devices with different view angle ranges are started at the same time, and the image data are acquired for a target scene (marked as a target scene A) through the two image acquisition devices with different view angle ranges, wherein the image data acquired for the target scene A by the first acquisition device are marked as first image data, and the image data acquired for the target scene A by the second acquisition device are marked as second image data.
Step S102: previewing and displaying the image data of the overlapping view angles of the first image data and the second image data and the image data of the non-overlapping view angles of the first image data and the second image data; wherein the image data of the overlapping view angles are displayed with a first display parameter; the image data of the non-overlapping viewing angles is displayed with the second display parameter.
After the image acquisition device acquires the image, the acquired image is usually subjected to preview display through a preview interface so as to be convenient for a user to check. When the application is used for preview display, the image data of the overlapped visual angles and the image data of the non-overlapped visual angles of the image data acquired by the two acquisition devices are displayed with different display parameters. The object represented by the image data of the overlapped view angles is an object which is located in the view angle range of the first acquisition device and the view angle range of the second acquisition device in the target scene A, and the object represented by the image data of the non-overlapped view angles is an object which is located in the view angle range of the first acquisition device and is located outside the view angle range of the second acquisition device in the target scene A.
Because the image data of the overlapped view angles and the image data of the non-overlapped view angles are displayed, a user can view the object in the view angle range of the first acquisition device and the object in the view angle range of the second acquisition device, and the object in the view angle range of the second acquisition device and the object in the view angle range of the non-second acquisition device are displayed with different display parameters, so that the user can conveniently identify the object in the view angle range of the first acquisition device and the object in the view angle range of the second acquisition device.
According to the application, the image data of the overlapped view angles and the image data of the non-overlapped view angles of the image data acquired by the acquisition devices of the two different view angle ranges are displayed by different display parameters, so that the diversity of display information of the preview interface is increased, the intelligence of the electronic equipment is improved, the follow-up shooting operation of a user is facilitated, and the shooting experience of the user is improved.
In an alternative embodiment, the image data of overlapping view is second image data, and the image data of non-overlapping view belongs to the first image data. That is, in the present application, the second image data may be regarded as the image data of the overlapping view angle, and the image data of the non-overlapping view angle is located only in the first image data, and therefore, the image data of the non-overlapping view angle belongs to the first image data.
Or alternatively
The image data of the overlapping view and the image data of the non-overlapping view both belong to the first image data. That is, the image data of the overlapping view angle in the present application may be determined from the first image data.
In an alternative embodiment, an implementation flowchart of the foregoing preview display of the image data of the overlapping view angle of the first image data and the second image data, and the image data of the non-overlapping view angle of the first image data and the second image data is shown in fig. 2, may include:
step S201: a matching region is determined in the first image data that matches the second image data.
The matching area of the first image data, which is matched with the second image data, may be larger than the size of the second image data, may be smaller than the size of the second image data, or may be the same as the size of the second image data, so that the second image data may be used as a template image, and the matching area matched with the second image data may be determined in the first image data by searching for a matching block by using a multi-scale template matching method.
The size of the second image data may be the size of a preview frame of the second image data.
Step S202: and fusing the second image data into the first image data, wherein the second image data is positioned in the matching area in the first image data.
As an example, the second image data may be resized to the size of the matching region described above, and then overlaid onto the matching region described above, wherein each pixel point in the second image data overlays a corresponding matching pixel point in the matching region.
In the embodiment of the application, the size of the preview frame of the second image data is adjustable. The size of the second image data may be adjusted to the size of the matching area, and in one embodiment, the size of the preview frame of the second image data may be adjusted to the size of the matching area. In particular, during adjustment, the aspect ratio of the preview frame can be kept unchanged.
Step S203: preview displaying the first image data fused with the second image data; the second image data is displayed with the first display parameter, and the non-matching region in the first image data is displayed with the second display parameter.
In an alternative embodiment, an implementation flowchart of the foregoing preview display of the image data of the overlapping view angle of the first image data and the second image data, and the image data of the non-overlapping view angle of the first image data and the second image data is shown in fig. 3, may include:
Step 301: a matching region is determined in the first image data that matches the second image data.
The specific implementation process may refer to step S201, and will not be described herein.
Step 303: displaying the first image data in a preview mode; the matching area is displayed with a first display parameter, and the non-matching area in the first image data is displayed with a second display parameter.
In the embodiment of the application, after the matching area is determined in the first image data, the first image data is directly previewed and displayed, the matching area is displayed with a first display parameter, and the non-matching area is displayed with a second display parameter, so that the matching area and the non-matching area in the first image data show different display effects.
In an alternative embodiment, the first display parameter and the second display parameter characterize different display effects of the target dimension; the target dimension comprises any one of the following dimensions: transparency, clarity, brightness, color.
Optionally, the display effect of the target dimension represented by the first display parameter is better than the display effect of the target dimension represented by the second display parameter.
As an example, the first display parameter is determined by the application software triggering the image acquisition instruction, such that the display effect of the target dimension represented by the first display parameter is an effect suitable for viewing by a user, and the second display parameter is obtained by adjusting the first parameter, such that the display effect of the target dimension represented by the second display parameter is worse than the display effect of the target dimension represented by the first display parameter.
As an example, the transparency characterized by the first display parameter is less than or equal to the first transparency, and the transparency characterized by the second display parameter is greater than the second transparency; the second transparency is greater than or equal to the first transparency.
As an example, the first display parameter characterizes a definition that is greater than the first definition and the second display parameter characterizes a definition that is less than the second definition; the second sharpness is less than or equal to the first sharpness.
As an example, the first display parameter characterizes a luminance that is greater than the first luminance value and less than the second luminance value; the brightness value represented by the second display parameter is smaller than the third brightness value; the third luminance value is less than or equal to the first luminance value.
As an example, the color characterized by the first display parameter is a color determined based on white balance; the second display parameter characterizes a color having a specific color temperature (e.g. red, blue or orange, etc.), i.e. a color that does not meet the white balance requirements.
In an alternative embodiment, a plurality of marks (first marks for convenience of description and distinction) may also be displayed in the preview display area, with different first marks being used to indicate different viewing ranges in the first image data.
Wherein the view range indicated by one of the two first marks is a sub-range of the view range indicated by the other first mark.
Fig. 4 is a schematic diagram of a preview display interface of an electronic device according to an embodiment of the present application. The area with higher brightness in the middle dash-dot line frame is overlapped with the image data of the visual angle, and the rest area of the preview display interface is the image data of the non-overlapped visual angle.
In this example, the dashed boxes are first marks, and a total of 5 dashed boxes, that is, 5 first marks, are numbered 1 to 5 in sequence. Wherein,
The view range indicated by the number 5 first mark is a sub-range of the view range indicated by the number 4 first mark, and is also a sub-range of the view range indicated by the number 3 first mark, a sub-range of the view range indicated by the number 2 first mark, and a sub-range of the view range indicated by the number 1 first mark.
The view range indicated by the number 4 first mark is a sub-range of the view range indicated by the number 3 first mark, a sub-range of the view range indicated by the number 2 first mark, and a sub-range of the view range indicated by the number 1 first mark.
The view range indicated by the number 3 first mark is a sub-range of the view range indicated by the number 2 first mark and a sub-range of the view range indicated by the number 1 first mark.
The view range indicated by the first mark No. 2 is a sub-range of the view range indicated by the first mark No. 1.
Optionally, the overlapped view-finding range and the non-overlapped view-finding range in the view-finding ranges indicated by any two adjacent first marks are displayed through different display parameters, so that the overlapped view-finding range and the non-overlapped view-finding range show gradual change effect and/or stereoscopic effect.
For example, in fig. 4, two first marks adjacent to each other are numbered, for example, a first mark No. 5 and a first mark No. 4 are two first marks adjacent to each other, a first mark No. 4 and a first mark No. 3 are two first marks adjacent to each other, and so on, a first mark No. 3 and a first mark No. 2 are two first marks adjacent to each other, and a first mark No. 2 and a first mark No.1 are two first marks adjacent to each other.
Taking the first marks No. 5 and No. 4as an example, overlapping view ranges (for convenience of description and distinction, first overlapping view ranges are denoted by the first marks No. 5 and No. 4) of view ranges indicated by the first marks within the range of the dashed line No. 5, non-overlapping view ranges (for convenience of description and distinction, first non-overlapping view ranges are denoted by the first marks No. 5 and No. 4) of view ranges outside the range of the dashed line No. 5, and the range of the dashed line No. 4 within the range of the dashed line No. 5, the first overlapping view ranges and the first non-overlapping view ranges need to be displayed with different display parameters. Taking the example of the first marks No. 4 and No. 3, the overlapping view ranges (for convenience of description and distinction, the second overlapping view range is denoted by the first mark No. 4 and the first mark No. 3) of the view ranges indicated by the first marks No. 4 in the range of the dashed line, the non-overlapping view ranges (for convenience of description and distinction, the second non-overlapping view range is denoted by the second non-overlapping view range) of the view ranges indicated by the first marks No. 4 out of the range of the dashed line No. 4 and the non-overlapping view ranges (for convenience of description and distinction, the second overlapping view range and the second non-overlapping view range need to be displayed with different display parameters, and in one embodiment, the second non-overlapping view range and the first non-overlapping view range may be displayed with different display parameters because the first non-overlapping view range is immediately adjacent to the second non-overlapping view range.
As an example, different display parameters of overlapping and non-overlapping ones of the view ranges indicated by any adjacent two of the first marks characterize different brightnesses or different grayscales.
Optionally, the display parameters of the overlapped view finding ranges in the view finding ranges indicated by any two adjacent first marks are larger than the display parameters of the non-overlapped view finding ranges; or the display parameters of the overlapped view finding ranges in the view finding ranges indicated by any two adjacent first marks are smaller than those of the non-overlapped view finding ranges, so that the display content in the preview interface presents gradual change effect in each first mark range, and in some scenes, the gradual change effect can generate a stereoscopic effect.
As an example, the display parameters of the overlapping ones of the ranges indicated by any adjacent two of the first marks characterize a luminance that is greater than the luminance of the non-overlapping ones of the ranges. Or alternatively
The display parameters of the overlapping ones of the view ranges indicated by any adjacent two of the first marks characterize a brightness that is smaller than the brightness of the non-overlapping view ranges. Or alternatively
The gray scale of the display parameter characterization of the overlapping view ranges in the view ranges indicated by any adjacent two of the first marks is larger than the gray scale of the non-overlapping view ranges. Or alternatively
The gray scale of the display parameter characterization of the overlapping view ranges in the view ranges indicated by any adjacent two of the first marks is smaller than the gray scale of the non-overlapping view ranges.
In practical application, some users cannot form a picture when photographing, for example, when photographing a portrait, the image edge is located at the neck, the wrist and the ankle, etc., and the photographed picture is shown in fig. 5, which is three unreasonable examples of the picture. In order to improve the intelligence of the electronic device and improve the shooting experience of the user, in the embodiment of the application, after the first image data is acquired, a plurality of first marks conforming to the composition principle can be determined in the first image data and displayed in the preview display area, that is, different first marks correspond to different composition principles.
As an example, the first image data may be input into a pre-trained marking model, resulting in a plurality of first marks output by the marking model.
The sample used for training the marking model is an image, and the label of the image is a marking frame which is marked in advance and accords with different composition principles. When training the marking model, inputting the sample into the marking model to obtain each marking frame in the sample output by the marking model, and training the marking model by taking the label of the marking frame which is close to the sample as a target until the sequence ending condition is met.
Through outputting the first marks corresponding to different isomorphic rules, the user composition can be assisted when the user shoots an image, the intelligence of the electronic equipment is improved, and the shooting experience of the user is improved.
In an alternative embodiment, if the boundary of the matching area in the first image data, which matches the second image data, does not overlap with the boundary of the framing range indicated by any one of the first marks, it is indicated that the second image data does not conform to any one of the composition principles, and therefore, a prompt message is output.
That is, if the second image data does not match the image data within the framing range indicated by any one of the second marks in the first image data, the hint information is output.
The prompt information indicates that the acquisition range of the second acquisition device is adjusted so that the boundary of the matching area matched with the second image data in the first image data overlaps with the boundary of the framing range indicated by any one of the first marks.
After the prompt information is obtained, when the acquisition range of the second acquisition device is adjusted, the user can adjust the focal length of the second acquisition device, move the electronic equipment in the direction away from the target scene A, or move the electronic equipment in the direction close to the target scene A until the boundary of the matching area matched with the second image data in the first image data overlaps with the boundary of the framing range indicated by any one of the first marks.
In the case of a mobile electronic device, the first image data acquired by the first acquisition means also changes, and at this time, the first mark is redetermined and the redetermined first mark is displayed in a preview.
In an alternative embodiment, if the boundary of the matching area in the first image data, which matches the second image data, does not overlap with the boundary of the framing range indicated by any one of the first marks, which indicates that the second image data does not conform to any one of the composition principles, the acquisition range of the second acquisition device may be automatically adjusted (for example, the focal length of the second acquisition device is adjusted) so that the boundary of the matching area in the first image data, which matches the second image data, overlaps with the boundary of the framing range indicated by any one of the first marks.
Further, the user may also manually adjust the acquisition range of the second acquisition device, such as adjusting the focal length of the second acquisition device, or moving the electronic device away from the target scene a, or moving the electronic device toward the target scene a, or the like, to obtain second image data overlapping with the boundary of the framing range indicated by the other first marks in the first image data.
In an alternative embodiment, in a case where the boundary of the matching area in the first image data that matches the second image data overlaps the boundary of the framing range indicated by any one of the first marks, the plurality of first marks are hidden in response to the first mark hiding instruction.
As shown in fig. 4, the boundary of the matching region in the first image data, which matches the second image data, overlaps the boundary of the framing range indicated by the first mark No. 3.
That is, in the case where the second image data conforms to a certain composition rule, the user can perform a designation operation to trigger the first mark hiding instruction to hide the plurality of first marks, so that the user views a neat picture.
In an alternative embodiment, the first image data is hidden and the second image data is preview displayed in response to the first image data hiding instruction.
As an example, the user may trigger the first image data hiding instruction at any time, that is, whether or not the boundary of the matching area in the first image data, which matches the second image data, overlaps the boundary of the framing range indicated by any one of the first marks, may trigger generation and response of the first image data hiding instruction, so that the electronic device hides the first image data, and previews the second image data on the entire preview display interface (that is, the interface in which the image data of the overlapping view angle and the image data of the non-overlapping view angle are displayed in fig. 4).
As an example, the user may trigger the first image data hiding instruction in a case where the boundary of the matching area matching the second image data in the first image data overlaps with the boundary of the framing range indicated by any one of the first marks, that is, may trigger generation and response of the first image data hiding instruction only in a case where the boundary of the matching area matching the second image data in the first image data overlaps with the boundary of the framing range indicated by any one of the first marks, so that the electronic apparatus hides the first image data and previews only the second image data.
The aspect ratio of the second image data may remain unchanged while the second image data is displayed in preview throughout the preview display interface.
In an alternative embodiment, after the user determines that the image is to be saved, an image saving instruction may be triggered, and in response to the image saving instruction, the second image data acquired by the second acquisition device in the preview display is saved.
Corresponding to the method embodiment, the embodiment of the present application further provides an image processing apparatus, and a schematic structural diagram of the image processing apparatus provided in the embodiment of the present application is shown in fig. 6, which may include:
An acquisition module 601 and an output module 602; wherein,
The acquiring module 601 is configured to acquire first image data acquired by a first acquiring device for a target scene and second image data acquired by a second acquiring device for the target scene in response to an image acquisition instruction; the view angle range of the first acquisition device is larger than that of the second acquisition device;
The output module 602 is configured to preview and display image data of overlapping view angles of the first image data and the second image data, and image data of non-overlapping view angles of the first image data and the second image data; wherein the image data of the overlapping view angle is displayed with a first display parameter; the image data of the non-overlapping viewing angles is displayed with a second display parameter.
According to the image processing device provided by the embodiment of the application, the image data of the overlapped view angles and the image data of the non-overlapped view angles of the image data acquired by the acquisition devices of the two different view angle ranges are displayed by different display parameters, so that the diversity of display information of a preview interface is increased, the intelligence of the electronic equipment is improved, the follow-up shooting operation of a user is facilitated, and the shooting experience of the user is improved.
In an optional embodiment, the image data of the overlapping view angle is the second image data, and the image data of the non-overlapping view angle belongs to the first image data; or alternatively
The image data of the overlapping view angle and the image data of the non-overlapping view angle both belong to the first image data.
In an alternative embodiment, the output module 602 includes:
a matching module for determining a matching region in the first image data that matches the second image data;
a fusion module, configured to fuse the second image data into the first image data, and preview and display the first image data fused with the second image data; the second image data is positioned in the matching area in the first image data, the second image data is displayed with the first display parameter, and the non-matching area in the first image data is displayed with the second display parameter; or alternatively
The preview display module is used for displaying the first image data in a preview mode; and displaying the matching area by the first display parameter, and displaying the non-matching area in the first image data by the second display parameter.
In an alternative embodiment, the fusion module is configured to:
adjusting the size of the second image data to the size of the matching region;
overlaying the second image data to the matching region; each pixel point in the second image data covers a corresponding matching pixel point in the matching region.
In an alternative embodiment, the first display parameter and the second display parameter characterize different display effects of a target dimension; the target dimension includes any one of the following dimensions: transparency, clarity, brightness, color.
In an alternative embodiment, the output module is further configured to display a plurality of first marks in a preview display area, and different first marks are used to indicate different viewing ranges in the first image data;
wherein the view range indicated by one of the two first marks is a sub-range of the view range indicated by the other first mark.
In an alternative embodiment, the output module is further configured to:
Outputting prompt information if the boundary of the matching area matched with the second image data in the first image data is not overlapped with the boundary of the view finding range indicated by any one of the first marks; the prompt information indicates that the acquisition range of the second acquisition device is adjusted so that the boundary of a matching area matched with the second image data in the first image data overlaps with the boundary of a framing range indicated by any one of the first marks;
In an alternative embodiment, the image processing apparatus further includes:
and the adjusting module is used for adjusting the acquisition range of the second acquisition device if the boundary of the matching area matched with the second image data in the first image data is not overlapped with the boundary of the framing range indicated by any one of the first marks, so that the boundary of the matching area matched with the second image data in the first image data is overlapped with the boundary of the framing range indicated by any one of the first marks.
In an alternative embodiment, the output module 602 is further configured to:
And hiding the plurality of first marks in response to a first mark hiding instruction when the boundary of a matching area matched with the second image data in the first image data overlaps with the boundary of a framing range indicated by any one of the first marks.
In an alternative embodiment, the output module 602 is further configured to:
And hiding the first image data in response to a first image data hiding instruction, and previewing and displaying the second image data.
Corresponding to the method embodiment, the application further provides an electronic device, and a schematic structural diagram of the electronic device is shown in fig. 7, which may include: at least one processor 1, at least one communication interface 2, at least one memory 3 and at least one communication bus 4.
In the embodiment of the present application, the number of the processor 1, the communication interface 2, the memory 3 and the communication bus 4 is at least one, and the processor 1, the communication interface 2 and the memory 3 complete communication with each other through the communication bus 4.
The processor 1 may be a central processing unit CPU, or an Application-specific integrated Circuit ASIC (Application SPECIFIC INTEGRATED Circuit), or one or more integrated circuits configured to implement embodiments of the present application, etc.
The memory 3 may comprise a high-speed RAM memory, and may also comprise a non-volatile memory (non-volatile memory) or the like, such as at least one disk memory.
Wherein the memory 3 stores a program, the processor 1 may call the program stored in the memory 3, the program being for:
in response to the image acquisition instruction,
Acquiring first image data acquired by a first acquisition device aiming at a target scene and second image data acquired by a second acquisition device aiming at the target scene; the view angle range of the first acquisition device is larger than that of the second acquisition device;
Previewing image data of overlapping view angles of the first image data and the second image data, and image data of non-overlapping view angles of the first image data and the second image data; wherein the image data of the overlapping view angle is displayed with a first display parameter; the image data of the non-overlapping viewing angles is displayed with a second display parameter.
Alternatively, the refinement function and the extension function of the program may be described with reference to the above.
The embodiment of the present application also provides a storage medium storing a program adapted to be executed by a processor, the program being configured to:
in response to the image acquisition instruction,
Acquiring first image data acquired by a first acquisition device aiming at a target scene and second image data acquired by a second acquisition device aiming at the target scene; the view angle range of the first acquisition device is larger than that of the second acquisition device;
Previewing image data of overlapping view angles of the first image data and the second image data, and image data of non-overlapping view angles of the first image data and the second image data; wherein the image data of the overlapping view angle is displayed with a first display parameter; the image data of the non-overlapping viewing angles is displayed with a second display parameter.
Alternatively, the refinement function and the extension function of the program may be described with reference to the above.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
It should be understood that in the embodiments of the present application, the claims, the various embodiments, and the features may be combined with each other, so as to solve the foregoing technical problems.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. An image processing method, the method comprising:
responding to an image acquisition instruction, and simultaneously starting a first acquisition device and a second acquisition device;
acquiring first image data acquired by a first acquisition device aiming at a target scene and second image data acquired by a second acquisition device aiming at the target scene; the view angle range of the first acquisition device is larger than that of the second acquisition device;
Previewing image data of overlapping view angles of the first image data and the second image data, and image data of non-overlapping view angles of the first image data and the second image data; wherein the image data of the overlapping view angle is displayed with a first display parameter; the image data of the non-overlapping viewing angles is displayed with a second display parameter.
2. The method of claim 1, the image data of the overlapping view being the second image data, the image data of the non-overlapping view belonging to the first image data; or alternatively
The image data of the overlapping view angle and the image data of the non-overlapping view angle both belong to the first image data.
3. The method of claim 1, the previewing displaying image data of overlapping perspectives of the first image data and the second image data, and image data of non-overlapping perspectives of the first image data and the second image data, comprising:
Determining a matching region in the first image data, which matches the second image data;
fusing the second image data into the first image data, and previewing the first image data fused with the second image data; the second image data is positioned in the matching area in the first image data, the second image data is displayed with the first display parameter, and the non-matching area in the first image data is displayed with the second display parameter; or alternatively
Displaying the first image data in a preview mode; and displaying the matching area by the first display parameter, and displaying the non-matching area in the first image data by the second display parameter.
4. A method according to claim 3, the fusing the second image data into the first image data comprising:
adjusting the size of the second image data to the size of the matching region;
overlaying the second image data to the matching region; each pixel point in the second image data covers a corresponding matching pixel point in the matching region.
5. The method of claim 1, the first display parameter and the second display parameter characterizing different display effects of a target dimension; the target dimension includes any one of the following dimensions: transparency, clarity, brightness, color.
6. The method of any of claims 1-5, displaying a plurality of first indicia in a preview display area, different first indicia being used to indicate different viewing ranges in the first image data;
wherein the view range indicated by one of the two first marks is a sub-range of the view range indicated by the other first mark.
7. The method of claim 6, further comprising:
Outputting prompt information if the boundary of the matching area matched with the second image data in the first image data is not overlapped with the boundary of the view finding range indicated by any one of the first marks; the prompt information indicates that the acquisition range of the second acquisition device is adjusted so that the boundary of a matching area matched with the second image data in the first image data overlaps with the boundary of a framing range indicated by any one of the first marks;
Or alternatively
And if the boundary of the matching area matched with the second image data in the first image data is not overlapped with the boundary of the framing range indicated by any one of the first marks, adjusting the acquisition range of the second acquisition device so that the boundary of the matching area matched with the second image data in the first image data is overlapped with the boundary of the framing range indicated by any one of the first marks.
8. The method of claim 6, further comprising:
And hiding the plurality of first marks in response to a first mark hiding instruction when the boundary of a matching area matched with the second image data in the first image data overlaps with the boundary of a framing range indicated by any one of the first marks.
9. The method of any of claims 1-5, further comprising:
And hiding the first image data in response to a first image data hiding instruction, and previewing and displaying the second image data.
10. An image processing apparatus, the apparatus comprising:
The acquisition module is used for responding to an image acquisition instruction, and simultaneously starting the first acquisition device and the second acquisition device to acquire first image data acquired by the first acquisition device aiming at a target scene and second image data acquired by the second acquisition device aiming at the target scene; the view angle range of the first acquisition device is larger than that of the second acquisition device;
an output module for previewing image data of overlapping view angles of the first image data and the second image data, and image data of non-overlapping view angles of the first image data and the second image data; wherein the image data of the overlapping view angle is displayed with a first display parameter; the image data of the non-overlapping viewing angles is displayed with a second display parameter.
CN202210586416.0A 2022-05-27 2022-05-27 Image processing method and device, electronic equipment and storage medium Active CN114979487B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210586416.0A CN114979487B (en) 2022-05-27 2022-05-27 Image processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210586416.0A CN114979487B (en) 2022-05-27 2022-05-27 Image processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114979487A CN114979487A (en) 2022-08-30
CN114979487B true CN114979487B (en) 2024-06-18

Family

ID=82956524

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210586416.0A Active CN114979487B (en) 2022-05-27 2022-05-27 Image processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114979487B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024119318A1 (en) * 2022-12-05 2024-06-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Imaging device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017128914A1 (en) * 2016-01-27 2017-08-03 努比亚技术有限公司 Photographing method and device
CN112584034A (en) * 2019-09-30 2021-03-30 虹软科技股份有限公司 Image processing method, image processing device and electronic equipment applying same
CN112995500A (en) * 2020-12-30 2021-06-18 维沃移动通信(杭州)有限公司 Shooting method, shooting device, electronic equipment and medium
WO2021169669A1 (en) * 2020-02-28 2021-09-02 Oppo广东移动通信有限公司 Photography composition method, terminal, and storage medium

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0546718A (en) * 1991-08-14 1993-02-26 Kowa Co Picture editing device
JP4298407B2 (en) * 2002-09-30 2009-07-22 キヤノン株式会社 Video composition apparatus and video composition method
CN1992811A (en) * 2005-12-30 2007-07-04 摩托罗拉公司 Method and system for displaying adjacent image in the preview window of camera
KR101599881B1 (en) * 2009-06-30 2016-03-04 삼성전자주식회사 Digital image signal processing apparatus method for controlling the apparatus and medium for recording the method
JP5915514B2 (en) * 2012-12-21 2016-05-11 カシオ計算機株式会社 Image processing apparatus, image processing method, and program
KR20160078023A (en) * 2014-12-24 2016-07-04 삼성전자주식회사 Apparatus and method for controlling display
CN104580897B (en) * 2014-12-25 2018-02-06 魅族科技(中国)有限公司 Shoot find a view method and terminal
US10955657B2 (en) * 2018-12-20 2021-03-23 Acclarent, Inc. Endoscope with dual image sensors
EP4029450A4 (en) * 2019-09-10 2022-11-02 FUJIFILM Corporation Image inspection device, console, and radiation imaging system
CN112541858A (en) * 2019-09-20 2021-03-23 华为技术有限公司 Video image enhancement method, device, equipment, chip and storage medium
CN111541845B (en) * 2020-04-30 2022-06-24 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment
CN113709428A (en) * 2020-05-20 2021-11-26 中强光电股份有限公司 Projection system and projection method
CN114113185A (en) * 2020-08-31 2022-03-01 中国科学院生物物理研究所 Imaging method for realizing zoom scanning of scanning electron microscope
CN114531551B (en) * 2021-12-31 2023-12-26 联想(北京)有限公司 Image processing method and device, electronic equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017128914A1 (en) * 2016-01-27 2017-08-03 努比亚技术有限公司 Photographing method and device
CN112584034A (en) * 2019-09-30 2021-03-30 虹软科技股份有限公司 Image processing method, image processing device and electronic equipment applying same
WO2021169669A1 (en) * 2020-02-28 2021-09-02 Oppo广东移动通信有限公司 Photography composition method, terminal, and storage medium
CN112995500A (en) * 2020-12-30 2021-06-18 维沃移动通信(杭州)有限公司 Shooting method, shooting device, electronic equipment and medium

Also Published As

Publication number Publication date
CN114979487A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN112135046B (en) Video shooting method, video shooting device and electronic equipment
CN112565589B (en) Photographing preview method and device, storage medium and electronic equipment
US7349020B2 (en) System and method for displaying an image composition template
WO2017016030A1 (en) Image processing method and terminal
CN111654635A (en) Shooting parameter adjusting method and device and electronic equipment
EP3664016B1 (en) Image detection method and apparatus, and terminal
US11132770B2 (en) Image processing methods and apparatuses, computer readable storage media and electronic devices
CN106412458A (en) Image processing method and apparatus
US10885720B2 (en) Virtual display method, device, electronic apparatus and computer readable storage medium
US10116859B2 (en) Image processing apparatus and image processing method that present assist information to assist photographing
CN114979487B (en) Image processing method and device, electronic equipment and storage medium
CN105635568A (en) Image processing method in mobile terminal and mobile terminal
CN106650583B (en) Method for detecting human face, device and terminal device
JP6594666B2 (en) Imaging auxiliary device, imaging device, and imaging auxiliary method
CN108337427B (en) Image processing method and electronic equipment
CN113194256A (en) Shooting method, shooting device, electronic equipment and storage medium
CN106815237B (en) Search method, search device, user terminal and search server
CN110177216B (en) Image processing method, image processing device, mobile terminal and storage medium
CN104994282B (en) A kind of big visual angle camera control method and user terminal
JP6497030B2 (en) Imaging system, information processing apparatus, imaging method, program, storage medium
WO2019215797A1 (en) Composition advice system, composition advice method, camera and program
CN114339029B (en) Shooting method and device and electronic equipment
JP2020177605A (en) Image processing device
JP2020101897A (en) Information processing apparatus, information processing method and program
CN112653841B (en) Shooting method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant