CN116055712B - Method, device, chip, electronic equipment and medium for determining film forming rate - Google Patents

Method, device, chip, electronic equipment and medium for determining film forming rate Download PDF

Info

Publication number
CN116055712B
CN116055712B CN202210983147.1A CN202210983147A CN116055712B CN 116055712 B CN116055712 B CN 116055712B CN 202210983147 A CN202210983147 A CN 202210983147A CN 116055712 B CN116055712 B CN 116055712B
Authority
CN
China
Prior art keywords
image
scene
imaging quality
information
photographing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210983147.1A
Other languages
Chinese (zh)
Other versions
CN116055712A (en
Inventor
刘宜恩
陈雪飞
陈祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210983147.1A priority Critical patent/CN116055712B/en
Publication of CN116055712A publication Critical patent/CN116055712A/en
Application granted granted Critical
Publication of CN116055712B publication Critical patent/CN116055712B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application provides a method, a device, a chip, electronic equipment and a medium for determining a film forming rate, wherein the method comprises the following steps: configuring a first shooting scene according to first scene information, wherein the first scene information comprises configuration items affecting imaging quality; acquiring a first image shot by shooting equipment in a first shooting scene; acquiring imaging quality information of a first image, wherein the imaging quality information is used for reflecting the imaging quality of the image; and determining the first film forming rate according to the imaging quality information of the first image. The embodiment of the application can automatically determine the film forming rate, and solves the problem of time and labor consumption when the film forming rate is manually determined.

Description

Method, device, chip, electronic equipment and medium for determining film forming rate
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, a chip, an electronic device, and a medium for determining a film forming rate.
Background
Based on the continuous enhancement of the convenience of photographing devices (such as smart phones) and image systems thereof, the photographing use frequency of users in daily life is higher and higher. But there may be a case where the sheeting fails when the user photographs with the photographing apparatus. As such, it is necessary to determine the film rate of the photographing apparatus.
Currently, the determination scheme for the failure rate of shooting and film forming can manually determine the film forming rate, but the implementation is time-consuming and labor-consuming.
Disclosure of Invention
The embodiment of the application provides a film forming rate determining method, a device, a chip, electronic equipment and a medium, which can automatically determine the film forming rate of shooting equipment and solve the problem of time and labor consumption when the film forming rate is manually determined.
In a first aspect, an embodiment of the present application provides a method for determining a film rate, including: configuring a first shooting scene according to first scene information, wherein the first scene information comprises configuration items affecting imaging quality; acquiring a first image shot by shooting equipment in a first shooting scene; acquiring imaging quality information of a first image, wherein the imaging quality information is used for reflecting the imaging quality of the image; and determining the first film forming rate according to the imaging quality information of the first image.
Optionally, the method further comprises: configuring a second photographing scene according to second scene information, wherein the second scene information comprises configuration items affecting imaging quality; acquiring a second image shot by shooting equipment in a second shooting scene; acquiring imaging quality information of a second image; determining a second film forming rate according to the imaging quality information of the second image; comparing the first film forming rate with the second film forming rate to obtain a comparison result; and outputting a comparison result.
Optionally, the method further comprises: configuring a third photographing scene according to third scene information, wherein the third scene information comprises configuration items affecting imaging quality; acquiring a third image shot by shooting equipment in a third shooting scene; acquiring imaging quality information of a third image;
the determining the first film forming rate according to the imaging quality information of the first image comprises the following steps: and determining the first film forming rate according to the imaging quality information of the first image and the imaging quality information of the third image.
Optionally, the configuration item includes: configuration information of at least one of an illumination environment in which the photographing apparatus is located, a shake mode of the photographing apparatus, a zoom magnification of the photographing apparatus, an apparatus type of the photographing apparatus, an operating system version of the photographing apparatus, and a photographing mode of the photographing apparatus.
Optionally, the imaging quality information includes: at least one of sharpness and ambiguity.
Optionally, the configuration item includes: configuration information of at least one of a zoom magnification of the photographing apparatus and an apparatus type of the photographing apparatus;
the configuring the first photographing scene includes: executing a first operation, wherein the first image comprises an image of the image card, and the relative position relationship between the image of the image card and the first image is the same as a preset first relative position relationship; the image card is located in the image acquisition range of the shooting equipment.
Optionally, the method further comprises: classifying and storing the first image according to the first scene information corresponding to the first image; the determining the first film forming rate according to the imaging quality information of the first image comprises the following steps: and determining the first film forming rate according to the imaging quality information of the first image and the imaging quality information of other images in the class of images comprising the first image.
Optionally, the method further comprises: classifying the first image into a first image set corresponding to the shooting device, wherein the first image set further comprises images shot by the shooting device under other shooting scenes;
the determining the first film forming rate according to the imaging quality information of the first image comprises the following steps: and determining a first film forming rate according to the imaging quality information of the images in the first image set, wherein the first film forming rate is the film forming rate of the shooting equipment.
In a second aspect, an embodiment of the present application provides a device for determining a film rate, including: the configuration module is used for configuring a first shooting scene according to first scene information, wherein the first scene information comprises configuration items influencing imaging quality; the first acquisition module is used for acquiring a first image shot by the shooting equipment in a first shooting scene; the second acquisition module is used for acquiring imaging quality information of the first image, wherein the imaging quality information is used for reflecting the imaging quality of the image; and the determining module is used for determining the first film forming rate according to the imaging quality information of the first image.
In a third aspect, an embodiment of the present application provides an electronic chip, including: a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method according to any of the first aspects.
In a fourth aspect, embodiments of the present application provide an electronic device comprising a memory for storing computer program instructions, a processor for executing the computer program instructions, and communication means, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform a method as in any of the first aspects.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium having a computer program stored therein, which when run on a computer, causes the computer to perform the method as in any of the first aspects.
In a sixth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method as in any of the first aspects.
The embodiment of the application can automatically determine the film forming rate, and solves the problem of time and labor consumption when the film forming rate is manually determined.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a flow chart of a method for determining a film rate according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating another method for determining a film rate according to an embodiment of the present disclosure;
fig. 4 is a flowchart of yet another method for determining a film rate according to an embodiment of the present application;
FIG. 5 is a block schematic diagram of a film rate determination system according to one embodiment of the present application;
fig. 6 is a flowchart of another method for determining a film rate according to an embodiment of the present application.
Detailed Description
For a better understanding of the technical solutions of the present application, embodiments of the present application are described in detail below with reference to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, of the embodiments of the present application. All other embodiments, based on the embodiments herein, which would be apparent to one of ordinary skill in the art without making any inventive effort, are intended to be within the scope of the present application.
The terminology used in the embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "at least one" as used herein means one or more, and "a plurality" means two or more. The term "and/or" as used herein is merely one association relationship describing the associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. Wherein A, B may be singular or plural. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship. "at least one of the following" and the like means any combination of these items, including any combination of single or plural items. For example, at least one of a, b and c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
It should be understood that although the terms first, second, etc. may be used in embodiments of the present application to describe the set threshold values, these set threshold values should not be limited to these terms. These terms are only used to distinguish the set thresholds from each other. For example, a first set threshold may also be referred to as a second set threshold, and similarly, a second set threshold may also be referred to as a first set threshold, without departing from the scope of embodiments of the present application.
The terminology used in the description section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
The method for determining the film forming rate according to any of the embodiments of the present application may be applied to the electronic device 100 shown in fig. 1. Fig. 1 shows a schematic configuration of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation. The magnetic sensor 180D includes a hall sensor. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc. The temperature sensor 180J is for detecting temperature.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100. The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
Before introducing embodiments of the present application, some key terms will first be described.
The shift delay may refer to the period of time from when the camera shutter is pressed to when the camera starts exposure. The duration of this period is not fixed.
Based on the continuous enhancement of convenience of photographing devices (such as smartphones, video cameras, digital cameras and other devices for photographing) and image systems thereof, the photographing use frequency of users in daily life is increasing. But there may be a case where the sheeting fails when the user photographs with the photographing apparatus.
Taking a smart phone as an example, users often use the smart phone to shoot under various extreme scenes (such as a medium-low bright night scene, indoor extremely dim light, a sunset, a magnesium light court, and the like). When shooting in these extreme scenes, the processing time of the shooting algorithm is longer than the processing time of the Shutterlag (shutter delay), and the user can have the condition of hand shake when holding the shooting for a long time, or the user can put down the mobile phone too quickly to finish shooting and other habits, so that the shooting failure is caused.
Thus, it is necessary to determine the sheeting rate. For example, the film rate of the photographing apparatus may be determined, and based on the film rate of different photographing apparatuses, the photographing imaging effect of the different photographing apparatuses may be known or verified.
Determination of the failure rate of taking a picture the rate of taking a picture can be determined manually, but this implementation is time-consuming and labor-consuming. Moreover, the mode of manually evaluating whether the film forming fails or not has stronger subjectivity, and is inconvenient for objectively and accurately determining the film forming rate.
In the following, some possible application scenarios of the embodiments of the present application are described.
Application scenario 1: the method can compare the film forming rate of different smart phones, and determine which smart phone has better film forming effect (or called photographing imaging effect).
Application scenario 2: the method can compare the smart phone sheeting rates of different factories and determine which manufacturer has better smart phone sheeting effect.
Application scenario 3: the method can compare the film forming rate of different versions of the operating system (or different manufacturer operating systems) to determine which version of the operating system (or which manufacturer operating system) has better film forming effect.
Application scenario 4: the current smart phone's rate of tablets can be known. It may also be determined whether the measured sheeting rate meets the expected sheeting rate in combination with the expected sheeting rate requirement.
Application scenario 5: the film forming rate of different photographing modes can be compared to determine in which photographing mode the currently tested smart phone has better film forming effect.
Application scenario 6: the sheeting rate of a certain configuration item may be compared to determine whether the comparison meets expectations.
The configuration item may be configuration information of any one of an illumination environment in which the photographing apparatus is located, a shake mode of the photographing apparatus, a zoom magnification of the photographing apparatus, an apparatus type of the photographing apparatus, an operating system version of the photographing apparatus, a photographing mode of the photographing apparatus, and the like. For example, the configuration information of the illumination environment may be a bright light environment, a dark light environment, etc. If desired, the resulting test data may be used to determine other application scenarios for the film rate (e.g., application scenarios 1-5 described above).
The embodiment of the present application is not limited to the above-listed application scenarios, and may be applied to other feasible application scenarios not shown in the present embodiment.
Next, the technology of the embodiment of the present application will be described.
As shown in fig. 2, an embodiment of the present application provides a method for determining a film rate, which may include the following steps 201 to 204. The execution subject of the method can be electronic equipment such as personal computers, smart phones, servers and the like.
Step 201, configuring a first shooting scene according to first scene information, wherein the first scene information comprises configuration items affecting imaging quality.
In one embodiment, the configuration items may include: configuration information of at least one of an illumination environment in which the photographing apparatus is located, a shake mode of the photographing apparatus, a zoom magnification of the photographing apparatus, an apparatus type of the photographing apparatus, an operating system version of the photographing apparatus, and a photographing mode of the photographing apparatus.
For example, in one embodiment, the first scene information includes the following configuration items: bright light environment, high frequency dithering, main camera, equipment manufacturer X, operating system version X, night scene mode.
For another example, in another embodiment, the first scene information includes the following configuration items: dim light environment, low frequency jitter, main shot, night scene mode.
In the following, some configuration categories affecting imaging quality are explained, respectively.
1) For the lighting environment this configuration class:
the configuration item may include configuration information of a configuration category of the lighting environment in which the photographing apparatus is located, taking into consideration influences of different lighting environments on imaging quality. For example, in the context information, the configuration information of the configuration category may relate to light intensity, light color, shooting angle, and the like.
Taking the effect of light intensity on imaging quality as an example, in one embodiment, the configuration information for the configuration class may be bright light (or intense light), dark light (or weak light). Therefore, at least two photographing scenes can be preset, the illumination environment of one photographing scene is a bright light environment, and the illumination environment of the other photographing scene is a weak light environment.
The light source sequence can be preset, and different illumination environments can be obtained through switching of the light sources in the light source sequence.
2) For the dither pattern this configuration class:
the above configuration items may include configuration information of a configuration category of the shake mode of the photographing apparatus in consideration of the influence of different shake modes on the imaging quality. For example, in the context information, the configuration information of the configuration category may relate to jitter frequency, jitter amplitude, and the like.
Taking as an example the effect of the dithering frequency on the imaging quality, in one embodiment the configuration information of the configuration class may have a high frequency, a low frequency. Thus, at least two photographing scenes can be preset, wherein the dithering mode of one photographing scene is high-frequency dithering, and the dithering mode of the other photographing scene is low-frequency dithering.
The jitter sequence may be preset, and the different jitter modes may be obtained by switching the jitter modes in the jitter sequence.
3) For this configuration category of zoom magnification:
the above configuration items may include configuration information of a configuration class of zoom magnifications of the photographing apparatus in consideration of influences of different zoom magnifications (or focal length) on imaging quality. For example, in the scene information, the configuration information of the configuration category may be a main camera (i.e., a main camera, a zoom magnification of 1.0), a tele (i.e., a tele), or the like.
Based on the above, at least two photographing scenes can be preset, wherein the zoom magnification of one photographing scene is mainly photographed, and the zoom magnification of the other photographing scene is long-focus.
4) This configuration category for device type:
the above configuration items may include configuration information of a configuration category of the device type of the photographing device, taking into consideration the influence of different device types on the imaging quality. Illustratively, in the context information, the configuration information of the configuration category may relate to vendor, model, etc.
Taking the influence of the vendor of the smart phone on the imaging quality as an example, in one embodiment, the configuration information of the configuration class may be vendor 1 and vendor 2. Thus, at least two photographing scenes can be preset, wherein the equipment type of one photographing scene is the smart phone of manufacturer 1, and the equipment type of the other photographing scene is the smart phone of manufacturer 2.
For switching of different shooting devices, the shooting devices can be manually switched, and the corresponding control system can also be used for automatically switching the shooting devices.
In one embodiment, when the shooting device is manually switched, the scene information may also include configuration information of a configuration category of a device type. In this case, the electronic device may read the device information of the current photographing device and verify whether the current photographing device matches with the configuration information of the configuration class, which is the device type, in the current scene information. If so, triggering the shooting equipment to execute shooting operation. If the two types of the shooting equipment are not matched, prompt information can be output to prompt that the used shooting equipment does not accord with the current scene information, so that a user can manually replace the correct shooting equipment.
In other embodiments, when the photographing device is manually switched, the scene information may not include configuration information of the device type, that is, the electronic device defaults to the current photographing device.
5) This configuration class for operating system version:
the configuration item may include configuration information of a configuration class of an operating system version of the photographing apparatus in consideration of influence of different operating system versions on imaging quality. Illustratively, in the context information, the configuration information of the configuration category may have version 1, version 2, and the like. Version 1 and version 2, etc. may be versions of different operating systems that are installable on the same capture device. For example, version 1 is the version number of the operating system that was developed before, and version 2 is the version number of the operating system that was developed after.
For switching of different operating system versions, the switching can be performed manually or automatically by a corresponding control system.
For version switching, the version of the operating system can be updated for the same shooting device to realize version switching, or two shooting devices of the same type, in which operating systems of different versions are respectively installed, can be directly switched to realize version switching.
In one embodiment, when the version is manually switched, the scene information may also include configuration information of the configuration type of the operating system version. In this case, the electronic device may read the operating system related information of the current photographing device, and verify whether the operating system related information matches with the configuration information of the configuration category of the operating system version in the current scene information. If so, triggering the shooting equipment to execute shooting operation. If the operation system version does not accord with the current scene information, a prompt message can be output to prompt the user to manually replace the correct operation system.
In other embodiments, when the version is manually switched, the scene information may not include configuration information of the configuration type of the operating system version, that is, the electronic device defaults to the current operating system installation without errors.
6) For the photographing mode, this configuration category:
the configuration item may include configuration information of a configuration category of a photographing mode of the photographing apparatus in consideration of influence of different photographing modes on imaging quality. For example, in the scene information, the configuration information of the configuration category may include a portrait photographing mode, a night scene photographing mode, and the like.
Based on the above, at least two photographing scenes can be preset, wherein the photographing mode of one photographing scene is a portrait photographing mode, and the photographing mode of the other photographing scene is a night scene photographing mode.
The scene information used for configuring the photographing scene may be any combination of the configuration information of the six configuration categories, and the specific combination mode may be selected as required.
For example, in one embodiment, the scene information may include configuration information of configuration categories such as an illumination environment in which the photographing apparatus is located, a shake mode of the photographing apparatus, a zoom magnification of the photographing apparatus, an apparatus type of the photographing apparatus, an operating system version of the photographing apparatus, and a photographing mode of the photographing apparatus. This embodiment can be applied at least to any one of the application scenarios 1 to 6 described above.
For example, in another embodiment, the scene information may include configuration information of configuration categories of an illumination environment in which the photographing apparatus is located, a shake mode of the photographing apparatus, a zoom magnification of the photographing apparatus, and a photographing mode of the photographing apparatus. This embodiment can be applied at least to any one of the application scenarios 1 to 6 described above.
The configuration category related to the scene information can be set as required, and the embodiment is not limited thereto, and other possible examples are not described herein.
Illustratively, in one embodiment, taking an example in which the scene information includes information of an illumination environment in which the photographing apparatus is located and information of a shake mode (specifically, a shake frequency) of the photographing apparatus, one or more of the following scene information 1 to scene information 4 may be preset:
scene information 1: a bright light environment, high frequency dithering pattern;
scene information 2: a bright light environment, a low frequency dithering pattern;
scene information 3: a dim light environment, high frequency dithering pattern;
scene information 4: dim light environment, low frequency dither pattern.
The configuration information of other configuration categories (such as jitter amplitude, photographing mode, etc.) in the scene information is consistent, for example, the configuration information is all default values.
In this embodiment, each piece of preset scene information may be used as the first scene information, so that an image corresponding to each piece of scene information may be obtained.
In one implementation, the film rate of the photographing apparatus may be calculated based on images corresponding to the scene information.
In another implementation, the sheeting rate for the bright light environment may be calculated based on the images corresponding to scene information 1-2, and the sheeting rate for the dark light environment may be calculated based on the images corresponding to scene information 3-4.
In a possible further implementation, the cropping rate of the corresponding high-frequency dither patterns may be calculated based on the images corresponding to the scene information 1, 3, and the cropping rate of the corresponding low-frequency dither patterns may be calculated based on the images corresponding to the scene information 2, 4.
For example, in another embodiment, taking as an example that the scene information includes information of an illumination environment in which the photographing apparatus is located, information of a shake mode of the photographing apparatus (specifically, a shake frequency), and information of an apparatus type of the photographing apparatus (specifically, a manufacturer of the apparatus), one or more of the following scene information 5 to scene information 12 may be preset.
Scene information 5: a bright light environment, a high frequency dithering mode, vendor 1;
scene information 6: a bright light environment, a low frequency dithering pattern, vendor 1;
scene information 7: dim light environment, high frequency dithering mode, vendor 1;
scene information 8: dim light environment, low frequency dithering mode, vendor 1;
scene information 9: a bright light environment, a high frequency dithering mode, vendor 2;
scene information 10: a bright light environment, a low frequency dithering pattern, vendor 2;
scene information 11: dim light environment, high frequency dithering mode, vendor 2;
scene information 12: dim light environment, low frequency dither pattern, vendor 2.
The configuration information of other configuration categories (such as jitter amplitude, photographing mode, etc.) in the scene information is consistent, for example, the configuration information is all default values.
When the first scene information is any one of the scene information 5 to the scene information 8, one or more shooting devices of the manufacturer 1 are used for executing shooting operation, and when the first scene information is any one of the scene information 9 to the scene information 12, one or more shooting devices of the manufacturer 2 are used for executing shooting operation.
In this embodiment, each piece of preset scene information may be used as the first scene information, so that an image corresponding to each piece of scene information may be obtained.
In one possible implementation, the film rate of the photographing apparatus of vendor 1 may be calculated based on the images corresponding to the scene information 5 to the scene information 8, and the film rate of the photographing apparatus of vendor 2 may be calculated based on the images corresponding to the scene information 9 to the scene information 12.
Step 202, acquiring a first image shot by a shooting device in a first shooting scene.
The first shooting scene is configured according to the first scene information, and a first image shot in the first shooting scene corresponds to the first scene information. The images may be stored in a classified manner according to scene information corresponding to the images.
In one embodiment, taking a photographing device such as a smart phone as an example, a personal computer connected with the smart phone may configure a corresponding photographing scene according to scene information, and after the photographing scene is configured, trigger or control the smart phone to perform photographing operation, and obtain an image photographed by the smart phone. When the intelligent mobile phone executes shooting operation in a first shooting scene, the personal computer can obtain a first image corresponding to first scene information.
For the execution of photographing operations, a preset number of photographing operations can be executed in the current photographing scene, and the preset number of images are correspondingly obtained. The preset number is a positive integer, for example, can be set to 1, 5, 10, 50, etc. as required, which is not limited in this embodiment.
In step 203, imaging quality information of the first image is acquired, wherein the imaging quality information is used to reflect the imaging quality of the image.
Reflecting the imaging quality, or measuring the imaging quality, measuring the imaging picture quality.
In one embodiment, the imaging quality information may include at least one of sharpness and ambiguity (such as MotionBlur).
The definition can be used for reflecting the definition degree of each detail texture and the image edge of the image.
MotionBlur, motion blur, can be used to reflect the degree of blur of a moving scene in an image.
In one embodiment, the sharpness of the image may be calculated using a gradient operator based contour extraction method.
Step 204, determining a first film forming rate according to the imaging quality information of the first image.
The film rate may be, for example, the probability that a captured image is available.
In one embodiment, the first film rate may be a film rate of the photographing apparatus.
In another embodiment, the first film forming rate may be a film forming rate of a manufacturer of the corresponding photographing apparatus.
In yet another embodiment, the first film rate may be a film rate corresponding to an operating system version of the photographing apparatus.
In still another embodiment, the first clip rate may be a clip rate corresponding to any configuration item in the first scene information.
For example, if the first scene information includes information of an illumination environment where the photographing apparatus is located, the information is a dark light environment, and the first film forming rate may be a film forming rate corresponding to the dark light environment.
For another example, if the first scene information includes information of a dither pattern of the photographing apparatus, the information being a dither pattern, the first film forming rate may be a film forming rate corresponding to the dither pattern.
For another example, if the first scene information includes information of a zoom magnification of the photographing apparatus, the information is a main shot, the first film rate may be a film rate corresponding to the main shot.
For another example, if the first scene information includes information of a photographing mode of the photographing apparatus, the information is a portrait mode, the first film forming rate may be a film forming rate corresponding to the portrait mode.
In one embodiment, taking the imaging quality information as definition as an example, the definition of the first image may be compared with a preset definition threshold, and if the definition of the first image reaches the definition threshold, the first image may be considered to be usable, otherwise, the first image may be considered to be unusable. And further, the film rate can be calculated based on the total number of images and the number of clearly available images. For example, the sheeting rate may be the number of available images as a percentage of the total number of images.
The method for determining the film forming rate can objectively and automatically determine the film forming rate of the shooting equipment in a whole process, so that the problem of time and labor consumption when the film forming rate is determined manually is solved.
Considering that the photographing apparatus has a corresponding manufacturer, has a corresponding apparatus model, and is provided with an operating system of a specific version, the first cropping rate determined in this embodiment may be not only the cropping rate of the photographing apparatus, but also the cropping rate of the apparatus model of the photographing apparatus, the cropping rate of the manufacturer corresponding to the photographing apparatus, and the cropping rate of the operating system version corresponding to the photographing apparatus.
In order to achieve the purpose of obtaining the filming effect of different filming devices, the filming effect of filming devices of different manufacturers, the filming effect of filming devices of different models of the same manufacturer, or the filming effect of different operating system versions, in one embodiment, referring to fig. 3, the method for determining the filming rate shown in fig. 2 may further include the following steps 301 to 306:
step 301, configuring a second shooting scene according to second scene information, wherein the second scene information comprises configuration items affecting imaging quality.
The configuration items included in the first scene information and the configuration items included in the second scene information may be identical, may be partially identical, or may be completely different.
The category of the configuration item included in the first scene information and the category of the configuration item included in the second scene information may be identical, may be partially identical, or may be completely different.
Step 302, acquiring a second image shot by the shooting device in a second shooting scene.
The second shooting scene is configured according to second scene information, and a second image shot in the second shooting scene corresponds to the second scene information.
In step 303, imaging quality information of the second image is acquired.
Step 304, determining a second film forming rate according to the imaging quality information of the second image.
In step 305, the first film forming rate and the second film forming rate are compared to obtain a comparison result.
And step 306, outputting a comparison result.
The steps of acquiring the first image and the second image may be performed sequentially, and the imaging quality information of the images may be calculated together, so as to determine the film rate and compare the film rate. Or the first image can be acquired to determine the first film forming rate, and the second image can be acquired to determine the second film forming rate, so that the film forming rates can be compared. The sequence of execution of these steps is not limited in this embodiment.
In one embodiment, the photographing apparatus that performs the photographing operation in the second photographing scene is the same apparatus as the photographing apparatus that performs the photographing operation in the first photographing scene.
For example, when the first film rate is the film rate of one photographing mode and the second film rate is the film rate of another photographing mode, the two photographing devices may be the same device or different devices.
The same equipment may be the same equipment, or may be different equipment in the same production batch.
In another embodiment, the photographing apparatus that performs the photographing operation in the second photographing scene is a different apparatus from the photographing apparatus that performs the photographing operation in the first photographing scene.
For example, when the first cropping rate is the cropping rate of one shooting device and the second cropping rate is the cropping rate of another shooting device, the two shooting devices are different devices.
The first scene information and the second scene information may be part or all of a plurality of scene information set in advance.
The embodiment can automatically compare the first film forming rate with the second film forming rate and output a comparison result for the user to check. For example, taking the first rate as the rate of the smart phone 1 and the second rate as the rate of the smart phone 2, the first rate is greater than the second rate, and the comparison result may be that "the effect of the smart phone 1 is better than the smart phone 2", "the rate of the smart phone 1 is greater than the rate of the smart phone 2", and so on.
In other embodiments, the first and second rates may also be directly output for viewing by the user.
For example, in order to obtain the filming effect of the filming devices of different factories (the factories 1 and 2), the filming rate of the filming device of the factory 1 is obtained by photographing based on the filming device of the factory 1, and then the filming rate of the filming device of the factory 2 is obtained by photographing based on the filming device of the factory 2. Based on the obtained sheeting rate, the sheeting effect of the shooting equipment of the two factories can be compared.
Based on this, the first scene information may be, for example, any one of the above-described scene information 5 to 8, and the second scene information may be any one of the above-described scenes 9 to 12.
Since a certain version of operating system is installed in the photographing apparatus, in order to obtain the film effect of different operating system versions, in one implementation manner, the operating system version may be switched by switching the photographing apparatus. In this case, for switching photographing apparatuses, the apparatuses to be switched may be the same kind of apparatuses, and differences between different apparatuses may be basically only different from operating system versions installed on the apparatuses than apparatuses of the same model and the same production lot, so that the switching photographing apparatuses are equivalent to switching operating system versions.
In another implementation, the operating system version may also be switched by updating the operating system based on the same shooting device.
In step 202, the embodiment configures a corresponding first shooting scene according to the first scene information, and the image shot in the first shooting scene corresponds to the first scene information. According to the scene information corresponding to each shot image, the images can be classified and stored, so that operations such as film rate analysis, image effect verification, image related data viewing and the like can be performed on demand based on the classified and stored images.
As such, in one embodiment, the method for determining a film rate shown in fig. 2 may further include: classifying and storing the first image according to the first scene information corresponding to the first image;
the determining the first film forming rate according to the imaging quality information of the first image comprises the following steps: and determining the first film forming rate according to the imaging quality information of the first image and the imaging quality information of other images in the class of images comprising the first image.
The embodiment of the application can record the corresponding scene information of the image when the image is stored. Based on this, the stored images may be classified as needed according to various configuration items in the scene information.
Taking configuration information of a device type as an example, assuming that the configuration information of the device type includes the device 1 and the device 2, according to the configuration information of the device type in the scene information corresponding to the image, all images corresponding to the device 1 and all images corresponding to the device 2 can be separated, then the film forming rate of the device 1 is determined according to all images corresponding to the device 1, the film forming rate of the device 2 is determined according to all images corresponding to the device 2, and then the film forming rates of the two devices are compared to determine which device has better film forming effect.
Taking configuration items including configuration information of illumination environments as an example, assuming that the configuration information of the illumination environments has bright light and dark light, all images corresponding to the bright light and all images corresponding to the dark light can be separated according to the configuration information of the illumination environments in the scene information corresponding to the images, then the film forming rate under the bright light is determined according to all images corresponding to the bright light, and the film forming rate under the dark light is determined according to all images corresponding to the dark light.
Illustratively, the sheeting rate under bright light and the sheeting rate under dark light may be compared to determine whether the comparison meets the expected requirement (assuming the expected requirement is that the sheeting rate under bright light is higher than the sheeting rate under dark light, and a minimum difference in sheeting rate may be required). If the image meets the expected requirement, the image is verified to pass, and the image can be compared with the film forming rate of different shooting devices, the film forming rate of different types of shooting devices and the like according to the images. Based on the comparison of the film forming rate of different devices, the film forming effect of which device can be accurately determined.
For example, if the difference between the sheeting rate in bright light and the sheeting rate in dark light is smaller than expected for the same photographing apparatus, the photographing apparatus can be considered to better cope with, improve the problem of poor sheeting effect in dark light.
Taking configuration information of a photographing mode as an example, assuming that the configuration information of the photographing mode includes a portrait photographing mode and a night scene photographing mode, all images corresponding to the portrait photographing mode and all images corresponding to the night scene photographing mode can be separated according to the configuration information of the photographing mode in the image corresponding scene information, then the film taking rate of the corresponding portrait photographing mode is determined according to all images corresponding to the portrait photographing mode, and the film taking rate of the corresponding night scene photographing mode is determined according to all images corresponding to the night scene photographing mode.
For example, the film taking rate in the two photographing modes may be compared to determine in which photographing mode the current photographing apparatus has better film taking effect.
It can be seen that, according to the method for determining the film rate provided in this embodiment, the film rate of different photographing apparatuses may be determined, and the film rate of other configuration categories may be determined, for example, the film rate in different lighting environments, the film rate in different photographing modes, the film rate in different dithering modes (specifically, the film rate in different dithering frequencies, the film rate in different dithering amplitudes, etc.), and so on.
In order to obtain the film rate of the current shooting device, the images may be classified and stored according to different shooting devices according to the scene information corresponding to the shot images.
In another implementation manner, the scene information corresponding to each image may not be known and recorded, and as long as the images obtained during the photographing of the same photographing device are all classified into the same class based on the photographing device, the class of images is the image corresponding to the photographing device.
Based on this, in one embodiment, the method may further comprise: classifying the first image into a first image set corresponding to the photographing device in step 202, where the first image set further includes images photographed by the photographing device under other photographing scenes;
the determining the first film forming rate according to the imaging quality information of the first image comprises the following steps: and determining a first film forming rate according to the imaging quality information of the images in the first image set, wherein the first film forming rate is the film forming rate of the shooting equipment.
In this embodiment, in the image set of the same photographing apparatus, there may be respective images photographed by the photographing apparatus in more than one photographing scene.
If there is only one photographing scene, i.e. the first photographing scene, the steps 201 to 202 are executed to obtain images photographed by the current photographing device in the first photographing scene, where the images are all classified into the same image set corresponding to the photographing device. Further, in step 204, the film rate of the photographing apparatus may be calculated according to the definition of some or all of the images in the image set corresponding to the photographing apparatus.
For example, if there are two photographing scenes, i.e., the first photographing scene and the second photographing scene, the steps 201 to 202 may be executed for the scene information corresponding to the two photographing scenes, respectively, so as to obtain the images photographed by the same photographing device under the respective photographing scenes, where the images are all categorized into the same image set corresponding to the photographing device. Further, in step 204, the film rate of the photographing apparatus may be calculated according to the definition of some or all of the images in the image set corresponding to the photographing apparatus.
In this embodiment, a plurality of photographing scenes (including the first photographing scene and the second photographing scene) may be set to basically achieve a full-coverage effect of the scenes, and the film rate is determined under the condition of covering the full scene, so that accuracy and persuasion for evaluating the film rate may be improved.
Possibly, in combination with configuration items (such as configuration information of illumination, shake, etc.) affecting the imaging quality, one or more photographing scenes may be preset. The shooting operation can be performed to shoot images in each preset shooting scene.
In one implementation, only one photographing scene may be set, and one or more images may be photographed under the same photographing scene.
In another implementation, multiple photo scenes may be provided, which may substantially cover most or all of the photo scenes in reality. One or more images may be taken in the same scene.
In one embodiment of the present application, referring to fig. 4, the method for determining a film rate shown in fig. 2 may further include the following steps 401 to 404:
step 401, configuring a third shooting scene according to third scene information, wherein the third scene information includes configuration items affecting imaging quality.
The configuration items included in the first scene information and the configuration items included in the third scene information may be partially the same or may be completely different.
The category of the configuration item included in the first scene information and the category of the configuration item included in the second scene information may be identical, may be partially identical, or may be completely different.
In step 402, a third image captured by the capturing device in a third capturing scene is obtained.
The third shooting scene is configured according to third scene information, and a third image shot in the third shooting scene corresponds to the third scene information.
In step 403, imaging quality information of the third image is acquired.
The determining the first film forming rate according to the imaging quality information of the first image comprises the following steps: step 404, determining a first film forming rate according to the imaging quality information of the first image and the imaging quality information of the third image.
In one embodiment, the photographing apparatus that performs the photographing operation in the third photographing scene is the same apparatus as the photographing apparatus that performs the photographing operation in the first photographing scene. For example, when the first film rate is the film rate of the photographing apparatus, the two photographing apparatuses are the same apparatus.
The same equipment may be the same equipment, or may be different equipment in the same production batch.
In another embodiment, the photographing apparatus that performs the photographing operation in the third photographing scene is a different apparatus from the photographing apparatus that performs the photographing operation in the first photographing scene. For example, when the first film rate is the film rate of a certain shooting mode, the two shooting devices may be the same device or different devices.
Illustratively, the first scene information and the third scene information may be any two of the above-described scene information 1 to 4, and the first film rate may be a film rate of the photographing apparatus.
Illustratively, the first scene information and the third scene information may be any two of the above-described scene information 5 to 8, and the first clip rate may be a clip rate of vendor 1.
Illustratively, the first scene information and the third scene information may be any two of the above-described scene information 9 to 12, and the first clip rate may be a clip rate of vendor 2.
Illustratively, the first scene information and the third scene information may be any two of the above-described scene information 5, 7, 9, 11, and the first clip rate may be a clip rate corresponding to the high frequency dither pattern.
In the case of presetting a plurality of scene information, in one implementation manner, corresponding shooting scenes can be configured for each scene information in turn. After the photographing scene is configured, photographing operation can be performed in the photographing scene. After the photographing operation under the current photographing scene is completed, the next photographing scene can be configured.
For example, 3 pieces of scene information, namely scene information 1, scene information 2 and scene information 3, are preset. The image is shot firstly for scene information 1, then for scene information 2, and finally for scene information 3.
For example, after the photographing operation is completed in the photographing scene corresponding to the scene information 2, it may be determined that the scene information 2 is not the last scene information, so that the next scene information of the scene information 2, that is, the scene information 3, may be taken, and the scene information 3 is taken as the first scene information, step 201 is executed again, so as to configure the photographing scene corresponding to the scene information 3, and the photographing operation is executed in the photographing scene.
In the case of presetting a plurality of scene information, in another implementation manner, one of the preset scene information may be randomly taken to capture an image until each of the preset scene information is used to capture the image. This embodiment is not limited thereto.
The embodiment determines the film forming rate under the condition of covering the whole scene, and can improve the accuracy and persuasion of evaluating the film forming rate.
In the case where the photographing apparatuses of different zoom magnifications and different apparatus types need to be switched, considering that the switching of the different zoom magnifications and the switching of the photographing apparatuses of different apparatus types may cause a change in the photographing framing composition, an operation for realizing the standardized composition may be performed so that the image compositions photographed by the photographing apparatuses of different apparatus types under different zoom magnifications remain consistent, and all have the standardized composition effect.
Based on the standardized composition effect, the contrast analysis among the images obtained by shooting in different shooting scenes can be facilitated, the adverse effect of composition inconsistency among the images obtained by shooting in different shooting scenes on the film rate determination is eliminated, and the accurate determination of the film rate is facilitated.
Thus, in one embodiment, the configuration items include: configuration information of at least one of a zoom magnification of the photographing apparatus and an apparatus type of the photographing apparatus; the configuring the first photographing scene includes: executing a first operation, wherein the first image comprises an image of the image card, and the relative position relationship between the image of the image card and the first image is the same as a preset first relative position relationship; the image card is located in the image acquisition range of the shooting equipment.
In the standardized composition effect, the preset first relative positional relationship may be: the position of the second image in the first image is fixed, the ratio of the second image to the first image is fixed, etc.
As shown in fig. 5, one embodiment of the present application provides another method for determining a film rate. Referring to fig. 5, the method may include the following steps 501 to 507.
Step 501, configuring a corresponding photographing scene according to preset scene information, wherein the scene information comprises illumination environment information, shake mode information, zoom ratio information and smart phone information.
Before configuring the first photographing scene, the user may erect the graphics card in advance, so that the graphics card is located in the image acquisition range of the smart phone, that is, the image shot by the smart phone includes the image of the graphics card, erect the smart phone on the mobile phone installation position of the shake simulation control system, connect the smart phone with the personal computer, and then open the personal computer and the smart phone.
The information of the installed smart phone is consistent with the information of the smart phone in the current scene information. When scene information is replaced to replace a photographing scene, the installed smart phone is replaced correspondingly.
The various steps in the embodiment shown in fig. 5 may be performed by the personal computer.
Because the embodiment of the application relates to the change of the zoom magnification and the smart phone, in the process of configuring the current photographing scene, standardized composition can be carried out according to the zoom magnification information and the smart phone information in the current scene information.
Because the embodiment of the application relates to the change of the illumination environment and the dithering mode, in the process of configuring the current photographing scene, the light control system can be controlled to configure the corresponding illumination environment and the dithering simulation control system can be controlled to configure the corresponding dithering mode according to the illumination environment information and the dithering mode information in the current scene information.
After the personal computer is configured with the photographing scene, the smart phone can be triggered to execute photographing operation to start image acquisition.
In one embodiment, the personal computer may control the smart phone to end the shooting operation in the current shooting scene when determining that the smart phone shoots a preset number of images. In another embodiment, the smartphone may end the photographing operation when it determines that a preset number of images have been photographed.
Step 502, obtaining an image corresponding to scene information, which is shot by a smart phone corresponding to the scene information in a shooting scene.
In the embodiment shown in fig. 5, after each photographing scene is configured, the personal computer may acquire each image photographed by the corresponding smart phone under the photographing scene.
In other embodiments, the personal computer may also obtain, after each photographing scene is configured, each image photographed by each smart phone under the corresponding photographing scene (or under each photographing scene by the same smart phone).
Step 503, determining whether the image capturing under the shooting scene corresponding to each preset scene information is completed, if yes, executing step 505, otherwise executing step 504.
Step 504, another preset scene information is acquired, and step 501 is executed again according to the acquired scene information.
In one implementation, the preset scene information may be sequentially arranged, and a corresponding shooting scene may be sequentially configured for each scene information from first to second. Thus, step 503 may determine whether the current scene information is the last scene information, if so, the image capturing in all the photographing scenes is completed, and if not, the image capturing in all the photographing scenes is not completed yet. The other scene information in step 504 may be the next scene information to the current scene information.
The images shot by the smart phone can be automatically led out to the personal computer.
And 505, classifying and storing the acquired images according to the intelligent mobile phone information in the scene information corresponding to the images.
Images corresponding to the same smart phone information can be classified into a category, and the naming of the category of images can be the smart phone information. Based on the images, the film rate corresponding to the smart phone information can be determined.
In one implementation, the smart phone may automatically record a corresponding exposure time when capturing an image, and the exposure time and the captured image may be classified and stored together.
The exposure times of different smartphones may be correspondingly different. The exposure time can be used as auxiliary information for evaluating the film forming rate, so that the understanding and judgment of the film forming rate by a user are enhanced.
For example, when comparing the film forming rate of the smart phone 1 and the film forming rate of the smart phone 2, the user can know the exposure time of the smart phone 1 and the exposure time of the smart phone 2, and if the exposure time is consistent, the comparison of the film forming rates of the two can be considered to be more accurate (different from the inconsistent exposure time, the influence of the factor of the exposure time on the film forming rate can be eliminated when the exposure time is consistent).
Step 506, calculating the sharpness of the stored image using a contour extraction method based on gradient operators.
For example, the image sharpness may be calculated using a contour extraction method based on Sobel gradient operators.
In other embodiments of the present application, other algorithms may be used to calculate image sharpness. For example, an image resolution algorithm (such as an SFR algorithm) may be used to calculate the image sharpness. Wherein, SFR is fully spelled as spatial frequency response, chinese paraphrasing is spatial frequency response.
In one embodiment, the calculated sharpness may be saved toIn the table. In other embodiments, the calculated sharpness may be saved to other types of files (e.g.)>) Is a kind of medium. This embodiment is not limited thereto.
And step 507, calculating the film forming rate of the corresponding smart phone according to the definition of each image in each type of image.
Alternatively, a fraction statistical analysis system for calculating the fraction may be configured, and step 507 may be performed based on the system.
The images of the class correspond to the same scene information, namely the same smart phone information, and the film rate of the corresponding smart phone can be calculated according to the definition of the images of the class.
Alternatively, a corresponding cut-out may be generated based on the calculated cut-out and output for viewing by the user.
In one embodiment, the computed film-rate may be saved toIn the table. In other embodiments, the calculated film-rate may be saved to other types of files (e.g.)>) Is a kind of medium. This embodiment is not limited thereto.
After obtaining the sheeting rate of each smart phone, the smart phones and the sheeting rates thereof can be sequenced in sequence according to the sheeting rate, so that a user can quickly and intuitively know the sheeting effect of different smart phones.
Based on the implementation of the embodiment of the application, the film forming rate of different smart phones can be automatically and objectively evaluated, and the evaluation mode can cover a full-shot scene so as to support accurate evaluation of the film forming rate.
In one embodiment, the method of determining a rate of film forming provided by any of the embodiments of the present application may be performed using a film forming rate determination system 600 as shown in fig. 6.
Referring to fig. 6, the film rate determination system 600 may at least include: a light control system 601, a jitter simulation control system 602, an automatic photographing system 603, a data automatic saving system 604, a definition automatic calculation system 605 and a film rate statistical analysis system 606.
The light control system 601 may be configured to provide an illumination environment corresponding to the preset scene information, where the photographing apparatus is located, so that the photographing apparatus photographs under the illumination environment.
The shake simulation control system 602 may be used to provide a shake pattern of the photographing apparatus corresponding to preset scene information so that the photographing apparatus photographs in the shake pattern.
For example, the photographing apparatus may be mounted on a corresponding mounting position of the shake simulation control system 602 so that the photographing apparatus shakes in a desired shake pattern. This implementation may simulate the jitter situation when the user holds the photographing apparatus.
In one embodiment, the automated photographing system 603 may be located in a photographing apparatus for performing photographing operations.
In other embodiments, the automatic photographing system 603 may also be located outside the photographing apparatus, for triggering the photographing apparatus to perform photographing operations.
The photographing device may perform a photographing operation to photograph a captured image, and the data automatic saving system 604 may be used to save the photographed image.
The sharpness auto-computing system 605 may be used to compute the sharpness of a captured image.
The film rate statistical analysis system 606 may be used to calculate a film rate based on the calculated sharpness.
In one embodiment, a personal computer of a user may be connected to the photographing apparatus, and the personal computer may perform some or all of the following operations by executing a preset program: the light control system 601 is controlled to provide a required illumination environment, the shake simulation control system 602 is controlled to provide a required shake mode, and the shooting equipment is triggered to execute shooting operation, save shot images, calculate definition and calculate film rate.
An embodiment of the present application further provides a device for determining a film rate, including: the configuration module is used for configuring a first shooting scene according to first scene information, wherein the first scene information comprises configuration items influencing imaging quality; the first acquisition module is used for acquiring a first image shot by the shooting equipment in a first shooting scene; the second acquisition module is used for acquiring imaging quality information of the first image, wherein the imaging quality information is used for reflecting the imaging quality of the image; and the determining module is used for determining the first film forming rate according to the imaging quality information of the first image.
One embodiment of the present application also provides an electronic chip, which is mounted in an electronic device (UE), the electronic chip including: a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger an electronic chip to perform the method steps provided by any of the method embodiments of the present application.
An embodiment of the present application further proposes a terminal device, which includes a communication module, a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the terminal device to execute the method steps provided by any of the method embodiments of the present application.
An embodiment of the present application further proposes a server device comprising a communication module, a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the server device to perform the method steps provided by any of the method embodiments of the present application.
An embodiment of the present application also provides an electronic device, including a plurality of antennas, a memory for storing computer program instructions, a processor for executing the computer program instructions, and a communication means (such as a communication module that may enable 5G communication based on an NR protocol), where the computer program instructions, when executed by the processor, trigger the electronic device to perform the method steps provided by any of the method embodiments of the present application.
In particular, in an embodiment of the present application, one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the apparatus, cause the apparatus to perform the method steps described in the embodiments of the present application.
Specifically, in an embodiment of the present application, the processor of the electronic device may be a System On Chip (SOC), where the processor may include a central processing unit (Central Processing Unit, CPU), and may further include other types of processors. Specifically, in an embodiment of the present application, the processor of the electronic device may be a PWM control chip.
In particular, in an embodiment of the present application, the processor may include, for example, a CPU, DSP (digital signal processor ) or microcontroller, and may further include a GPU (graphics processing unit, graphics processor), an embedded Neural network processor (Neural-network Process Units, NPU), and an image signal processor (Image Signal Processing, ISP), where the processor may further include a necessary hardware accelerator or logic processing hardware circuit, such as an ASIC, or one or more integrated circuits for controlling the execution of the program of the present application, and so on. Further, the processor may have a function of operating one or more software programs, which may be stored in a storage medium.
In particular, in an embodiment of the present application, the memory of the electronic device may be a read-only memory (ROM), other type of static storage device capable of storing static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device capable of storing information and instructions, an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc read-only memory (compact disc read-only memory, CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any computer readable medium capable of carrying or storing desired program code in the form of instructions or data structures and capable of being accessed by a computer.
In particular, in an embodiment of the present application, the processor and the memory may be combined into a processing device, more commonly separate components, and the processor is configured to execute the program code stored in the memory to implement the method described in the embodiment of the present application. In particular, the memory may also be integrated into the processor or may be separate from the processor.
Further, the devices, apparatuses, modules illustrated in the embodiments of the present application may be implemented by a computer chip or entity, or by a product having a certain function.
It will be apparent to those skilled in the art that embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied therein.
In several embodiments provided herein, any of the functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application.
Specifically, in an embodiment of the present application, there is further provided a computer readable storage medium, where a computer program is stored, when the computer program is executed on a computer, to cause the computer to perform the method steps provided in the embodiments of the present application.
An embodiment of the present application also provides a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method steps provided by the embodiments of the present application.
The description of embodiments herein is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments herein. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices, or units, which may be in electrical, mechanical, or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units, implemented in the form of software functional units, may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a Processor (Processor) to perform part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In the present embodiments, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
All embodiments in the application are described in a progressive manner, and identical and similar parts of all embodiments are mutually referred, so that each embodiment mainly describes differences from other embodiments. In particular, for the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments in part.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as a combination of electronic hardware, computer software, and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
The foregoing description of the preferred embodiments of the present invention is not intended to limit the invention to the precise form disclosed, and any modifications, equivalents, improvements and alternatives falling within the spirit and principles of the present invention are intended to be included within the scope of the present invention.

Claims (11)

1. A method for determining a film rate, comprising:
configuring a first shooting scene according to first scene information, wherein the first scene information comprises a configuration item affecting imaging quality, and the configuration item comprises a shake mode of shooting equipment;
acquiring a first image shot by shooting equipment in the first shooting scene;
acquiring imaging quality information of the first image, wherein the imaging quality information is used for reflecting the imaging quality of the image;
determining a first film forming rate according to the imaging quality information of the first image;
configuring a second shooting scene according to second scene information, wherein the second scene information comprises configuration items influencing imaging quality;
acquiring a second image shot by shooting equipment in the second shooting scene;
acquiring imaging quality information of the second image;
determining a second film forming rate according to the imaging quality information of the second image;
Comparing the first film forming rate with the second film forming rate to obtain a comparison result;
outputting the comparison result;
the first rate is the rate of the first configuration item in the first scene information;
the second rate is the rate of the second configuration item in the second scene information;
the first configuration item and the second configuration item belong to the same configuration category;
the configuration category includes: any one of illumination environment in which the photographing apparatus is located, shake mode of the photographing apparatus, zoom magnification of the photographing apparatus, apparatus type of the photographing apparatus, operating system version of the photographing apparatus, photographing mode of the photographing apparatus;
the method further comprises the steps of: classifying the first image into an image set corresponding to the first configuration item, wherein the image set further comprises images shot in other shooting scenes, and scene information of the other shooting scenes comprises the first configuration item;
the determining a first film forming rate according to the imaging quality information of the first image includes: determining the first film rate according to the imaging quality information of the images in the image set corresponding to the first configuration item;
The method further comprises the steps of: classifying the second image into an image set corresponding to the second configuration item, wherein the image set further comprises images shot in other shooting scenes, and scene information of the other shooting scenes comprises the second configuration item;
the determining a second film forming rate according to the imaging quality information of the second image comprises the following steps: and determining the second film rate according to the imaging quality information of the images in the image set corresponding to the second configuration item.
2. The method according to claim 1, wherein the method further comprises:
configuring a third photographing scene according to third scene information, wherein the third scene information comprises configuration items affecting imaging quality;
acquiring a third image shot by shooting equipment in the third shooting scene;
acquiring imaging quality information of the third image;
the determining a first film forming rate according to the imaging quality information of the first image includes:
and determining a first film forming rate according to the imaging quality information of the first image and the imaging quality information of the third image.
3. The method of claim 1, wherein the configuration item comprises: configuration information of at least one of an illumination environment in which the photographing apparatus is located, a zoom magnification of the photographing apparatus, an apparatus type of the photographing apparatus, an operating system version of the photographing apparatus, and a photographing mode of the photographing apparatus.
4. The method of claim 1, wherein the imaging quality information comprises: at least one of sharpness and ambiguity.
5. The method of claim 1, wherein the configuration item comprises: configuration information of at least one of a zoom magnification of the photographing apparatus and an apparatus type of the photographing apparatus;
the configuring the first photographing scene includes:
executing a first operation, wherein the first image comprises an image of a graphic card, and the relative position relationship between the image of the graphic card and the first image is the same as a preset first relative position relationship;
the image card is located in an image acquisition range of the shooting equipment.
6. The method according to claim 1, wherein the method further comprises: and classifying and storing the first image according to the first scene information corresponding to the first image.
7. The method according to claim 1, wherein the method further comprises: classifying the first image into a first image set corresponding to the shooting device, wherein the first image set further comprises images shot by the shooting device under other shooting scenes;
The determining a first film forming rate according to the imaging quality information of the first image includes:
and determining a first film forming rate according to the imaging quality information of the images in the first image set, wherein the first film forming rate is the film forming rate of the shooting equipment.
8. A sheeting rate determining apparatus, comprising:
the configuration module is used for configuring a first shooting scene according to first scene information, wherein the first scene information comprises configuration items affecting imaging quality, and the configuration items comprise a shake mode of shooting equipment;
the first acquisition module is used for acquiring a first image shot by the shooting equipment in the first shooting scene;
the second acquisition module is used for acquiring imaging quality information of the first image, wherein the imaging quality information is used for reflecting the imaging quality of the image;
the determining module is used for determining a first film forming rate according to the imaging quality information of the first image;
the film rate determination apparatus includes means for performing:
configuring a second shooting scene according to second scene information, wherein the second scene information comprises configuration items influencing imaging quality;
Acquiring a second image shot by shooting equipment in the second shooting scene;
acquiring imaging quality information of the second image;
determining a second film forming rate according to the imaging quality information of the second image;
comparing the first film forming rate with the second film forming rate to obtain a comparison result;
outputting the comparison result;
the first rate is the rate of the first configuration item in the first scene information;
the second rate is the rate of the second configuration item in the second scene information;
the first configuration item and the second configuration item belong to the same configuration category;
the configuration category includes: any one of illumination environment in which the photographing apparatus is located, shake mode of the photographing apparatus, zoom magnification of the photographing apparatus, apparatus type of the photographing apparatus, operating system version of the photographing apparatus, photographing mode of the photographing apparatus;
the film rate determining device is further configured to: classifying the first image into an image set corresponding to the first configuration item, wherein the image set further comprises images shot in other shooting scenes, and scene information of the other shooting scenes comprises the first configuration item;
The determining a first film forming rate according to the imaging quality information of the first image includes: determining the first film rate according to the imaging quality information of the images in the image set corresponding to the first configuration item;
the film rate determining device is further configured to: classifying the second image into an image set corresponding to the second configuration item, wherein the image set further comprises images shot in other shooting scenes, and scene information of the other shooting scenes comprises the second configuration item;
the determining a second film forming rate according to the imaging quality information of the second image comprises the following steps: and determining the second film rate according to the imaging quality information of the images in the image set corresponding to the second configuration item.
9. An electronic chip, comprising:
a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method of any of claims 1-7.
10. An electronic device comprising a memory for storing computer program instructions, a processor for executing the computer program instructions, and communication means, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method of any of claims 1-7.
11. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when run on a computer, causes the computer to perform the method according to any of claims 1-7.
CN202210983147.1A 2022-08-16 2022-08-16 Method, device, chip, electronic equipment and medium for determining film forming rate Active CN116055712B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210983147.1A CN116055712B (en) 2022-08-16 2022-08-16 Method, device, chip, electronic equipment and medium for determining film forming rate

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210983147.1A CN116055712B (en) 2022-08-16 2022-08-16 Method, device, chip, electronic equipment and medium for determining film forming rate

Publications (2)

Publication Number Publication Date
CN116055712A CN116055712A (en) 2023-05-02
CN116055712B true CN116055712B (en) 2024-04-05

Family

ID=86126005

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210983147.1A Active CN116055712B (en) 2022-08-16 2022-08-16 Method, device, chip, electronic equipment and medium for determining film forming rate

Country Status (1)

Country Link
CN (1) CN116055712B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116843683B (en) * 2023-08-30 2024-03-05 荣耀终端有限公司 Equipment imaging definition evaluation method, system and device
CN117474926A (en) * 2023-12-28 2024-01-30 荣耀终端有限公司 Image detection method and device

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007306108A (en) * 2006-05-09 2007-11-22 Nippon Telegr & Teleph Corp <Ntt> Image quality estimation apparatus, method, and program
CN203827466U (en) * 2013-12-25 2014-09-10 广州计量检测技术研究院 Apparatus for comprehensive inspection of image quality of digital camera
CN109089041A (en) * 2018-08-22 2018-12-25 Oppo广东移动通信有限公司 Recognition methods, device, electronic equipment and the storage medium of photographed scene
CN110493595A (en) * 2019-09-30 2019-11-22 腾讯科技(深圳)有限公司 The detection method and device of camera, storage medium and electronic device
CN111932521A (en) * 2020-08-13 2020-11-13 Oppo(重庆)智能科技有限公司 Image quality testing method and device, server and computer readable storage medium
CN111953964A (en) * 2020-07-29 2020-11-17 欧菲微电子技术有限公司 Ambiguity detection method, electronic device and storage medium
WO2020238775A1 (en) * 2019-05-28 2020-12-03 华为技术有限公司 Scene recognition method, scene recognition device, and electronic apparatus
CN112712564A (en) * 2020-12-01 2021-04-27 珠海格力电器股份有限公司 Camera shooting method and device, storage medium and electronic device
CN112954214A (en) * 2021-02-10 2021-06-11 维沃移动通信有限公司 Shooting method and device, electronic equipment and storage medium
WO2021179186A1 (en) * 2020-03-10 2021-09-16 华为技术有限公司 Focusing method and apparatus, and electronic device
CN113596440A (en) * 2021-07-20 2021-11-02 杭州海康威视数字技术股份有限公司 System and method for calculating anti-shake performance of camera
CN114640798A (en) * 2022-05-09 2022-06-17 荣耀终端有限公司 Image processing method, electronic device, and computer storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102368625B1 (en) * 2015-07-23 2022-03-02 삼성전자주식회사 Digital photographing apparatus and the method for the same

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007306108A (en) * 2006-05-09 2007-11-22 Nippon Telegr & Teleph Corp <Ntt> Image quality estimation apparatus, method, and program
CN203827466U (en) * 2013-12-25 2014-09-10 广州计量检测技术研究院 Apparatus for comprehensive inspection of image quality of digital camera
CN109089041A (en) * 2018-08-22 2018-12-25 Oppo广东移动通信有限公司 Recognition methods, device, electronic equipment and the storage medium of photographed scene
WO2020238775A1 (en) * 2019-05-28 2020-12-03 华为技术有限公司 Scene recognition method, scene recognition device, and electronic apparatus
CN110493595A (en) * 2019-09-30 2019-11-22 腾讯科技(深圳)有限公司 The detection method and device of camera, storage medium and electronic device
WO2021179186A1 (en) * 2020-03-10 2021-09-16 华为技术有限公司 Focusing method and apparatus, and electronic device
CN111953964A (en) * 2020-07-29 2020-11-17 欧菲微电子技术有限公司 Ambiguity detection method, electronic device and storage medium
CN111932521A (en) * 2020-08-13 2020-11-13 Oppo(重庆)智能科技有限公司 Image quality testing method and device, server and computer readable storage medium
CN112712564A (en) * 2020-12-01 2021-04-27 珠海格力电器股份有限公司 Camera shooting method and device, storage medium and electronic device
CN112954214A (en) * 2021-02-10 2021-06-11 维沃移动通信有限公司 Shooting method and device, electronic equipment and storage medium
CN113596440A (en) * 2021-07-20 2021-11-02 杭州海康威视数字技术股份有限公司 System and method for calculating anti-shake performance of camera
CN114640798A (en) * 2022-05-09 2022-06-17 荣耀终端有限公司 Image processing method, electronic device, and computer storage medium

Also Published As

Publication number Publication date
CN116055712A (en) 2023-05-02

Similar Documents

Publication Publication Date Title
CN116055712B (en) Method, device, chip, electronic equipment and medium for determining film forming rate
CN114946169B (en) Image acquisition method and device
WO2019052329A1 (en) Facial recognition method and related product
CN113973173B (en) Image synthesis method and electronic equipment
CN114119758B (en) Method for acquiring vehicle pose, electronic device and computer-readable storage medium
CN114422340B (en) Log reporting method, electronic equipment and storage medium
CN104349033A (en) Self-timer light supplement method, self-timer light supplement device and electronic equipment
CN114095666B (en) Photographing method, electronic device, and computer-readable storage medium
CN115601244B (en) Image processing method and device and electronic equipment
CN112446252A (en) Image recognition method and electronic equipment
CN114490174B (en) File system detection method, electronic device and computer readable storage medium
CN114727220A (en) Equipment searching method and electronic equipment
CN112188094B (en) Image processing method and device, computer readable medium and terminal equipment
CN116708751B (en) Method and device for determining photographing duration and electronic equipment
CN115686182B (en) Processing method of augmented reality video and electronic equipment
CN115631250A (en) Image processing method and electronic equipment
CN116051450B (en) Glare information acquisition method, device, chip, electronic equipment and medium
CN116703741B (en) Image contrast generation method and device and electronic equipment
CN116095477B (en) Focusing processing system, method, equipment and storage medium
CN116896626B (en) Method and device for detecting video motion blur degree
CN116055872B (en) Image acquisition method, electronic device, and computer-readable storage medium
CN115526786B (en) Image processing method and related device
CN115705663B (en) Image processing method and electronic equipment
CN117119314B (en) Image processing method and related electronic equipment
CN116048769B (en) Memory recycling method and device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant