CN114007009B - Electronic device and image processing method - Google Patents

Electronic device and image processing method Download PDF

Info

Publication number
CN114007009B
CN114007009B CN202010682841.0A CN202010682841A CN114007009B CN 114007009 B CN114007009 B CN 114007009B CN 202010682841 A CN202010682841 A CN 202010682841A CN 114007009 B CN114007009 B CN 114007009B
Authority
CN
China
Prior art keywords
state information
image
image data
image signal
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010682841.0A
Other languages
Chinese (zh)
Other versions
CN114007009A (en
Inventor
杨平平
方攀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010682841.0A priority Critical patent/CN114007009B/en
Publication of CN114007009A publication Critical patent/CN114007009A/en
Application granted granted Critical
Publication of CN114007009B publication Critical patent/CN114007009B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/665Control of cameras or camera modules involving internal camera communication with the image sensor, e.g. synchronising or multiplexing SSIS control signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application provides an electronic device and an image processing method, wherein an image signal preprocessor is additionally added, the image signal preprocessor is used for carrying out statistics on state information of original image data acquired by an image sensor to correspondingly obtain first state information, and the image signal preprocessor is used for carrying out preprocessing on the original image data to obtain preprocessed image data; then, the image signal processor carries out statistics on the state information again on the pre-processed image data to correspondingly obtain second state information; and finally, the application processor fuses the first state information counted by the image signal preprocessor and the second state information counted by the image signal processor to obtain target state information, and the image acquisition parameters of the image sensor are updated by utilizing the target state information. Therefore, the purpose of improving the accuracy of the automatic configuration image acquisition parameters of the electronic equipment can be achieved.

Description

Electronic device and image processing method
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an electronic device and an image processing method.
Background
Currently, the quality of a shooting function has become a key for measuring the performance of an electronic device (such as a smart phone, a tablet computer, etc.). To simplify user operation, electronic devices typically automatically configure image acquisition parameters (e.g., exposure parameters, focus parameters, etc.) of image data. However, in the related art, the accuracy of the electronic device to automatically configure the image acquisition parameters is poor.
Disclosure of Invention
The application provides electronic equipment and an image processing method, which can improve the accuracy of automatic configuration of image acquisition parameters of the electronic equipment.
The application discloses an electronic device, comprising:
the image sensor is used for acquiring image data according to the configured image acquisition parameters;
an image signal preprocessor for counting state information of image acquisition parameters of the image sensor in the image data to obtain first state information, and preprocessing the image data to obtain preprocessed image data;
an image signal processor for counting state information of image acquisition parameters used for updating the image sensor in the pre-processing image data to obtain second state information;
and the application processor is used for fusing the first state information and the second state information to obtain target state information and updating the image acquisition parameters of the image sensor according to the target state information.
The embodiment of the application also discloses an image processing method which is suitable for electronic equipment, wherein the electronic equipment comprises an image sensor, an image signal preprocessor, an image signal processor and an application processor, and the image processing method comprises the following steps:
Acquiring image data by the image sensor according to the configured image acquisition parameters;
counting state information of image acquisition parameters used for updating the image sensor in the image data through the image signal preprocessor to obtain first state information, and preprocessing the image data to obtain preprocessed image data;
counting state information used for updating image acquisition parameters of the image sensor in the pre-processing image data through the image signal processor to obtain second state information;
and fusing the first state information and the second state information through the application processor to obtain target state information, and updating image acquisition parameters of the image sensor according to the target state information.
In the embodiment of the application, an image signal preprocessor is additionally added, the image signal preprocessor is used for carrying out statistics on state information of original image data acquired by an image sensor to correspondingly obtain first state information, and the image signal preprocessor is used for carrying out preprocessing on the original image data to obtain preprocessed image data; then, the image signal processor carries out statistics on the state information again on the pre-processed image data to correspondingly obtain second state information; and finally, the application processor fuses the first state information counted by the image signal preprocessor and the second state information counted by the image signal processor to obtain target state information, and the image acquisition parameters of the image sensor are updated by utilizing the target state information. Therefore, the purpose of improving the accuracy of the automatic configuration image acquisition parameters of the electronic equipment can be achieved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the description of the embodiments will be briefly described below.
Fig. 1 is a schematic diagram of a first structure of an electronic device according to an embodiment of the present application.
Fig. 2 is a schematic diagram of the structure of the image signal preprocessor in fig. 1.
Fig. 3 is a schematic diagram of a data flow according to an embodiment of the present application.
Fig. 4 is a schematic diagram of a second structure of an electronic device according to an embodiment of the present application.
Fig. 5 is a schematic diagram showing connection between the image signal preprocessor and the application processor in fig. 4.
Fig. 6 is a flowchart of an image processing method according to an embodiment of the present application.
Detailed Description
The technical scheme provided by the embodiment of the application can be applied to various scenes needing data communication, and the embodiment of the application is not limited to the scenes.
Referring to fig. 1, fig. 1 is a schematic diagram of a first structure of an electronic device 100 according to an embodiment of the application. The electronic device 100, such as a mobile electronic device like a smart phone, a tablet computer, a palm top computer, a notebook computer, or a non-mobile electronic device like a desktop computer, a server, includes an image sensor 110, an image signal preprocessor 120, an image signal processor 130, and an application processor 140.
The image sensor 110 or photosensitive element is a device for converting an optical signal into an electrical signal, and compared with a photosensitive element of a "point" light source such as a photodiode or a phototransistor, the image sensor 110 divides an optical image sensed by the photosensitive element into a plurality of small units, and then converts the small units into a usable electrical signal to obtain original image data. It should be noted that, in the embodiment of the present application, the type of the image sensor 110 is not limited, and may be a Complementary Metal Oxide Semiconductor (CMOS) image sensor, a charge coupled device (Charge Coupled Device, CCD) image sensor, or the like.
The image signal processor 130 can process the image data collected by the image sensor 110, and improve the quality of the image data. For example, the image signal processor 130 can perform optimization processing methods such as white balance correction, strong light suppression, backlight compensation, color enhancement, lens shading correction, and the like on the image data, and of course, the optimization processing methods not listed in the present application may also be included.
The image signal pre-processor 120, compared to the image signal processor 130, performs some differentiation processing before the image signal processor 130 processes the image data, and may be regarded as preprocessing before the image signal processor 130 processes, such as dead point correction processing, temporal noise reduction processing, 3D noise reduction processing, linearization processing, and black level correction processing, and may, of course, also include optimization processing not listed in the present application.
The application processor 140 is a general purpose processor such as a processor designed based on the ARM architecture.
In the embodiment of the present application, the image sensor 110 is connected to the image signal preprocessor 120, and is configured to collect image data according to the configured image collection parameters, and transmit the collected image data to the image signal preprocessor 120. It should be noted that, the image data collected by the image sensor 110 is image data in a RAW format, and the collected image data in the RAW format is transferred to the image signal preprocessor 120.
The image acquisition parameters include, but are not limited to, exposure parameters, focusing parameters, white balance parameters and the like. For example, before starting the acquisition of image data, the application processor 140 configures the initial exposure parameters to the image sensor 110 so that the image sensor 110 acquires image data according to the initial exposure parameters and transmits the acquired image data to the image signal preprocessor 120.
It should be noted that, in the embodiment of the present application, the connection manner between the image signal preprocessor 120 and the image sensor 110 is not limited, for example, the image signal preprocessor 120 and the image sensor 110 are connected through MIPI (Mobile Industry Processor Interface ).
When transmitting the image data to the image signal preprocessor 120, the image sensor 110 encapsulates the image data into a plurality of image data packets and transmits the image data packets to the image signal preprocessor. The image data packet includes a header field, a trailer field, and a data field, where the header field and the trailer field are used to fill some necessary control information, such as synchronization information, address information, error control information, and the like, and the data field is used to fill actual image content.
On the other hand, the image signal preprocessor 120 receives image data from the image sensor 110. In addition, the image signal preprocessor 120 is further connected to the image signal processor 130, where in the embodiment of the application, the connection mode between the image signal processor 130 and the image signal preprocessor 120 is not limited, for example, the image signal processor 130 and the image signal preprocessor 120 may also be connected through MIPI.
The image signal pre-processor 120, after receiving the image data transmitted from the image sensor 110, uses the image data to calculate the state information required by the application processor 140 to update the image acquisition parameters of the image sensor 110, including but not limited to brightness information, sharpness information, contrast information, and the like, and records the state information calculated using the image data as the first state information. In addition, the image signal preprocessor 120 further preprocesses the image data from the image sensor 110 according to a configured preprocessing strategy, improves the image quality of the image data, and accordingly obtains preprocessed image data. It should be noted that the preprocessing of the image data by the image signal preprocessor 120 does not change the format of the image data, i.e. the preprocessed image data obtained by the preprocessing is still in the RAW format.
After finishing the preprocessing of the image data and obtaining the preprocessed image data, the image signal preprocessor 120 transmits the preprocessed image data to the image signal processor 130.
The image signal processor 130, upon receiving the pre-processed image data from the image signal pre-processor 120, uses the pre-processed image data to calculate state information including, but not limited to, brightness information, sharpness information, contrast information, etc., required for the application processor 140 to update the image acquisition parameters of the image sensor 110, and records the state information calculated using the pre-processed image data as second state information.
Further, for the first state information, the image signal preprocessor 120 may directly transmit the first state information to the application processor 140 or transmit the first state information to the application processor 140 via the image signal processor 130.
After the image signal preprocessor 120 counts to obtain the first state information and the image signal processor 130 counts to obtain the second state information, the application processor 140 further fuses the first state information and the second state information to obtain new state information according to the configured fusing strategy, and marks the new state information as the target state information. It can be seen that the target state information carries state information before and after image data preprocessing to a certain extent, so that shooting states aiming at actual shooting scenes can be more accurately reflected, and image acquisition parameters calculated by using the target state information are more accurate. Accordingly, the application processor 140 calculates a new image acquisition parameter according to a configured image acquisition parameter calculation algorithm (such as an automatic exposure algorithm, an automatic white balance algorithm, an automatic focusing algorithm, etc.), using the fused target state information.
The application processor 140 is further connected to the image sensor 110 for controlling the image sensor 110 to start and end capturing image data. After the new image acquisition parameters are calculated, the application processor updates the image acquisition parameters of the image sensor 110 to the newly calculated image acquisition parameters. The image acquisition parameters of the image sensor 110 are continuously updated until the image acquisition parameters of the image sensor 110 are converged, so that the optimal image acquisition effect is obtained.
It should be noted that, in the embodiment of the present application, the connection manner between the application processor 140 and the image sensor 110 is not limited specifically, and may be configured by one of ordinary skill in the art according to the actual situation, for example, in the embodiment of the present application, the application processor 140 and the image sensor 110 are connected through an I2C bus.
Compared with the related art, the application additionally adds the image signal preprocessor 120, performs statistics of state information on the original image data acquired by the image sensor 110 through the image signal preprocessor 120 to correspondingly obtain first state information, and performs preprocessing on the original image data by the image signal preprocessor 120 to obtain preprocessed image data; then, the image signal processor 130 performs statistics of state information on the pre-processed image data to correspondingly obtain second state information; finally, the application processor 140 fuses the first state information counted by the image signal preprocessor 120 and the second state information counted by the image signal processor 130 to obtain target state information, and updates the image acquisition parameters of the image sensor by using the target state information. Therefore, the purpose of improving the accuracy of the automatic configuration image acquisition parameters of the electronic equipment can be achieved.
Optionally, referring to fig. 2, the image signal preprocessor 120 includes:
an image signal processing unit 1201 for counting first state information of image data; and
performing first preprocessing on the image data;
the neural network processing unit 1202 is configured to perform a second preprocessing on the image data after the first preprocessing, to obtain preprocessed image data.
The first image signal processing unit 1201 is connected to the image sensor 110, and is configured to perform statistics of state information on image data from the image sensor 110, so as to obtain first state information accordingly. The image signal preprocessing unit 120 is further configured to perform a first preprocessing on the image data according to a configured preprocessing policy. It should be noted that, in the embodiment of the present application, the first preprocessing performed by the image signal processing unit 1201 is not specifically limited, and includes, but is not limited to, optimization processing modes such as dead point correction processing, time domain noise reduction processing, 3D noise reduction processing, linearization processing, and black level correction processing, and of course, optimization processing modes not listed in the present application may also be included.
The neural network processing unit 1202 is configured to perform a second preprocessing on the image data after the first preprocessing, to obtain preprocessed image data. The neural network processing unit 1202 is configured to cure a plurality of neural network algorithms (such as a video night scene algorithm, a video HDR algorithm, a video blurring algorithm, a video noise reduction algorithm, and a video super-resolution algorithm based on a neural network), and after the first image signal processing unit 1201 finishes a first preprocessing on the image data, the neural network processing unit 1202 invokes a corresponding neural network algorithm to perform a second preprocessing on the image data after the first preprocessing according to a configured preprocessing policy, so as to obtain preprocessed image data.
In popular terms, the image signal processing unit 1201 performs preliminary optimization processing on the image data by using a non-AI image optimization method, and then the neural network processing unit 1202 performs further optimization on the preliminarily optimized image data by using an AI image optimization method.
Optionally, in an embodiment, the application processor 140 is configured to:
acquiring a first weight value corresponding to the first state information and acquiring a second weight value corresponding to the second state information;
and carrying out weighted summation according to the first state information, the first weight value, the second state information and the second weight value to obtain the target state information.
The application further provides an optional fusion strategy of the first state information and the second state information. The application processor 140 fuses the first state information and the second state information by adopting a weighted summation manner, and uses a weighted result as target state information obtained by fusion, which can be expressed as:
S’=a*S 1 +b*S 2
wherein S' represents the target state information obtained by fusion, S 1 Representing the first status information S 2 The second state information is represented by a, the weight assigned to the first state information is represented by a, and the weight assigned to the second state information is represented by b.
It should be noted that, in the embodiment of the present application, the weight configuration manner of the first state information and the second state information is not specifically limited, and may be configured by a person of ordinary skill in the art according to actual needs. For example, static weights may be configured for the second state information and the first state information, that is, fixed weights may be allocated in advance for the second state information and the first state information, or dynamic weights may be configured for the second state information and the first state information, that is, the second state information and the first state information dynamically change according to a certain specified condition.
For example, the weight of the first state information may be configured to be 0.9, and the weight of the second state information may be configured to be 0.1;
for another example, the weights of the first state information and the second state information may be dynamically allocated according to the preprocessing performed by the image signal preprocessor, where the greater the difference between the images before and after the preprocessing, the greater the weight of the first state information is configured and the smaller the weight of the second state information is configured; the smaller the image difference between the image data before and after the preprocessing, the smaller the weight of the first state information is configured, and the larger the weight of the second state information is configured.
It should be noted that the first state information and the second state information include state information of a plurality of different dimensions, and the application processor 140 performs the above fusion policy according to the present application, for example, fusion of the state information in each dimension, such as fusion of brightness in the first state information and brightness in the second state information, fusion of definition in the first state information and definition in the second state information, and so on.
Optionally, in an embodiment, the image signal preprocessor 120 is further configured to transmit the first status information and the preprocessed image data to the image signal processor 130;
the image signal processor 130 is further configured to post-process the pre-processed image data to obtain post-processed image data; and
transmitting the post-processed image data, the first status information, the second status information and the first indication information to the application processor 140
The application processor 140 is further configured to fuse the first state information and the second state information according to the first indication information to obtain the target state information.
Referring to fig. 3, in the embodiment of the application, the image signal preprocessor 120 performs statistics to obtain first status information of the image data from the image sensor 110, and performs preprocessing on the image data to obtain preprocessed image data, and then transmits the first status information and the preprocessed image data to the image signal processor 130. For example, the image signal preprocessor 120 transmits the first state information obtained by statistics and the preprocessed image data obtained by preprocessing to the image signal processor 130 based on the MIPI interface between it and the image signal processor 130.
After receiving the first state information and the pre-processed image data from the image signal pre-processor 120, the image signal processor 130 performs post-processing on the pre-processed image data to obtain post-processed image data in addition to performing state statistics on the pre-processed image data to obtain second state information.
The post-processing performed by the image signal processor 130 is not specifically limited in this embodiment of the present application, so that the pre-processing of the image signal processor 120 is differentiated into a constraint (i.e., the image signal processor 120 performs the same optimization process, and the image signal processor 130 does not perform any more), which can be configured by one of ordinary skill in the art according to actual needs. For example, the post-processing performed by the image signal processor 130, such as glare suppression, backlight compensation, color enhancement, lens shading correction, and the like, may also include optimization processing not listed in the present application.
The image signal processor 130 generates first indication information after finishing post-processing of the pre-processed image data and obtaining the post-processed image data, and statistics of the second state information, and sends the post-processed image data, the first state information, the second state information, and the first indication information to the application processor 140.
Accordingly, the application processor 140 fuses the first state information and the second state information according to the first indication to obtain the target state information.
Optionally, in an embodiment, the application processor 140 is further configured to:
when the post-processing image data is dynamic image data, previewing the post-processing image data and/or performing video coding according to the post-processing image data; or alternatively, the process may be performed,
when the post-processing image data is a still image, image encoding is performed based on the post-processing image data.
It should be noted that the image type does not change with the processing of the image data, that is, the original image data is a still image, the pre-processed image data/post-processed image data obtained by the corresponding processing is also a still image, the original image data is a moving image, and the pre-processed image data/post-processed image data obtained by the corresponding processing is also a moving image. The static image is a single-frame image shot in real time, the dynamic image is a frame image in an image sequence acquired during preview, and the dynamic image is a frame image in an image sequence acquired during video recording.
Optionally, in an embodiment, the image signal preprocessor 120 is further configured to transmit the first status information and the preprocessed image data to the image signal processor 130 when the current transmission mode is configured as the first transmission mode.
It should be noted that two alternative transmission modes, namely, the first transmission mode and the second transmission mode, are provided in the embodiment of the present application.
The image signal preprocessor 120 is configured as a first transmission mode (or called a synchronous transmission mode) in the current transmission mode, transmits the first status information and the preprocessed image data to the image signal processor 130, and transmits the first status information to the application processor 140 via the image signal processor 130, and in addition, the image signal processor 130 transmits the post-processed image data that post-processes the pre-processed image data, the second status information that is obtained by counting the post-processed image data, and the generated first indication information to the application processor 140 along with the first status information, which may be specifically referred to the related description in the above embodiments, and will not be repeated herein.
Optionally, in an embodiment, the image signal preprocessor 120 is further configured to send the first status information and the second indication information to the application processor 140 when the current transmission mode is configured as the second transmission mode;
the application processor 140 is further configured to fuse the first state information and the second state information according to the second indication information to obtain the target state information.
Referring to fig. 4, in the embodiment of the application, the image signal preprocessor 120 is further connected to the application processor 140, and the image signal preprocessor 120 directly transmits the first status information to the application processor 140 when the current transmission mode is configured as the second transmission mode.
On the other hand, the image signal preprocessor 120 also transmits preprocessed image data obtained by preprocessing the aforementioned image data to the image signal processor 130. Accordingly, the image signal processor 130 performs a post-process on the pre-processed image data to obtain post-processed image data, in addition to performing a state statistic on the pre-processed image data to obtain second state information.
The post-processing performed by the image signal processor 130 is not specifically limited in this embodiment of the present application, so that the pre-processing of the image signal processor 120 is differentiated into a constraint (i.e., the image signal processor 120 performs the same optimization process, and the image signal processor 130 does not perform any more), which can be configured by one of ordinary skill in the art according to actual needs. For example, the post-processing performed by the image signal processor 130, such as glare suppression, backlight compensation, color enhancement, lens shading correction, and the like, may also include optimization processing not listed in the present application.
The image signal processor 130 performs post-processing on the pre-processed image data to obtain post-processed image data, and sends the post-processed image data and the second state information to the application processor 140 after obtaining the second state information by statistics.
The application processor 140 is further configured to fuse the first state information and the second state information according to the second indication information to obtain the target state information.
Optionally, referring to fig. 5, a first connection and a second connection are established between the image signal pre-processor 120 and the application processor 140, and the image signal pre-processor 120 is further configured to send the first status information to the application processor 140 through the first connection, and send the second indication information to the application processor 140 through the second connection.
For example, a first connection is established between the image signal pre-processing 120 and the application processor 140 via a serial peripheral interface (Serial Peripheral Interface, SPI), and a second connection is established via a General-purpose input/output (GPIO) interface.
Optionally, in an embodiment, the image signal preprocessor 120 is further configured to:
analyzing according to the historical image data to obtain an analysis result;
and configuring the transmission mode as a first mode or a second mode according to the analysis result.
The historical image data are image data acquired before the image data. The image signal preprocessor 120 analyzes the history image data according to the configured analysis policy according to the history image data previously acquired from the image data, obtains an analysis result, and then configures the transmission mode into a first mode and a second mode according to the analysis result.
Illustratively, the historical image data is image data that the image sensor 110 acquired before acquiring the image data, and since the historical image data and the image data are continuously acquired, the image content of the historical image data and the image data are substantially the same. The image signal preprocessor 120 obtains the preprocessing time length of the historical image data with a preset number, predicts the preprocessing time length of the image data according to the preprocessing time length with the preset number, judges whether the preprocessing time length of the image data reaches the preset time length, configures the transmission mode as a second mode if the preprocessing time length reaches the preset time length, and configures the transmission mode as a first mode if the preprocessing time length does not reach the preset time length. The preset duration can be taken by a person of ordinary skill in the art according to actual needs.
For example, when the preprocessing duration of the image data is predicted according to the preset number of preprocessing durations, the image signal preprocessor 120 may calculate an average processing duration of the preset number of preprocessing durations as the preprocessing duration of the image data.
For another example, when the preprocessing duration of the image data is predicted according to the preset number of preprocessing durations, the image signal preprocessor 120 may perform weighted summation on the preset number of preprocessing durations, and use the obtained weighted summation value as the preprocessing duration of the image data, where for the preset number of preprocessing durations, if the corresponding historical image data is acquired earlier than the image data, the weight of the historical image data is smaller.
Optionally, in an embodiment, the image sensors 110 are plural, and the application processor 140 is further configured to update the image acquisition parameters of each image sensor 110 synchronously.
In the embodiment of the present application, for each image sensor 110, according to the calculation method of the image acquisition parameters provided in any of the above embodiments, new image acquisition parameters are obtained correspondingly, and then, according to the new image acquisition parameters of each image sensor 110, the image acquisition parameters of each image sensor are updated synchronously.
The present application also provides an image processing method applied to the electronic device provided by the present application, please refer to fig. 6 and fig. 1, and the flow of the image processing method may be:
in 201, image data is acquired by the image sensor 110 according to the configured image acquisition parameters;
in 202, the image signal preprocessor 120 counts state information of image acquisition parameters of the image sensor 110 in the image data to obtain first state information, and preprocesses the image data to obtain preprocessed image data;
in 203, the image signal processor 130 counts the state information of the image acquisition parameters used for updating the image sensor 110 in the pre-processing image data to obtain second state information;
at 204, the first state information and the second state information are fused by the application processor 140 to obtain target state information, and the image acquisition parameters of the image sensor 110 are updated according to the target state information.
In the embodiment of the present application, the image sensor 110 is connected to the image signal preprocessor 120, and is configured to collect image data according to the configured image collection parameters, and transmit the collected image data to the image signal preprocessor 120. It should be noted that, the image data collected by the image sensor 110 is image data in a RAW format, and the collected image data in the RAW format is transferred to the image signal preprocessor 120.
The image acquisition parameters include, but are not limited to, exposure parameters, focusing parameters, white balance parameters and the like. For example, before starting the acquisition of image data, the application processor 140 configures the initial exposure parameters to the image sensor 110 so that the image sensor 110 acquires image data according to the initial exposure parameters and transmits the acquired image data to the image signal preprocessor 120.
It should be noted that, in the embodiment of the present application, the connection manner between the image signal preprocessor 120 and the image sensor 110 is not limited, for example, the image signal preprocessor 120 and the image sensor 110 are connected through MIPI (Mobile Industry Processor Interface ).
When transmitting the image data to the image signal preprocessor 120, the image sensor 110 encapsulates the image data into a plurality of image data packets and transmits the image data packets to the image signal preprocessor. The image data packet includes a header field, a trailer field, and a data field, where the header field and the trailer field are used to fill some necessary control information, such as synchronization information, address information, error control information, and the like, and the data field is used to fill actual image content.
On the other hand, the image signal preprocessor 120 receives image data from the image sensor 110. In addition, the image signal preprocessor 120 is further connected to the image signal processor 130, where in the embodiment of the application, the connection mode between the image signal processor 130 and the image signal preprocessor 120 is not limited, for example, the image signal processor 130 and the image signal preprocessor 120 may also be connected through MIPI.
The image signal pre-processor 120, after receiving the image data transmitted from the image sensor 110, uses the image data to calculate the state information required by the application processor 140 to update the image acquisition parameters of the image sensor 110, including but not limited to brightness information, sharpness information, contrast information, and the like, and records the state information calculated using the image data as the first state information. In addition, the image signal preprocessor 120 further preprocesses the image data from the image sensor 110 according to a configured preprocessing strategy, improves the image quality of the image data, and accordingly obtains preprocessed image data. It should be noted that the preprocessing of the image data by the image signal preprocessor 120 does not change the format of the image data, i.e. the preprocessed image data obtained by the preprocessing is still in the RAW format.
After finishing the preprocessing of the image data and obtaining the preprocessed image data, the image signal preprocessor 120 transmits the preprocessed image data to the image signal processor 130.
The image signal processor 130, upon receiving the pre-processed image data from the image signal pre-processor 120, uses the pre-processed image data to calculate state information including, but not limited to, brightness information, sharpness information, contrast information, etc., required for the application processor 140 to update the image acquisition parameters of the image sensor 110, and records the state information calculated using the pre-processed image data as second state information.
Further, for the first state information, the image signal preprocessor 120 may directly transmit the first state information to the application processor 140 or transmit the first state information to the application processor 140 via the image signal processor 130.
After the image signal preprocessor 120 counts to obtain the first state information and the image signal processor 130 counts to obtain the second state information, the application processor 140 further fuses the first state information and the second state information to obtain new state information according to the configured fusing strategy, and marks the new state information as the target state information. It can be seen that the target state information carries state information before and after image data preprocessing to a certain extent, so that shooting states aiming at actual shooting scenes can be more accurately reflected, and image acquisition parameters calculated by using the target state information are more accurate. Accordingly, the application processor 140 calculates a new image acquisition parameter according to a configured image acquisition parameter calculation algorithm (such as an automatic exposure algorithm, an automatic white balance algorithm, an automatic focusing algorithm, etc.), using the fused target state information.
The application processor 140 is further connected to the image sensor 110 for controlling the image sensor 110 to start and end capturing image data. After the new image acquisition parameters are calculated, the application processor updates the image acquisition parameters of the image sensor 110 to the newly calculated image acquisition parameters. The image acquisition parameters of the image sensor 110 are continuously updated until the image acquisition parameters of the image sensor 110 are converged, so that the optimal image acquisition effect is obtained.
It should be noted that, in the embodiment of the present application, the connection manner between the application processor 140 and the image sensor 110 is not limited specifically, and may be configured by one of ordinary skill in the art according to the actual situation, for example, in the embodiment of the present application, the application processor 140 and the image sensor 110 are connected through an I2C bus.
Optionally, in an embodiment, the fusing the first state information and the second state information by the application processor 140 to obtain the target state information includes:
acquiring, by the application processor 140, a first weight value corresponding to the first state information, and acquiring a second weight value corresponding to the second state information;
The target state information is obtained by the application processor 140 performing weighted summation according to the first state information, the first weight value, the second state information and the second weight value.
The application further provides an optional fusion strategy of the first state information and the second state information. The application processor 140 fuses the first state information and the second state information by adopting a weighted summation manner, and uses a weighted result as target state information obtained by fusion, which can be expressed as:
S’=a*S 1 +b*S 2
wherein S' represents the target state information obtained by fusion, S 1 Representing the first status information S 2 The second state information is represented by a, the weight assigned to the first state information is represented by a, and the weight assigned to the second state information is represented by b.
It should be noted that, in the embodiment of the present application, the weight configuration manner of the first state information and the second state information is not specifically limited, and may be configured by a person of ordinary skill in the art according to actual needs. For example, static weights may be configured for the second state information and the first state information, that is, fixed weights may be allocated in advance for the second state information and the first state information, or dynamic weights may be configured for the second state information and the first state information, that is, the second state information and the first state information dynamically change according to a certain specified condition.
For example, the weight of the first state information may be configured to be 0.9, and the weight of the second state information may be configured to be 0.1;
for another example, the weights of the first state information and the second state information may be dynamically allocated according to the preprocessing performed by the image signal preprocessor, where the greater the difference between the images before and after the preprocessing, the greater the weight of the first state information is configured and the smaller the weight of the second state information is configured; the smaller the image difference between the image data before and after the preprocessing, the smaller the weight of the first state information is configured, and the larger the weight of the second state information is configured.
It should be noted that the first state information and the second state information include state information of a plurality of different dimensions, and the application processor 140 performs the above fusion policy according to the present application, for example, fusion of the state information in each dimension, such as fusion of brightness in the first state information and brightness in the second state information, fusion of definition in the first state information and definition in the second state information, and so on.
Optionally, in an embodiment, the image processing method provided by the present application further includes:
transmitting the first state information and the pre-processed image data to the image signal processor 130 through the image signal pre-processor 120;
Post-processing the pre-processed image data by the image signal processor 130 to obtain post-processed image data;
transmitting the post-processed image data, the first state information, the second state information, and the first instruction information to the application processor 140 through the image signal processor 130;
the target state information is obtained by the application processor 140 fusing the first state information and the second state information according to the first indication information.
Referring to fig. 3, in the embodiment of the application, the image signal preprocessor 120 performs statistics to obtain first status information of the image data from the image sensor 110, and performs preprocessing on the image data to obtain preprocessed image data, and then transmits the first status information and the preprocessed image data to the image signal processor 130. For example, the image signal preprocessor 120 transmits the first state information obtained by statistics and the preprocessed image data obtained by preprocessing to the image signal processor 130 based on the MIPI interface between it and the image signal processor 130.
After receiving the first state information and the pre-processed image data from the image signal pre-processor 120, the image signal processor 130 performs post-processing on the pre-processed image data to obtain post-processed image data in addition to performing state statistics on the pre-processed image data to obtain second state information.
The post-processing performed by the image signal processor 130 is not specifically limited in this embodiment of the present application, so that the pre-processing of the image signal processor 120 is differentiated into a constraint (i.e., the image signal processor 120 performs the same optimization process, and the image signal processor 130 does not perform any more), which can be configured by one of ordinary skill in the art according to actual needs. For example, the post-processing performed by the image signal processor 130, such as glare suppression, backlight compensation, color enhancement, lens shading correction, and the like, may also include optimization processing not listed in the present application.
The image signal processor 130 generates first indication information after finishing post-processing of the pre-processed image data and obtaining the post-processed image data, and statistics of the second state information, and sends the post-processed image data, the first state information, the second state information, and the first indication information to the application processor 140.
Accordingly, the application processor 140 fuses the first state information and the second state information according to the first indication to obtain the target state information.
Optionally, in an embodiment, the image processing method provided by the present application further includes:
When the post-processing image data is moving image data, previewing the post-processing image data and/or video encoding according to the post-processing image data by the application processor 140; or alternatively, the process may be performed,
when the post-processing image data is a still image, image encoding is performed by the application processor 140 from the post-processing image data.
It should be noted that the image type does not change with the processing of the image data, that is, the original image data is a still image, the pre-processed image data/post-processed image data obtained by the corresponding processing is also a still image, the original image data is a moving image, and the pre-processed image data/post-processed image data obtained by the corresponding processing is also a moving image. The static image is a single-frame image shot in real time, the dynamic image is a frame image in an image sequence acquired during preview, and the dynamic image is a frame image in an image sequence acquired during video recording.
Optionally, in an embodiment, transmitting the first status information and the pre-processed image data to the image signal processor 130 by the image signal pre-processor 120 includes:
when the current transmission mode is configured as the first transmission mode, the first status information and the pre-processed image data are transmitted to the image signal processor 130 through the image signal pre-processor 120.
It should be noted that two alternative transmission modes, namely, the first transmission mode and the second transmission mode, are provided in the embodiment of the present application.
The image signal preprocessor 120 is configured as a first transmission mode (or called a synchronous transmission mode) in the current transmission mode, transmits the first status information and the preprocessed image data to the image signal processor 130, and transmits the first status information to the application processor 140 via the image signal processor 130, and in addition, the image signal processor 130 transmits the post-processed image data that post-processes the pre-processed image data, the second status information that is obtained by counting the post-processed image data, and the generated first indication information to the application processor 140 along with the first status information, which may be specifically referred to the related description in the above embodiments, and will not be repeated herein.
Optionally, in an embodiment, the image processing method provided by the present application further includes:
when the current transmission mode is configured as the second transmission mode, transmitting the first status information and the second indication information to the application processor 140 through the image signal preprocessor 120;
the target state information is obtained by fusing the first state information and the second state information according to the second instruction information by the application processor 140.
Referring to fig. 4, in the embodiment of the application, the image signal preprocessor 120 is further connected to the application processor 140, and the image signal preprocessor 120 directly transmits the first status information to the application processor 140 when the current transmission mode is configured as the second transmission mode.
On the other hand, the image signal preprocessor 120 also transmits preprocessed image data obtained by preprocessing the aforementioned image data to the image signal processor 130. Accordingly, the image signal processor 130 performs a post-process on the pre-processed image data to obtain post-processed image data, in addition to performing a state statistic on the pre-processed image data to obtain second state information.
The post-processing performed by the image signal processor 130 is not specifically limited in this embodiment of the present application, so that the pre-processing of the image signal processor 120 is differentiated into a constraint (i.e., the image signal processor 120 performs the same optimization process, and the image signal processor 130 does not perform any more), which can be configured by one of ordinary skill in the art according to actual needs. For example, the post-processing performed by the image signal processor 130, such as glare suppression, backlight compensation, color enhancement, lens shading correction, and the like, may also include optimization processing not listed in the present application.
The image signal processor 130 performs post-processing on the pre-processed image data to obtain post-processed image data, and sends the post-processed image data and the second state information to the application processor 140 after obtaining the second state information by statistics.
The application processor 140 is further configured to fuse the first state information and the second state information according to the second indication information to obtain the target state information.
Optionally, in an embodiment, the image processing method provided by the present application further includes:
analyzing according to the historical image data to obtain an analysis result;
and configuring the transmission mode as a first mode or a second mode according to the analysis result.
The historical image data are image data acquired before the image data. The image signal preprocessor 120 analyzes the history image data according to the configured analysis policy according to the history image data previously acquired from the image data, obtains an analysis result, and then configures the transmission mode into a first mode and a second mode according to the analysis result.
Illustratively, the historical image data is image data that the image sensor 110 acquired before acquiring the image data, and since the historical image data and the image data are continuously acquired, the image content of the historical image data and the image data are substantially the same. The image signal preprocessor 120 obtains the preprocessing time length of the historical image data with a preset number, predicts the preprocessing time length of the image data according to the preprocessing time length with the preset number, judges whether the preprocessing time length of the image data reaches the preset time length, configures the transmission mode as a second mode if the preprocessing time length reaches the preset time length, and configures the transmission mode as a first mode if the preprocessing time length does not reach the preset time length. The preset duration can be taken by a person of ordinary skill in the art according to actual needs.
For example, when the preprocessing duration of the image data is predicted according to the preset number of preprocessing durations, the image signal preprocessor 120 may calculate an average processing duration of the preset number of preprocessing durations as the preprocessing duration of the image data.
For another example, when the preprocessing duration of the image data is predicted according to the preset number of preprocessing durations, the image signal preprocessor 120 may perform weighted summation on the preset number of preprocessing durations, and use the obtained weighted summation value as the preprocessing duration of the image data, where for the preset number of preprocessing durations, if the corresponding historical image data is acquired earlier than the image data, the weight of the historical image data is smaller.
Optionally, in an embodiment, the image sensor 110 is multiple, and the image processing method provided by the present application further includes:
the image acquisition parameters of each image sensor 110 are updated synchronously by the application processor 140.
In the embodiment of the present application, for each image sensor 110, according to the calculation method of the image acquisition parameters provided in any of the above embodiments, new image acquisition parameters are obtained correspondingly, and then, according to the new image acquisition parameters of each image sensor 110, the image acquisition parameters of each image sensor are updated synchronously.
The electronic device and the image processing method provided by the embodiment of the application are described in detail above. Specific examples are set forth herein to illustrate the principles and embodiments of the present application and are provided to aid in the understanding of the present application. Meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, the present description should not be construed as limiting the present application.

Claims (10)

1. An electronic device, comprising:
the image sensor is used for acquiring image data according to the configured image acquisition parameters;
an image signal preprocessor for counting state information of image acquisition parameters of the image sensor in the image data to obtain first state information, and preprocessing the image data to obtain preprocessed image data;
an image signal processor for counting state information of image acquisition parameters used for updating the image sensor in the pre-processing image data to obtain second state information;
and the application processor is used for fusing the first state information and the second state information to obtain target state information and updating the image acquisition parameters of the image sensor according to the target state information.
2. The electronic device of claim 1, wherein the application processor is configured to:
acquiring a first weight value corresponding to the first state information and acquiring a second weight value corresponding to the second state information;
and carrying out weighted summation according to the first state information, the first weight value, the second state information and the second weight value to obtain the target state information.
3. The electronic device of claim 1, wherein the image signal pre-processor is further configured to transmit the first status information and the pre-processed image data to the image signal processor;
the image signal processor is also used for carrying out post-processing on the pre-processing image data to obtain post-processing image data; and
transmitting the post-processing image data, the first state information, the second state information, and first indication information to the application processor;
the application processor is further configured to fuse the first state information and the second state information according to the first indication information to obtain the target state information.
4. The electronic device of claim 3, wherein the image signal pre-processor is further configured to transmit the first status information and the pre-processed image data to the image signal processor when a current transmission mode is configured as a first transmission mode.
5. The electronic device of claim 4, wherein the image signal pre-processor is further configured to send the first status information and second indication information to the application processor when a current transmission mode is configured as a second transmission mode;
the application processor is further configured to fuse the first state information and the second state information according to the second instruction information to obtain the target state information.
6. The electronic device of claim 4 or 5, wherein the image signal preprocessor is further configured to:
analyzing according to the historical image data to obtain an analysis result;
and configuring the transmission mode as a first mode or a second mode according to the analysis result.
7. The electronic device of any of claims 1-5, wherein the image sensors are a plurality of, the application processor further configured to update the image acquisition parameters of each image sensor simultaneously.
8. An image processing method applied to an electronic device, wherein the electronic device comprises an image sensor, an image signal preprocessor, an image signal processor and an application processor, the image processing method comprising:
Acquiring image data by the image sensor according to the configured image acquisition parameters;
counting state information of image acquisition parameters used for updating the image sensor in the image data through the image signal preprocessor to obtain first state information, and preprocessing the image data to obtain preprocessed image data;
counting state information used for updating image acquisition parameters of the image sensor in the pre-processing image data through the image signal processor to obtain second state information;
and fusing the first state information and the second state information through the application processor to obtain target state information, and updating image acquisition parameters of the image sensor according to the target state information.
9. The image processing method according to claim 8, characterized by further comprising:
transmitting, by the image signal preprocessor, the first state information and the preprocessed image data to the image signal processor when the current transmission mode is configured as a first transmission mode;
post-processing the pre-processed image data by the image signal processor to obtain post-processed image data; and sending the post-processing image data, the first state information, the second state information, and first indication information to the application processor;
And fusing the first state information and the second state information according to the first indication information by the application processor to obtain the target state information.
10. The image processing method according to claim 8, characterized by further comprising:
transmitting, by the image signal preprocessor, the first state information and second indication information to the application processor when the current transmission mode is configured as a second transmission mode;
and fusing the first state information and the second state information according to the second instruction information by the application processor to obtain the target state information.
CN202010682841.0A 2020-07-15 2020-07-15 Electronic device and image processing method Active CN114007009B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010682841.0A CN114007009B (en) 2020-07-15 2020-07-15 Electronic device and image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010682841.0A CN114007009B (en) 2020-07-15 2020-07-15 Electronic device and image processing method

Publications (2)

Publication Number Publication Date
CN114007009A CN114007009A (en) 2022-02-01
CN114007009B true CN114007009B (en) 2023-08-18

Family

ID=79920183

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010682841.0A Active CN114007009B (en) 2020-07-15 2020-07-15 Electronic device and image processing method

Country Status (1)

Country Link
CN (1) CN114007009B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107147837A (en) * 2017-06-30 2017-09-08 维沃移动通信有限公司 The method to set up and mobile terminal of a kind of acquisition parameters
CN108370414A (en) * 2016-10-29 2018-08-03 华为技术有限公司 A kind of image pickup method and terminal
WO2019056242A1 (en) * 2017-09-21 2019-03-28 深圳传音通讯有限公司 Camera photographing parameter setting method for smart terminal, setting device, and smart terminal
CN110022420A (en) * 2019-03-13 2019-07-16 华中科技大学 A kind of image scanning system based on CIS, method and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9363446B2 (en) * 2013-04-15 2016-06-07 Htc Corporation Automatic exposure control for sequential images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108370414A (en) * 2016-10-29 2018-08-03 华为技术有限公司 A kind of image pickup method and terminal
CN107147837A (en) * 2017-06-30 2017-09-08 维沃移动通信有限公司 The method to set up and mobile terminal of a kind of acquisition parameters
WO2019056242A1 (en) * 2017-09-21 2019-03-28 深圳传音通讯有限公司 Camera photographing parameter setting method for smart terminal, setting device, and smart terminal
CN110022420A (en) * 2019-03-13 2019-07-16 华中科技大学 A kind of image scanning system based on CIS, method and storage medium

Also Published As

Publication number Publication date
CN114007009A (en) 2022-02-01

Similar Documents

Publication Publication Date Title
CN109005366B (en) Night scene shooting processing method and device for camera module, electronic equipment and storage medium
CN110072051B (en) Image processing method and device based on multi-frame images
CN109842753B (en) Camera anti-shake system, camera anti-shake method, electronic device and storage medium
US11532076B2 (en) Image processing method, electronic device and storage medium
US20110043663A1 (en) Imaging terminal, display terminal, display method, and imaging system
US20180007292A1 (en) Imaging device, imaging method, and image processing device
CN111107276B (en) Information processing apparatus, control method thereof, storage medium, and imaging system
CN101815175B (en) Information processing apparatus and method for controlling information processing apparatus, imaging apparatus, and method for correcting images
EP2141654A1 (en) System and method for efficiently performing image processing operations
CN110290325B (en) Image processing method, image processing device, storage medium and electronic equipment
CN112822371B (en) Image processing chip, application processing chip, data statistical system and method
KR20140080815A (en) Photographing apparatus, method for controlling the same, and computer-readable storage medium
JP4499908B2 (en) Electronic camera system, electronic camera, server computer, and photographing condition correction method
CN104811601B (en) A kind of method and apparatus for showing preview image
CN114007009B (en) Electronic device and image processing method
EP4033750A1 (en) Method and device for processing image, and storage medium
CN112702588B (en) Dual-mode image signal processor and dual-mode image signal processing system
CN109379535A (en) Image pickup method and device, electronic equipment, computer readable storage medium
US20160198077A1 (en) Imaging apparatus, video data transmitting apparatus, video data transmitting and receiving system, image processing method, and program
CN114143471A (en) Image processing method, system, mobile terminal and computer readable storage medium
CN113766142B (en) Image processing apparatus, image signal preprocessing module, device, and processing method
JP2022045567A (en) Imaging control device, imaging control method, and program
CN113792708B (en) ARM-based remote target clear imaging system and method
CN113873142A (en) Multimedia processing chip, electronic device and dynamic image processing method
CN113747145B (en) Image processing circuit, electronic apparatus, and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant