CN111491101B - Image processor, image processing method, photographing device, and electronic apparatus - Google Patents

Image processor, image processing method, photographing device, and electronic apparatus Download PDF

Info

Publication number
CN111491101B
CN111491101B CN202010313755.2A CN202010313755A CN111491101B CN 111491101 B CN111491101 B CN 111491101B CN 202010313755 A CN202010313755 A CN 202010313755A CN 111491101 B CN111491101 B CN 111491101B
Authority
CN
China
Prior art keywords
image
algorithm
module
processing
scene detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010313755.2A
Other languages
Chinese (zh)
Other versions
CN111491101A (en
Inventor
***
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010313755.2A priority Critical patent/CN111491101B/en
Publication of CN111491101A publication Critical patent/CN111491101A/en
Application granted granted Critical
Publication of CN111491101B publication Critical patent/CN111491101B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image processor, an image processing method, a shooting device and an electronic device. The image processor comprises a hardware abstraction module, an application program module and an algorithm post-processing module. The hardware abstraction module is used for receiving the RAW image, converting the RAW image into a YUV image and transmitting the YUV image. The application program module is used for being connected with the hardware abstraction module. And the algorithm post-processing module is connected with the hardware abstraction module through the application program module. The algorithm post-processing module is internally stored with a scene detection algorithm and is used for processing the YUV image by adopting the scene detection algorithm to obtain a scene detection result. The image processor, the image processing method, the shooting device and the post-algorithm processing module in the electronic equipment carry out scene detection algorithm processing on the YUV image, flow truncation does not need to be carried out on an algorithm framework of a hardware abstraction module, the coupling degree of scene detection processing and the hardware abstraction module is greatly reduced, the design difficulty is favorably reduced, and the research and development cost is reduced.

Description

Image processor, image processing method, photographing device, and electronic apparatus
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processor, an image processing method, a photographing apparatus, and an electronic device.
Background
Currently, scene detection processing is processed in a Hardware abstraction module (HAL), so that the coupling degree between the Hardware abstraction module and the scene detection processing is high, and thus when related functions of a product need to be changed, more adaptive changes need to be made to the internal flow of the Hardware abstraction module, and the coupling between the Hardware abstraction module and the scene detection processing causes great design difficulty and consumes research and development costs.
Disclosure of Invention
The embodiment of the application provides an image processor, an image processing method, a shooting device and electronic equipment.
The image processor of the embodiment of the application comprises a hardware abstraction module, an application program module and an algorithm post-processing module. The hardware abstraction module is used for receiving a RAW image, converting the RAW image into a YUV image and transmitting the YUV image. The application module is used for being connected with the hardware abstraction module. The algorithm post-processing module is connected with the hardware abstraction module through the application program module. The algorithm post-processing module is internally stored with a scene detection algorithm and is used for processing the YUV image by adopting the scene detection algorithm to obtain a scene detection result.
The image processing method of the embodiment of the application comprises the following steps: the hardware abstraction module receives a RAW image, converts the RAW image into a YUV image and transmits the YUV image to an application program module; and the algorithm post-processing module processes the YUV image by adopting a scene detection algorithm to obtain a scene detection result.
The shooting device comprises an image processor and an image sensor, wherein the image sensor is connected with the image processor. The image processor comprises a hardware abstraction module, an application program module and an algorithm post-processing module. The hardware abstraction module is used for receiving a RAW image, converting the RAW image into a YUV image and transmitting the YUV image. The application program module is used for being connected with the hardware abstraction module. The algorithm post-processing module is connected with the hardware abstraction module through the application program module. The algorithm post-processing module is internally stored with a scene detection algorithm and is used for processing the YUV image by adopting the scene detection algorithm to obtain a scene detection result.
The electronic equipment of the embodiment of the application comprises a shooting device and a shell, wherein the shooting device is combined with the shell. The shooting device comprises an image processor and an image sensor, and the image sensor is connected with the image processor. The image processor comprises a hardware abstraction module, an application program module and an algorithm post-processing module. The hardware abstraction module is used for receiving a RAW image, converting the RAW image into a YUV image and transmitting the YUV image. The application program module is used for being connected with the hardware abstraction module. The algorithm post-processing module is connected with the hardware abstraction module through the application program module. The algorithm post-processing module is internally stored with a scene detection algorithm and is used for processing the YUV image by adopting the scene detection algorithm to obtain a scene detection result.
In the image processor, the image processing method, the shooting device and the electronic equipment, the hardware abstraction module does not perform scene detection algorithm processing on the YUV image, but the algorithm post-processing module performs the scene detection algorithm processing on the YUV image, so that the scene detection algorithm is realized without performing flow truncation on an algorithm framework of the hardware abstraction module, but is realized externally, the coupling degree of the scene detection processing and the hardware abstraction module is greatly reduced, the difficulty of the design process and the difficulty of design change are reduced, and the research and development cost is reduced.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic view of a camera according to some embodiments of the present application;
FIG. 2 is a schematic structural diagram of an electronic device according to some embodiments of the present application;
FIG. 3 is a schematic illustration of a preview algorithm of some embodiments of the present application;
FIG. 4 is a schematic diagram of a scene detection algorithm according to some embodiments of the present application;
FIG. 5 is a schematic diagram of the communication of an algorithm post-processing module and an application module of some embodiments of the present application;
FIG. 6 is a schematic view of a camera according to some embodiments of the present application;
FIG. 7 is a schematic diagram of the communication of an algorithm post-processing module with a hardware abstraction module, an application module, according to some embodiments of the present application;
FIG. 8 is a schematic diagram of an algorithmic post-processing module in accordance with certain embodiments of the present application;
FIG. 9 is a schematic view of a camera according to some embodiments of the present application;
fig. 10 to 17 are schematic flow charts of image processing methods according to some embodiments of the present disclosure.
Detailed Description
Reference will now be made in detail to the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
The following disclosure provides many different embodiments or examples for implementing different configurations of embodiments of the application. In order to simplify the disclosure of the embodiments of the present application, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present application.
Referring to fig. 1 and fig. 2, an electronic device 1000 is provided. The electronic apparatus 1000 includes the photographing device 100 and the housing 200, and the photographing device 100 is combined with the housing 200. The housing 200 may serve as a mounting carrier for functional elements of the electronic apparatus 1000. The housing 200 may provide protection against dust, falling, water, etc. for functional elements, such as a display screen, the camera 100, a receiver, etc.
Referring to fig. 1 and fig. 2, the present application further provides a camera 100. The photographing apparatus 100 includes an image processor 10 and an image sensor 20. The image processor 10 is connected to the image sensor 20. The Image sensor 20 may include an Image acquisition unit (sensor)22 and an RAW Image data unit (IFE) 24. The image acquisition unit 22 may be configured to receive light to acquire acquired image data (RAW image), and the RAW image data unit 24 may be configured to transmit the image data acquired by the image acquisition unit 22 to the image processor 10. The RAW image data unit 24 may process the RAW image acquired by the image acquisition unit 22 and output the processed RAW image to the image processor 10.
Referring to fig. 1, an image processor 10 is further provided in the present embodiment. The image processor 10 includes a hardware abstraction module 12, an application program module 14(APP)14, and an algorithm post-processing module 16 (APS) 16. The hardware abstraction module 12 is configured to receive a RAW image, convert the RAW image into a YUV image, and transmit the YUV image. The application module 14 is used to interface with the hardware abstraction module 12. The algorithm post-processing module 16 is adapted to interface with the hardware abstraction module 12 via the application module 14. The post-algorithm processing module 16 stores a scene detection algorithm therein, and the post-algorithm processing module 16 is configured to process the YUV image by using the scene detection algorithm to obtain a scene detection result.
In the image processor 10, the image processing method, the photographing device 100, and the electronic device 1000 according to the embodiment of the present application, the hardware abstraction module 12 does not perform scene detection algorithm processing on the YUV image, but the algorithm post-processing module 16 performs scene detection algorithm processing on the YUV image, so that the scene detection algorithm is implemented externally without performing flow truncation on the algorithm framework of the hardware abstraction module 12 itself, thereby greatly reducing the degree of coupling between the scene detection processing and the hardware abstraction module 12, facilitating reduction of difficulty in the design process and difficulty in design change, and reducing the development cost.
The hardware abstraction module 12 is configured to receive a RAW image, convert the RAW image into a YUV image, and transmit the RAW image and/or the YUV image. The hardware abstraction module 12 may be connected to the image sensor 20. Specifically, the hardware abstraction module 12 may include a buffer queue (buffer queue)122 connected to the Image sensor 20, a RAW to RGB processing unit (BPS) 124, and an Image Process Engine (IPE) 126 connected to the application module 14. The buffer unit 122 is used for buffering the RAW image from the image sensor 20 and transmitting the RAW image to the post-algorithm processing module 16 through the application module 14. The RAW-to-RGB processing unit 124 is configured to convert the RAW image from the buffer unit 122 into an RGB image. The denoising and YUV post-processing unit 126 is configured to process the RGB image to obtain a YUV image and transmit the YUV image to the algorithm post-processing module 16 through the application module 14. The hardware abstraction module 12 may also transmit metadata (metadata) of the image data, the metadata including 3a (automatic exposure control AE, automatic focus control AF, automatic white balance control AWB) information, picture information (e.g., image width, height), exposure parameters (aperture size, shutter speed, and sensitivity aperture value), etc., and post-photographing processing (e.g., including at least one of beauty processing, filter processing, rotation processing, watermarking processing, blurring processing, HDR processing, and multi-frame processing) of the RAW image and/or the YUV image may be assisted by the metadata. In one embodiment, the metadata Includes Sensitivity (ISO) information according to which the brightness of the RAW image and/or the YUV image can be assisted to be adjusted, thereby implementing post-photographing processing related to the adjusted brightness.
Because the hardware abstraction module 12 does not perform post-photographing processing on the RAW image and/or the YUV image (for example, only receiving the RAW image, converting the RAW image into the YUV image, and transmitting the RAW image and/or the YUV image), the image processing algorithm for post-photographing processing does not need to perform flow truncation on the algorithm framework of the hardware abstraction module 12 itself, and only needs to be externally compatible, so that the design difficulty is reduced.
In the related art, an Application Program Interface (API) establishes the hardware abstraction module 12 as a pipeline (pipeline), and since the establishment of the pipeline requires a lot of time and memory, all the pipelines used for the working mode corresponding to the camera are established first when the camera is started, and in order to implement various image processing algorithms, a lot of pipelines (for example, more than three pipelines) are generally required to be established, which may cause the startup of the camera to consume a lot of time and occupy a lot of memory. The hardware abstraction module 12 according to the embodiment of the present application does not perform post-photographing processing on the RAW image and/or the YUV image, and therefore, the hardware abstraction module 12 only needs to establish a small number of (for example, one or two) pipelines, and does not need to establish a large number of pipelines, so that the memory can be saved, and the starting speed of the camera can be increased.
The application module 14 is used to interface with the hardware abstraction module 12. The application module 14 may be configured to generate control commands according to user input and send the control commands to the image sensor 20 through the hardware abstraction module 12 to control the operation of the image sensor 20 accordingly. Wherein, the application module 14 can run with 64 bits (bit), and the static data link library (lib) of the image processing algorithm of the post-photographing processing can be configured with 64 bits to improve the operation speed. After receiving the RAW image and/or the YUV image transmitted by the hardware abstraction module 12, the application module 14 may perform post-photographing processing on the RAW image and/or the YUV image, or may transmit the RAW image and/or the YUV image to the algorithm post-processing module 16 and perform post-photographing processing by the algorithm post-processing module 16. Of course, it is also possible that the application module 14 performs some post-photographing processing (e.g., beauty processing, filter processing, rotation processing, watermarking processing, blurring processing, etc.), and the algorithm post-processing module 16 performs some other post-photographing processing (e.g., HDR processing, multi-frame processing, etc.). In the embodiment of the present application, the application module 14 transmits the RAW and/or YUV images to the post-algorithm processing module 16 for post-photographing processing.
The algorithm post-processing module 16 is connected to the hardware abstraction module 12 through the application program module 14, and at least a scene detection algorithm is stored in the algorithm post-processing module 16. The algorithm post-processing module 16 may also store other image processing algorithms (e.g., including preview algorithms, beauty processing algorithms, filter processing algorithms, rotation processing algorithms, watermarking processing algorithms, blurring processing algorithms, HDR processing algorithms, multi-frame processing algorithms, etc.). The post-algorithm processing module 16 is configured to process the YUV image using a scene detection algorithm to implement scene detection algorithm processing, and may process the RAW image and/or the YUV image using an image processing algorithm to implement other post-photographing processing. Since the scene detection algorithm processing of the YUV image and the post-photographing processing of the RAW image and/or the YUV image can be realized by the algorithm post-processing module 16, the process truncation is not required to be performed on the algorithm framework of the hardware abstraction module 12 itself, and only the external compatibility is required, so that the design difficulty is reduced. And because the scene detection algorithm processing and other post-photographing processing is realized by the algorithm post-processing module 16, the algorithm post-processing module 16 has a more single function and is more focused, so that the effects of fast transplantation, simple expansion of a new image processing algorithm or change of an original image processing algorithm and the like can be achieved. Of course, if the application module 14 performs the scene detection algorithm and other post-photographing processes (e.g., preview algorithm processing, beautification processing, filter processing, rotation processing, watermarking processing, blurring processing, etc.), and the algorithm post-processing module 16 performs other post-photographing processes (e.g., HDR processing, multi-frame processing, etc.), the application module 14 may also store an image processing algorithm (e.g., at least one of beautification processing algorithm, filter processing algorithm, rotation processing algorithm, watermarking processing algorithm, blurring processing algorithm, HDR processing algorithm, and multi-frame processing algorithm) such as the scene detection algorithm. The application module 14 may also be configured to process the RAW image and/or the YUV image using an image processing algorithm to implement post-photographing processing such as scene detection algorithm processing. Since the post-photographing processing of the RAW image and/or the YUV image is realized by the application module 14 and the algorithm post-processing module 16, the process truncation is not required on the algorithm architecture of the hardware abstraction module 12, and only the external compatibility is required, so that the design difficulty is also greatly reduced.
When the post-algorithm processing module 16 processes only YUV images (e.g., the image processing algorithm processes YUV images), the hardware abstraction module 12 may transmit only YUV images; while the post-algorithm processing module 16 processes the RAW image and the YUV image, the hardware abstraction module 12 may transmit the RAW image and the YUV image.
Referring to fig. 1 and 3, in some embodiments, the algorithm post-processing module 16 further stores a preview algorithm therein, and the algorithm post-processing module 16 is further configured to perform a preview algorithm process on the YUV image, copy the YUV image from the preview algorithm, and provide the copied YUV image for a scene detection algorithm process. Therefore, the image processor 10, the photographing device 100, and the electronic device 1000 according to the embodiment of the present application copy the YUV image from the preview algorithm processing, so as to facilitate the subsequent algorithm post-processing module 16 to perform the scene detection algorithm processing on the copied YUV image.
Specifically, referring to fig. 1 and 3, the process of the preview algorithm may be as follows. Firstly, acquiring a YUV image; then, when the scene detection algorithm needs to run, copying the acquired YUV image and sending the copied YUV image to a process of running the scene detection algorithm of the algorithm post-processing module 16; then, executing a core algorithm of a preview algorithm; finally, the preview algorithm is packaged. The core algorithm of the preview algorithm may include: (1) compressing the obtained YUV image; (2) generating compressed YUV images to the encoding unit 162; (3) receiving the JPG image sent back by the encoding unit 162 (the encoding unit 162 converts the YUV image into a JPG image); (4) the JPG image is sent to the application module 14 and the preview display of the JPG image is controlled by the application module 14.
The image processor 10, the photographing device 100 and the post-algorithm processing module 16 of the electronic device 1000 according to the embodiment of the present application may store a preview algorithm therein, so that the image processor 10, the photographing device 100 and the electronic device 1000 according to the embodiment of the present application can obtain a preview image after photographing, and after photographing is completed, the preview image is displayed in a certain area on the display screen of the electronic device 1000, which is in accordance with user habits and is beneficial to improving user experience,
referring to fig. 1 and 4, in some embodiments, the post-algorithm processing module 16 is further configured to receive the copied YUV image, extract parameters of the YUV image, perform scene detection algorithm processing according to the parameters to obtain a scene detection result, and encapsulate the scene detection result.
Specifically, the process of the algorithm post-processing module 16 in the scene detection algorithm may include: receiving the copied YUV image; extracting parameters of the YUV image; executing core algorithm processing of a scene detection algorithm according to the parameters to obtain a scene detection result; and encapsulates the scene detection results and sends the scene detection results to the application module 14. The post-algorithm processing module 16 receives the YUV image copied and sent out in the preview algorithm, and then extracts image parameters (such as pixel size, image format, etc.) of the YUV image so as to perform the processing of the scene detection core algorithm in the following. And then, executing scene detection core algorithm processing to obtain a scene detection result. Finally, the scene detection result is encapsulated and sent to the application module 14 or the hardware abstraction module 12. The image processor 10, the photographing device 100, and the algorithm post-processing module 16 of the electronic device 1000 according to the embodiment of the present application include a scene detection algorithm, and the scene detection algorithm may include receiving a copied YUV image, extracting parameters of the YUV image, performing scene detection algorithm processing according to the parameters to obtain a scene detection result, and encapsulating the scene detection result, so that a scene type of the photographed image can be identified, which is beneficial to performing other processing according to the scene detection result by the image processor 10, the photographing device 100, and the electronic device 1000 according to the embodiment of the present application.
The scene detection core algorithm may include edge extraction, blur detection, and brightness detection, among others. For example, the scene detection core algorithm may include extracting a scene edge in the YUV image, comparing the extracted scene edge with a scene model in the database, and using a scene model with the highest similarity found in the comparison process as a scene detection result, that is, a type of the scene. The scene detection result may include buildings, natural scenery, figures, objects, food, and the like. The scene detection core algorithm may also include blur detection, e.g., a scene that is determined to be a fast moving scene when scene blur in the YUV images is detected and has a long afterimage. The scene detection result may include a fast moving scene, a slow moving scene, a static scene, and the like. The scene detection core algorithm can also comprise brightness detection, for example, when the average brightness of the pixels of the YUV image is detected to exceed a certain threshold value, the scene is judged to be an outdoor sunny scene; or setting a threshold, when the gray value of the pixel of the YUV image exceeds the threshold, the pixel is judged as an overexposed pixel, and when the overexposed pixel of the YUV image is detected to exceed a certain number, the scene outside the clear sky is judged. The scene detection result may include outdoor sunny days, outdoor cloudy days, indoor dim light environment, and the like.
In some embodiments, the scene detection results may be directly recalled to the application module 14. The scene detection result is directly recalled to the application module 14, which is beneficial to reducing nodes for data transmission, so as to improve transmission efficiency, and thus the processing speed of the image processor 10, the shooting device 100, and the electronic device 1000 according to the embodiment of the present application can be integrally increased.
Referring to fig. 5, in other embodiments, the scene detection result may be called back to the application module 14 through the communication service module 141. The image processor 10, the photographing device 100 and the electronic device 1000 according to the embodiment of the application may enable part of the receiving functions in the application module 14 to be packaged in the communication service module 141 by setting the communication service module 141 in the application module 14, which is beneficial to independent packaging of the functions of the application module 14, so as to improve the stability of the image processor 10 during operation.
Specifically, referring to fig. 5, after the post-algorithm processing module 16 executes the scene detection algorithm to obtain a scene detection result, the scene detection result is encapsulated and called back to the application module 14 through the communication service module 141 of the application module 14.
With continued reference to fig. 5, the post-algorithm processing module 16 of the embodiment of the present application may perform a preview algorithm process on the YUV image, copy the YUV image from the preview algorithm, and provide the copied YUV image for a scene detection algorithm process. In fig. 5, the post-algorithm processing module 16 may start to run the preview algorithm first and then start to run the scene detection algorithm, and after the preview algorithm obtains the YUV image, the YUV image is copied and sent to the process of the scene detection algorithm, so that the scene detection algorithm can receive the YUV image and perform subsequent processing to obtain a scene detection result. The image processor 10, the photographing device 100, and the electronic device 1000 according to the embodiment of the present application copy the YUV image from the preview algorithm process to perform the scene detection algorithm process, so that the preview algorithm process does not need to obtain the YUV image from the application module 14 or the hardware abstraction module 12, the number of image data transmission interfaces and the length of an image data transmission channel can be reduced, the transmission efficiency is improved, and the normal operation of the scene detection algorithm is not affected, therefore, the power consumption of the image processor 10, the photographing device 100, and the electronic device 1000 according to the embodiment of the present application can be reduced, and the processing speed is increased.
According to general user habits, a preview image needs to be generated after each shooting, but scene detection is not necessarily required for each shooting. Thus, in some embodiments, the operation of copying YUV images may be performed once every other frame or frames of YUV images, while scene detection is also run once every other frame or frames of YUV images. In other embodiments, the operation of copying the YUV image may be performed in a specific photographing mode (e.g., smart recognition photographing mode). The scene detection algorithm in the image processor 10, the photographing device 100, and the electronic device 1000 of the embodiment of the present application may be run once every one or several frames of YUV images, so that the power consumption of the image processor 10, the photographing device 100, and the electronic device 1000 of the embodiment of the present application can be reduced, and the processing speed of the image processor 10 can be increased.
Referring to fig. 1 and 5, in some embodiments, the application module 14 is configured to receive the YUV image transmitted from the hardware abstraction module 12 and transmit the YUV image to the post-algorithm processing module 16 for preview algorithm processing, which is beneficial to reduce the coupling degree between the preview algorithm processing and the hardware abstraction module 12, so as to reduce the difficulty of design and design modification, and further reduce the development cost.
Referring to fig. 6 and 7, in another embodiment, the hardware abstraction module 12 is directly connected to the algorithm post-processing module 16, and the hardware abstraction module 12 is configured to transmit the YUV image to the algorithm post-processing module 16 for preview algorithm processing, which is beneficial to reduce nodes for data transmission and shorten a distance for data transmission, so as to improve transmission efficiency, thereby improving processing speeds of the image processor 10, the photographing device 100, and the electronic device 1000 according to the embodiment of the present disclosure as a whole.
Referring to fig. 5 or fig. 7, the algorithm post-processing module 16 may include an encoding unit 162, and the encoding unit 162 is configured to convert the YUV image into a JPG image (or a JPEG image, etc.). Specifically, when the YUV image is processed by the post-algorithm processing module 16, the encoding unit 162 may directly encode the YUV image to form a JPG image, thereby increasing the output speed of the image. When the RAW image is processed by the post-algorithm processing module 16, the post-algorithm processing module 16 may transmit the RAW image processed to realize post-photographing processing back to the hardware abstraction module 12 through the application module 14, for example, back to the RAW to RGB processing unit 124, the RAW to RGB processing unit 124 may be configured to convert the RAW image processed by the post-algorithm processing module 16 to realize post-photographing processing and transmitted back through the application module 14 into an RGB image, the noise reduction and YUV post-processing unit 126 may convert the RGB image into a YUV image, and the YUV image may be transmitted to the encoding unit 162 of the post-algorithm processing module 16 again to convert the YUV image into a JPG image. In some embodiments, the algorithm post-processing module 16 may also transmit the RAW image processed to implement the post-photographing processing back to the buffer unit 122 through the application module 14, and the transmitted RAW image passes through the RAW to RGB processing unit 124 and the noise reduction and YUV post-processing unit 126 to form a YUV image, and then is transmitted to the encoding unit 162 to form the JPG image. After the JPG image is formed, an algorithmic post-processing module 16 may be used to transfer the JPG image to memory for storage.
Referring to fig. 1 and 5, in some embodiments, the algorithm post-processing module 16 includes an encoding unit 162, the encoding unit 162 may be configured to convert the YUV image into a JPG image, and the application module 14 is configured to perform a first operation on the JPG image according to a scene detection result.
In particular, in some embodiments, the first operation may include displaying the scene detection result on a display screen of the electronic device 1000 in which the image processor 10 is located. For example, the electronic device 1000 may be a mobile phone, and after the mobile phone takes an image and the application module 14 receives the scene detection result, the scene detection result is displayed while the taken image is displayed on the screen of the mobile phone, and the taken image and the scene detection result are stored together, so that the user can subsequently classify and sort the images in the mobile phone. When the scene detection result is a building, displaying the building on the screen of the mobile phone while displaying the shot image, and storing the shot image and the building together; when the scene detection result is a natural scene, displaying the shot image and the natural scene on the screen of the mobile phone, and storing the shot image and the natural scene together; similarly, the scene detection result can also be a portrait, an object, a character and the like. The electronic device 1000 according to the embodiment of the application can enable a user to recognize the classification information of the shot object in real time after shooting through detection and display of the scene. The electronic device 1000 according to the embodiment of the present application detects and stores the scene, so that when the user needs to search for a certain image from the images with a large number of total images, the user can search according to the scene category, and thus the user can conveniently search for the required image at a higher speed.
In other embodiments, the first operation may include: the photographing mode of the photographing apparatus 100 is selected according to the scene detection result. The photographing mode may include a portrait mode, a scene mode, and an animal mode. The photographing apparatus 100 may include a lens and an image sensor 20. For example, when the scene detection result is the person photographing, the application module 14 controls the photographing mode of the photographing apparatus 100 to be the person mode. In the portrait mode, the focusing mode of the lens is single-point focusing, which is beneficial to highlight the people in the RAW image collected by the image sensor 20. When the scene detection result is the scene photographing, the application module 14 controls the photographing mode of the photographing device 100 to be the scene mode. In the scene mode, the focusing mode of the lens is multi-point focusing, which is beneficial to clear imaging of RAW images acquired by the image sensor 20. When the scene detection result is outdoor in a clear day, the application module 14 controls the photographing mode of the photographing apparatus 100 to be a clear day mode. In the clear weather mode, the exposure aperture of the lens is smaller, and the brightness of the shooting environment in the clear weather is higher, which is beneficial for the image sensor 20 to acquire the RAW image with more suitable brightness. When the scene detection result is outside the cloudy day, the application module 14 controls the photographing mode of the photographing apparatus 100 to be the cloudy day mode. In the cloudy mode, the exposure aperture of the lens is large, and the brightness of the shooting environment in the cloudy day is low, which is beneficial for the image sensor 20 to acquire a RAW image with more appropriate brightness.
In still other embodiments, the first operation may include: and carrying out image processing on the JPG image. The image processing may include a beauty treatment, a dermabrasion treatment, a food filter treatment. For example, when the scene detection result is food, the application module 14 performs food filter processing on the JPG image received from the post-algorithm processing module 16, and the food filter processing may include increasing color saturation of the JPG image, so that the food in the JPG image has better appreciation. When the scene detection result is a portrait, the application module 14 performs a peeling process on the JPG image received from the post-algorithm processing module 16, and the peeling process may include increasing smoothness of a partial area of the JPG image, so that the portrait in the JPG image has a better skin.
Referring to fig. 6 and 7, in some embodiments, the hardware abstraction module 12 may be configured to perform a second operation on the YUV image according to the scene detection result. In fig. 6 and 7, the hardware abstraction module 12 may directly send the YUV image or the RAW image to the post-algorithm processing module 16, the scene detection result of the scene detection algorithm of the post-algorithm processing module 16 may be called back to the hardware abstraction module 12, and then the hardware abstraction module 12 may perform the second operation on the YUV image according to the scene detection result. Specifically, in some embodiments, the second operation may include adjustment of shooting parameters of the shooting device 100. For example, when the scene detection result is outdoors on a clear day, the hardware abstraction module 12 controls to reduce an exposure aperture of a lens of the shooting device 100, so as to reduce the light incoming amount in the shooting process, and facilitate the image sensor 20 to acquire a RAW image with more appropriate brightness, thereby facilitating the improvement of the imaging quality. When the scene detection result is outside the cloudy sky, the hardware abstraction module 12 controls to increase the exposure aperture of the lens of the shooting device 100 to increase the light input amount in the shooting process, which is beneficial for the image sensor 20 to acquire RAW images with more suitable brightness, thereby being beneficial to improving the imaging quality. When the scene detection result is a moving scene, the hardware abstraction module 12 controls to reduce the shutter speed of the lens of the shooting device 100 to reduce the duration of the shooting process, thereby reducing the blur and afterimage of the RAW image acquired by the image sensor 20, and facilitating to improve the imaging quality.
In other embodiments, the second operation may further include performing some image processing on the image, for example, when the scene detection result is an indoor dim light environment, the hardware abstraction module 12 performs brightness enhancement processing on the YUV image or the RAW image to enhance the imaging effect.
In still other embodiments, the image sensor 20 may include a gyroscope, and the hardware abstraction module 12 may send the frame number suggestion to the application module 14 according to the jitter condition of the gyroscope, the scene detection result, and the like, for example, when the jitter detected by the gyroscope is large and the scene detection result is a moving scene, the frame number suggestion sent by the hardware abstraction module 12 to the application module 14 may be: more frames are suggested to better realize post-photographing processing; when the jitter detected by the gyroscope is small and the scene detection result is a static scene, the frame number suggestion sent by the hardware abstraction module 12 to the application module 14 may be: fewer frames are suggested to reduce the amount of data transmission. That is, the number of frames that the hardware abstraction module 12 suggests to the application module 14 may be positively correlated to the jitter degree detected by the gyroscope and correlated to the scene detection result, so that the application module 14 can control the image sensor 20 to capture images at a more appropriate number of frames during the shooting process, and control the image processor 10 to process images at a more appropriate number of frames, which is beneficial to improve the imaging quality or improve the processing efficiency.
In still other embodiments, the image sensor 20 may include a gyroscope, and the hardware abstraction module 12 may further send the algorithm suggestion to the application module 14 according to the jitter condition of the gyroscope and the scene detection result, for example, when the jitter detected by the gyroscope is large and the scene detection result is a moving scene, the algorithm suggestion sent by the hardware abstraction module 12 to the application module 14 may be multi-frame processing to eliminate the jitter according to the multi-frame processing; when the scene type detected by the scene detection result is a character, the algorithm suggestion sent by the hardware abstraction module 12 to the application program module 14 may be a beautifying process to beautify the character; when the scene type detected by the scene detection result is landscape, the algorithm suggestion sent by the hardware abstraction module 12 to the application module 14 may be HDR processing to form a high dynamic range landscape image. The application program module 14 sends a data request to the hardware abstraction module 12 according to the frame number suggestion and the algorithm suggestion, the hardware abstraction module 12 transmits corresponding data to the application program module 14 according to the data request, and the application program module 14 transmits the data to the algorithm post-processing module 16 for post-processing after photographing. The image processor 10, the image processing method, the photographing device 100 and the hardware abstraction module 12 in the electronic device 1000 according to the embodiment of the present application may also send an algorithm suggestion to the application program module 14 according to the shaking condition of the gyroscope, the scene detection result, and the like, so that the application program module 14 is facilitated to perform image post-photographing processing on the image with a suitable algorithm, which is beneficial to improving the imaging quality.
After the image sensor 20 performs one shot (exposure imaging), the shot data (RAW image) is transmitted to the hardware abstraction module 12, and after the RAW image and/or YUV image corresponding to the shot data is received by the post-algorithm processing module 16, the image sensor 20 can perform the next shot, or the image sensor 20 can be turned off, or the application module 14 can exit the application interface. Since the post-photographing processing is implemented by the post-algorithm processing module 16, after the RAW image and/or the YUV image corresponding to the photographing data is transmitted to the post-algorithm processing module 16, the post-photographing processing can be implemented only by the post-algorithm processing module 16, and at this time, the image sensor 20 and the application program module 14 may not participate in the post-photographing processing, so that the image sensor 20 can be turned off or perform the next photographing, and the application program module 14 can be turned off or quit the application interface. In this way, the photographing apparatus 100 can achieve snapshot, and the application module 14 can be closed or the application interface can be exited when the post-photographing processing is performed by the post-algorithm processing module 16, so that some other operations (for example, operations unrelated to the photographing apparatus 100, such as browsing a web page, watching a video, making a call, etc.) can be performed on the electronic device 1000, so that the user does not need to spend a lot of time waiting for the completion of the post-photographing processing, and the user can use the electronic device 1000 conveniently.
Referring to fig. 8, the algorithm post-processing module 16 may include a logic processing calling layer 164, an algorithm module interface layer 166, and an algorithm processing layer 168. The logic processing call layer 164 is used to communicate with the application module 14. The algorithm module interface layer 166 is used to maintain the algorithm interface. The algorithm processing layer 168 includes at least one image processing algorithm. The algorithm module interface layer 166 is used for performing at least one of registration, logout, call and callback on the image processing algorithm of the algorithm processing layer 168 through the algorithm interface.
The logic processing calling layer 164 may include a thread queue, and after receiving the post-photographing processing task of the RAW image and/or the YUV image, the algorithm post-processing module 16 may cache the post-photographing processing task in the thread queue for processing, where the thread queue may cache a plurality of post-photographing processing tasks, and thus, a snapshot (i.e., a snapshot mechanism) may be implemented by the logic processing calling layer 164. The logical process calling layer 164 may receive an instruction such as an initialization (init) or process (process) transmitted from the application module 14, and store the corresponding instruction and data in the thread queue. The logic processing call layer 164 makes a call of specific logic (i.e., a specific logic call combination) according to the task in the thread queue. Logical process call layer 164 may also pass the thumbnail (thumbnail) obtained by the process back to application module 14 for display (i.e., thumbnail display). In the description of the embodiments of the present application, "a plurality" means two or more unless specifically defined otherwise.
The algorithm module interface layer 166 is used for calling an algorithm interface, the calling command can also be stored in the thread queue, and the algorithm processing layer 168 can analyze the parameter of the calling command to obtain the image processing algorithm to be called when receiving the calling command of the thread queue. When the algorithm module interface layer 166 registers the image processing algorithm, an image processing algorithm may be newly added to the algorithm processing layer 168; when the algorithm module interface layer 166 performs logout on an image processing algorithm, one of the image processing algorithms in the algorithm processing layer 168 may be deleted; when the algorithm module interface layer 166 calls an image processing algorithm, one of the image processing algorithms in the algorithm processing layer 168 may be called; when the algorithm module interface layer 166 recalls the image processing algorithm, the data and status after the algorithm processing can be transmitted back to the application module 14. The unified interface can be adopted to realize the operations of registration, logout, call-back and the like of the image processing algorithm. Each image processing algorithm in the algorithm processing layer 168 is independent, so that operations such as registration, logout, call back and the like can be conveniently realized on the image processing algorithms.
Referring to fig. 9, in some embodiments, the image processor 10 further includes a camera service module 18. The hardware abstraction module 12 is connected to the application module 14 through the camera service module 18. The camera service module 18 encapsulates the RAW image and/or the YUV image and transmits the encapsulated RAW image and/or YUV image to the application module 14, and transmits the RAW image returned by the application module 14 to the hardware abstraction module 12. In this way, by encapsulating the image by the camera service module 18, the efficiency of image transmission can be improved, and the security of image transmission can be improved. When the image processor 10 includes the camera service module 18, the path of data (images, metadata, etc.) transmission in the image processor 10 may be adapted, i.e., data transmitted between the hardware abstraction module 12 and the application module 14 need to pass through the camera service module 18. For example, when the hardware abstraction module 12 transmits the RAW image and/or the YUV image to the application module 14, the hardware abstraction module 12 first transmits the RAW image and/or the YUV image to the camera service module 18, and the camera service module 18 encapsulates the RAW image and/or the YUV image and transmits the encapsulated RAW image and/or YUV image to the application module 14. For another example, when the hardware abstraction module 12 transmits metadata to the application program module 14, the hardware abstraction module 12 first transmits the metadata to the camera service module 18, and the camera service module 18 encapsulates the metadata and transmits the encapsulated metadata to the application program module 14. For another example, when the hardware abstraction module 12 transmits the frame number suggestion to the application module 14, the hardware abstraction module 12 first transmits the frame number suggestion to the camera service module 18, and the camera service module 18 encapsulates the frame number suggestion and transmits the encapsulated frame number suggestion to the application module 14. For another example, when the hardware abstraction module 12 transmits the algorithm suggestion to the application module 14, the hardware abstraction module 12 first transmits the algorithm suggestion to the camera service module 18, and the camera service module 18 encapsulates the algorithm suggestion and transmits the encapsulated algorithm suggestion to the application module 14. Of course, in some embodiments, the hardware abstraction module 12 may transmit the sensitivity information, the jitter condition of the gyroscope, the AR scene detection result, and the like to the camera service module 18, and the camera service module 18 obtains the frame number suggestion and/or the algorithm suggestion according to the sensitivity information, the jitter condition of the gyroscope, the AR scene detection result, and the like, and then transmits the frame number suggestion and/or the algorithm suggestion to the application module 14.
In the description of the embodiments of the present application, it should be noted that, unless otherwise explicitly specified or limited, the term "mounted" is to be interpreted broadly, e.g., as being either fixedly attached, detachably attached, or integrally attached; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. Specific meanings of the above terms in the embodiments of the present application can be understood by those of ordinary skill in the art according to specific situations.
Referring to fig. 1 and 10, an image processing method according to an embodiment of the present disclosure includes:
01: the hardware abstraction module 12 receives the RAW image, converts the RAW image into a YUV image, and transmits the YUV image to the application module 14; and
02: the post-algorithm processing module 16 processes the YUV image using a scene detection algorithm to obtain a scene detection result.
The image processing method according to the embodiment of the present invention may be used in the image processor 10 according to the embodiment of the present invention, or the image processing method according to the embodiment of the present invention may be implemented by the image processor 10 according to the embodiment of the present invention, wherein step 01 may be implemented by the hardware abstraction module 12, and step 02 may be implemented by the post-algorithm processing module 16.
In the image processing method according to the embodiment of the present application, the hardware abstraction module 12 does not perform scene detection algorithm processing on the YUV image, but the algorithm post-processing module 16 performs scene detection algorithm processing on the YUV image, so that the scene detection algorithm is implemented externally without performing flow truncation on the algorithm framework of the hardware abstraction module 12 itself, thereby greatly reducing the degree of coupling between the scene detection processing and the hardware abstraction module 12, facilitating reduction of difficulty in the design process and difficulty in design change, and reducing the development cost.
Referring to fig. 1 and 11, in some embodiments, the post-algorithm processing module 16 further stores a preview algorithm therein, and the image processing method further includes:
03: the algorithm post-processing module 16 executes preview algorithm processing on the YUV image;
04: the algorithm post-processing module 16 copies the YUV image from the preview algorithm;
the post-algorithm processing module 16 processes the YUV image by using a scene detection algorithm to obtain a scene detection result, including:
021: the algorithm post-processing module 16 processes the copied YUV image using a scene detection algorithm to obtain a scene detection result.
Wherein, steps 03, 04 and 021 can be realized by the post-algorithm processing module 16.
Referring to fig. 1 and 12, in some embodiments, the image processing method further includes:
05: the application module 14 receives the YUV image transmitted from the hardware abstraction module 12;
06: the application module 14 transmits the YUV image to the post-algorithm processing module 16 for preview algorithm processing.
Wherein steps 05 and 06 may be implemented by the application module 14.
Referring to fig. 1 and 13, in some embodiments, the image processing method further includes:
07: the hardware abstraction module 12 directly transmits the YUV image to the post-algorithm processing module 16 for preview algorithm processing.
Wherein step 07 may be implemented by the hardware abstraction module 12.
Referring to fig. 1 and 14, in some embodiments, the post-algorithm processing module 16 processes the copied YUV image with a scene detection algorithm to obtain a scene detection result (021), including:
0211: receiving the copied YUV image;
0212: extracting parameters of the YUV image;
0213: executing scene detection algorithm processing according to the parameters to obtain a scene detection result; and
0214: and encapsulating the scene detection result.
Wherein, the steps 0211, 0212, 0213 and 0214 can be realized by an algorithm post-processing module 16.
Referring to fig. 1 and 15, in some embodiments, the image processing method further includes:
08: the scene detection result is called back to the application module 14 through the communication service module 141.
Wherein step 08 can be implemented by the algorithm post-processing module 16.
Referring to fig. 1 and 16, in some embodiments, the image processing method further includes:
09: the YUV image is converted into a JPG image by the algorithm post-processing module 16;
010: the application module 14 performs a first operation on the JPG image according to the scene detection result.
Wherein step 09 can be implemented by the post-algorithm processing module 16, and step 10 can be implemented by the application module 14.
Referring to fig. 1 and 17, in some embodiments, the image processing method further includes:
011: the hardware abstraction module 12 performs a second operation on the YUV image according to the scene detection result.
Wherein, step 011 can be implemented by the hardware abstraction module 12.
The explanation of the image processor 10 in the above embodiment is also applicable to the image processing method according to the embodiment of the present invention, and will not be described herein again.
In the image processor 10, the image processing method, the photographing device 100, and the electronic device 1000 according to the embodiment of the present application, the hardware abstraction module 12 does not perform scene detection algorithm processing on the YUV image, but the algorithm post-processing module 16 performs scene detection algorithm processing on the YUV image, so that the scene detection algorithm is implemented externally without performing flow truncation on the algorithm framework of the hardware abstraction module 12 itself, thereby greatly reducing the degree of coupling between the scene detection processing and the hardware abstraction module 12, facilitating reduction of difficulty in the design process and difficulty in design change, and reducing the development cost.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processing module-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires (control method), a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the embodiments of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, references to the description of "certain embodiments" or the like are intended to mean that a particular feature, structure, or characteristic described in connection with the embodiment or example is included in at least one embodiment of the present application. In the present specification, the schematic representations of the above terms do not necessarily refer to the same embodiment. Furthermore, the particular features, structures, or characteristics described may be combined in any suitable manner in any one or more embodiments.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (17)

1. An image processor, characterized in that the image processor comprises:
a hardware abstraction module for receiving a RAW image, converting the RAW image into a YUV image, and transmitting the YUV image;
an application module for interfacing with the hardware abstraction module; and
the post-algorithm processing module is connected with the hardware abstraction module through the application program module, a scene detection algorithm is stored in the post-algorithm processing module, the post-algorithm processing module is used for processing the YUV image transmitted from the hardware abstraction module by adopting the scene detection algorithm to obtain a scene detection result, and calling back the scene detection result to the hardware abstraction module so as to execute a second operation on the YUV image through the hardware abstraction module according to the scene detection result, the second operation comprises at least one of adjusting the light input quantity, adjusting the shutter speed and processing the image, the scene detection result comprises the type of the scene, and the scene detection algorithm comprises at least one of edge extraction, fuzzy detection and brightness detection.
2. The image processor of claim 1, wherein the post-algorithm processing module further stores a preview algorithm therein, and wherein the post-algorithm processing module is further configured to perform a preview algorithm process on the YUV image, copy the YUV image from the preview algorithm, and provide the copied YUV image to the scene detection algorithm.
3. The image processor of claim 2, wherein the application module is configured to receive the YUV images transmitted from the hardware abstraction module and to transmit the YUV images to the post-algorithm processing module for processing by the preview algorithm.
4. The image processor of claim 2, wherein the hardware abstraction module is directly connected to the post-algorithm processing module, the hardware abstraction module being configured to transmit the YUV images to the post-algorithm processing module for processing by the preview algorithm.
5. The image processor of claim 2, wherein the post-algorithm processing module is further configured to receive the copied YUV image, extract parameters of the YUV image, perform the scene detection algorithm processing according to the parameters to obtain the scene detection result, and encapsulate the scene detection result.
6. The image processor of any one of claims 1-5, wherein the scene detection result is recalled to the application module via a communication service module.
7. The image processor of claim 6, wherein the post-algorithm processing module comprises an encoding unit configured to convert the YUV images into JPG images, and the application module is configured to perform a first operation on the JPG images according to the scene detection result.
8. An image processing method, characterized in that the image processing method comprises:
the hardware abstraction module receives a RAW image, converts the RAW image into a YUV image and transmits the YUV image to an application program module; and
the algorithm post-processing module processes the YUV image transmitted from the hardware abstraction module by adopting a scene detection algorithm to obtain a scene detection result, and calls back the scene detection result to the hardware abstraction module to execute a second operation on the YUV image according to the scene detection result through the hardware abstraction module, wherein the second operation comprises at least one of adjusting the light input quantity, adjusting the shutter speed and processing the image, the scene detection result comprises the type of a scene, and the scene detection algorithm comprises at least one of edge extraction, fuzzy detection and brightness detection.
9. The image processing method according to claim 8, wherein a preview algorithm is further stored in the post-algorithm processing module, and the image processing method further comprises:
the algorithm post-processing module executes preview algorithm processing on the YUV image;
the algorithm post-processing module copies the YUV image from the preview algorithm;
the algorithm post-processing module processes the YUV image by adopting a scene detection algorithm to obtain a scene detection result, and comprises the following steps: and the algorithm post-processing module processes the copied YUV image by adopting the scene detection algorithm to obtain a scene detection result.
10. The image processing method according to claim 9, characterized in that the image processing method further comprises:
the application module receives the YUV image transmitted from the hardware abstraction module; and
and the application program module transmits the YUV image to the algorithm post-processing module for the preview algorithm processing.
11. The image processing method according to claim 9, characterized in that the image processing method further comprises:
and the hardware abstraction module directly transmits the YUV image to the algorithm post-processing module for the preview algorithm processing.
12. The image processing method of claim 9, wherein the post-algorithm processing module processes the copied YUV image with a scene detection algorithm to obtain a scene detection result, and comprises:
receiving the copied YUV image;
extracting parameters of the YUV image;
executing the scene detection algorithm processing according to the parameters to obtain the scene detection result; and
and packaging the scene detection result.
13. The image processing method according to any one of claims 8 to 12, characterized in that the image processing method further comprises: and the scene detection result is called back to the application program module through the communication service module.
14. The image processing method according to claim 13, characterized in that the image processing method further comprises:
the YUV image is converted into a JPG image by the algorithm post-processing module; and
and the application program module executes a first operation on the JPG image according to the scene detection result.
15. The image processing method according to claim 13, characterized in that the image processing method further comprises:
and the hardware abstraction module executes a second operation on the YUV image according to the scene detection result.
16. A photographing apparatus, characterized by comprising:
the image processor of any one of claims 1 to 7; and
an image sensor connected with the image processor.
17. An electronic device characterized in that the electronic device comprises the photographing apparatus according to claim 16 and
a housing, the photographing device being combined with the housing.
CN202010313755.2A 2020-04-20 2020-04-20 Image processor, image processing method, photographing device, and electronic apparatus Active CN111491101B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010313755.2A CN111491101B (en) 2020-04-20 2020-04-20 Image processor, image processing method, photographing device, and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010313755.2A CN111491101B (en) 2020-04-20 2020-04-20 Image processor, image processing method, photographing device, and electronic apparatus

Publications (2)

Publication Number Publication Date
CN111491101A CN111491101A (en) 2020-08-04
CN111491101B true CN111491101B (en) 2022-08-02

Family

ID=71798332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010313755.2A Active CN111491101B (en) 2020-04-20 2020-04-20 Image processor, image processing method, photographing device, and electronic apparatus

Country Status (1)

Country Link
CN (1) CN111491101B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112162797B (en) * 2020-10-14 2022-01-25 珠海格力电器股份有限公司 Data processing method, system, storage medium and electronic device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101594295B1 (en) * 2009-07-07 2016-02-16 삼성전자주식회사 Photographing apparatus and photographing method
WO2016149894A1 (en) * 2015-03-23 2016-09-29 Intel Corporation Workload scheduler for computing devices with camera
CN107592453B (en) * 2017-09-08 2019-11-05 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN108496198B (en) * 2017-10-09 2021-08-20 华为技术有限公司 Image processing method and device
CN109963083B (en) * 2019-04-10 2021-09-24 Oppo广东移动通信有限公司 Image processor, image processing method, photographing device, and electronic apparatus
CN110290288B (en) * 2019-06-03 2022-01-04 Oppo广东移动通信有限公司 Image processor, image processing method, photographing device, and electronic apparatus
CN110300240B (en) * 2019-06-28 2021-08-13 Oppo广东移动通信有限公司 Image processor, image processing method, photographing device and electronic equipment
CN110266951A (en) * 2019-06-28 2019-09-20 Oppo广东移动通信有限公司 Image processor, image processing method, filming apparatus and electronic equipment
CN110941821A (en) * 2019-12-09 2020-03-31 Oppo广东移动通信有限公司 Data processing method, device and storage medium

Also Published As

Publication number Publication date
CN111491101A (en) 2020-08-04

Similar Documents

Publication Publication Date Title
CN110086967B (en) Image processing method, image processor, photographing device and electronic equipment
CN109963083B (en) Image processor, image processing method, photographing device, and electronic apparatus
WO2021052232A1 (en) Time-lapse photography method and device
CN110290288B (en) Image processor, image processing method, photographing device, and electronic apparatus
US11070742B2 (en) Optimized exposure temporal smoothing for time-lapse mode
CN110062161B (en) Image processor, image processing method, photographing device, and electronic apparatus
WO2020259250A1 (en) Image processing method, image processor, photographing apparatus, and electronic device
CN110177214B (en) Image processor, image processing method, photographing device and electronic equipment
CN111147695B (en) Image processing method, image processor, shooting device and electronic equipment
CN110264473B (en) Image processing method and device based on multi-frame image and electronic equipment
CN110996012B (en) Continuous shooting processing method, image processor, shooting device and electronic equipment
CN109618102B (en) Focusing processing method and device, electronic equipment and storage medium
CN110012227B (en) Image processing method, image processing device, storage medium and electronic equipment
WO2021238522A1 (en) Multimedia processing chip, electronic device, and image processing method
CN116055890A (en) Method and electronic device for generating high dynamic range video
TW202318331A (en) Camera initialization for reduced latency
CN111491101B (en) Image processor, image processing method, photographing device, and electronic apparatus
CN111193866A (en) Image processing method, image processor, photographing device and electronic equipment
CN111510629A (en) Data display method, image processor, photographing device and electronic equipment
CN111193867B (en) Image processing method, image processor, photographing device and electronic equipment
CN115633262B (en) Image processing method and electronic device
CN110278386A (en) Image processing method, device, storage medium and electronic equipment
EP4148656A1 (en) Multimedia processing chip, electronic device, and dynamic image processing method
CN110602359B (en) Image processing method, image processor, photographing device and electronic equipment
CN112218008B (en) Device control method, device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant