CN117692790A - Image data processing method and related device - Google Patents

Image data processing method and related device Download PDF

Info

Publication number
CN117692790A
CN117692790A CN202310897862.8A CN202310897862A CN117692790A CN 117692790 A CN117692790 A CN 117692790A CN 202310897862 A CN202310897862 A CN 202310897862A CN 117692790 A CN117692790 A CN 117692790A
Authority
CN
China
Prior art keywords
image
camera
format
spatial alignment
abstraction layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310897862.8A
Other languages
Chinese (zh)
Inventor
王京
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310897862.8A priority Critical patent/CN117692790A/en
Publication of CN117692790A publication Critical patent/CN117692790A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The application provides an image data processing method and a related device, which can realize that an adapter is added between a camera hardware abstract layer and an image processing algorithm module, and image data in a fixed format output by the camera hardware abstract layer is converted into data in a format required by the image algorithm module through the adapter. Therefore, the decoupling of the image processing algorithm and the camera hardware abstraction layer of the system-on-chip platform can be realized, so that the image processing algorithm can be customized according to the system-on-chip platform without customization, and the processing of image data can be realized.

Description

Image data processing method and related device
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to an image data processing method and a related device.
Background
With the development of electronic devices such as mobile phones and tablet computers, shooting functions are increasingly important. In order to obtain better shooting experience, the electronic device can process the image stream through some image processing algorithms so as to improve the picture effect of the shot image.
Currently, image processing algorithms on electronic devices are strongly dependent on a system on a chip (SOC) platform on the electronic devices, and developers must specifically design the image processing algorithms according to the SOC platform adopted on the electronic devices, which causes poor suitability of the image processing algorithms.
Disclosure of Invention
The image data processing method and the related device can realize decoupling of the image processing algorithm and a camera hardware abstraction layer of a system-on-chip platform, so that the image processing algorithm can be customized according to the system-on-chip platform without customization, and can also realize processing of image data.
In a first aspect, the present application provides an image data processing method applied to an electronic device, where the electronic device has a plurality of cameras and an image signal processor; the plurality of cameras comprise a first camera and a second camera, the zoom magnification range of the first camera is different from the zoom magnification range of the second camera, the electronic equipment runs an operating system, the operating system comprises a hardware abstraction layer, and the hardware abstraction layer comprises a camera hardware abstraction layer, an adapter and a space alignment transformation algorithm module; the method comprises the following steps: converting a first image in a first format output by the camera hardware abstraction layer into a third image in a second format through the adapter, and converting the second image in the first format output by the camera hardware abstraction layer into a fourth image in the second format; the first image is shot by the first camera, the second image is shot by the second camera, and the first format is different from the second format; determining, by the spatial transformation algorithm module, spatial alignment transformation parameters of a third format based on the third image of the second format and the fourth image of the second format; converting, by the adapter, the spatial alignment transformation parameters of the third format to spatial alignment transformation parameters of a fourth format, the third format being different from the fourth format; issuing the spatial alignment transformation parameters of the fourth format to the image signal processor through the camera hardware abstraction layer; performing spatial alignment transformation processing on a fifth image shot by the second camera based on the spatial alignment transformation parameters of the fourth format by an image signal processor to obtain a sixth image after spatial alignment transformation; the sixth image is displayed.
Therefore, the embodiment of the application provides an image data processing method, which can realize that an adapter is added between a camera hardware abstraction layer and an image processing algorithm module, and image data in a fixed format output by the camera hardware abstraction layer is converted into data in a format required by the image algorithm module through the adapter. Therefore, decoupling of the image processing algorithm and the system-on-chip platform can be realized, so that the image processing algorithm can be customized according to the system-on-chip platform, and processing of image data can be realized.
In one possible implementation, before acquiring, by the camera hardware abstraction layer, the first image captured by the first camera and the second image captured by the second camera, the method further includes: displaying an image with a first zoom magnification shot by the first camera, wherein the first zoom magnification is within a zoom magnification range of the first camera; a first operation of setting a zoom magnification to a second zoom magnification by a user is received, wherein the second zoom magnification is within a zoom magnification range of the second camera. The method for obtaining the first image shot by the first camera and the second image shot by the second camera through the camera hardware abstraction layer specifically comprises the following steps: and responding to the first operation, and acquiring a first image shot by the first camera and a second image shot by the second camera through the camera hardware abstraction layer.
In one possible implementation, the first format includes a port number and a junction buffer structure, where the junction buffer structure includes a buffer address of the image data, a width and a height of the image data, and a data type of the image data, where the data type includes YUV data or RAW data; wherein the port number of the first image is different from the port number of the second image.
In one possible implementation, the second format includes a camera type, a buffer type, and a frame structure; the buffer type is used for indicating the resolution ratio of the image data, and the camera type of the third image is different from that of the fourth image.
In one possible implementation, the third format includes clipping information and an affine transformation matrix.
In one possible implementation, the fourth format includes image correction adjustment data.
In one possible implementation, the camera hardware abstraction layer includes a spatially aligned transform junction; the converting, by the adapter, the first image in the first format output by the camera hardware abstraction layer into a third image in the second format, and converting the second image in the first format output by the camera hardware abstraction layer into a fourth image in the second format, specifically includes: the first image of the first format output by the space alignment conversion junction is converted into a third image of the second format through the adapter, and the second image of the first format output by the space alignment conversion junction is converted into a fourth image of the second format.
In one possible implementation, before converting, by the adapter, the first image in the first format output by the camera hardware abstraction layer into the third image in the second format, and converting, by the adapter, the second image in the first format output by the camera hardware abstraction layer into the fourth image in the second format, the method further includes: front-end processing is performed on the first image and the second image by the image signal processor.
In one possible implementation manner, the performing, by the image signal processor, spatial alignment transformation processing on the fifth image captured by the second camera based on the spatial alignment transformation parameter in the fourth format, to obtain a sixth image after spatial alignment transformation specifically includes: performing front-end processing on a fifth image shot by the second camera through the image signal processor; and performing spatial alignment transformation processing on the fifth image subjected to front-end processing based on the spatial alignment transformation parameters by the image signal processor to obtain the sixth image subjected to spatial alignment transformation.
In one possible implementation, the front-end processing includes one or more of color correction, downsampling, demosaicing, and statistical 3A data.
In a second aspect, the present application provides an electronic device comprising a plurality of cameras, one or more processors, and one or more memories; wherein the plurality of cameras, the one or more memories are coupled to the one or more processors, the one or more memories for storing a computer program which, when executed by the one or more processors, causes the method of any of the possible implementations of the first aspect to be performed.
In a third aspect, the present application provides another electronic device comprising one or more functional modules for performing the method in any of the possible implementations of the first aspect.
In a fourth aspect, the present application provides a chip comprising processing circuitry and interface circuitry for receiving instructions and transmitting to the processing circuitry, the processing circuitry for executing instructions to perform a method in any one of the possible implementations of the above aspect.
In a fifth aspect, the present application provides a computer readable storage medium storing a computer program which, when run on a processor of an electronic device, causes the method in any one of the possible implementations of the first aspect described above to be performed.
In a sixth aspect, embodiments of the present application provide a computer program product which, when run on a processor of an electronic device, causes a method in any one of the possible implementations of the above aspect to be performed.
Drawings
Fig. 1 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic diagram of a software and hardware architecture of an electronic device according to an embodiment of the present application;
3A-3D illustrate a set of zoom magnification transformations provided in embodiments of the present application that result in camera switching;
FIG. 4 is a schematic diagram of a data transmission manner between a hardware abstraction layer of a camera and a super-image engine according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a hardware abstraction layer architecture according to an embodiment of the present application;
fig. 6 is a schematic functional block diagram of an image data processing method according to an embodiment of the present application;
fig. 7 is a flowchart of an image data processing method provided in an embodiment of the present application;
fig. 8 is a schematic architecture diagram of a hardware abstraction layer according to another embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and in addition, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The following describes a hardware structure of an electronic device provided in an embodiment of the present application.
Fig. 1 is a schematic hardware structure of an electronic device according to an embodiment of the present application.
As shown in fig. 1, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a camera 193, a display 194, and the like. Wherein the sensor module 180 may include a pressure sensor 180A, a distance sensor 180F, a proximity light sensor 180G, a touch sensor 180K, an ambient light sensor 180L, etc.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD). The display panel may also be manufactured using organic light-emitting diode (OLED), active-matrix organic light-emitting diode (AMOLED) or active-matrix organic light-emitting diode (active-matrix organic light emitting diode), flexible light-emitting diode (FLED), mini, micro-OLED, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise and illuminance of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like. Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM). The random access memory may be read directly from and written to by the processor 110, may be used to store executable programs (e.g., machine instructions) for an operating system or other on-the-fly programs, may also be used to store data for users and applications, and the like. The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
Program codes corresponding to the shooting mode recommendation method provided by the embodiment of the application can be stored in a nonvolatile memory. In a scenario where a camera application is running, the electronic device 100 may load the program code stored in the nonvolatile memory into the random access memory and then send to the processor 110 for execution, thereby implementing the shooting mode switching method.
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device 100. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, a file such as a captured video is stored in an external nonvolatile memory.
The electronic device 100 may implement audio functions through an audio module 170, an application processor, and the like.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
In particular, the audio module 170 may include a speaker 170A, a receiver 170B, a microphone 170C, and an earphone interface 170D. The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. In embodiments of the present application, after beginning to record video, electronic device 100 may encode microphone 170C audio electrical signals and then obtain a video soundtrack. Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. The earphone interface 170D is used to connect a wired earphone.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus. In embodiments of the present application, the electronic device 100 may determine the object distance of the image using the distance sensor 180F.
The ambient light sensor 180L is for sensing ambient light illuminance. In embodiments of the present application, the electronic device 100 may determine the illuminance of the image using the ambient light sensor 180L.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
In the embodiment of the present application, the electronic device 100 may detect user operations such as clicking, sliding, etc. of the user on the screen through the touch detection capability provided by the touch sensor 180K, so as to control the starting and closing of the application program and the control.
The electronic device 100 may be a cell phone with a camera, a digital camera, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, a cellular phone, a personal digital assistant (personal digital assistant, PDA), an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, an artificial intelligence (artificial intelligence, AI) device, a wearable device, a vehicle-mounted device, a smart home device, and/or a smart city device, and the specific type of the electronic device 100 is not particularly limited in the embodiments of the present application.
Fig. 2 schematically illustrates a software and hardware architecture of an electronic device according to an embodiment of the present application.
As shown in FIG. 2, the layered architecture divides the operating system into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the system is divided into five layers, from top to bottom, an application layer, a framework layer (FWK), a Hardware Abstraction Layer (HAL), a kernel layer, and a hardware layer, respectively.
The application layer may include a series of application packages.
The application packages may include camera applications and the like.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for application packages of the application layer. The application framework layer includes some predefined functions.
In some embodiments, the application framework layer may include a camera access interface, wherein the camera access interface may include camera management and camera devices. The camera access interface is used to provide an application programming interface and programming framework for camera applications.
The hardware abstraction layer is an interface layer between the application framework layer and the kernel layer, and provides a virtual hardware platform for the operating system.
In the embodiment of the application, the hardware abstraction layer may include a Camera hardware abstraction layer (Camera HAL) and a super image engine (SIT). The SIT may include a plug in (plug in) of a plurality of image processing algorithms, for example, a plug in (SAT plug in) of a Spatial Alignment Transform (SAT) algorithm, a plug in (background blurring plug in) of a background blurring algorithm, and the like.
The Camera HAL can comprise one or more calling interfaces of cameras and one or more interfaces of image processing algorithms. Among other things, the one or more camera invocation interfaces include a camera interface 1 (e.g., a master camera) and a camera interface 2 (e.g., a wide angle camera), and so forth. The electronic device 100 may call each camera through the camera interfaces such as the camera interface 1 and the camera interface 2, and acquire image data of each camera.
The interfaces of the one or more image processing algorithms include a spatial alignment transformation interface, a background blurring interface, and the like. Where the interfaces of the image processing algorithms differ in the names of the different system-on-chip platforms, for example, the nodes of the image processing algorithms in the Camx architecture may be referred to as nodes (nodes), e.g., SAT nodes, etc. In the Camera HAL architecture of other system-on-chip platforms, the interface of the image processing algorithm may also be referred to as a session (session), etc.
The various nodes and modules in the camera hardware abstraction layer are provided by the solution architecture designed by the vendor of the SoC platform on the electronic device. For example, the solution architecture may include a CamX architecture, etc.
The kernel layer is a layer between the hardware layer and the hardware abstraction layer. The kernel layer includes drivers for various hardware. The kernel layer may include a camera driver, a digital signal processor driver, an image processor driver, and the like. Wherein the camera drives an image sensor (e.g., image sensor 1, image sensor 2, TOF sensor, etc.) for driving one or more cameras in the camera module to capture images and an image signal processor to pre-process the images. The digital signal processor driver is used for driving the digital signal processor to process the image. The graphics processor driver is used to drive the graphics processor to process the image.
The hardware layer may include a camera module, an Image Signal Processor (ISP), a Digital Signal Processor (DSP), a Graphics Processor (GPU). The camera module may include therein an image sensor (image sensor) of a plurality of cameras, for example, the image sensor 1, the image sensor 2, and so on. Optionally, a time of flight (TOF) sensor, a multispectral sensor, etc. may be included in the camera module.
Among them, an ISP (image front) module, an Image Processing Engine (IPE) module may be included. The IFE module can perform one or more front-end processing tasks of color correction, downsampling, demosaicing, statistics of 3A data and the like on image data output by a sensor in the camera module, aiming at a preview stream or a video stream. The IPE module can perform one or more back-end processing works such as hardware noise reduction, image clipping, software noise reduction, color processing, detail enhancement and the like on the image data.
The nodes of the image processing algorithm can send the image data processed by the IFE and/or IPE to the plugin of the image processing algorithm in the SIT module for algorithm processing. The plug-in of the image processing algorithm may pass the processing results to the Camera HAL.
For example, the SAT node may acquire image data acquired by a plurality of cameras processed by the IFE module, and send the image data to the SAT plug in the SIT. The SAT plug in may determine spatial alignment transformation parameters (e.g., a crop (crop) matrix, an affine transformation (warp) matrix) from image data acquired by the plurality of cameras processed by the IFE module through a spatial alignment switching algorithm model (SAT AIgo), and return the spatial alignment transformation parameters to the SAT node. The SAT node can provide the spatial alignment transformation parameters for an IPE module in the ISP, and the IPE module can perform spatial alignment transformation processing on the image stream acquired by the switched cameras based on the spatial alignment transformation parameters, so that when the electronic equipment performs optical zooming through camera switching, the picture content and the size of the image have no obvious jump, and the effect of smooth zooming is achieved.
Fig. 3A-3D illustrate a set of zoom magnification transformations provided in embodiments of the present application that result in camera switching.
As shown in fig. 3A, the electronic device 100 may display a desktop 310, where a page with application icons is displayed in the desktop 310, where the page includes a plurality of application icons (e.g., a setup application icon, an application marketplace application icon, a gallery application icon, a browser application icon, etc.). A page indicator 313 is also displayed below the plurality of application icons to indicate the positional relationship of the currently displayed page with other pages. Below the page indicator 313, a tray area 311 is displayed. The tray area 311 includes a plurality of tray icons, such as a camera application icon 312, an address book application icon, a phone application icon, and an information application icon. The tray area 311 remains displayed at the time of page switching. In some embodiments, the page may also include a plurality of application icons and a page indicator 313, where the page indicator 313 may not be part of the page, and the tray icon may also be optional, which is not limited in this embodiment of the present application.
The electronic device 100 may receive an input (e.g., a click) by a user on the camera application icon 312, and in response to the input operation, the electronic device 100 may display a capture interface 320 as shown in fig. 3B.
As shown in fig. 3B, the shooting interface 320 may include a playback control 325A, a shooting control 325B, a camera conversion control 325C, a preview box 322, a zoom magnification control 323, one or more shooting mode controls (e.g., a large aperture shooting mode control 324A, a night scene shooting mode control 324B, a portrait shooting mode control 324C, a normal shooting mode control 324D, a normal video mode control 324E, a multi-mirror video mode control 324F, and a more mode control 324G).
Wherein, as shown in fig. 3B, the control 324D of the normal photographing mode is selected, and the electronic device 100 is in the normal photographing mode. The preview frame 322 displays a preview screen 326 collected by the electronic device 100 through the camera in the normal photographing mode. The back display control 325A may be used to trigger the display of a captured image or video. The capture control 325B is used to trigger saving of images captured by the camera. The camera conversion control 325C can be used to switch the camera of the electronic device 100 that acquired the image (e.g., front camera to rear camera or rear camera to front camera). The zoom magnification control 323 may be used to set a zoom magnification at which the electronic device 100 takes a photograph or video. The zoom magnification control 323 may display a currently used zoom magnification (for example, 1×).
The electronic device 100 may receive user input (e.g., a click) on the zoom magnification control 323, in response to which the electronic device 100 may display a zoom magnification setting control 331 and a current zoom magnification control 332 as shown in fig. 3C.
As shown in fig. 3C, the electronic device 100 may display the zoom magnification control 323 described above in place of the zoom magnification setting control 331 and the current zoom magnification control 332. The zoom magnification setting control 331 may display thereon a zoom magnification range (e.g., 0.5x to 10 x) supported by the normal photographing mode. The electronic device 100 may determine the zoom magnification set by the user based on the user input (e.g., sliding) to the zoom magnification setting control 331. The current zoom magnification control 332 may be used to display the zoom magnification currently selected by the user on the zoom magnification setting control 331.
As shown in fig. 3D, the electronic device 100 may receive and respond to a user input (e.g., sliding) to the zoom magnification setting control 331, adjust the zoom magnification of the normal photographing mode (e.g., adjust the zoom magnification from 1x to 0.9 x), and acquire the real-time preview screen 327 based on the zoom magnification (e.g., 0.9 x) set by the user. The electronic device 100 may display a preview screen 327 in the preview box 322. In contrast, the preview screen 327 and the preview screen 326 are smaller in zoom magnification (for example, 0.9 x) than the preview screen 327, and thus the screen of the subject in the preview screen 327 is smaller in duty than the same subject in the preview screen 326.
In the above zoom magnification switching process, since the zoom magnification ranges supported by different cameras on the electronic device 100 are different, for example, the zoom magnification range supported by the main camera is 1x to 10x, and the zoom magnification range supported by the wide-angle camera is 0.5x to 1x. When the zoom magnification is enlarged or reduced to a certain value, the electronic device 100 needs to switch to the camera to capture the image stream. For example, when the zoom magnification is equal to or greater than 1x, the electronic device 100 may capture an image stream through the main camera, and when the zoom magnification is adjusted to be less than 1x (e.g., 0.9 x), the electronic device 100 may switch from the main camera to capture an image stream using the wide-angle camera. However, when the electronic device 100 is capturing the image stream, if the electronic device 100 switches the camera, a jump is easily caused in the content of the image displayed on the image frame. Therefore, the electronic device 100 may process the image stream when the camera is switched by using a Spatial Alignment Transform (SAT) algorithm, so as to achieve a smooth optical zoom effect and prevent the jump of the content in the image frame.
When the SAT algorithm processes an image stream during camera switching, multiple paths of image data of multiple cameras need to be processed. In the camera switching process, a large amount of image data (for example, 10 to 20 pieces of image data) needs to be managed at the same time. This makes management of the image stream difficult. In addition, the SAT algorithm uses image data in universal bandwidth compression (UBWC) format, as opposed to image data in YUV format. The SAT algorithm needs to be different in the design of the image format structure. While the SAT algorithm is configured on the electronic device 100, it is necessary to rely on the interface design provided by the system-on-chip platform at the camera abstraction layer.
As shown in fig. 4, the SAT node of the Camera hardware abstraction layer (Camera HAL) of the system-on-chip platform a directly uses the data in the a format provided by the system-on-chip platform a and passes the data to the spatial alignment transformation algorithm module (SAT AIgo) of the spatial alignment transformation plug-in (SAT plug in) in the SIT module. SATAIgo is then converted from the fixed format image data provided by the camera abstraction layer into the data required by SAT AIgo. Thus, the SAT AIgo must be customized according to the system-on-chip platform A employed on the electronic device 100, and the SAT plug-in cannot be decoupled from the system-on-chip platform A. In addition, there is a lot of information that is not useful for SAT AIgo in the fixed format image data provided by the camera abstraction layer, resulting in wasted performance and memory of the electronic device 100.
Therefore, the embodiment of the application provides an image data processing method, which can realize that an Adapter (Adapter) is added between a camera hardware abstraction layer and an image processing algorithm module, and image data in a fixed format output by the camera hardware abstraction layer is converted into data in a format required by the image algorithm module through the Adapter. Therefore, decoupling of the image processing algorithm and the system-on-chip platform can be realized, so that the image processing algorithm can be customized according to the system-on-chip platform, and processing of image data can be realized.
FIG. 5 illustrates an architecture diagram of a hardware abstraction layer provided in an embodiment of the present application.
As shown in fig. 5, the hardware abstraction layers of the electronic device 100 may include: a Camera hardware abstraction layer (Camera HAL) of the system-on-chip platform a or a Camera hardware abstraction layer (Adapter) 530, a super image engine (SIT) 540 of the system-on-chip platform B.
The Camera HAL of the system-on-chip platform a may include a spatial alignment conversion interface (SAT node) 510, and the spatial alignment conversion interface 510 may include a transceiver module 511. The transceiver module 511 in the spatial alignment transformation interface 510 may be used to collect image data in the Camera HAL of the system-on-chip platform a, and pass it through the adapter 530 into the spatial alignment transformation plug-in 541 in the super-level image engine 540. The transceiver module 511 may also be configured to receive the processing results of the Camera HAL that the spatial alignment transformation plug-in 541 passes through the adapter 530 into the system-on-chip platform a.
A spatial alignment transformation session (SAT session) 520 may be included in the Camera HAL of the system-on-chip platform B. The spatial alignment transformation session 520 may include a transceiver module 521. Wherein the transceiver module 521 in the spatial alignment transformation session 520 is operable to collect image data in the Camera HAL of the system-on-chip platform a and pass it through the adapter 530 into the spatial alignment transformation plug-in 541 in the super image engine 540. The transceiver module 521 may also be configured to receive the processing results of the Camera HAL that the spatial alignment transformation plug-in 541 passes into the system-on-chip platform B via the adapter 530.
The adapter 530 may include a conversion module 531 and a conversion module 532 therein. The conversion module 531 may be configured to convert the image data in the platform format A1 sent by the transceiver module 511 of the spatial alignment conversion interface 510 into the image data in the private format C1, and send the image data to the spatial alignment conversion algorithm module 542 for spatial alignment conversion processing. The conversion module 531 may be further configured to convert the processing result of the private format C2, which is obtained by performing the spatial alignment conversion on the image data of the private format by the spatial alignment conversion algorithm module 542, into the processing result of the platform format A2, and return the processing result of the platform format A2 to the transceiver module 511 of the spatial alignment conversion interface 510. The processing result of the private format C2 may include clipping (crop) information and affine transformation (warp) matrix, among others. The processing result of the platform format A2 may be Image Correction Adjustment (ICA) data.
The conversion module 532 may be configured to convert the image data in the platform format B1 sent by the transceiver module 521 of the spatial alignment conversion session 520 into image data in a proprietary format, and provide the image data to the spatial alignment conversion algorithm module 542 for spatial alignment conversion processing. The conversion module 532 may be further configured to convert the processing result of the private format C2 after the spatial alignment transformation processing by the spatial alignment transformation algorithm module 542 according to the image data of the private format C1 into the processing result of the platform format B2, and return the processing result of the platform format B2 to the transceiver module 521 of the spatial alignment transformation session 520.
A spatial alignment transformation plug-in (SAT plug in) 541 may be included in the super image engine 540. A spatial alignment transformation algorithm module (SAT AIgo) 542 may be included in the spatial alignment transformation plug-in 541. The spatial alignment transformation algorithm module 542 may perform spatial alignment transformation processing according to the image data of the private format C1 transmitted by the adapter 530, to obtain a processing result of the private format C2. The spatial alignment transformation algorithm module 542 may output the processing result of the proprietary format C2 to the conversion module 531 or the conversion module 532.
Therefore, decoupling of the SIT and the system-on-chip platform can be realized, so that SAT plug in and SAT AIgo do not need to be changed when the SIT is transplanted across platforms, and image data processing can be realized.
In some embodiments, the conversion module 531 may be provided in a camera hardware abstraction layer of the system-on-chip platform a. The conversion module 532 may also be disposed in the camera hardware abstraction layer of the system-on-chip platform B.
Fig. 6 shows a functional block diagram of an image data processing method in an embodiment of the present application.
As shown in fig. 6, the hardware abstraction layers of the electronic device 100 may include: camera hardware abstraction layer (Camera HAL), adapter 530, super image engine (SIT) 540. The Camera HAL may include a spatial alignment conversion interface (SAT ndoe) 510, and the spatial alignment conversion interface 510 may include a transceiver module 511 therein. The adapter 530 may include a conversion module 531 therein. The conversion module 531 may include a data management module 5311 and an output management module 5312. A spatial alignment transformation plug-in (SAT plug in) 541 may be included in the super image engine 540. A spatial alignment transformation algorithm module (SAT AIgo) 542 may be included in the spatial alignment transformation plug-in 541.
The text description of the spatial alignment conversion interface 510, the transceiver module 511, the super image engine 540, the spatial alignment conversion plug-in 541, and the spatial alignment conversion algorithm module 542 may refer to the embodiment shown in fig. 5, and will not be repeated here.
The concept of port can be used in the Camera HAL to uniquely scale an image data stream. Accordingly, the platform format a output by the spatial alignment transformation interface 510 to the data management module 5311 may include a Port number (Port) and an interface buffer (Node buffer) structure of the image data. The camera type (CamType), buffer type (bufferlype) and frame (frame) structure of the image data stream may be included in the image data of the private data format.
The Node buffer structure includes buffer address of image data, width and height of image data, and data type of image data. The data types comprise YUV data and RAW data.
The buffer type (bufferlype) may be used to indicate the resolution of the image data, for example, the resolution of the image data may be full resolution, 1/4 resolution, 1/16 resolution, or the like.
The data management module 5311 may first convert the port number of the image data into a camera identification (CamID) and a port type (PortType). The data management module 5311 may then convert the camera identity (CamID) to a camera type (CamType), the port type to a buffer type (bufferlype), and encapsulate the Nodebuffer structure into a frame structure.
In one possible implementation, the data management module 5311 may directly convert the port number in the image data in the platform format a into a camera type and a buffer type (buffer type), and encapsulate the Nodebuffer into a frame structure to obtain the image data in the private format. The data management module 5311 may send the proprietary format image data to the spatial alignment transformation algorithm module 542 in the spatial alignment transformation plug-in 541.
The spatial alignment transformation algorithm module 542 may perform spatial alignment transformation analysis according to the image data of the private formats of the two cameras before and after the switching, which is transmitted by the data management module 5311, to obtain a processing result of the private format. The spatial alignment transformation algorithm module 542 may output the processing result in the proprietary format to the output management module 5312 in the conversion module 531. The processing results in the private format include clipping (crop) information and affine transformation (warp) matrix, among others.
The output management module 5312 may convert the proprietary format of the processing results to platform format a processing results. Specifically, the output management module 5312 may convert the crop information and the warp matrix in the private format processing result into Image Correction Adjustment (ICA) data in the platform format a processing result, and send the ICA data to the spatial alignment transformation interface 510.
The spatial alignment transformation interface 510, upon receiving ICA data, may provide the ICA data to an IPE module (not shown in fig. 6) in the ISP (not shown in fig. 6). The IPE module can perform space alignment transformation processing on the image streams acquired by the cameras after switching by utilizing ICA data, and the smooth zooming switching of the multiple cameras is completed. Among them, the spatial alignment transformation processing includes clipping (crop) and affine transformation (warp). The IPE module may pass the spatially aligned transformed image stream to the camera application through the camera driver, the camera hardware abstraction layer, and the framework layer. The camera application may present the cropped (crop) and affine transformed (warp) image streams.
An image data processing method provided in an embodiment of the present application is described below.
Fig. 7 is a schematic flow chart of an image data processing method according to an embodiment of the present application.
As shown in fig. 7, the method may include the steps of:
s701, the camera 1 sends an image stream 1 acquired by the camera 1 to the ISP.
S702.isp performs front-end processing and back-end processing on image stream 1.
The ISP may perform front-end processing on the image stream 1 through the IFE module, and perform back-end processing on the image stream 1 through the IPE module. The front-end processing may include one or more of color correction, downsampling, demosaicing, statistics of 3A data, and the like, among others. The back-end processing may include one or more of hardware noise reduction, clipping of images, software noise reduction, color processing, detail enhancement, and the like.
S703.isp sends the front-end processed and back-end processed image stream 1 to the camera application.
Wherein the ISP may send the front-end processed and back-end processed image stream 1 to the Camera application sequentially through a Camera driver (not shown in fig. 7), a Camera hal, a Camera Service (not shown in fig. 7).
S704, the camera application displays the image stream 1 after front-end processing and back-end processing.
The image stream 1 subjected to front-end processing and back-end processing comprises images with a first zoom magnification, which are shot by the camera 1, and the first zoom magnification is within a zoom magnification range of the first camera.
For example, as shown in fig. 3B described above, the zoom magnification is 1x. The front-end processed and back-end processed image stream 1 presented by the camera application may include a preview screen 326 captured by the camera 1.
S705. the camera application detects that the camera 2 is turned on.
Wherein the camera application detects a first operation in which the user sets the zoom magnification to a second zoom magnification that is within the zoom magnification range of the camera 2. When the camera application can detect that the zoom magnification set by the user is not within the zoom magnification range of the camera 1, it falls within the zoom magnification range of the camera 2, the camera 2 may be turned on. For example, the camera 1 may be a main camera, the camera 2 may be a wide-angle camera, the zoom magnification range of the main camera is [1x,10x ], and the zoom magnification range of the wide-angle camera is [0.5x,1 x). When the user sets the zoom magnification to be less than 1x, the camera application determines to switch from the main camera to the wide-angle camera to acquire the image stream.
S706, the camera application sends an opening instruction to the camera 2.
Among them, the camera application may send an on instruction to the camera 2 sequentially through a camera service (not shown in fig. 7), a camera hal, a camera driver (not shown in fig. 7).
S707, after receiving the opening instruction, the camera 2 can send the image stream 2 collected by the camera 2 to the ISP.
After receiving the start command, the camera 2 may collect the image stream 2 through an image sensor in the camera 2. The camera 2 may send the image stream 2 acquired by the camera 2 to the ISP for processing.
And S708, performing front-end processing on the image stream 2 by the ISP.
Wherein the ISP can perform front-end processing on the image stream 2 through the IFE module.
S709.isp sends the front-end processed image stream 2 to SAT node.
The sat node sends image stream 2 in the first format to the adapter.
After receiving the front-end processed image stream 2, the SAT node may encapsulate the front-end processed image stream 2 according to the first format specified by the Camera HAL, to obtain the image stream 2 in the first format. Wherein the first format includes a Port number (Port) and a node buffer (node buffer) structure. The node buffer (node buffer) structure may include one or more of buffer address of image data, width and height of image data, data type of image data, and the like. Wherein the data type includes any one of YUV data, RAW data, and the like.
S711 the adapter converts the image stream 2 in the first format into an image stream 2 in the second format.
For example, the image stream 2 in the first format may include a second image in the first format, and the adapter may convert the first image in the first format into a fourth image in the second format. Thus, the fourth image of the second format is included in the image stream 2 of the second format.
The second format may include a camera type, a buffer type (buffer type), and a frame (frame) structure, among others. The camera type includes any one of a main camera, a wide-angle camera, and the like. The buffer type (buffer type) is used to indicate the resolution of the image data, for example, the resolution of the image data may be any one of a full resolution, 1/4 resolution, 1/16 resolution, and the like.
S712, the adapter sends the image stream 2 in the second format to the SAT AIgo.
S713, the camera 1 sends the image stream 3 acquired by the camera 1 to the ISP.
S714.isp performs front-end processing on image stream 3.
The ISP may perform one or more front-end processes on the image stream 1 via IFE modules, such as color correction, downsampling, demosaicing, statistics of 3A data, and the like.
The isp sends the front-end processed image stream 3 to the SAT node.
S716.sat node sends image stream 3 in the first format to the adapter.
After receiving the front-end processed image stream 3, the SAT node may encapsulate the front-end processed image stream 3 according to the first format specified by the Camera HAL, to obtain the image stream 3 in the first format.
S717. the adapter converts the image stream 3 in the first format into an image stream 3 in the second format.
For example, the first image in the first format may be included in the image stream 3, and the adapter may convert the first image in the first format into a third image in the second format. Thus, the second format image stream 3 includes a third image in the second format.
The second format may include a camera type, a buffer type (buffer type), and a frame (frame) structure, among others.
S718, the adapter sends the image stream 3 in the second format to the SAT AIgo.
S719.sat AIgo determines spatial alignment transform parameters of a third format based on image stream 2 of the first format and image stream 3 of the second format.
Wherein the third format comprises clipping information (cropping) and an affine transformation (warp) matrix. The cropping information is used to crop the images in the image stream. The warp matrix is used to affine transform the images in the image stream.
S720, the SAT AIgo sends the space alignment transformation parameters in the third format to the adapter.
S721, the adapter converts the spatial alignment conversion parameters of the third format into the spatial alignment conversion parameters of the fourth format.
Wherein the fourth format includes Image Correction Adjustment (ICA) data.
S722. the adapter sends the spatial alignment transformation parameters of the fourth format to the SAT node.
S723.sat node sends spatial alignment transformation parameters of the fourth format to ISP.
And S724, performing spatial alignment transformation processing on the image stream 2 subjected to front end processing by the ISP based on the spatial alignment transformation parameters in the fourth format to obtain the image stream subjected to spatial alignment transformation.
The ISP may cut and affine transform the front-end processed image stream 2 according to the spatial alignment transformation parameter of the fourth format by using the IPE module, so as to obtain a spatial alignment transformed image stream.
For example, the front-end processed image stream 2 may include a fifth image, where the ISP may perform spatial alignment transformation processing on the fifth image according to the spatial alignment transformation parameters of the fourth format by using the IPE module to obtain a sixth image, and send the sixth image to the camera application for display.
S725.isp sends the spatially aligned transformed image stream to the camera application.
The ISP may send the spatially aligned transformed image stream to a Camera application through a Camera driver (not shown in fig. 7), a Camera hal, a Camera Service (not shown in fig. 7), among others.
S726, the camera applies the image stream after the transmission space alignment transformation.
The camera 1 may be referred to as a first camera and the camera 2 may be referred to as a second camera in the embodiments of the present application. In one example, camera 1 may be a master camera and camera 2 may be a wide angle camera, in another example, camera 1 may be a wide angle camera and camera 2 may be a master camera.
By providing the image data processing method in the embodiment of the application, an Adapter (Adapter) can be added between the camera hardware abstraction layer and the image processing algorithm module, and the image data in a fixed format output by the camera hardware abstraction layer can be converted into the data in a format required by the image processing algorithm through the Adapter. Therefore, decoupling of the image processing algorithm and the camera hardware abstraction layer can be realized, so that the image processing algorithm can be customized according to a system-on-chip platform, and processing of image data can be realized.
FIG. 8 illustrates an architecture diagram of a hardware abstraction layer provided in another embodiment of the subject application.
As shown in fig. 8, the hardware abstraction layers of the electronic device 100 may include: a Camera hardware abstraction layer (Camera HAL), an Adapter 530, a super image engine (SIT) 540 of the system on chip platform a.
Among them, the Camera HAL of the system-on-chip platform a may include a spatial alignment transformation node (SAT node) 510 and a background blurring node 512, etc. A transceiver module 511 may be included in the spatial alignment conversion junction 510. A transceiver module 513 may be included in the background blurring node 512.
The transceiver module 511 in the spatial alignment transformation interface 510 may be used to collect image data in the Camera HAL of the system-on-chip platform a, and pass it through the adapter 530 into the spatial alignment transformation plug-in 541 in the super-level image engine 540. The transceiver module 511 may also be configured to receive the processing results of the Camera HAL that the super image engine 540 passes into the system-on-chip platform a through the adapter 530.
The transceiver module 513 in the background blurring node 512 may be used to collect image data in the Camera HAL of the system-on-chip platform a and pass the image data through the adapter 530 into the background blurring plug-in 543 in the super image engine 540. The transceiver module 513 may be further configured to receive a processing result of the Camera HAL that is transmitted to the system-on-chip platform a by the background blurring plug-in 543 through the adapter 530.
The adapter 530 may include a conversion module 531 and a conversion module 533 therein. The conversion module 531 may be configured to convert the image data in the platform format A1 sent by the transceiver module 511 of the spatial alignment conversion interface 510 into the image data in the private format C1, and send the image data to the spatial alignment conversion algorithm module 542 for spatial alignment conversion processing. The conversion module 533 may be further configured to convert the processing result of the private format C2, which is obtained by performing the spatial alignment conversion on the image data in the private format by the spatial alignment conversion algorithm module 542, into the processing result of the platform format A2, and return the processing result of the platform format A2 to the transceiver module 511 of the spatial alignment conversion interface 510. The processing results in the private format may include clipping (crop) information and affine transformation (warp) matrices, among others. The processing result of the platform format A2 may be Image Correction Adjustment (ICA) data.
The conversion module 533 may be configured to convert the image data in the platform format A1 sent by the transceiver module 513 of the background blurring node 512 into the image data in the private format D1, and send the image data to the background blurring algorithm module 544 for background blurring. The conversion module 533 may be further configured to convert the processing result of the private format D2, which is obtained by performing the spatial alignment transformation on the image data of the private format D1 by the background blurring algorithm module 544, into the processing result of the platform format A3, and return the processing result of the platform format A3 to the transceiver module 513 of the background blurring node 512.
A plurality of image processing algorithm plug-ins may be included in the super-image engine 540, wherein the plurality of image processing algorithm plug-ins includes a spatial alignment transformation plug-in (SAT plug in) 541 and a background blurring plug-in 543. A spatial alignment transformation algorithm module (SAT AIgo) 542 may be included in the spatial alignment transformation plug-in 541. A background blurring algorithm module 544 may be included in the background blurring plug-in 543.
The spatial alignment transformation algorithm module 542 may perform spatial alignment transformation processing according to the image data of the private format C1 transmitted by the conversion module 531 in the adapter 530, to obtain a processing result of the private format C2. The spatial alignment transformation algorithm module 542 may output the processing result of the proprietary format C2 to the conversion module 531.
The background blurring algorithm module 544 can perform background blurring processing according to the image data of the private format D1 transmitted by the conversion module 533 in the adapter 530, so as to obtain a processing result of the private format D2. The background blurring algorithm module 544 may output the processing result of the proprietary format D2 to the conversion module 533.
In some embodiments, the conversion module 531 and the conversion module 533 may also be provided in the camera hardware abstraction layer of the system-on-chip platform a.
By providing the image data processing method in the embodiment of the application, different conversion modules can be set in the Adapter (Adapter) for different image processing algorithms, so that the image data in a fixed format output by the camera hardware abstraction layer is converted into the data in the format required by each of the different image processing algorithms. Therefore, decoupling of the image processing algorithm and the camera hardware abstraction layer can be realized, so that the image processing algorithm can be customized according to a system-on-chip platform, and processing of image data can be realized.
The embodiments of the present application also provide a computer readable storage medium storing a computer program, where the computer program can implement the steps in the above-mentioned method embodiments when executed by a processor.
Embodiments of the present application also provide a computer program product enabling an electronic device to carry out the steps of the various method embodiments described above when the computer program product is run on the electronic device.
The embodiments of the present application also provide a chip system, where the chip system includes a processor, the processor is coupled to a memory, and the processor executes a computer program stored in the memory to implement the steps of any of the method embodiments of the present application. The chip system can be a single chip or a chip module composed of a plurality of chips.
The term "User Interface (UI)" in the specification and drawings of the present application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it implements conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, the interface source code is analyzed and rendered on the terminal equipment, and finally the interface source code is presented as content which can be identified by a user, such as a picture, characters, buttons and the like. Controls (controls), also known as parts (widgets), are basic elements of a user interface, typical controls being toolbars (toolbars), menu bars (menu bars), text boxes (text boxes), buttons (buttons), scroll bars (scrollbars), pictures and text. The properties and content of the controls in the interface are defined by labels or nodes, such as XML specifies the controls contained in the interface by nodes of < Textview >, < ImgView >, < VideoView >, etc. One node corresponds to a control or attribute in the interface, and the node is rendered into visual content for a user after being analyzed and rendered. In addition, many applications, such as the interface of a hybrid application (hybrid application), typically include web pages. A web page, also referred to as a page, is understood to be a special control embedded in an application program interface, which is source code written in a specific computer language, such as hypertext markup language (hyper text markup language, HTML), cascading style sheets (cascading style sheets, CSS), java script (JavaScript, JS), etc., and which can be loaded and displayed as user-recognizable content by a browser or web page display component similar to the browser's functionality. The specific content contained in a web page is also defined by tags or nodes in the web page source code, such as HTML defines the elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (14)

1. An image data processing method is characterized by being applied to electronic equipment, wherein the electronic equipment is provided with a plurality of cameras and an image signal processor; the electronic equipment runs an operating system, the operating system comprises a hardware abstraction layer, and the hardware abstraction layer comprises a camera hardware abstraction layer, an adapter and a space alignment transformation algorithm module; the method comprises the following steps:
Converting, by the adapter, a first image in a first format output by the camera hardware abstraction layer into a third image in a second format, and converting the second image in the first format output by the camera hardware abstraction layer into a fourth image in the second format; the first image is shot by the first camera, the second image is shot by the second camera, and the first format is different from the second format;
determining, by the spatial transformation algorithm module, spatial alignment transformation parameters of a third format based on the third image of the second format and the fourth image of the second format;
converting, by the adapter, the spatial alignment transformation parameters of the third format to spatial alignment transformation parameters of a fourth format, the third format being different from the fourth format;
issuing the spatial alignment transformation parameters of the fourth format to the image signal processor through the camera hardware abstraction layer;
performing spatial alignment transformation processing on a fifth image shot by the second camera based on the spatial alignment transformation parameters in the fourth format by an image signal processor to obtain a sixth image after spatial alignment transformation;
And displaying the sixth image.
2. The method of claim 1, wherein prior to acquiring, by the camera hardware abstraction layer, the first image captured by the first camera and the second image captured by the second camera, the method further comprises:
displaying an image with a first zoom magnification shot by the first camera, wherein the first zoom magnification is within a zoom magnification range of the first camera;
receiving a first operation of setting a zoom magnification to a second zoom magnification by a user, wherein the second zoom magnification is within a zoom magnification range of the second camera;
the obtaining, by the camera hardware abstraction layer, the first image captured by the first camera and the second image captured by the second camera specifically includes:
and responding to the first operation, and acquiring a first image shot by the first camera and a second image shot by the second camera through the camera hardware abstraction layer.
3. The method according to claim 1 or 2, wherein the first format includes a port number and a junction buffer structure, the junction buffer structure including a buffer address of image data, a width and height of image data, a data type of image data, the data type including YUV data or RAW data; wherein the port number of the first image is different from the port number of the second image.
4. A method according to any of claims 1-3, wherein the second format comprises a camera type, a buffer type and a frame structure; the buffer zone type is used for indicating the resolution ratio of the image data, and the camera type of the third image is different from the camera type of the fourth image.
5. The method of any one of claims 1-4, wherein the third format includes clipping information and an affine transformation matrix.
6. The method of any of claims 1-5, wherein the fourth format includes image correction adjustment data.
7. The method of any of claims 1-6, wherein the camera hardware abstraction layer includes a spatially aligned transform junction;
the converting, by the adapter, the first image in the first format output by the camera hardware abstraction layer into a third image in a second format, and converting the second image in the first format output by the camera hardware abstraction layer into a fourth image in the second format, specifically includes:
converting, by the adapter, the first image in the first format output by the spatially aligned transformation junction into a third image in the second format, and converting the second image in the first format output by the spatially aligned transformation junction into a fourth image in the second format.
8. The method of any of claims 1-7, wherein prior to converting, by the adapter, a first image in a first format output by the camera hardware abstraction layer to a third image in a second format, converting the second image in the first format output by the camera hardware abstraction layer to a fourth image in the second format, the method further comprises:
front-end processing is performed on the first image and the second image by the image signal processor.
9. The method according to any one of claims 1-8, wherein the performing, by the image signal processor, the spatial alignment transformation on the fifth image captured by the second camera based on the spatial alignment transformation parameters in the fourth format, to obtain a sixth image after the spatial alignment transformation, specifically includes:
performing front-end processing on a fifth image shot by the second camera through the image signal processor;
and performing spatial alignment transformation processing on the fifth image subjected to front-end processing based on spatial alignment transformation parameters by the image signal processor to obtain the sixth image subjected to spatial alignment transformation.
10. The method of claim 8 or 9, wherein the front-end processing comprises one or more of color correction, downsampling, demosaicing, statistical 3A data.
11. An electronic device comprising a plurality of cameras, one or more processors, and one or more memories; wherein the plurality of cameras, the one or more memories are coupled with the one or more processors, the one or more memories for storing a computer program that when executed by the one or more processors causes the method of any of claims 1-10 to be performed.
12. An electronic device comprising one or more functional modules configured to perform the method of any of claims 1-10.
13. A chip for application to an electronic device, characterized in that the chip comprises processing circuitry and interface circuitry for receiving instructions and transmitting to the processing circuitry, the processing circuitry for executing instructions to perform the method of any of claims 1-10.
14. A computer readable storage medium storing a computer program, which when run on a processor of an electronic device causes the method of any one of claims 1-10 to be performed.
CN202310897862.8A 2023-07-20 2023-07-20 Image data processing method and related device Pending CN117692790A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310897862.8A CN117692790A (en) 2023-07-20 2023-07-20 Image data processing method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310897862.8A CN117692790A (en) 2023-07-20 2023-07-20 Image data processing method and related device

Publications (1)

Publication Number Publication Date
CN117692790A true CN117692790A (en) 2024-03-12

Family

ID=90127224

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310897862.8A Pending CN117692790A (en) 2023-07-20 2023-07-20 Image data processing method and related device

Country Status (1)

Country Link
CN (1) CN117692790A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113554718A (en) * 2020-04-24 2021-10-26 Oppo(重庆)智能科技有限公司 Image encoding method, storage medium, and electronic device
CN113727035A (en) * 2021-10-15 2021-11-30 Oppo广东移动通信有限公司 Image processing method, system, electronic device and storage medium
CN114125284A (en) * 2021-11-18 2022-03-01 Oppo广东移动通信有限公司 Image processing method, electronic device, and storage medium
WO2022262291A1 (en) * 2021-06-15 2022-12-22 荣耀终端有限公司 Image data calling method and system for application, and electronic device and storage medium
CN115550541A (en) * 2022-04-22 2022-12-30 荣耀终端有限公司 Camera parameter configuration method and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113554718A (en) * 2020-04-24 2021-10-26 Oppo(重庆)智能科技有限公司 Image encoding method, storage medium, and electronic device
WO2022262291A1 (en) * 2021-06-15 2022-12-22 荣耀终端有限公司 Image data calling method and system for application, and electronic device and storage medium
CN113727035A (en) * 2021-10-15 2021-11-30 Oppo广东移动通信有限公司 Image processing method, system, electronic device and storage medium
CN114125284A (en) * 2021-11-18 2022-03-01 Oppo广东移动通信有限公司 Image processing method, electronic device, and storage medium
CN115550541A (en) * 2022-04-22 2022-12-30 荣耀终端有限公司 Camera parameter configuration method and electronic equipment

Similar Documents

Publication Publication Date Title
CN114205522B (en) Method for long-focus shooting and electronic equipment
CN113099146B (en) Video generation method and device and related equipment
WO2021190348A1 (en) Image processing method and electronic device
CN115359105B (en) Depth-of-field extended image generation method, device and storage medium
CN115484403B (en) Video recording method and related device
CN115883957B (en) Shooting mode recommendation method
EP4262226A1 (en) Photographing method and related device
CN115442517B (en) Image processing method, electronic device, and computer-readable storage medium
CN116055868B (en) Shooting method and related equipment
CN115567783A (en) Image processing method
CN115460343A (en) Image processing method, apparatus and storage medium
CN117692790A (en) Image data processing method and related device
CN115914860A (en) Shooting method and electronic equipment
CN116709042A (en) Image processing method and electronic equipment
CN114630152A (en) Parameter transmission method and device for image processor and storage medium
CN114945019A (en) Data transmission method, device and storage medium
CN116055863B (en) Control method of optical image stabilizing device of camera and electronic equipment
CN116757963B (en) Image processing method, electronic device, chip system and readable storage medium
WO2023035868A1 (en) Photographing method and electronic device
WO2024109213A1 (en) Shooting mode switching method and related apparatus
WO2024140472A1 (en) Video generation method and related device
CN115802144B (en) Video shooting method and related equipment
CN115631098B (en) Antireflection method and device
WO2023231696A1 (en) Photographing method and related device
WO2022206600A1 (en) Screen projection method and system, and related apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination