CN115550541A - Camera parameter configuration method and electronic equipment - Google Patents

Camera parameter configuration method and electronic equipment Download PDF

Info

Publication number
CN115550541A
CN115550541A CN202210428351.7A CN202210428351A CN115550541A CN 115550541 A CN115550541 A CN 115550541A CN 202210428351 A CN202210428351 A CN 202210428351A CN 115550541 A CN115550541 A CN 115550541A
Authority
CN
China
Prior art keywords
mode
camera
camera sensor
interface
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210428351.7A
Other languages
Chinese (zh)
Other versions
CN115550541B (en
Inventor
杜亚雯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210428351.7A priority Critical patent/CN115550541B/en
Publication of CN115550541A publication Critical patent/CN115550541A/en
Application granted granted Critical
Publication of CN115550541B publication Critical patent/CN115550541B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a camera parameter configuration method and electronic equipment, relates to the technical field of terminals, and improves timeliness of switching of an image mode of the electronic equipment. The specific scheme is as follows: receiving a first operation, wherein the first operation is used for indicating to open a camera application; in response to the first operation, sending configuration parameters corresponding to the first mapping mode to the camera sensor; displaying a first interface, wherein the first interface is a photographing interface, the first interface comprises a first image frame, and the first image frame is an image obtained by a camera sensor in a first image drawing mode; the electronic device sends an identification of a second chart mode to the camera sensor under a first condition, wherein the first condition comprises the following steps: the electronic equipment is in a High Dynamic Range (HDR) scene, or the zooming parameter of the camera is larger than a first multiple; in the first interface, a second image frame is displayed, the second image frame being an image obtained by the camera sensor using a second image mode.

Description

Camera parameter configuration method and electronic equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a camera parameter configuration method and an electronic device.
Background
The photographing function has become a basic function of most electronic devices (e.g., mobile phones). With the popularization of the shooting function in the electronic equipment, the shooting quality requirement of the electronic equipment for users is higher and higher. For the electronic device, under different shooting scenes, the camera sensor in the electronic device is indicated to enable the adaptive mapping mode, so that the shooting quality of the electronic device can be effectively improved. For example, in a conventional scenario, a Binning graph model is enabled; in a high dynamic scene, an intra-field dual conversion gain (Idcg) graph mode is started; and starting a Quadra picture-out mode under the condition that a user indicates a scene of amplifying a shot picture.
Obviously, in the process of actually using the electronic device to shoot, the shooting scene may change quickly, however, the timeliness of the corresponding switching of the image mode is still poor, and thus, the shooting quality of the electronic device may be actually affected.
Disclosure of Invention
The embodiment of the application provides a camera parameter configuration method and electronic equipment, which are used for improving the switching efficiency of an image making mode and guaranteeing the shooting quality of the electronic equipment in different shooting scenes.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, a camera parameter configuration method provided in an embodiment of the present application is applied to an electronic device, where the electronic device includes a camera and a camera sensor, and the method includes: receiving a first operation, wherein the first operation is used for indicating to open a camera application; in response to the first operation, sending configuration parameters corresponding to the first mapping mode to the camera sensor; displaying a first interface, wherein the first interface is a photographing interface, the first interface comprises a first image frame, and the first image frame is an image obtained by a camera sensor in a first image drawing mode; the electronic device sends an identification of a second chart mode to the camera sensor under a first condition, wherein the first condition comprises the following steps: the electronic equipment is in a high dynamic range HDR scene, or the zooming parameter of the camera is larger than a first multiple; in the first interface, a second image frame is displayed, the second image frame being an image obtained by the camera sensor using the second mapping mode.
In the above embodiment, after the user instructs the electronic device to open the camera application, a photographing interface may be displayed, where the photographing interface includes the first image frame. The first image frame is an image obtained by preprocessing a first original image, and the first original image is an image output by the camera sensor after adopting a first image drawing mode, such as a Binning image drawing mode. Then, when the shooting scene changes, for example, the electronic device is in a high dynamic range HDR scene, or the zoom parameter of the camera is greater than the first multiple, the electronic device may send an identifier of the second mapping mode to the camera sensor, instructing the camera sensor to complete the switching of the mapping modes. That is, from the detection of the change of the shooting scene to the completion of the switching of the mapping mode, only the identifier of the second mapping mode needs to be sent to the camera sensor, and the data volume corresponding to the identifier of the second mapping mode is small, the transmission speed is high, and the timeliness of the mapping mode along with the change of the shooting scene is effectively improved.
In some embodiments, the method further comprises: receiving a second operation, wherein the second operation is used for indicating and displaying a second interface, and the second interface is a first video recording interface; in response to the second operation, sending configuration parameters corresponding to the third drawing mode to the camera sensor; displaying a second interface, wherein the second interface comprises a third image frame, and the third image frame is an image obtained by the camera sensor in a third image drawing mode; receiving a third operation, wherein the third operation is used for indicating and displaying a third interface, the third interface is a second video recording interface, and the video recording parameters corresponding to the first video recording interface and the second video recording interface are different; in response to the third operation, sending an identification of a fourth chart mode to the camera sensor; and displaying a third interface, wherein the third interface comprises a fourth image frame, and the fourth image frame is an image obtained by the camera sensor in a fourth image drawing mode.
The first video recording interface and the second video recording interface belong to different camera modes. In the above embodiment, even if the camera mode is changed, for example, the camera mode is switched from the camera mode of the first video recording interface to the camera mode of the second video recording interface, the switching to the adaptive image mode may be instructed by sending the identifier with the smaller data volume, that is, in the shooting scene of the partial camera mode switching, the timeliness of the image mode switching can still be ensured.
In some embodiments, the video recording parameters include video resolution and video frame rate, and the second video recording interface is a recording interface for 4k dynamic frame rate video.
In some embodiments, before sending the configuration parameters corresponding to the first map mode to the camera sensor, the method further comprises: sending a first data packet to the camera sensor, wherein the first data packet comprises configuration parameters corresponding to a first graph drawing mode and configuration parameters corresponding to a second graph drawing mode; after sending the configuration parameters corresponding to the first mapping mode to the camera sensor, the method further includes: writing a first value in the first flag bit; prior to sending the identification of the second mapping pattern to the camera sensor, the method further comprises: the value of the first flag bit is determined to be a first value.
In some embodiments, before sending the configuration parameters corresponding to the third graph mode to the camera sensor, the method further comprises: determining that the configuration parameters corresponding to the third graph mode are contained in the second data packet, and sending the second data packet to the camera sensor under the condition that the second data packet is not written into the camera sensor; writing a second value in the first flag bit, the second value being different from the first value; determining the value of the first flag bit as a second value; after sending the configuration parameters corresponding to the third mapping mode to the camera sensor, the method further includes: writing the first value again in the first flag bit; prior to sending the identification of the fourth charting mode to the camera sensor, the method further comprises: and determining that the second data comprises the configuration parameters corresponding to the fourth graph mode, and the value of the first flag bit is the first value.
The configuration parameters of the first graph mode and the second graph mode are camera parameters corresponding to a photographing mode, the configuration parameters of the third graph mode are camera parameters corresponding to a video mode, and the configuration parameters of the fourth graph mode are camera parameters corresponding to a 4k dynamic frame rate mode. After the camera parameters are actually configured to the camera sensor, the camera sensor can use the corresponding mapping mode.
In the above embodiment, the camera parameters of the shooting mode, the video recording mode and the 4k dynamic frame rate mode are respectively encapsulated in different data packets, such as the first data packet and the second data packet. In this way, the data volume of a single data packet is effectively controlled. In addition, the electronic equipment starts a photographing mode, issues a first data packet to the camera sensor, starts a video recording mode or a 4k dynamic frame rate mode, and then issues a second data packet to the camera sensor. Therefore, even if the storage space of the camera sensor is limited, the camera sensor can support switching of the image showing modes in a plurality of modes, and the intelligence of the electronic equipment is improved.
In some embodiments, after sending the configuration parameters corresponding to the first mapping mode to the camera sensor, the method further includes: a camera sensor of the electronic equipment is configured according to the configuration parameters corresponding to the first graph plotting mode; the camera sensor of the electronic device resumes streaming and begins to map out using the first map out mode.
In some embodiments, after sending the identification of the second chart mode to the camera sensor, the method further comprises: the camera sensor of the electronic equipment responds to the identification of the second mapping mode, and accesses a first address, wherein the first address is an address for storing configuration parameters corresponding to the second mapping mode in the camera sensor; loading configuration parameters corresponding to the second drawing mode by a camera sensor of the electronic equipment; and the camera sensor of the electronic equipment adopts a second drawing mode to start drawing.
In some embodiments, where the first condition is that the electronic device is in an HDR scene, the second map mode is an Idcg map mode suitable for taking photographs.
In some embodiments, the second map mode is a Quadra map mode suitable for taking a photograph in the case where the first condition is that the zoom parameter of the camera is greater than a first multiple.
In some embodiments, after displaying the second image frame, the method further comprises: the electronic equipment sends the identification of the first chart mode to the camera sensor under the second condition; the second condition includes: the electronic equipment is not in an HDR scene, and the zooming parameter of the camera is not more than a first time value; the first image frame is displayed on the first interface.
In a second aspect, an electronic device provided in an embodiment of the present application includes one or more processors and a memory; a memory coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions, which when executed by the one or more processors, cause the one or more processors to perform: receiving a first operation, wherein the first operation is used for indicating to open a camera application; in response to the first operation, sending configuration parameters corresponding to the first mapping mode to the camera sensor; displaying a first interface, wherein the first interface is a photographing interface and comprises a first image frame, and the first image frame is an image obtained by a camera sensor in a first drawing mode; the electronic equipment sends the identification of the second drawing mode to the camera sensor under a first condition, wherein the first condition comprises the following steps: the electronic equipment is in a High Dynamic Range (HDR) scene, or the zooming parameter of the camera is larger than a first multiple; in the first interface, a second image frame is displayed, the second image frame being an image obtained by the camera sensor using a second image mode.
In some embodiments, the one or more processors are further configured to: receiving a second operation, wherein the second operation is used for indicating and displaying a second interface, and the second interface is a first video recording interface; in response to the second operation, sending configuration parameters corresponding to the third drawing mode to the camera sensor; displaying a second interface, wherein the second interface comprises a third image frame, and the third image frame is an image obtained by the camera sensor in a third image drawing mode; receiving a third operation, wherein the third operation is used for indicating and displaying a third interface, and the third interface is a second video recording interface; in response to the third operation, sending an identification of a fourth chart mode to the camera sensor; and displaying a third interface, wherein the third interface comprises a fourth image frame, and the fourth image frame is an image obtained by the camera sensor in a fourth image drawing mode.
In some embodiments, the video recording parameters include a video resolution and a video frame rate, and the second video recording interface is a recording interface of a 4k dynamic frame rate video.
In some embodiments, the one or more processors are further configured to: sending a first data packet to a camera sensor, wherein the first data packet comprises configuration parameters corresponding to a first drawing mode and configuration parameters corresponding to a second drawing mode; after sending the configuration parameters corresponding to the first mapping mode to the camera sensor, the method further includes: writing a first value in the first flag bit; prior to sending the identification of the second mapping pattern to the camera sensor, the method further comprises: the value of the first flag bit is determined to be a first value.
In some embodiments, prior to sending the configuration parameters corresponding to the third graph mode to the camera sensor, the one or more processors are further to: determining that the configuration parameters corresponding to the third graph mode are contained in the second data packet, and sending the second data packet to the camera sensor under the condition that the second data packet is not written into the camera sensor; writing a second value in the first flag bit, the second value being different from the first value; determining the value of the first flag bit to be a second value;
after sending the configuration parameters corresponding to the third mapping mode to the camera sensor, the one or more processors are further configured to: writing the first value again in the first flag bit;
prior to sending the identification of the fourth illustration pattern to the camera sensor, the one or more processors are further to: and determining that the second data comprises the configuration parameters corresponding to the fourth graph mode, and the value of the first flag bit is the first value.
In some embodiments, after sending the configuration parameters corresponding to the first charting mode to the camera sensor, the one or more processors are further to: configuring the camera sensor according to the configuration parameters corresponding to the first graph drawing mode; instructing the camera sensor to restart streaming and to start streaming in the first mapping mode.
In some embodiments, after sending the identification of the second chart mode to the camera sensor, the one or more processors are further to: in response to the identification of the second mapping mode, instructing the camera sensor to access a first address, wherein the first address is an address of the camera sensor for storing configuration parameters corresponding to the second mapping mode; instructing the camera sensor to load configuration parameters corresponding to the second mapping mode; instructing the camera sensor to adopt a second mapping mode to start mapping.
In some embodiments, the second drawing mode is an Idcg drawing mode suitable for taking a photograph in case the first condition is that the electronic device is in an HDR scene.
In some embodiments, the second map mode is a Quadra map mode suitable for taking a photograph in a case where the first condition is that the zoom parameter of the camera is greater than a first multiple.
In some embodiments, after displaying the second image frame, the one or more processors are further to: under a second condition, sending an identification of a first chart mode to the camera sensor; the second condition includes: the electronic equipment is not in an HDR scene, and the zooming parameter of the camera is not more than a first time value; the first image frame is displayed on the first interface.
In a third aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device, the electronic device is caused to perform the method in the first aspect and possible embodiments thereof.
In a fourth aspect, the present application provides a computer program product, which, when run on the above-mentioned electronic device, causes the electronic device to perform the method of the above-mentioned first aspect and its possible embodiments.
It is understood that the electronic device, the computer-readable storage medium and the computer program product provided in the foregoing aspects are all applied to the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device, the computer-readable storage medium and the computer program product provided in the foregoing aspects can refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Drawings
Fig. 1 is one of software and hardware structure diagrams of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a second software/hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 3 is one of the flow diagrams for configuring camera parameters in some embodiments;
FIG. 4 is a second flowchart of configuring camera parameters in some embodiments;
fig. 5 is one of schematic interfaces of a mobile phone in a shooting scene provided in the embodiment of the present application;
fig. 6 is a signaling interaction diagram corresponding to a method for configuring camera parameters according to an embodiment of the present application;
fig. 7 is a second signaling interaction diagram corresponding to a camera parameter configuration method according to an embodiment of the present application;
fig. 8 is a second schematic interface diagram of a mobile phone in a shooting scene according to the embodiment of the present application;
fig. 9 is a third signaling interaction diagram corresponding to a method for configuring camera parameters according to the embodiment of the present application;
fig. 10 is a third schematic interface diagram of a mobile phone in a shooting scene according to the embodiment of the present application;
fig. 11 is a fourth signaling interaction diagram corresponding to the camera parameter configuration method according to the embodiment of the present application;
fig. 12 is a fourth schematic interface diagram of a mobile phone in a shooting scene according to the embodiment of the present application;
fig. 13 is a fifth signaling interaction diagram corresponding to a camera parameter configuration method according to an embodiment of the present application;
fig. 14 is a fifth schematic interface diagram of a mobile phone in a shooting scene according to the embodiment of the present application;
fig. 15 is a sixth signaling interaction diagram corresponding to a camera parameter configuration method according to an embodiment of the present application;
fig. 16 is a schematic composition diagram of a chip system according to an embodiment of the present disclosure.
Detailed Description
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present embodiment, the meaning of "a plurality" is two or more unless otherwise specified.
Embodiments of the present embodiment will be described in detail below with reference to the accompanying drawings.
The embodiment of the application provides a camera parameter configuration method which is applied to electronic equipment with a shooting function.
Illustratively, the electronic device may be a desktop, laptop, tablet, handheld Computer, cell phone, notebook, ultra-mobile Personal Computer (UMPC), netbook, as well as a cellular phone, personal Digital Assistant (PDA), television, VR device, AR device, and the like.
As shown in fig. 1, the electronic device 100 may be divided into an application layer (abbreviated as an application layer), an application framework layer (abbreviated as a framework layer), a Hardware Abstraction Layer (HAL), a Kernel layer (also referred to as a driver layer), and a hardware layer from top to bottom, where each layer has a clear role and division of work. The layers communicate with each other through a software interface.
It is understood that fig. 1 is only an example, that is, the layers divided in the electronic device are not limited to the layers shown in fig. 1, for example, between the application framework layer and the HAL layer, and may further include an Android runtime (Android runtime) layer and a system library (libraries) layer, etc.
Illustratively, the application layer may include a series of application packages. As shown in fig. 1, the application layer may include a camera application. Of course, besides the camera application, other application packages may be included in the application layer, such as a gallery application, a video application, and so on.
Generally, an application is developed using Java language, and is completed by calling an Application Programming Interface (API) and a programming framework provided by an application framework layer. Illustratively, the application framework layer includes some predefined functions.
As shown in fig. 1, the application framework layer may include a camera service that is available for a camera application to call to implement shooting-related functions. Of course, the application framework layer may further include a content provider, a resource manager, a notification manager, a window manager, a view system, a phone manager, and the like, and similarly, the camera application may also call the content provider, the resource manager, the notification manager, the window manager, the view system, and the like according to actual service requirements, which is not limited in this embodiment of the present application.
The kernel layer is a layer between hardware and software. As shown in fig. 1, the core layer contains at least a camera driver. The camera driver may be used to drive a hardware module having a photographing function, such as a camera sensor. In other words, the camera driver needs to be responsible for data interaction with the camera sensor. Of course, the kernel layer may also include drivers such as an audio driver and a sensor driver, which is not limited in this embodiment of the present application.
In addition, the HAL layer may encapsulate a driver in the kernel layer and provide a calling interface to the application framework layer, shielding implementation details of the low-level hardware.
As shown in fig. 1, the HAL layer may include Camera HAL, a decision module and XML.
The Camera HAL is a Camera core software framework, and comprises a Sensor node (Sensor node), an image processing module, an interface module and the like. The Sensor node, the image processing module and the interface module are components in an image data and control instruction transmission pipeline in a Camera HAL, and different components have different functions correspondingly. For example, a Sensor node may be a control node facing a camera Sensor, which may control the camera Sensor through a camera drive. For another example, the interface module may be a software interface facing the application framework layer, and is configured to perform data interaction with the application framework layer, and of course, the interface module may also perform data interaction with other modules (e.g., a decision module, an image processing module, and a Sensor node) in the HAL. For another example, the image processing module may process raw image data returned by the camera sensor, and exemplarily, the image processing module may include an Image Front End (IFE) node and a bayer processing segment (BPS node), where the IFE is used to process a preview stream captured by the camera sensor, and the BPS node is used to process a photo stream captured by the camera sensor. In addition, the image processing module may further include nodes with other image processing capabilities, which may refer to related technologies and are not described herein again.
In addition, the decision module is a multi-shot decision module under a CamX-Chi architecture, and the decision module can determine a camera sensor (such as a front camera sensor, a rear camera sensor, and the like) actually plotted and a plotting mode of the camera sensor according to scene information. With respect to the pattern of the image of the camera sensor, detailed description may be referred to in the subsequent embodiments.
In addition, the above-described XML may be used to transmit and store data. For example, the XML may obtain various types of configuration parameters from a memory of the electronic device, such as configuration parameters required for the camera sensor to operate. Meanwhile, XML can also support the Sensor node to inquire the stored configuration parameters.
In addition, fig. 1 also illustrates an example of a hardware module that can be driven in a hardware layer, such as a camera sensor. Of course, hardware modules not shown in fig. 1, such as a camera, a processor, a memory, etc., may also be included in the hardware layer.
As previously described, the camera sensor may support a variety of map modes, such as Binning map mode, idcg map mode, and Quadra map mode.
In the Binning graph mode, after the camera sensor collects an original pixel array, induction charges corresponding to adjacent pixels in the original pixel array are added together to be used as an actually output pixel point. In the Binning image mode, several adjacent pixels are combined to be used as one pixel, so that the original image data output by the camera sensor to the camera driver is compared with an original pixel array, the output resolution is reduced, the photosensitive area is increased, and the sensitivity of light sensing in a dark place is improved while the field of view (FOV) is kept unchanged. In general, the Binning graph schema is also the default output schema.
In addition, in the Idcg image mode, the dynamic range of the camera sensor, which is the capability of the camera sensor to simultaneously represent highlight and shadow content in an image, can be increased. Wherein the larger the dynamic range of the camera sensor, the stronger the capability of indicating that the camera sensor can embody highlight and shadow content. In the Idcg image mode, the camera sensor synchronously acquires a high gain (HCG) image and a low gain (LCG) image corresponding to the same frame of original image data with the same exposure time. Then, the camera sensor fuses the high gain map and the low gain map into one frame of image as the original image data actually output to the camera driver.
The above Idcg map mode has a larger dynamic range than the Binning map mode, and of course, the corresponding power consumption is also higher, about 1.5 times that of the Binning map mode.
In the Quadra map mode, the camera sensor uses the original pixel array collected by the 4-cell sensor as the original image data output to the camera driver, that is, in the Quadra map mode, the original image data received by the camera driver is not subjected to binning pixel synthesis. That is, raw image data resulting from camera driving cannot be directly recognized and processed during the Quadra map mode being enabled. The raw image data needs to be converted into a graph in standard bayer format, a process called remosaic. Illustratively, raw image data received by the Camera driver may be converted into remosaic image data by an image processing module in Camera HAL. Compared with the original image data obtained in the binning imaging mode, the remosaic image data has more pixels and high definition, and is more suitable for a shooting scene in which a user indicates to enlarge a shooting picture (i.e., increase zoom parameters).
The various drawing modes have different advantages and are also suitable for different scenes. For example, the Binning map mode is suitable for a conventional shooting scene, the Idcg map mode is suitable for a backlight or highlight shooting scene, and the Quadra map mode is suitable for a shooting scene indicated by a user to increase the zoom parameter.
Obviously, in the shooting process, the electronic device switches different drawing modes according to different shooting scenes. In this way, the electronic device can capture high-quality image data (e.g., photos, videos) under different capture scenes.
However, there are some differences between the customization parameters required for different graph modes. Therefore, in the process of switching different image drawing modes, the electronic device needs to configure corresponding customized parameters in the camera sensor. The customization parameters may include field angle, size of the image, aspect ratio of the image, frame rate, and other parameters.
Illustratively, the other parameters may include: color, data transmission rate, exposure parameter, one Frame Length Lines (Frame Length Lines) including one or more of field blanking, one Line of data pixel points (Line Length PCK) including Line blanking, clipping parameter, scaling parameter, clock frequency, data transmission rate, phase focusing parameter, pixel point merging mode, internal timing, effect processing related parameter, DCG related parameter (e.g., internal gain ratio of LCG and HCG, LCG and HCG image fusion algorithm parameter, etc.), and the like.
In addition, even in the same type of drawing mode, the custom parameters of the desired configuration are different in different camera modes. The camera mode may include a photographing mode, a video recording mode, a 4k dynamic frame rate mode, and the like. The photographing mode is a function mode for photographing a photograph, the recording mode is a function mode for photographing a video, and the 4k dynamic frame rate mode is a function mode for photographing a 4k dynamic frame rate video, where the 4k dynamic frame rate video may be video data with a resolution of 4k and a dynamically changeable frame rate. The camera application enables different camera modes and may display different capture interfaces.
Illustratively, the customized parameters corresponding to the mapping mode in each camera mode are shown in table 1 below:
TABLE 1
Figure BDA0003610759210000071
Figure BDA0003610759210000081
The customization parameters corresponding to the Binning graph mode can be called as Binning customization parameters; the customization parameter corresponding to the Idcg map mode can be called Idcg customization parameter; the customization parameters corresponding to the Quadra map mode can be called Quadra customization parameters.
As can be seen from table 1, in the same camera mode, the same parameters exist in the Binning customized parameter, the Idcg customized parameter and the Quadra customized parameter, such as the field angle, the image size, the image aspect ratio and the frame rate. Thus, under the same camera mode, the consistency of the picture is not influenced when the Binning image drawing mode, the Idcg image drawing mode and the Quadra image drawing mode are switched.
Of course, there are also different parameters for the Binning customization parameter, the Idcg customization parameter, and the Quadra customization parameter. For example, the Binning customization parameters include scaling parameters, while the Quadra customization parameters do not. As another example, the value of the clipping parameter of the Binning custom parameter is different from that of the Quadra custom parameter. For another example, the Idcg custom parameters include DCG related parameters, but neither the Binning custom parameters nor the Quadra custom parameters include DCG related parameters. For another example, the internal timings of the Binning custom parameter and the Quadra custom parameter both indicate that the same frame of original image only outputs the LCG image (or the HCG image), and the internal timing of the Idcg custom parameter indicates that the same frame of original image can synchronously output the LCG image and the HCG image.
In addition, the graph patterns corresponding to the camera modes are also shown in table 1 as an example, and it is understood that each camera mode may correspond to more or fewer graph patterns. For example, the video recording mode may correspond to a Quadra image mode in addition to the Binning image mode and the Idcg image mode. For another example, the recording mode may correspond to the Binning image mode only.
For example, as shown in fig. 1, the camera application may transfer information such as a camera mode and zoom parameters selected by a user to a camera service of an application framework layer, and then the camera service transfers the information to the decision module through an interface module of a HAL layer. Therefore, the decision module can determine the image-showing mode adaptive to the current shooting scene according to the camera mode, the zooming parameter and the received ambient lighting parameter, and inform the Sensor node. And finally, configuring the customized parameters corresponding to the image mode into the camera Sensor through the camera driver by the Sensor node, and if so, indicating the camera driver to interact with an I2C interface of the camera Sensor to realize the configuration of the customized parameters to the camera Sensor.
Thus, as shown in fig. 2, after configuring the customized parameters, the camera sensor sends raw image data to the camera driver according to the graph mode corresponding to the customized parameters, for example, sends raw image data to the camera driver through the MIPI interface of the camera sensor. And then, the camera drive sends the original image data to the image processing module, and the image frame is obtained after the original image data is processed by the image processing module. It will be appreciated that when the camera application is in the capture preview phase, the resulting image frames may also be referred to as preview frames, and when the camera application is in the actual capture phase, the resulting image frames may also be referred to as camera frames. The image processing module may send the image frame to a camera application through the interface module, the camera service, for display by the electronic device.
Of course, due to the limitations of the hardware platform configured by the electronic device, such as the high-pass chip and the camera sensor, in the related art, the process of configuring the parameters required for different drawing modes is more complicated than the above-described process, and in addition, before configuring the customized parameters, the common parameters need to be configured in the camera sensor. The common parameters may include a data transmission protocol, an internal timing sequence, an interrupt frequency, and the like. In the following embodiments, the common parameters and the customized parameters may be collectively referred to as camera parameters.
As shown in fig. 3, in some embodiments, the process of configuring the camera parameters is as follows:
a1, the camera application receives operation 1 indicating a foreground run.
Illustratively, the operation 1 described above may be an operation in which the user instructs the camera application to enter the foreground for running. For example, the operation 1 may be an operation of clicking a camera function shortcut key by a user, an operation of clicking a camera application icon, a task window corresponding to a camera application in a multi-task interface, and the like, and the operation 1 may also be that the user speaks a voice instruction to open the camera application or makes a gesture to instruct to open the camera application.
In addition, when the background application does not include the camera application, the instructing may be instructing the camera application to enter a foreground for running, where the instructing may be instructing the camera application to start and run in the foreground. When the background application contains the camera application, the camera application is instructed to enter foreground operation, or when the background application contains the camera application, the camera application is instructed to switch from the background operation to the foreground operation.
In some examples, operation 1 above may trigger the camera application to invoke the camera service, i.e., the flow goes to A2.
A2, the camera application sends request information 1 to the camera service, wherein the request information 1 comprises a photographing mode identifier and a zooming parameter 1.
Before receiving the operation 1, if the camera service is in a dormant state, for example, the background application does not include the camera application, the request information 1 is used to wake up the camera service and instruct the camera service to provide a corresponding service to the camera application. Before receiving the operation 1, if the camera service is not in the dormant state, for example, the background application includes the camera application, the request information 1 is used to instruct the camera service to provide the corresponding service to the camera application.
In addition, camera parameters, such as zoom parameters and an identification of a camera mode, are carried in the request information 1.
In the case where the user has no special instruction, the zoom parameter carried in the request information 1 may be the zoom parameter 1, for example, may be 1 time.
In addition, before receiving operation 1, in a case that the background application does not include the camera application, the identifier of the camera mode carried in the request information 1 may indicate a photographing mode, that is, the request information 1 includes the identifier of the camera mode.
In the case where the camera application is included in the background application before receiving operation 1, the identification of the camera mode in request information 1 may indicate that the camera application switches into a camera mode that is enabled before the background runs. If the photographing mode is enabled before the camera application is switched into the background operation, the identifier of the photographing mode is included in the request information 1, and if the recording mode is enabled before the camera application is switched into the background operation, the identifier of the recording mode is included in the request information 1.
Of course, in the following embodiments, the description is given by taking the case where the camera application is not included in the background application before the operation 1 is received. Thus, the request information 1 may include a photographing mode identification, a zoom parameter 1.
And A3, the camera service sends request information 2 to the interface module, wherein the request information 2 comprises a photographing mode identifier and a zooming parameter 1.
In some embodiments, the camera service generates request information 2 in response to the request information 1. The request information 2 is used to request the distribution. The identification of the camera mode and the zoom parameter are also included in the request information 2. Certainly, the identifier of the camera mode and the zoom parameter in the request information 2 are the same as the identifier of the camera mode and the zoom parameter carried in the request information 1, that is, the request information 2 also includes the photographing mode identifier and the zoom parameter 1.
In some embodiments, the camera service sends the request information 2 to the interface module of the camera HAL, so as to instruct the camera HAL to enable the camera sensor, and instruct the camera HAL to transmit the photographing mode identifier and the zooming parameter 1 carried in the request information 2 to the decision module, and instruct the decision module to decide the image drawing mode that the camera sensor needs to use.
And A4, the interface module sends the photographing mode identifier and the zooming parameter 1 to the decision module.
And A5, the decision module determines a Binning image drawing mode 1 corresponding to the photographing mode according to the zooming parameter 1.
In some embodiments, when the decision module receives the photographing mode identifier, the drawing mode that needs to be adopted may be selected from the drawing modes corresponding to the photographing mode. For convenience of description, the Binning image pattern corresponding to the photographing mode is simply referred to as Binning image pattern 1, the Idcg image pattern corresponding to the photographing mode is simply referred to as Idcg image pattern 1, and the Quadra image pattern corresponding to the photographing mode is simply referred to as Quadra image pattern 1.
In a scene that the photographing mode corresponds to a plurality of plotting modes and the decision module is triggered by the operation 1 to execute the decision, the decision module can determine the plotting mode to be adopted according to the received zooming parameters. For example, when the received zoom parameter is zoom parameter 1, the decision module defaults to Binning fig. mode 1. For another example, when the received zoom parameter is greater than the zoom parameter 1, the decision module defaults to adopt the Quadra graph mode 1.
A6, the decision module sends an identifier 1 to the Sensor node, and the identifier 1 corresponds to the Binning graph mode 1.
In some embodiments, the identifier 1 may be a serial number corresponding to the Binning graph mode 1. For example, the identifiers corresponding to different drawing modes can be configured in advance, and the identifiers of different drawing modes are different.
A7, the Sensor node reads the configuration parameter packet 1 from the XML.
In some embodiments, the Sensor node may send the request information for querying the configuration parameter package to XML after receiving the identifier 1, in this example, only the configuration parameter package 1 in XML, so that XML may send the configuration parameter package 1 to the Sensor node.
After the Sensor node reads the configuration parameter packet 1, the flow proceeds to A8.
A8, the Sensor node sends a configuration parameter packet 1 to the camera driver, the configuration parameter packet 1 comprises public parameters and customized parameters corresponding to a plurality of camera modes, and the customized parameters comprise Binning customized parameters and Idcg customized parameters.
A9, the camera driver sends a configuration parameter packet 1 to the camera sensor.
In some embodiments, the Sensor node may write configuration parameter packet 1 into the camera Sensor through the camera drive. The configuration parameter package 1 may include common parameters and customized parameters. The customization parameters include a plurality of customization parameters corresponding to the image mode, for example, the customization parameters may include a Binning customization parameter corresponding to the photographing mode, an Idcg customization parameter corresponding to the photographing mode, a Binning customization parameter corresponding to the video mode, a Binning customization parameter corresponding to the 4k dynamic frame rate mode, and the like.
It will be appreciated that the type of map mode that the camera sensor is actually switchable to after the configuration parameter packet 1 is written to the camera sensor is determined by the customization parameters contained in the configuration parameter packet 1. For example, the configuration parameter packet 1 includes a Binning customized parameter corresponding to the video recording mode, and the actually switchable image mode of the camera sensor includes a Binning image mode in the video recording mode. For another example, the configuration parameter packet 1 does not include the Idcg customization parameters corresponding to the video recording mode, so that even if the camera sensor supports the Idcg image mode in the video recording mode, the Idcg image mode in the video recording mode cannot be switched to in actual operation.
Ideally, the configuration parameter package 1 should include customized parameters for all drawing modes, so as to ensure that the camera sensor can successfully switch to all drawing modes supported by the camera sensor in practical application. However, in practice, the configuration parameter package 1 containing all the customized parameters cannot be stored because of the limited memory space inside the camera sensor. In these embodiments, part of the customized parameters may be discarded from the configuration parameter packet 1, so that after the configuration parameter packet 1 is configured, the camera sensor cannot enable part of the graph mode, for example, when the configuration parameter packet 1 does not include Idcg customized parameters corresponding to the video recording mode, the configured configuration parameter packet 1 cannot enable the Idcg graph mode in the video recording mode.
And A10, configuring the camera sensor according to the common parameters.
In some embodiments, after the camera sensor receives the configuration parameter package 1, the common parameters in the configuration parameter package 1 may be loaded. After the camera Sensor has configured the common parameters, a notification, for example, referred to as notification 1, may be sent back to the Sensor node through the camera driver, and thus the process proceeds to a11.
A11, the Sensor node sends a Binning customizing parameter 1 to a camera driver, wherein the Binning customizing parameter 1 is a customizing parameter corresponding to a photographing mode and is used for realizing the Binning drawing mode 1.
The above-mentioned marketing customizing parameter 1 is a customizing parameter corresponding to the marketing figure pattern 1, and as shown in table 1, may be a field angle: 78 °, dimension of figure: 3632x2720, aspect ratio: 4, frame rate: 30fps and other parameters (e.g., scaling parameters), etc.
A12, the camera driver sends Binning customization parameters 1 to the camera sensor.
Illustratively, after the camera driver receives notification 1, the camera driver again writes Binning customization parameter 1 into the camera sensor.
And A13, configuring the camera sensor according to the Binning customization parameter 1.
And A14, the Sensor node sends a streaming instruction to the camera driver.
In some embodiments, after the camera sensor receives and configures the Binning customization parameter 1, a notification 2 is sent to the camera driver, which notification 2 is used to prompt that the Binning customization parameter 1 is configured to be completed. After the camera driver receives the notification 2, the camera driver may forward the notification 2 to the Sensor node, and the Sensor node determines whether the camera Sensor starts data acquisition, e.g., starts to acquire raw image data. After the Sensor node determines that the camera Sensor needs to start data acquisition, the Sensor node sends a streaming start instruction to the camera driver.
And A15, the camera drive sends a starting instruction to the camera sensor.
And A16, adopting a Binning drawing mode 1 by the camera sensor, and starting drawing.
In some examples, the camera sensor may process the acquired raw pixel array using Binning fig. 1 in response to the streaming instruction, obtain raw image data meeting the requirement of Binning fig. 1, and send the raw image data to the image processing module through the camera driver. In this way, when the photographing preview interface (that is, the application interface displayed after the electronic device receives the operation 1) is displayed, the image frame output by the image processing module may be displayed.
During the period of displaying the photographing preview interface, the photographing scene of the electronic device can be changed, and the required drawing mode can also be changed. For example, the ambient lighting parameters collected by the camera sensor indicate that the electronic device is currently in a high-light environment, as shown in fig. 4, the step of configuring the camera parameters further includes:
a17, the decision module obtains an ambient light parameter 1.
In some examples, during operation of the camera sensor, the light sensing element in the camera sensor may sense the ambient light of the environment in real time, obtain the ambient light parameter, and transmit the ambient light parameter to the decision module.
And A18, the decision module determines an Idcg map making mode 1 corresponding to the photographing mode according to the zooming parameter 1 and the environment illumination parameter 1.
In some embodiments, the decision module decides the actual used graph mode according to the received zoom parameter and the current ambient lighting parameter. The environment illumination parameter may be used to determine whether the scene is in a High Dynamic Range (HDR) scene. Illustratively, a HDR scene is determined to be when the ambient lighting parameters indicate that there are both areas of high brightness (e.g., sky) and areas of low brightness (e.g., ground) in the field of view of the camera. Of course, whether the high-luminance region and the low-luminance region exist or not may be identified by subtracting the highest luminance and the lowest luminance in the field of view, and determining that the high-luminance region and the low-luminance region exist when the difference exceeds a specified value. Of course, the specific determination rule may refer to related technologies, and will not be described herein.
In other words, the decision module can decide the actually used graph mode according to the zoom parameters and whether it is in the HDR scene. Illustratively, in the photographing mode, the zoom parameter is smaller than a first multiple (e.g., 2 times), and in the case of an HDR scene, idcg map mode 1 is selected. Also illustratively, in the photographing mode, the zoom parameter is less than 2 times, and in the case of not being in the HDR scene, binning fig. mode 1 is selected. Also illustratively, in the photographing mode, the zoom parameter is not less than 2 times, and Quadra map mode 1 is selected regardless of the HDR scene.
When the mobile phone is in an HDR scene or the zoom parameter of the camera of the mobile phone is greater than a first multiple, it may be said that the mobile phone currently satisfies a first condition. When the mobile phone is not in the HDR scene and the zoom parameter of the camera of the mobile phone is not greater than the first multiple, it may be said that the mobile phone currently satisfies the second condition.
Before the decision module makes a decision, if a new zoom parameter and a new camera mode identifier issued by the camera application are not received, the decision module may evaluate an image drawing mode suitable for the current shooting scene from image drawing modes corresponding to the shooting mode based on the zoom parameter 1 and the ambient light parameter 1. In the case that the ambient lighting parameter 1 indicates that the electronic device is in the HDR scene, the decision module may determine to adopt Idcg map mode 1 corresponding to the photographing mode.
A19, the decision module sends an identifier 2 corresponding to the Idcg graph mode 1 to the Sensor node.
The identifier 2 is a preset identifier uniquely corresponding to the diagram mode 1 of Idcg, the identifier 2 is also associated with the customization parameter 1 of Idcg, and the association relationship can also be preset. In this way, the Sensor node can determine the customization parameter to be sent to the camera driver, that is, the Idcg customization parameter 1, according to the identifier 2. Thus, the flow may proceed to a20.
A20, the Sensor node sends an Idcg custom parameter 1 to the camera driver, wherein the Idcg custom parameter 1 is a custom parameter corresponding to the photographing mode and is used for realizing the Idcg image mode 1.
Wherein Idcg custom parameter 1 is a custom parameter corresponding to Idcg image mode 1. As shown in table 1, idcg customization parameter 1 may include field angle: 78 °, dimension of figure: 3632x2720, aspect ratio: 4, frame rate: 30fps, other parameters (e.g., DCG related parameters), etc.
A21, the camera driver sends Idcg custom parameter 1 to the camera sensor.
A22, the camera sensor loads the received Idcg custom parameter 1.
After the camera sensor finishes loading the Idcg custom parameter 1, the camera sensor may implement a switch to the Idcg map mode to fit the current shooting scene.
A23, the camera sensor makes a chart according to Idcg chart mode 1.
In some embodiments, the machine sensor may process the acquired raw pixel array by using Idcg map mode 1 to obtain raw image data meeting the Idcg map mode 1 requirement, and send the raw image data to the image processing module by the camera driver.
Obviously, each time the electronic device determines that the graph mode needs to be switched, the electronic device needs to wait for a long time until the switching is actually completed. This latency is effectively the time period for the camera driver to receive Idcg customization parameter 1 in its entirety and write it to the camera sensor. It can be seen that, in the above embodiment, when the electronic device faces a shooting scene change, timeliness of switching different drawing modes is not good.
In addition, in the above embodiment, due to the limitation of the system chip internal program, the configuration parameter packet 1 is written into the camera sensor only when the operation 1 is received. Meanwhile, the configuration parameter package 1 cannot include customized parameters required for all the drawing modes due to the memory space limitation of the camera sensor. For example, the configuration parameter packet 1 does not contain Idcg customization parameters in the video recording mode, and then after the camera sensor sets the configuration parameter packet 1, the camera sensor cannot switch the Idcg image mode in the video recording mode, which greatly affects the shooting performance of the electronic device.
The embodiment of the application provides a camera parameter configuration method, which is applied to electronic equipment with a shooting function, such as a mobile phone. The method can solve the problems that the actual switchable drawing modes of the camera sensor are few, and the timeliness of switching the drawing modes is not high.
The camera parameter configuration method provided in the embodiment of the present application is described below by taking an example in which the electronic device is a mobile phone.
In some embodiments, as shown in fig. 5, after the cell phone is unlocked, a home interface 501 may be displayed. The home interface 501 includes an application icon 502 for the camera application. In a scenario that the background application of the mobile phone does not include the camera application, the mobile phone receives a click operation of the user on the application icon 502, and the mobile phone may first display a waiting interface, such as the interface 503. The interface 503 may be an application interface corresponding to the photographing mode. When the interface 503 is displayed, the camera sensor does not return the original image data, that is, there is no displayable image frame in the interface 503.
In other embodiments, in a scene where the background application of the mobile phone includes the camera application, if the video recording mode is enabled before the camera application enters the background operation, the mobile phone receives a click operation of the application icon 502 by the user, and may also display a waiting interface, where the waiting interface is an application interface corresponding to the video recording mode.
In other possible embodiments, a long press operation of the application icon 502 by the user may be received during the display of the main interface 501 by the mobile phone. In this way, the handset can display a mode selection window with respect to the application icon 502, the mode selection window including mode controls that indicate the respective camera modes. In this scenario, the handset may receive a user operation on any mode control and determine the camera mode selected by the user. For example, the mobile phone may display a waiting interface corresponding to the video recording mode after receiving an operation of the mode control indicating the video recording mode by the user.
In addition, besides clicking the application icon 502, in a scene that the background application of the mobile phone includes a camera application, the mobile phone may be instructed to display the waiting interface in a manner of operating a multi-task interface. Of course, when the background application of the mobile phone does not include the camera application, the mobile phone may be instructed to display the waiting interface corresponding to the default camera mode by operating the camera shortcut key. In addition, whether the camera application needs to be opened or not may also be analyzed by recognizing a voice instruction spoken by the user, or by detecting a gesture motion or the like made by the user. For example, recognizing that the user speaks the keywords "camera," "capture," etc., it may be determined that the camera application needs to be opened and the corresponding wait interface is displayed. As another example, recognizing that the user makes a gesture motion associated with the camera application, it may also be determined that the camera application needs to be opened and a corresponding waiting interface is displayed. Additionally, gesture actions associated with the camera application may be preset.
Continuing with the example of the waiting interface 503 being displayed. During display of the interface 503, the handset may perform configuration of camera parameters to instruct the camera sensor to initiate acquisition of raw image data.
In some embodiments, as shown in fig. 6, the method may include:
s101, the camera application determines that the operation of the application icon 502 by the user is received.
In some embodiments, the operation may be a click operation of the application icon 502 by the user.
In other embodiments, the operation may also be a long-press operation of the application icon 502 by the user. Of course, the mobile phone may also display a mode selection window upon receiving a long press operation of the application icon 502 by the user. Mode controls indicating the respective camera modes are included in the mode selection window. The process may also proceed to S102 after the camera application determines that the user selects a mode control of any camera mode.
It is understood that the above S101 is only an example. In the actual use process, the mobile phone may further instruct the process to enter S102 according to other operations of the user. For example, the user operates a window 1 in the multi-task interface, wherein an application interface thumbnail of the camera application is displayed in the window 1. As another example, the user clicks on a shortcut entry of the camera application (e.g., a camera shortcut entry displayed in the negative screen of a cell phone). For another example, the mobile phone may also indicate that the process may enter S102 when detecting that the user speaks a keyword related to the camera application, or when detecting that the user performs a gesture motion related to the camera application, or the like.
The operations described above are all operations that can instruct to open the camera application (i.e., to bring the camera application into the foreground for running), and may also be referred to as a first operation.
And S102, the camera application sends request information 1 to the camera service, wherein the request information 1 comprises a photographing mode identifier and a zooming parameter 1.
The request information 1 is used to wake up the camera service and instruct the camera service to provide a corresponding service to the camera application. In addition, the request information 1 carries camera parameters, such as zoom parameters and an identification of a default camera mode.
Wherein, the zoom parameter may be 1, for example, 1 time without special indication or setting.
In addition, before S101, the camera mode identifier in the request information 1 in the scenario that the camera application is not included in the background application of the mobile phone is an identifier of the photographing mode. Before S101, in a scene where the background application of the mobile phone includes the camera application, the camera mode identifier in the request information 1 is related to a camera mode used before the camera application enters the background operation. For example, the camera application uses the video recording mode before entering the background operation, and the camera mode identifier in the request information 1 is an identifier of the video recording mode. For another example, the camera application uses the photographing mode before entering the background operation, and the camera mode flag in the request information 1 is a flag of the photographing mode. In the following embodiments, the description is continued by taking an example in which the request information 1 includes the identification of the photographing mode.
S103, the camera service sends the photographing mode identification and the zooming parameter 1 to the interface module.
In some embodiments, the camera service sends the camera parameters from request information 1 to the interface module in response to the received request information 1. For example, if the request information 1 includes a shooting mode identifier and a zoom parameter 1, the shooting mode identifier and the zoom parameter 1 are sent to the interface module.
And S104, the interface module sends the photographing mode identifier and the zooming parameter 1 to the decision module.
And S105, the decision module determines a Binning image drawing mode 1 corresponding to the photographing mode according to the zooming parameter 1.
In some embodiments, the implementation principle of S105 may refer to A5, which is not described herein. The Binning drawing pattern 1 described above may also be referred to as a first drawing pattern.
S106, the decision module sends an identifier 1 to the Sensor node, and the identifier 1 corresponds to the Binning graph mode 1.
As an implementation manner, a correspondence table may be preconfigured in the mobile phone, and is used to indicate a correspondence between each drawing pattern, the identifier, the configuration parameter packet, and the customized parameter. In addition, both the Sensor node and the decision module can access the corresponding relation table. Illustratively, table 2 below:
TABLE 2
Figure BDA0003610759210000141
Thus, after the decision module determines the required graph schema (Binning graph schema 1), it can send identifier 1 to the Sensor node. Thus, the Sensor node can determine that the camera Sensor needs to be instructed to start Binning fig. mode 1 based on table 2 and identification 1.
In addition, the content of the customization parameters corresponding to the graph modes in table 2 can refer to table 1, and is not repeated herein.
And S107, the Sensor node determines a configuration parameter packet 2 corresponding to the identifier 1, wherein the configuration parameter packet 2 comprises public parameters and customized parameters corresponding to the photographing mode.
In some embodiments, a plurality of configuration parameter packages may be configured in advance in the memory of the mobile phone, and the configuration parameter packages are formed by combining the common parameters and the customized parameters corresponding to the at least one camera mode. Illustratively, the customization parameters in different configuration parameter packages are different. The mapping mode corresponding to the customized parameter in the configuration parameter packet is also the mapping mode supported by the configuration parameter packet, or the mapping mode corresponding to the configuration parameter packet. For example, the configuration parameter package 2 includes all the customization parameters corresponding to the common parameters and the photographing mode, that is, when the photographing mode supports the Binning drawing mode, the Idcg drawing mode and the Quadra drawing mode, the configuration parameter package 2 may include the common parameters, the Binning customization parameters 1, the Idcg customization parameters 1 and the Quadra customization parameters 1. For another example, the configuration parameter packet 3 includes a common parameter, a customized parameter corresponding to the video mode, and a customized parameter corresponding to the 4k dynamic frame rate mode, that is, when both the video mode and the 4k dynamic frame rate mode support a Binning graph mode and an Idcg graph mode, the configuration parameter packet 3 may include the common parameter, a Binning customized parameter 2, an Idcg customized parameter 2, a Binning customized parameter 3, and an Idcg customized parameter 3.
The configuration parameter packet 2 may be referred to as a first packet, and the configuration parameter packet 3 may be referred to as a second packet.
Therefore, the customized parameters corresponding to all the graph modes do not need to be packaged into the same configuration parameter packet, and compared with a complete configuration parameter packet (such as the configuration parameter packet 1) in the related art, the data volume of each configuration parameter packet is effectively controlled on the premise that the customized parameters of each graph mode correspond to one configuration parameter packet.
In addition, the above-mentioned configuration parameter package may be mirrored in the XML in the HAL layer, so that the configuration parameter package 2 and the configuration parameter package 3 may be included in the XML in the HAL layer. The Sensor node may read the above-mentioned configuration parameter packet 2 and configuration parameter packet 3 from XML.
In some embodiments, each configuration parameter packet may be pre-marked with a corresponding camera mode, such that there is also a correspondence between the configuration parameter packet and the identifier. For example, configuration parameter packet 2 may be marked to correspond to a picture-taking mode, and configuration parameter packet 3 may be marked to correspond to a picture-recording mode and a 4k dynamic frame rate mode. After the Sensor node receives the identifier 1, it may determine that the configuration parameter packet 2 corresponds to the identifier 1 according to the corresponding relationship table (e.g., table 2) and the identifier 1.
In other embodiments, the Sensor node may determine the Binning customized parameter 1 corresponding to the identifier 1 according to the corresponding relationship table (e.g., table 2) and the identifier 1, and then find the configuration parameter packet 2 including the Binning customized parameter 1 from all the configuration parameter packets, thereby determining that the configuration parameter packet 2 matches the identifier 1.
S108, the Sensor node reads the configuration parameter packet 2 from the XML.
And S109, the Sensor node sends a configuration parameter packet 2 to the camera driver.
S110, the camera driver sends a configuration parameter packet 2 to the camera sensor.
In some embodiments, the Sensor node sends the configuration parameter packet 2 to the camera Sensor through the camera driver and writes it into the memory space of the camera Sensor, so that the camera Sensor can load the data in the configuration parameter packet 2.
And S111, configuring the camera sensor according to the common parameters.
In some embodiments, after the camera sensor receives and stores the configuration parameter package 2, the common parameters in the configuration parameter package 2 may be loaded to implement the initialized configuration.
S112, the camera sensor sends a completion notification 1 to the camera driver.
S113, the camera driver transmits a completion notification 1 to the Sensor node.
In addition, after the camera Sensor receives the configuration parameter packet 2 and configures the public parameters, the camera Sensor can also send a completion notification 1 to the Sensor node through the camera drive, where the completion notification 1 is used to prompt the Sensor node that the current camera Sensor has stored the configuration parameter packet 2 and configures the public parameters.
And S114, setting the value of the marking bit to be 1 by the Sensor node, and marking the configuration parameter packet 2 as a issued state.
In some embodiments, the mobile phone sets a global variable accessible by the Sensor node in the designated storage area, such as a flag bit, which is also called a first flag bit, and the value of the flag bit may be set to a first value (e.g., 0) or to a second value (e.g., 1), where the first value and the second value are different and refer to different meanings respectively.
After the Sensor node has sent the configuration parameter packet, e.g., configuration parameter packet 2, to the camera Sensor, the flag bit may be set to 1, indicating that the complete customized parameter (e.g., binning customized parameter 1) of the enabled graph schema (e.g., binning graph schema 1) has not been currently issued.
It should be noted that, due to the limitation of the system chip, after the camera Sensor stores the configuration parameter packet, the Sensor node first instructs the camera Sensor to enable a certain drawing mode, for example, binning drawing mode 1, and the Sensor node needs to write the complete customization parameters of the drawing mode, for example, the complete Binning customization parameters 1, into the camera Sensor. That is, in the case that the flag bit is set to 1, the Sensor node may recognize that the configuration parameter packet 2 has been issued and the complete Binning customized parameter 1 has not been issued, so the flow proceeds to S111.
In addition, the parameter packet 2 may also be configured to be in a issued state, which indicates that the configuration parameter packet actually stored and loaded in the camera sensor is the configuration parameter packet 2.
Exemplarily, after the configuration parameter packet 2 is issued to the camera sensor, the corresponding relationship table is updated from table 2 to table 3:
TABLE 3
Figure BDA0003610759210000161
In other examples, after sending configuration parameter packet 2 to the camera Sensor, the Sensor node may write an identification, e.g., name, of configuration parameter packet 2 to the specified address. Thus, the Sensor node can determine the configuration parameter packet that has been sent to the camera Sensor by querying the specified address. Of course, after that, if the Sensor node sends another configuration parameter packet (e.g., configuration parameter packet 3) to the camera Sensor, the Sensor node may modify the identifier of the configuration parameter packet 2 in the designated address to the identifier of the configuration parameter packet 3.
S115, when the value of the marking bit is 1, the Sensor node sends a Binning customizing parameter 1 to the camera drive, wherein the Binning customizing parameter 1 is a customizing parameter corresponding to the photographing mode and is used for realizing the Binning drawing mode 1.
The Binning customization parameter 1 is a customization parameter corresponding to the Binning drawing pattern 1, that is, a configuration parameter corresponding to the first drawing pattern, and specific contents thereof may be referred to table 1 and are not described herein again.
In some embodiments, the Sensor node may further determine that the image drawing mode adopted by the camera Sensor to be indicated is a Binning image drawing mode 1 according to the identifier 1 and the corresponding relationship table (e.g., table 3).
In addition, the value of the flag bit can be set by the Sensor node, and the Sensor node can also inquire the value of the flag bit at any time. Thus, the Sensor node can query and determine whether the value of the flag bit is 1. When the value of the flag bit is 1, the Sensor node needs to send a complete Binning customization parameter 1 to the camera Sensor through the camera drive. That is, the Sensor node may send Binning customized parameter 1 to the camera driver, instructing the camera driver to send Binning customized parameter 1 to the camera Sensor again.
S116, the camera driver sends a Binning customization parameter 1 to the camera sensor.
And S117, configuring the camera sensor according to the Binning customization parameter 1.
In some embodiments, after the camera sensor receives and stores the Binning customization parameter 1, the Binning customization parameter 1 can be configured directly.
S118, the camera sensor sends a completion notification 2 to the camera driver.
S119, the camera driver sends a completion notification 2 to the Sensor node.
In some examples, it may be that after the camera Sensor receives and configures the Binning customized parameter 1, a completion notification 2 is sent to the Sensor node by the camera driver. Wherein the completion notification 2 is used to indicate that the camera sensor has completed configuring the customized parameters.
S120, the value of the Sensor node change mark bit is 0.
In some embodiments, the value of the Sensor node change flag bit is a first value, and the above "0" is only one example of the first value. When the value of the flag bit is 0 and indicates that other graph modes supported by the configuration parameter packet 2 need to be switched subsequently, the complete customized parameter does not need to be issued.
And S121, the Sensor node sends a starting stream command to the camera driver.
In some embodiments, after the value of the flag bit is configured to 0, it may be determined that an origination instruction needs to be sent to the camera driver.
S122, the camera driver sends a start command to the camera sensor.
And S123, drawing the image of the camera sensor according to a Binning drawing mode 1.
In some embodiments, after the camera driver receives the streaming instruction, the streaming instruction is forwarded to the camera sensor, instructing the camera sensor to acquire raw image data using Binning fig. mode 1.
In some embodiments, as shown in fig. 7, the method further comprises:
and S124, the camera sensor responds to the streaming instruction and acquires raw image data.
The drawing mode corresponding to the original image data is Binning drawing mode 1.
S125, the camera sensor sends raw image data to the camera driver.
It is understood that the above-mentioned S124, S125 are the processes of performing the figure according to Binning figure mode 1 indicated in S123.
S126, the camera driver sends the raw image data to the image processing module.
In some embodiments, the camera sensor passes the captured raw image data to the image processing module through the camera driver for processing by the image processing module.
And S127, processing the original image data by the image processing module to obtain an image frame.
In some embodiments, the principle of processing the raw image data by the image processing module may refer to related technologies, which are not described herein again.
S128, the image processing module transmits the image frame to the interface module.
S129, the interface module delivers the image frame to the camera service.
S130, the camera service delivers the image frames to the camera application.
In some embodiments, the image processing module passes through the processed image frames to the camera application through the interface module and the camera service.
S131, the camera application displays the image frame.
In some embodiments, after the camera application receives the image frame, it may instruct to display the image frame in the application interface of the photographing mode, as shown in fig. 5, and the cell phone may display an interface 504, also referred to as a first interface. Illustratively, after the camera application receives an image frame (e.g., referred to as a first image frame), a view system in the application framework layer may be scheduled through which to enable display of the image frame.
In some embodiments, after the mobile phone displays the interface 504, even if an operation indicating switching the camera mode and a zooming operation are not received, the mobile phone may identify whether the mobile phone enters the HDR scene according to the ambient lighting parameters. The HDR scene refers to a shooting scene in which the HDR technology needs to be enabled, for example, a shooting environment includes both a high-luminance region and a low-luminance region. Switching the map out mode of the camera sensor may also continue after determining to enter the HDR scene.
For example, as shown in fig. 8, the lens of the mobile phone is turned from indoor to outdoor, and the mobile phone may switch from the display interface 504 to the display interface 801. Thus, the lens view of the mobile phone has a high-brightness region (sky) and a low-brightness region (ground) at the same time. In this scenario, the mobile phone may determine that the mobile phone is currently in the HDR scenario according to the perceived ambient lighting parameter, and may also instruct the camera sensor to switch to the Idcg map mode 1, that is, the Idcg map mode corresponding to the photographing mode. Of course, the process of instructing the camera sensor to switch to Idcg, fig. mode 1, requires the camera parameters to be configured again.
In some embodiments, as shown in fig. 9, the camera parameter configuration method further includes:
s201, the decision module obtains an environment illumination parameter 1.
In some embodiments, the decision module may acquire the real-time ambient illumination parameters through a camera sensor, and the specific implementation principle may refer to related technologies, which are not described herein again.
S202, the decision module determines an idcg image mode 1 corresponding to the photographing mode according to the zooming parameter 1 and the ambient light parameter 1.
In some embodiments, the camera application does not pass new zoom parameters and camera mode identification to the decision module when the handset does not receive an operation indicating a switch of camera mode and an operation indicating a modification of zoom parameters. In this way, the decision module can determine the required mapping mode based on the last obtained zoom parameter and camera mode identification.
For example, the last obtained zoom parameter and camera mode identification by the decision module are: and identifying the zoom parameter 1 and the photographing mode, and then the decision module can evaluate the drawing mode suitable for the current photographing scene from the drawing mode corresponding to the photographing mode according to the zoom parameter 1 and the ambient illumination parameter 1. For example, in a case that the ambient lighting parameter 1 indicates that the mobile phone is in the HDR scene, the decision module may determine to adopt Idcg map mode 1 corresponding to the photographing mode.
S203, the decision module sends an identifier 2 to the Sensor node, and the identifier 2 corresponds to the idcg graph mode 1.
And S204, the Sensor node determines that the identifier 2 corresponds to the issued configuration parameter packet 2.
In some embodiments, the Sensor node may determine that identifier 2 corresponds to configuration parameter packet 2 through a correspondence table (e.g., table 3). Meanwhile, the configuration parameter packet 2 is recorded in table 3 as a issued state, so that the Sensor node can determine that the configuration parameter packet 2 corresponding to the identifier 2 has been sent to the camera Sensor. In this scenario, before the Sensor node indicates that the map-out mode corresponding to identifier 2 is enabled (Idcg map-out mode 1), the configuration parameter packet does not need to be re-issued to the camera Sensor.
In other embodiments, after determining that the identifier 2 corresponds to the configuration parameter packet 2, the Sensor node may further query a specified address, and when the identifier of the configuration parameter packet 2 is written in the specified address, determine that the configuration parameter packet 2 has been sent to the camera Sensor.
S205, a Sensor node, wherein the Sensor node determines that the value of the marking bit is 0.
In some embodiments, after S204, the Sensor node also queries the value of the flag bit. Upon determining that the value of the flag bit is 0, the flow may proceed to S205.
S206, the Sensor node sends an identifier of the Idcg customization parameter 1 to the camera driver, wherein the Idcg customization parameter 1 is a customization parameter corresponding to the photographing mode and is used for realizing the Idcg image mode 1.
In other embodiments, idcg custom parameter 1 is associated with Idcg map mode 1, such that the Sensor node may also send an identification of Idcg map mode 1 to the camera Sensor via the camera driver, and the camera Sensor may determine that the Sensor node indicates that Idcg map mode 1 is enabled based on the identification of Idcg map mode 1. In the following embodiments, the example is mainly given by sending the identifier of the Idcg customization parameter 1. In addition, the Idcg map mode 1 is also referred to as a second map mode.
And S207, the camera driver instructs the camera sensor to access the register address 1 storing the Idcg custom parameter 1 according to the identification of the Idcg custom parameter 1.
The register address 1 may be a storage address of the Idcg customization parameter 1 in the camera sensor, which is also referred to as a first address. This register address 1 can be used to look up the Idcg custom parameter 1 already stored in the camera sensor. In addition, the Idcg custom parameter 1 stored in the camera sensor is from the configuration parameter package 2. After the camera sensor stores the configuration parameter packet 2, the storage addresses corresponding to the common parameters and the plurality of customized parameters in the configuration parameter packet 2 may be recorded.
In some embodiments, the Sensor node sends an identification of idcg custom parameter 1 to the camera Sensor through the camera driver. It is to be understood that the identification of Idcg customization parameter 1 may refer to Idcg customization parameter 1, for example, the identification may be the name of Idcg customization parameter 1. The camera Sensor may recognize that the Sensor node indicates a requirement to load Idcg custom parameter 1, based on the identity of Idcg custom parameter 1. In this way, the camera sensor may retrieve register address 1 from the recorded memory address in response to the identification of the idcg custom parameter 1. Then, the camera sensor accesses the register address 1 again, loads the corresponding Idcg custom parameter 1, and performs configuration, that is, the flow proceeds to S207.
In other embodiments, the camera Sensor receives the identity of Idcg map mode 1 and may recognize that the Sensor node indicates that Idcg map mode 1 is enabled. In this way, the camera sensor can access the register address 1 and load the Idcg custom parameter 1 indicated by the register address 1, thereby implementing configuration of the custom parameter and switching the image drawing mode of the camera sensor to Idcg image drawing mode 1.
S208, the camera sensor switches to load the Idcg custom parameter 1 indicated by the register address 1.
In some embodiments, after accessing register address 1, the camera sensor automatically loads Idcg custom parameter 1 indicated by register address 1 so that the camera sensor can automatically switch to Idcg map mode 1 without restarting the stream.
S209, the camera sensor plots according to Idcg plot mode 1.
In some embodiments, the foregoing S123 may be referred to for implementation of S209, and is not described herein again. Thus, referring to S124-S130, the camera application may obtain and display a second image frame. The second image frame is the image obtained after the camera sensor has taken Idcg image mode 1.
Obviously, the data volume of the identity is smaller than the complete Idcg custom parameter 1, and the speed of sending the identity of the Idcg custom parameter 1 is faster than the speed of sending the complete Idcg custom parameter 1. That is, by sending an identification of Idcg custom parameter 1, instructing the camera sensor to enable Idcg graph mode 1, the switching speed is faster compared to the related art.
In some embodiments, as shown in fig. 10, during the display interface 504 of the mobile phone, a zoom operation instructed by the user may be received, for example, an operation of instructing to modify a zoom parameter of the camera from 1 time to 3 times. In this way, in the case where an operation of switching the camera mode is not received, the cellular phone may determine that the zoom parameter of the camera is greater than the first multiple in response to the zoom operation, and then the cellular phone may again configure the camera parameter so that the camera sensor switches to the Quadra map mode.
As an implementation manner, as shown in fig. 11, the camera parameter configuration method may further include:
s301, the camera application determines that a zoom operation is received, and acquires zoom parameter 2.
In some embodiments, as shown in FIG. 10, a zoom bar 1001 is included in the interface 504. A sliding window 1002 is displayed on the zoom bar 1001. It can be understood that different position points in the zoom bar 1001 correspond to different zoom parameters. The zoom parameter indicated by the position point where the sliding window 1002 overlaps the zoom bar 1001 is the currently selected zoom parameter. In addition, the sliding window 1002 may also display the numerical value of the currently selected zoom parameter.
In some embodiments, when the mobile phone receives a sliding operation of the user on the zoom bar 1001, it may be determined that a zoom operation is received. The sliding operation may instruct the sliding window 1002 to adjust the position point overlapping with the zoom bar 1001, thereby instructing to modify the selected zoom parameter, and after the sliding operation of the user is finished, the mobile phone may obtain the zoom parameter 2. For example, as shown in fig. 10, the user instructs to modify the selected zoom parameter from 1 time to 3 times, and thus, 3 times is determined as the zoom parameter 2.
S302, the camera application sends zoom parameter 2 to the camera service.
S303, the camera service sends zoom parameter 2 to the interface module.
S304, the interface module sends the zooming parameter 2 to the decision module.
In some embodiments, after the camera application obtains the zoom parameter 2, the zoom parameter 2 may be passed through to the decision module by the camera service, the interface module, or the like.
S305, the decision module determines to adopt a Quadra image mode 1 corresponding to the photographing mode according to the zooming parameter 2.
Illustratively, in a case where the camera mode is not changed, for example, the camera mode is still the photographing mode, and the value of the zoom parameter 2 is greater than the first multiple (e.g., 2 times), it is determined that the Quadra fig. mode 1 is adopted. The first multiple may be preconfigured.
S306, the decision module sends an identifier 3 to the Sensor node, wherein the identifier 3 corresponds to the Quadra graph mode 1.
S307, the Sensor node determines that the identifier 3 corresponds to the issued configuration parameter packet 2, and determines that the value of the marker bit is 0.
S308, the Sensor node sends an identifier of a Quadra customization parameter 1 to the camera driver, wherein the Quadra customization parameter 1 is a customization parameter corresponding to the photographing mode and is used for realizing a Quadra image mode 1.
In other embodiments, the Sensor node may also be an identifier that sends a Quadra figure pattern 1 to the camera driver.
S309, the camera driver instructs the camera sensor to access the register address 2 stored with the Quadr custom parameter 1 according to the mark of the Quadr custom parameter 1.
In other embodiments, the camera driver may also instruct the camera sensor to access register address 2 storing Quadra custom parameter 1 according to the identifier of Quadra figure mode 1.
S310, the camera sensor switches to load the Quadra custom parameter 1 indicated by the register address 2.
S311, the camera sensor begins to draw a picture according to the Quadra picture mode 1.
In some embodiments, the implementation principle of S306 to S311 can refer to S203 to S209, which are not described herein again.
Of course, after the camera sensor switches to load the Quadra custom parameter 1, the stream does not need to be restarted. In addition, after the Quadra custom parameter 1 is loaded, the camera sensor may adopt the Quadra image mode 1 to output the raw image data to the image processing module. After the Quadra image mode 1 is started, the image frames output by the image processing module have more pixel points compared with the image frames output before starting. In this scenario, even if the shot preview picture is enlarged, the image frame displayed by the mobile phone is still clear. That is, after switching to Quadra fig. mode 1, the mobile phone may switch from display interface 504 to display interface 1003.
In summary, under the condition that the camera mode is not changed, the Sensor node can realize the switching of different image showing modes by sending the customized parameters or the marks of the image showing modes, in addition, the data volume of the marks is small, the transmission speed is higher, the customized parameters are prevented from being written into the camera Sensor repeatedly, and the switching efficiency of the image showing modes is improved.
In addition, in some special scenes, for example, in a scene where the camera mode is not changed but the value of the flag bit is 1 (an exception occurs), the Sensor node may instruct to start the corresponding plotting mode by issuing a complete customization parameter, modify the value of the flag bit to 0, and restart the streaming after the camera Sensor stores the customization parameter.
In other embodiments, as shown in fig. 12, the interface 504 displayed on the mobile phone further includes a control for instructing to switch to the video recording mode, for example, a control 1201. When the mobile phone receives an operation of the user on the control 1201, the mobile phone may display a shooting preview interface corresponding to the video recording mode, that is, the interface 1202, also referred to as a second interface. The configuration of the camera parameters is also required between the switching of the handset from the display interface 504 to the display interface 1202.
Exemplarily, as shown in fig. 13, the camera parameter configuration method may further include:
s401, the camera application determines that an operation indicating that the video recording mode is enabled is received.
In some embodiments, as shown in fig. 12, during the display interface 504 of the mobile phone, if an operation of a control 1201 by the user, such as a click operation, is received, the camera application may determine that an operation indicating that the video recording mode is enabled is received. The operation of activating the video recording mode and displaying the video recording interface may be referred to as a second operation.
S402, the camera application sends the video recording mode identification and zoom parameter 1 to the camera service.
S403, the camera service sends the identifier of the video recording mode and the zoom parameter 1 to the interface module.
S404, the interface module sends the mark of the video recording mode and the zooming parameter 1 to the decision module
In some embodiments, the camera application, in response to an operation indicating that the video recording mode is enabled, may pass the identification of the video recording mode and a default zoom parameter (e.g., zoom parameter 1) corresponding to the video recording mode through the camera service, interface module, to the decision module.
S405, the decision module determines a Binning image mode 2 corresponding to the video mode according to the zooming parameter 1 and the current environment illumination parameter.
In some embodiments, when the decision module receives the video recording mode identifier, an image mode to be adopted may be selected from image modes corresponding to the video recording mode. For convenience of description, a Binning image mode corresponding to the video recording mode is simply referred to as a Binning image mode 2, an Idcg image mode corresponding to the video recording mode is simply referred to as an Idcg image mode 2, and a Quadra image mode corresponding to the video recording mode is simply referred to as a Quadra image mode 2.
Illustratively, when the current ambient lighting parameter indicates that the HDR scene is not entered, and the zoom parameter is 1 times less than 2 times, it may be determined to adopt Binning drawing mode 2. Also illustratively, idcg plot mode 2 may be determined to be employed when the current ambient lighting parameters indicate that an HDR scene has been entered, and the zoom parameter 1 is less than 2 times. As still another example, when the zoom parameter 1 is not less than 2 times, quadra fig. mode 2 may be determined to be employed.
S406, the decision module sends an identifier 4 to the Sensor node, wherein the identifier 4 corresponds to the Binning graph mode 2.
In some embodiments, the implementation principle of S406 is the same as that of S106, and is not described herein again. In addition, binning graph mode 2 is also called as a third graph mode.
And S407, the Sensor node determines a matching parameter packet 3 corresponding to the identifier 4, wherein the configuration parameter packet 3 comprises the public parameters and the customized parameters corresponding to the video recording mode, and the configuration parameter packet 3 is not sent.
In some embodiments, the Sensor node may determine that the identifier 4 corresponds to the configuration parameter packet 3 through a correspondence table (e.g., table 3). However, as shown in table 3, it is the configuration parameter packet 2, but not the configuration parameter packet 3, which is currently in the issued state, so that the mobile phone is definitely in the condition that the configuration parameter packet 3 is not sent.
Under the condition that the mobile phone does not send the configuration parameter packet 3, before the Sensor node indicates that the camera Sensor enables the mapping mode (Binning mapping mode 2) corresponding to the identifier 4, the configuration parameter packet 3 needs to be sent to the camera Sensor, and thus the flow goes to S408.
S408, the Sensor node reads the configuration parameter packet 3 from the XML.
And S409, the Sensor node sends a configuration parameter packet 3 to the camera driver.
S410, the camera driver sends the configuration parameter packet 3 to the camera sensor.
In some embodiments, the Sensor node sends configuration parameter packet 3 to the camera Sensor through the camera driver.
S411, the camera sensor is configured according to the common parameters.
In some embodiments, after receiving the configuration parameter packet 3, the camera sensor performs configuration according to the common parameters in the configuration parameter packet 3.
S412, the camera sensor sends a completion notification 3 to the camera driver.
S413, the camera sensor transmits a completion notification 3 to the camera driver.
Likewise, after receiving and configuring the common parameters in the configuration parameter packet 3, the camera Sensor sends a completion notification 3 to the Sensor node through the camera driver, where the completion notification 3 is used to prompt the Sensor node that the current camera Sensor has completed receiving and configuring the configuration parameter packet 3.
S414, the Sensor node resets the value of the marking bit to 1 and marks the configuration parameter packet 3 as the issued state.
In some embodiments, after the Sensor node sends any configuration parameter packet to the camera Sensor through the camera driver, the value of the flag bit needs to be set to 1 again, so as to prompt that the Sensor node needs to issue a complete customization parameter of the Binning map mode 2 to the camera Sensor next, that is, the Binning customization parameter 2.
In addition, after marking the configuration parameter packet 3 as the issued state, the issued state of the configuration parameter packet 2 may be cancelled. For example, the correspondence table may be updated from table 3 to table 4.
TABLE 4
Figure BDA0003610759210000211
Figure BDA0003610759210000221
In other embodiments, after sending the configuration parameter packet 3 to the camera sensor, the configuration parameter packet 3 may be further indicated to have been issued to the camera sensor by writing the identifier of the configuration parameter packet 3 into the specified address. That is, the embodiment of the present application does not specifically limit the manner of marking the issuing state of the configuration parameter packet.
S415, when the value of the flag bit is 1, the Sensor node sends a Binning customization parameter 2 to the camera driver.
In some embodiments, the above-mentioned Binning customized parameter 2 may also be referred to as a configuration parameter corresponding to the third drawing mode.
S416, the camera driver sends Binning customization parameters 2 to the camera sensor.
In some embodiments, the Sensor node sends the Binning custom parameter 2 to the camera Sensor through the camera drive. After the camera sensor has stored the Binning customized parameter 2, the flow advances to S417.
And S417, configuring the camera sensor according to the Binning customization parameter 2.
S418, the camera sensor sends a completion notification 4 to the camera driver.
S419, the camera driver transmits a completion notification 4 to the Sensor node.
In some examples, after the camera Sensor receives and configures the Binning custom parameter 2, a completion notification 4 may be sent to the Sensor node by the camera driver. The completion notification 4 is used to indicate that the camera sensor has configured the Binning customization parameters 2.
S420, the value of the Sensor node change mark bit is 0.
After determining that the camera Sensor has configured the Binning custom parameter 2, the Sensor node may configure the value of the flag bit to 0. Therefore, the situation that the Sensor node still indicates the camera Sensor to switch the image mode in a mode of sending complete customized parameters under the condition that the configuration parameter packet is not issued again can be avoided, and therefore the switching efficiency of the subsequent image mode is influenced.
S421, the Sensor node determines that the streaming needs to be started again.
In some embodiments, when the Sensor node detects that the camera modes corresponding to the image mode before and after switching are different, it may be determined that streaming needs to be started again. For example, in the above embodiment, the camera Sensor is switched from Binning figure mode 1 to Binning figure mode 2, the photographing mode corresponding to Binning figure mode 1 is different from the recording mode corresponding to Binning figure mode 2, and thus the Sensor node can determine that streaming is required again.
In other embodiments, the Sensor node records that the value of the flag bit changes from 1 to 0 in the handover process, and then it may also be determined that the streaming needs to be started again.
S422, the Sensor node sends a streaming starting instruction to the camera driver.
S423, the camera driver sends a start flow command to the camera sensor.
In some embodiments, the Sensor node sends a streaming instruction to the camera Sensor through a camera drive, and instructs the camera Sensor to restart collecting raw image data, and an image drawing mode corresponding to the collected raw image data is a Binning image drawing mode 2.
S424, the camera sensor begins to image according to Binning image mode 2.
In some embodiments, the implementation principle of S424 may refer to the foregoing S123, and is not described herein again. Thus, referring to S124-S130 above, the camera application may obtain and display a third image frame, e.g., in the second interface. In addition, the third image frame is an image obtained after the camera sensor adopts the third image drawing mode.
In other embodiments, during interface 1202, as shown in FIG. 14, the cell phone may receive a user action, such as a click action, on a settings control 1401. Meanwhile, the cell phone may display the interface 1402 in response to the operation. The display interface 1402 may be an interface for instructing a user to manually configure parameters. The interface 1402 includes a configuration bar 1403 and a configuration bar 1404. Among them, a configuration column 1403 is used to configure the resolution of the captured video, and a configuration column 1404 is used to configure the video frame rate. The mobile phone can adjust the resolution to 4k according to the operation of the user on the configuration bar 1403. In addition, the handset may receive a user action on the configuration bar 1404, setting the video frame rate to "auto". After the video resolution is configured to be 4k and the video frame rate is set to automatic, the handset determines that the user instructs the camera application to switch to the 4k dynamic frame rate mode, so that the handset can display an interface for recording the 4k dynamic frame rate video, such as referred to as a third interface. Similarly, the mobile phone needs to configure the camera parameters before displaying the third interface.
Exemplarily, as shown in fig. 15, the above camera parameter configuration method may further include:
s501, the camera application receives an operation indicating that the 4k dynamic frame rate mode is enabled.
In some embodiments, the operation indicating that the 4k dynamic frame rate mode is enabled may be referred to as a third operation.
S502, the camera application sends a 4k dynamic frame rate mode identification and a zoom parameter 1 to the camera service.
S503, the camera service sends the 4k dynamic frame rate mode identification and the zoom parameter 1 to the interface module.
S504, the interface module sends the 4k dynamic frame rate mode identifier and the zooming parameter 1 to the decision module.
And S505, the decision module determines a Binning image drawing mode 3 corresponding to the 4k dynamic frame rate mode according to the zooming parameter 1 and the current ambient illumination parameter.
S506, the decision module sends an identifier 5 to the Sensor node, wherein the identifier 5 corresponds to the Binning graph mode 3.
In some embodiments, the implementation principles of S502 to S506 may refer to S402 to S406, which are not described herein again. In addition, binning drawing pattern 3 can also be referred to as a fourth drawing pattern.
And S507, the Sensor node determination identifier 5 corresponds to the issued configuration parameter packet 3, and the configuration parameter packet 3 further comprises a customized parameter corresponding to the 4k dynamic frame rate mode.
In some embodiments, the Sensor node may determine, through the correspondence table, that identifier 5 corresponds to a 4k dynamic frame rate mode, which also corresponds to configuration parameter packet 3. Meanwhile, the configuration parameter packet 3 is in a delivered state, that is, the configuration parameter packet 3 is indicated to be currently configured in the camera sensor, and the configuration parameter packet 3 does not need to be repeatedly delivered to the camera sensor through the camera drive.
In some embodiments, after determining that the identifier 5 corresponds to the issued configuration parameter packet 3, the Sensor node further needs to query the value of the flag bit, so that the process may enter S508.
S508, the Sensor node determines that the value of the marking bit is 0.
Under the condition that the value of the marking bit is 0, the Sensor node determines that the camera Sensor can be instructed to switch the Binning image mode 3 by sending the identification of the Binning customization parameter 3.
And S509, the Sensor node sends an identifier of a Binning customization parameter 3 to the camera driver, wherein the Binning customization parameter 3 is a customization parameter corresponding to a 4k dynamic frame rate mode and is used for realizing a Binning graph mode 3.
In other embodiments, the Sensor node may also send an identifier of Binning drawing pattern 3 to the camera driver, that is, send an identifier of a fourth drawing pattern.
And S510, the camera driver instructs the camera sensor to access the register address 3 for storing the Binning customization parameter 3 according to the identification of the Binning customization parameter 3.
In some embodiments, the register address 3 is the address where the Binning custom parameter 3 is stored in the camera sensor. The Binning custom parameter 3 indicated by register address 3 comes from the configuration parameter packet 3. The camera driver may access register address 3 by instructing the camera sensor to load the Binning custom parameter 3 indicated by register address 3.
S511, the camera sensor loads Binning customization parameter 3 indicated by register address 3.
S512, the camera sensor sends a completion notification 5 to the camera driver.
S513, the camera driver sends a completion notification 5 to the Sensor node.
In some embodiments, after the camera Sensor switches to load the Binning custom parameter 3, a completion notification 5 may be sent to the Sensor node by the camera driver, the completion notification 5 being used to notify that the loading for the Binning custom parameter 3 has been completed.
S514, the Sensor node determines that the streaming needs to be started again.
In some embodiments, after the Sensor node determines that the camera Sensor has loaded the Binning customized parameter 3, it needs to determine again whether the camera mode before and after the current image mode switching changes, and the implementation details may refer to the above S421, which is not described herein again.
S515, the Sensor node sends a start instruction to the camera driver.
S516, the camera driver sends a start instruction to the camera sensor.
And S517, starting to draw the picture according to the 3 th picture drawing mode of Binning by the camera sensor.
In some embodiments, reference may be made to S422 to S424 in the foregoing embodiments in S515 to S517, which are not described herein again. Thus, referring to S124-S130 above, the camera application may obtain and display the fourth image frame, e.g., display the fourth image frame in the third interface. The fourth image frame is the image obtained after 3 after the camera sensor adopts Binning mapping mode.
In addition, it can be understood that, in the video recording mode or the 4k dynamic frame rate mode, pairwise switching between the Binning graph mode, the Idcg graph mode, and the Quadra graph mode (under the condition of having corresponding Quadra customized parameters) may also be performed, and a switching process may refer to a switching example in the photographing mode, which is not described herein again.
In summary, with the method provided by the above embodiments, on the one hand, the data volume of a single configuration parameter packet is reduced, and the configuration parameter packet to be configured to the camera sensor is selected according to the camera mode. Therefore, the display of the storage space of the camera sensor on the size of the configuration parameter packet is avoided, and more types of graph mode switching can be realized. On the other hand, under the condition that the camera mode is not changed, the speed of switching the image mode is higher, and the adaptation efficiency of the mobile phone to the shooting scene is improved.
In addition, the above embodiment is mainly described by taking the camera application in the shooting preview stage as an example, and it can be understood that the method described in the above embodiment is still applied when the camera application is in the actual shooting stage, and details are not described here again.
The embodiment of the present application further provides a chip system, which can be applied to the electronic device in the foregoing embodiments. As shown in fig. 16, the system-on-chip includes at least one processor 2201 and at least one interface circuit 2202. The processor 2201 may be a processor in the electronic device described above. The processor 2201 and the interface circuit 2202 may be interconnected by wires. The processor 2201 may receive and execute computer instructions from the memory of the electronic device described above via the interface circuit 2202. The computer instructions, when executed by the processor 2201, may cause the electronic device to perform the various steps in the embodiments described above. Of course, the chip system may further include other discrete devices, which is not specifically limited in this embodiment of the present application.
In some embodiments, it is clear to those skilled in the art from the foregoing description of the embodiments that, for convenience and simplicity of description, the above division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A camera parameter configuration method is applied to an electronic device, wherein the electronic device comprises a camera and a camera sensor, and the method comprises the following steps:
receiving a first operation, wherein the first operation is used for indicating to open a camera application;
in response to the first operation, sending configuration parameters corresponding to a first charting mode to the camera sensor;
displaying a first interface, wherein the first interface is a photographing interface, the first interface comprises a first image frame, and the first image frame is an image obtained by the camera sensor in the first image drawing mode;
the electronic device sends an identification of a second chart mode to the camera sensor under a first condition, the first condition comprising: the electronic equipment is in a High Dynamic Range (HDR) scene, or the zooming parameter of the camera is larger than a first multiple;
displaying, in the first interface, a second image frame, the second image frame being an image of the camera sensor taken in the second drawing mode.
2. The method of claim 1, further comprising:
receiving a second operation, wherein the second operation is used for indicating and displaying a second interface, and the second interface is a first video recording interface;
in response to the second operation, sending configuration parameters corresponding to a third charting mode to the camera sensor;
displaying a second interface, wherein the second interface comprises a third image frame, and the third image frame is an image obtained by the camera sensor in the third image drawing mode;
receiving a third operation, wherein the third operation is used for indicating and displaying a third interface, and the third interface is a second video recording interface; the video recording parameters corresponding to the first video recording interface and the second video recording interface are different;
in response to the third operation, sending an identification of a fourth chart mode to the camera sensor;
displaying a third interface, wherein the third interface includes a fourth image frame, and the fourth image frame is an image obtained by the camera sensor in the fourth image drawing mode.
3. The method according to claim 1 or 2,
before the sending the configuration parameters corresponding to the first mapping mode to the camera sensor, the method further includes: sending a first data packet to the camera sensor, wherein the first data packet comprises configuration parameters corresponding to the first map-out mode and configuration parameters corresponding to the second map-out mode;
after the sending the configuration parameters corresponding to the first mapping mode to the camera sensor, the method further comprises: writing a first value in the first flag bit;
prior to sending the identification of the second charm mode to the camera sensor, the method further comprises: determining a value of the first marker bit to be the first value.
4. The method of claim 3,
before the sending the configuration parameters corresponding to the third graph mode to the camera sensor, the method further includes: determining that the configuration parameters corresponding to the third graph mode are contained in a second data packet, and sending the second data packet to the camera sensor if the second data packet is not written into the camera sensor; writing a second value in the first flag bit, the second value being different from the first value; determining the value of the first flag bit to be the second value;
after the sending the configuration parameters corresponding to the third graph mode to the camera sensor, the method further comprises: writing the first value again in the first flag bit;
prior to said sending an identification of a fourth map mode to the camera sensor, the method further comprises: and determining that the second data comprises the configuration parameters corresponding to the fourth graph mode, and the value of the first flag bit is the first value.
5. The method of claim 2, wherein the video recording parameters comprise a video resolution and a video frame rate, and the second video recording interface is a recording interface for 4k dynamic frame rate video.
6. The method according to any one of claims 1-5, wherein after sending the configuration parameters corresponding to the first graph mode to the camera sensor, the method further comprises:
the camera sensor of the electronic equipment is configured according to the configuration parameters corresponding to the first graph drawing mode;
and restarting the flow of the camera sensor of the electronic equipment and adopting the first drawing mode to start drawing.
7. The method of any of claims 1-6, wherein after said sending an identification of a second map pattern to the camera sensor, the method further comprises:
a camera sensor of the electronic equipment responds to the identification of the second mapping mode, and accesses a first address, wherein the first address is an address of the camera sensor for storing configuration parameters corresponding to the second mapping mode;
loading configuration parameters corresponding to the second mapping mode by a camera sensor of the electronic equipment;
and the camera sensor of the electronic equipment adopts the second mapping mode to start mapping.
8. The method according to any of claims 1-7, characterized in that in case the first condition is that the electronic device is in an HDR scene, the second map-out mode is an Idcg map-out mode suitable for taking a photograph.
9. The method according to any one of claims 1 to 7, wherein the second map mode is a Quadra map mode suitable for taking a photograph in a case where the first condition is that a zoom parameter of the camera is larger than the first multiple.
10. The method of claim 1, wherein after said displaying a second image frame, the method further comprises:
the electronic equipment sends the identification of the first drawing mode to the camera sensor under a second condition; the second condition includes: the electronic device is not in an HDR scene, and a zoom parameter of the camera is not greater than the first multiplier value;
displaying the first image frame on the first interface.
11. An electronic device, characterized in that the electronic device comprises one or more processors and memory; the memory is coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions, which when executed by the one or more processors, cause the one or more processors to perform the method of any of claims 1-10.
12. A computer storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-10.
13. A computer program product, characterized in that the computer program product comprises a computer program which, when run on a computer, causes the computer to carry out the method according to any one of claims 1-10.
CN202210428351.7A 2022-04-22 2022-04-22 Camera parameter configuration method and electronic equipment Active CN115550541B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210428351.7A CN115550541B (en) 2022-04-22 2022-04-22 Camera parameter configuration method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210428351.7A CN115550541B (en) 2022-04-22 2022-04-22 Camera parameter configuration method and electronic equipment

Publications (2)

Publication Number Publication Date
CN115550541A true CN115550541A (en) 2022-12-30
CN115550541B CN115550541B (en) 2024-04-09

Family

ID=84724177

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210428351.7A Active CN115550541B (en) 2022-04-22 2022-04-22 Camera parameter configuration method and electronic equipment

Country Status (1)

Country Link
CN (1) CN115550541B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116567407A (en) * 2023-05-04 2023-08-08 荣耀终端有限公司 Camera parameter configuration method and electronic equipment
CN117082340A (en) * 2023-10-16 2023-11-17 荣耀终端有限公司 High dynamic range mode selection method, electronic equipment and storage medium
CN117119291A (en) * 2023-02-06 2023-11-24 荣耀终端有限公司 Picture mode switching method and electronic equipment
CN117692790A (en) * 2023-07-20 2024-03-12 荣耀终端有限公司 Image data processing method and related device
CN117714837A (en) * 2023-08-31 2024-03-15 荣耀终端有限公司 Camera parameter configuration method and electronic equipment
WO2024093518A1 (en) * 2022-11-01 2024-05-10 荣耀终端有限公司 Image readout mode switching method and related device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110944160A (en) * 2019-11-06 2020-03-31 维沃移动通信有限公司 Image processing method and electronic equipment
CN112351193A (en) * 2020-09-17 2021-02-09 格科微电子(上海)有限公司 Zooming method based on time sequence control, image acquisition equipment and storage medium
CN113038019A (en) * 2021-03-24 2021-06-25 Oppo广东移动通信有限公司 Camera adjusting method and device, electronic equipment and readable storage medium
KR20210090476A (en) * 2020-01-10 2021-07-20 삼성전자주식회사 The method to improve image quality in zoom scenarios with a single camera and electronics that contains it
CN113347334A (en) * 2021-05-31 2021-09-03 惠州Tcl移动通信有限公司 Nondestructive photographing method and device for mobile terminal, terminal equipment and storage medium
US20210360157A1 (en) * 2018-09-19 2021-11-18 Gopro, Inc. Automated Camera Mode Selection
CN114007011A (en) * 2021-10-26 2022-02-01 希姆通信息技术(上海)有限公司 Output mode switching method and system based on camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210360157A1 (en) * 2018-09-19 2021-11-18 Gopro, Inc. Automated Camera Mode Selection
CN110944160A (en) * 2019-11-06 2020-03-31 维沃移动通信有限公司 Image processing method and electronic equipment
KR20210090476A (en) * 2020-01-10 2021-07-20 삼성전자주식회사 The method to improve image quality in zoom scenarios with a single camera and electronics that contains it
CN112351193A (en) * 2020-09-17 2021-02-09 格科微电子(上海)有限公司 Zooming method based on time sequence control, image acquisition equipment and storage medium
CN113038019A (en) * 2021-03-24 2021-06-25 Oppo广东移动通信有限公司 Camera adjusting method and device, electronic equipment and readable storage medium
CN113347334A (en) * 2021-05-31 2021-09-03 惠州Tcl移动通信有限公司 Nondestructive photographing method and device for mobile terminal, terminal equipment and storage medium
CN114007011A (en) * 2021-10-26 2022-02-01 希姆通信息技术(上海)有限公司 Output mode switching method and system based on camera

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024093518A1 (en) * 2022-11-01 2024-05-10 荣耀终端有限公司 Image readout mode switching method and related device
CN117119291A (en) * 2023-02-06 2023-11-24 荣耀终端有限公司 Picture mode switching method and electronic equipment
CN116567407A (en) * 2023-05-04 2023-08-08 荣耀终端有限公司 Camera parameter configuration method and electronic equipment
CN116567407B (en) * 2023-05-04 2024-05-03 荣耀终端有限公司 Camera parameter configuration method and electronic equipment
CN117692790A (en) * 2023-07-20 2024-03-12 荣耀终端有限公司 Image data processing method and related device
CN117714837A (en) * 2023-08-31 2024-03-15 荣耀终端有限公司 Camera parameter configuration method and electronic equipment
CN117082340A (en) * 2023-10-16 2023-11-17 荣耀终端有限公司 High dynamic range mode selection method, electronic equipment and storage medium
CN117082340B (en) * 2023-10-16 2024-05-24 荣耀终端有限公司 High dynamic range mode selection method, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN115550541B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
CN115550541B (en) Camera parameter configuration method and electronic equipment
CN111212235B (en) Long-focus shooting method and electronic equipment
WO2021213477A1 (en) Viewfinding method for multichannel video recording, graphic user interface, and electronic device
WO2023015981A1 (en) Image processing method and related device therefor
US20220417416A1 (en) Photographing method in telephoto scenario and mobile terminal
CN111491102B (en) Detection method and system for photographing scene, mobile terminal and storage medium
CN116567407B (en) Camera parameter configuration method and electronic equipment
US20150271388A1 (en) Photographing Starting Apparatus And Method For Built-In Camera Of Portable Data Processing Device
CN113630558B (en) Camera exposure method and electronic equipment
CN111866392A (en) Shooting prompting method and device, storage medium and electronic equipment
WO2023231687A1 (en) Camera switching method and electronic device
CN114845035A (en) Distributed shooting method, electronic equipment and medium
CN115604572A (en) Image acquisition method and device
CN115529413A (en) Shooting method and related device
WO2023160230A1 (en) Photographing method and related device
KR102028125B1 (en) Method and Apparatus for executing an application using IP address
US11622078B2 (en) Method and apparatus for image formation using preview images
WO2024093518A1 (en) Image readout mode switching method and related device
CN117560574B (en) Shooting method, electronic equipment and readable storage medium
CN116347212B (en) Automatic photographing method and electronic equipment
CN117714837A (en) Camera parameter configuration method and electronic equipment
WO2022161058A1 (en) Photographing method for panoramic image, and electronic device
WO2023160223A1 (en) Multi-camera strategy scheduling method and related device thereof
CN117082340B (en) High dynamic range mode selection method, electronic equipment and storage medium
WO2023160269A1 (en) Electronic device and photographing method thereof, and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant