CN116659659A - Calibration method of ambient light sensor, electronic equipment and chip system - Google Patents

Calibration method of ambient light sensor, electronic equipment and chip system Download PDF

Info

Publication number
CN116659659A
CN116659659A CN202211422682.6A CN202211422682A CN116659659A CN 116659659 A CN116659659 A CN 116659659A CN 202211422682 A CN202211422682 A CN 202211422682A CN 116659659 A CN116659659 A CN 116659659A
Authority
CN
China
Prior art keywords
ambient light
processor
light data
inter
light sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211422682.6A
Other languages
Chinese (zh)
Other versions
CN116659659B (en
Inventor
王思文
张文礼
王浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211422682.6A priority Critical patent/CN116659659B/en
Publication of CN116659659A publication Critical patent/CN116659659A/en
Application granted granted Critical
Publication of CN116659659B publication Critical patent/CN116659659B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the application provides a calibration method of an ambient light sensor, electronic equipment and a chip system, and relates to the technical field of off-screen ambient light. During the first test, establishing inter-core communication between the first processor and the second processor, wherein the end of the test does not need to destroy the inter-core process; when the test is not performed for the first time, the inter-core communication is not required to be re-established, and the inter-core communication is not required to be destroyed, so that the test time can be saved; when the first test is performed, the first processor sends a reporting request to the second processor, and the subsequent second processor reports the ambient light data periodically collected by the ambient light sensor to the first processor in real time, so that the sending times of the acquisition request are reduced, and the test duration is saved; by this method the calibration efficiency of the ambient light sensor can be improved.

Description

Calibration method of ambient light sensor, electronic equipment and chip system
Technical Field
The embodiment of the application relates to the field of under-screen ambient light, in particular to a calibration method of an ambient light sensor, electronic equipment and a chip system.
Background
With the development of electronic devices, the duty ratio of the display screen of the electronic device is higher and higher. In pursuit of an extreme screen ratio, an ambient Light sensor on the electronic device may be disposed under an Organic Light-Emitting Diode (OLED) screen of the electronic device. The OLED screen itself emits light, which results in noise being contained in the ambient light collected by the ambient light sensor placed below the OLED screen. Noise is typically related to the relative position between the ambient light sensor and the display screen. Therefore, calibration of the ambient light sensor of the electronic device is required before shipping the electronic device and after maintenance of the electronic device (e.g., replacement of a display screen, disassembly of a machine, etc.). The calibration procedure of the ambient light sensor comprises a plurality (even hundreds) of cycles of the test procedure, which results in a longer time-consuming calibration procedure of the ambient light sensor and a lower calibration efficiency.
Disclosure of Invention
The embodiment of the application provides a calibration method of an ambient light sensor, electronic equipment and a chip system, which solve the problems of long time consumption and low efficiency in the calibration of the ambient light sensor.
In order to achieve the above purpose, the application adopts the following technical scheme:
in a first aspect, an embodiment of the present application provides a calibration method for an ambient light sensor, which is applied to an electronic device, where the electronic device includes a display screen, the ambient light sensor located below the display screen, a first processor, and a second processor, and the method includes:
During the 1 st cycle test, the display screen displays a first image;
the first processor establishing inter-core communication between the first processor and the second processor;
the first processor sends a report request of the ambient light data to the second processor based on the inter-core communication, wherein the report request is used for indicating the second processor to report the received ambient light data to the first processor after receiving the ambient light data reported by the ambient light sensor;
the first processor receives the ambient light data reported by the second processor based on the inter-core communication, wherein the ambient light data comprises first ambient light data collected by the ambient light sensor when the display screen displays the first image;
in the ith cyclic test, the display screen displays a second image, i is less than or equal to n, and n is the preset total number of cyclic tests;
the first processor receives ambient light data reported by the second processor based on the inter-core communication, wherein the ambient light data comprises second ambient light data collected by the ambient light sensor when the display screen displays the second image, and the first ambient light data and the second ambient light data are used for calibrating the ambient light sensor;
After the nth cycle test is completed, the first processor destroys inter-core communications between the first processor and the second processor.
In the application, during the first cyclic test, the inter-core communication is established between the first processor and the second processor, and after the first test is finished, the inter-core process is not destroyed; when the test is not the first cycle test, the inter-core communication is not required to be established, and the inter-core communication is not required to be destroyed; after each cycle test is finished, the inter-core process is destroyed again; by the method, the times of establishing inter-core communication and destroying the inter-core communication can be reduced, so that the calibration time of the ambient light sensor is shortened, and the efficiency is improved.
In addition, during the first cycle test, the first processor sends a report request of the ambient light data to the second processor, and the second processor actively reports the received ambient light data to the first processor after receiving the request. And in the subsequent non-first cycle test, the first processor does not send an acquisition request of the ambient light data to the second processor, so that the calibration time of the ambient light sensor is reduced, and the efficiency is improved.
In one implementation manner of the first aspect, the first processor includes: a production service process and a HAL interface, the first processor establishing inter-core communication between the first processor and the second processor comprising:
The production service process sends a first acquisition request of ambient light data to the HAL interface;
in response to the first acquisition request, the HAL interface establishes inter-core communication between the HAL interface and the second processor;
the first processor sending a report request for ambient light data to the second processor based on the inter-core communication includes:
the HAL interface sends a report request of ambient light data to the second processor based on the inter-core communication.
In another implementation manner of the first aspect, the first ambient light data is first ambient light data reported by the second processor based on the inter-core communication, where the first ambient light data is received by the HAL interface after the HAL interface sends a report request of ambient light data to the second processor based on the inter-core communication.
In the embodiment of the application, after the display screen displays the first image, the HAL interface establishes inter-core communication with the second processor, so that in order to save time, the first ambient light data reported by the second processor can be used as the ambient light data for calibrating the ambient light sensor.
In another implementation manner of the first aspect, after the HAL interface receives the first ambient light data reported by the second processor based on the inter-core communication, the method further includes:
The HAL interface sends the first ambient light data to the production service process.
In another implementation manner of the first aspect, the method further includes:
after the HAL interface sends a report request for ambient light data to the second processor based on the inter-core communication, the method further comprises:
if the HAL interface does not receive the ambient light data reported by the second processor based on the inter-core communication within the first time period, the HAL interface destroys the inter-core communication between the HAL interface and the second processor.
In the embodiment of the application, when the inter-core communication is wrong or the ambient light sensor at the second processor side is problematic, the first processor side can not receive the ambient light data sent by the second processor side, in order to avoid overtime caused by the fact that the first processor waits for receiving the ambient light data all the time in the calibration process of the ambient light sensor, a first duration can be set, and if the ambient light data reported by the second processor side is not received in the first duration, the current calibration process is abnormal, and the inter-core communication should be destroyed.
In another implementation manner of the first aspect, the first processor includes: a production service process and a HAL interface, the display screen displaying the second image, the method further comprising:
The production service process sends a second acquisition request of the ambient light data to the HAL interface;
and in response to the second acquisition request, the HAL interface sends second ambient light data reported by the second processor based on the inter-core communication to the production service process.
In another implementation manner of the first aspect, the second ambient light data is last ambient light data received before the HAL interface receives the second acquisition request, or is first ambient light data received after the HAL interface receives the second acquisition request.
In the application, the ambient light sensor collects ambient light in a certain time period, and the ambient light sensor drives the ambient light sensor at the second processor side to report the ambient light data when the ambient light sensor collects one ambient light data; the ambient light sensor driver also reports the received ambient light data to the first processor side each time an ambient light data is received. Therefore, the HAL interface always receives the ambient light data, and after the display screen displays the second image, the generating service process sends the acquisition request to the HAL, so the last ambient light data received from the SCP processor before the HAL interface receives the acquisition request, or the first ambient light data received from the SCP processor after the HAL interface receives the acquisition request can be used as the ambient light data acquired by the ambient light sensor when the display screen displays the second image.
In another implementation manner of the first aspect, the first processor includes: a production service process and a HAL interface, the first processor destroying inter-core communications between the first processor and the second processor comprising:
the production service process sends a destruction instruction of the inter-core communication to the HAL interface;
in response to the destroy instruction, the HAL interface destroys inter-core communications between the HAL interface and the second processor.
In another implementation manner of the first aspect, the method further includes:
and responding to the first acquisition request, registering a callback receiving thread by the HAL interface, and running the callback receiving thread, wherein the callback receiving thread is used for receiving the ambient light data reported by the second processor based on the inter-core communication.
In another implementation manner of the first aspect, the method further includes:
and after the nth cycle test is finished, the HAL interface destroys the callback receiving thread.
In the application, a callback receiving thread can be registered, and the callback receiving thread can receive the ambient light data reported by the SCP processor side.
In a second aspect, there is provided an electronic device comprising a processor for executing a computer program stored in a memory, implementing the method of any one of the first aspects of the application.
In a third aspect, there is provided a system on a chip comprising a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method of any of the first aspects of the application.
In a fourth aspect, there is provided a computer readable storage medium storing a computer program which when executed by one or more processors performs the method of any of the first aspects of the application.
In a fifth aspect, embodiments of the present application provide a computer program product for, when run on a device, causing the device to perform the method of any one of the first aspects of the present application.
It will be appreciated that the advantages of the second to fifth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an electronic device to which a data processing method according to an embodiment of the present application is applied;
fig. 2 is a diagram of a positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a target area of an ambient light sensor according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a process of measuring a projection position of a center point of an ambient light sensor on a display screen according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a process for measuring a projection position of a center point of an ambient light sensor on a display screen according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an annular image displayed when calibrating a target area according to an embodiment of the present application;
FIG. 7 is a schematic diagram of calibrating interference coefficients of sub-regions according to an embodiment of the present application;
FIG. 8 is a schematic diagram of an embodiment of an ambient light sensor calibration method;
FIG. 9 is a schematic flow chart of calibrating an ambient light sensor based on the technical architecture shown in FIG. 8 according to an embodiment of the present application;
FIG. 10 is a timing diagram illustrating an implementation procedure of step S27 in FIG. 9 according to an embodiment of the present application;
fig. 11 is a timing chart illustrating another implementation procedure of step S27 in fig. 9 according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that in embodiments of the present application, "one or more" means one, two, or more than two; "and/or", describes an association relationship of the association object, indicating that three relationships may exist; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Furthermore, in the description of the present specification and the appended claims, the terms "first," "second," "third," "fourth," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The calibration method of the ambient light sensor provided by the embodiment of the application can be applied to electronic equipment provided with an OLED screen. The electronic device may be a tablet computer, a mobile phone, a wearable device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), etc. provided with an under-screen ambient light sensor. The embodiment of the application does not limit the specific type of the electronic equipment.
Fig. 1 shows a schematic structural diagram of an electronic device. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include, among other things, a pressure sensor 180A, a touch sensor 180K, an ambient light sensor 180L, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a coprocessor (sensor coprocessor, SCP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), or the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors. For example, the processor 110 is configured to perform a calibration method for an ambient light sensor in an embodiment of the application.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store application programs (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system.
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio signals to analog audio signal outputs and also to convert analog audio inputs to digital audio signals. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to listening to voice information. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc. For example, microphone 170C may be used to collect voice information related to embodiments of the present application.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The embodiment of the present application is not particularly limited to a specific structure of an execution subject of the calibration method of an ambient light sensor, as long as communication can be performed by running a code recorded with the calibration method of an ambient light sensor of the embodiment of the present application, with the calibration method of an ambient light sensor provided according to the embodiment of the present application. For example, the execution body of the calibration method of the ambient light sensor provided by the embodiment of the application may be a functional module in the electronic device, which can call a program and execute the program, or a communication device, such as a chip, applied to the electronic device.
The calibration method of the ambient light sensor provided by the embodiment of the application is used for calibrating the ambient light sensor of the electronic equipment provided with the under-screen ambient light sensor. The electronic device provided with the under-screen ambient light sensor can be a mobile phone, a tablet computer, an electronic watch and the like.
The calibration method of the ambient light sensor provided by the embodiment of the application can be also applied to the calibration of the ambient light sensor of the electronic equipment provided with the non-under-screen ambient light sensor. The electronic device provided with the non-under-screen ambient light sensor can also be a mobile phone, a tablet computer, an electronic watch and the like.
The following embodiment of the present application uses a mobile phone provided with an under-screen ambient light sensor as an electronic device of the ambient light sensor to be calibrated, and does not indicate that the embodiment of the present application is only applicable to a mobile phone provided with an under-screen ambient light sensor.
Fig. 2 is a schematic structural diagram of a mobile phone with an ambient light sensor to be calibrated according to an embodiment of the present application, a display screen of the mobile phone is an OLED screen, and the ambient light sensor of the mobile phone is disposed below the OLED screen. Therefore, the display module in the display screen is not required to be grooved to place the ambient light sensor, and the display screen right above the ambient light sensor can display images.
The OLED screen is a self-luminous display screen, when the OLED screen displays images, a user can see the images from the upper part of the display screen, and the ambient light sensor below the OLED screen can collect light corresponding to the images displayed by the OLED screen. Thus, the ambient light collected by the ambient light sensor includes the light emitted by the display screen and the ambient real ambient light.
In order to accurately obtain the real ambient light, the noise corresponding to the light emitted by the display screen needs to be obtained in addition to the ambient light collected by the ambient light sensor. And subtracting noise corresponding to light emitted by the display screen from the ambient light collected by the ambient light sensor to obtain the external real ambient light.
In practical applications, the projected area of the ambient light sensor on the display screen is much smaller than the area of the display screen itself. Therefore, not the light emitted by the whole display screen will interfere with the ambient light collected by the ambient light sensor. But the light emitted by the display area above the ambient light sensor in the display screen and the light emitted by the display area above a certain range around the ambient light sensor will interfere with the ambient light collected by the ambient light sensor.
Referring to fig. 3, a region of the display screen where ambient light collected by the ambient light sensor is disturbed is referred to as a target region. I.e. noise that interferes with the ambient light collected by the ambient light sensor, can be obtained from the image displayed by the target area of the display screen and the brightness of the display screen.
When the embodiment of the application is used for calibrating the ambient light sensor of the mobile phone, the position of the target area in the display screen of the mobile phone can be obtained through calibration test.
When determining a target area on the display screen above the ambient light sensor, a projection position of a center point of the ambient light sensor on the display screen is obtained through calibration test, and then the target area is determined in the display screen of the mobile phone through length and width based on the projection position.
In practical application, under the condition that the content displayed by each pixel point in the target area of the display screen is the same, the interference of the sub-area, which is far away from the center point of the ambient light sensor, in the target area to the initial ambient light collected by the ambient light sensor is smaller than the interference of the sub-area, which is near to the center point of the ambient light sensor, in the target area to the initial ambient light collected by the ambient light sensor, therefore, the target area needs to be divided into a plurality of sub-areas, and then the interference coefficients of the sub-areas in the target area are calibrated, so that the interference coefficients of the sub-areas are obtained.
In addition, noise generated by different colors (RGB) displayed by each pixel point in the target area and different brightness is also different. Therefore, it is also necessary to obtain a brightness fitting curve in different colors through a calibration test. The brightness fitting curve is used for determining noise generated by different images displayed in the target area under different brightness.
Each of the calibration contents listed above requires the handset to be in a dark environment. Therefore, before each calibration, the dark environment where the mobile phone is located needs to be calibrated, and after each calibration, the dark environment needs to be calibrated to ensure that the dark environment meets the standard in the calibration test process.
In view of the above analysis, when calibrating an ambient light sensor below a display screen of a mobile phone before the mobile phone leaves the factory, at least the following are calibrated: dark environment calibration, center point calibration of an ambient light sensor, target area calibration, interference coefficient calibration of each sub-area, brightness fitting curve calibration, dark environment calibration and the like.
Of course, in practical application, the above-mentioned partial calibration process may be selected as the calibration process in the calibration process of the ambient light sensor according to practical situations, and the above-mentioned other calibration process not listed may also be selected as the calibration process in the calibration process of the ambient light sensor.
Each of the calibration procedures listed above may require multiple tests to be obtained.
As one example, where dark environment calibration is performed, the cell phone is placed in production line testing equipment, which may provide a dark environment at the time of calibration testing. The ambient light sensor in the handset is typically disposed in an upper region of the handset, so that a dark ambient calibration image (e.g., an 80 x 80 pixel image, 80 representing the number of pixels) can be displayed on the display screen of the handset in a region remote from the ambient light sensor. The dark environment calibration image may be a highlighted white image block (e.g., the maximum of the brightness values that the display screen may set). After the dark environment calibration image is displayed stably, the ambient light data can be collected for a plurality of times through the ambient light sensor of the mobile phone, the average value of the ambient light data is calculated, and if the average value is smaller than the threshold value, the environment where the ambient light sensor of the mobile phone is located is indicated to belong to the dark environment. Of course, the variance of the multiple ambient light data may also be calculated to determine whether the multiple ambient light data is stable. The embodiment of the application does not limit the condition of determining whether the dark environment belongs to the dark environment or not in the calibration of the dark environment.
As can be appreciated from this example, the dark environment calibration includes a process of measuring ambient light data multiple times.
As another example, in calibrating the center point of the ambient light sensor, a center point calibration image may be provided, which may be a smaller-sized image, for example, a3×3 pixel image, a5×5 pixel image, a 7×7 pixel image, a 10×10 pixel image, or the like, wherein numerals 3, 5, 7, 10 denote the number of pixel points. Of course, the center point calibration image, which is also a white image block, may also be of the same size as the ambient light sensor. A display screen of the mobile phone may be set to display the center point calibration image at an initial position (e.g., a predicted center position of the ambient light sensor or an upper left corner of the predicted center position of the ambient light sensor, etc.), and then start to move the display position of the center point calibration image laterally (x-direction); finally, the display position of the center point calibration image is moved vertically (y-direction), and only one pixel point position can be moved per movement.
Referring to fig. 4, in the order of A1, A2, A3, A4, and A5, a center point calibration image is displayed at each position, and after the center point calibration image is stably displayed at each position, an ambient light sensor is required to test ambient light data at the time of displaying the center point calibration image at the position. When the ambient light data tends to be larger and smaller, the position corresponding to the highest (brightest) ambient light data is the center point in the lateral direction. Taking A3 as an example of a position corresponding to the highest ambient light data, a vertical center point is continuously searched by taking A3 as a starting point.
Referring to fig. 5, in order of B1 (i.e., A3), B2, B3, B4, and B5, a center point calibration image is displayed at each position, and after the center point calibration image is stably displayed at each position, an ambient light sensor is required to test ambient light data at the time of displaying the center point calibration image at the position. In the case where there is a tendency that the ambient light data becomes larger and smaller first, a position corresponding to the highest (brightest) ambient light data is a center point in the vertical direction.
Of course, in practical applications, if the collected ambient light data is always smaller (darker) with the change of the position, it means that the position needs to be moved in the opposite direction in the center of the principle.
In this way, the center point of the ambient light sensor can be determined, since the ambient light data collected by the ambient light sensor of the mobile phone is the largest (brightest) when the center point of the center point calibration image and the projection of the center point of the ambient light sensor on the display screen coincide when the center point calibration image is displayed on the display screen.
It will be appreciated from this calibration procedure that the testing procedure of the projected position of the center point of the ambient light sensor on the display screen also requires the procedure of measuring the ambient light data a plurality of times.
In the embodiment of the application, the calibration of the target area may be performed by a setting manner or a measurement manner, for example, the center point of the target area may be set as the center point of the ambient light sensor, and the area determined by the fixed length and the fixed width may be set as the target area. Of course, the annular calibration image may be set with the center point of the ambient light sensor as the center point.
Referring to fig. 6, the center point of the annular calibration image is the center point of the ambient light sensor, the inner diameter of the annular calibration image displayed on the display screen is sequentially enlarged, ambient light data collected by the ambient light sensor is measured, if the collected ambient light data is smaller than a preset threshold value, it means that the image at the position will not interfere with ambient light collected by the ambient light sensor, and in this case, the area corresponding to the inner diameter of the annular calibration image is the target area.
It will be appreciated from this calibration procedure that the calibration procedure of the target area also requires a procedure of measuring the ambient light data a plurality of times.
As another example, in calibrating an interference system of each sub-region in a target region, the target region may be divided into a plurality of sub-regions, and specifically referring to fig. 7, each sub-region in the target region may be sequentially lighted up when measuring an interference coefficient of each sub-region in calibration, the lighted up sub-regions display white images, and the other sub-regions display black. After the image displayed by each sub-area is stable, the ambient light data collected by the ambient light sensor can be obtained. And obtaining the interference coefficient of each sub-area according to the proportion of the ambient light data corresponding to each sub-area to the sum of the ambient light data corresponding to all sub-areas.
It will be appreciated from this example that calibration of the interfering system of the individual sub-areas also requires a process of measuring the ambient light data over a plurality of times.
As another example, when calibrating the brightness fitting curve, the embodiment of the application needs to obtain data collected by ambient light sensors corresponding to different colors and different brightnesses in a brightness curve calibration area (the brightness curve calibration area may be a target area or may also include a target area) of the display screen, where the data collected by the ambient light sensors may be converted into noise corresponding to different colors and different brightnesses. Therefore, it is necessary to measure the ambient light data when different colors match different brightnesses. If 6 brightness values are set and RGB is matched, 18 pieces of item data corresponding to different colors and different brightnesses shown in table 1 can be obtained.
Table 1 entry data corresponding to different colors and different brightnesses
Color of Brightness of light Ambient light sensor data collection
R Brightness 1 Data 1
R Brightness 2 Data 2
R Brightness 3 Data 3
R Brightness 4 Data 4
R Brightness 5 Data 5
R Brightness 6 Data 6
G Brightness 1 Data 7
G Brightness 2 Data 8
G Brightness 3 Data 9
G Brightness 4 Data 10
G Brightness 5 Data 11
G Brightness 6 Data 12
B Brightness 1 Data 13
B Brightness 2 Data 14
B Brightness 3 Data 15
B Brightness 4 Data 16
B Brightness 5 Data 17
B Brightness 6 Data 18
It will be appreciated from this example that the calibration process also requires multiple test procedures.
After the calibration data of the ambient light sensor is obtained through the calibration process, noise of the target area can be calculated by using the calibration data after ambient light is collected by the ambient light sensor.
As an example, capturing a target image which generates interference to ambient light data from an image displayed on a display screen according to the center point, the length and the width of a target area; and then obtaining noise generated by each pixel point in the target area based on the color of each pixel point in the target image, the brightness of the display screen when the image is displayed and the interference system of the subarea where the pixel point is positioned, and finally obtaining the noise generated by all the pixel points in the target area.
As can be appreciated from the above examples of calibration of the ambient light sensor of a cell phone, no matter which parameter is measured by the calibration, the display screen of the cell phone needs to be controlled to display a specific image (e.g., dark ambient calibration image, center point calibration image, annular image, sub-area image, and brightness curve calibration area image) in a specific area. And after the specific image display is stable, the ambient light data acquired by the ambient light sensor is also acquired. Of course, calibration data (for example, the coordinates of projection of the center point of the ambient light sensor on the display screen, the length and width of the target area, the interference coefficient of each sub-area, entry data corresponding to the brightness fitting curve, etc.) are also needed to be obtained according to the ambient light data collected by the ambient light sensor, and these calibration data are also needed to be written into a specific data structure for storage, so that the calibration data are convenient for being adopted in the subsequent real calculation of noise interfering with the ambient light collection of the ambient light sensor.
After understanding the calibration process of the ambient light sensor, a technical architecture diagram upon which the ambient light sensor is calibrated will be described below.
Referring to fig. 8, a technical architecture diagram of a calibration scheme provided by an embodiment of the present application is shown. In the technical architecture diagram, a mobile phone side comprises an AP processor and an SCP processor. The AP processor is an application processor on which an operating system, a user interface, and application programs all run. The SCP processor is a co-processor that can assist the AP processor in performing transactions related to sensors (e.g., ambient light sensors) and the like.
Wherein the application layer of the AP processor (denoted as the first processor) has a calibration package; the calibration package is used to control the calibration test of the ambient light sensor.
The hardware abstraction layer of the AP processor has a hardware abstraction (Hardware Abstraction Layer, HAL) interface; the HAL interface includes a first HAL interface and a second HAL interface. The first HAL interface is used for communication between the calibration package and a production service process provided by the chip platform. The second HAL interface is used for communication between the production service process and the ambient light sensor driver in the SCP processor. The first and second are for distinction only and are not limiting.
A normalized hidl interface is provided between the application layer and the hardware abstraction layer, the hidl interface being an interface description language of the interface between the hardware abstraction layer and the application layer.
The normalized hidl interface may enable application layer calibration packages to be used across chip platforms, for example, in electronic devices that use AP processors of different vendors.
The hardware abstraction layer also has a production service process provided by the chip platform, for example, a test_diag process of the high-pass platform, an attcmd server process of the concurrency platform, a test_diag process and an attcmd server process can be used for developing and debugging the ambient light sensor, and interfaces related to the production process are all available. Calibration testing in a production process may be accomplished based on production service processes by communicating with interfaces provided by the production service processes.
The hardware abstraction layer is also provided with a calibration algorithm library, which is a so library applied on the mobile phone side.
An ambient light sensor driver is arranged in an SCP processor (marked as a second processor) of the mobile phone, and data acquired by the ambient light sensor is reported to the ambient light sensor driver.
And after determining that the display screen of the mobile phone displays the test image, the calibration program package in the AP processor calls the hi dl interface to establish communication with the first HAL interface in the hardware abstraction layer in the AP processor. The HAL interface in the hardware abstraction layer in the AP processor establishes communication with the production service process provided by the chip platform by establishing socket (socket) communication. The production service process in the hardware abstraction layer of the AP processor may invoke the calibration algorithm library. The production service process in the hardware abstraction layer in the AP processor establishes communication with the second HAL interface in the hardware abstraction layer in the AP processor, which establishes communication with the ambient light sensor driver in the SCP processor based on inter-core communication.
How to implement the calibration process of the ambient light sensor based on this technical architecture diagram will be described below.
Referring to fig. 9, a flowchart of a calibration method of an ambient light sensor according to an embodiment of the present application is shown.
S21, analyzing the calibration flow by the calibration program package.
In the embodiment of the application, how to calibrate the ambient light sensor of the mobile phone is set in the calibration flow. The calibration procedure can be understood as a control procedure of the following steps: a dark environment calibration process, a calibration process of the projection position of the center point of the ambient light sensor on the display screen shown in fig. 5 and 6, a calibration process of the target area shown in fig. 7, a calibration process of the interference coefficient of the sub-area shown in fig. 8, and a calibration process of the luminance fitting curve and a dark environment calibration process shown in table 1. Of course, depending on the actual setting, the control flow may also include other calibration procedures, which are not limited in this embodiment of the present application.
After the calibration package parses the calibration procedure, the various steps in the calibration procedure may be performed sequentially. The following embodiments take as an example a one-time test procedure in a calibration procedure for calibrating the projected position of the center point of the ambient light sensor on the display screen.
S22, the calibration program package checks whether the last step in the calibration flow is completed.
The calibration package performs the step currently to be performed (calibrating the projection of the center point of the ambient light sensor on the display screen) if it is not currently completed.
S23, the calibration program package displays images based on the current calibration flow.
Referring to the embodiment shown in fig. 5, the calibration package controls the display of the mobile phone to display the calibration image of the center point at the position A1 according to the calibration procedure in a manner specified in the calibration procedure.
The embodiment of the application can record the image displayed in the first cyclic test as a first image and the image which is not displayed in the first cyclic test as a second image.
S24, delaying a period of time by the calibration program package based on the calibration flow.
As described above, after the center point calibration image is displayed stably, the data collected by the ambient light sensor of the mobile phone is obtained. Thus, a time delay is required.
S25, after the delay is finished, the calibration program package calls the hidl interface according to the calibration flow so as to transmit a calibration command word to the first HAL interface.
Wherein the calibration command word comprises: a current step.
S26, the first HAL interface transmits a calibration command word to the production service process based on socket communication.
And S27, after the production service process receives the calibration command word, acquiring the ambient light data (for example, RGBC data) acquired by the ambient light sensor corresponding to the position A1 from the ambient light sensor driver in the SCP processor based on the second HAL interface and the inter-core communication between the second HAL interface and the ambient light sensor driver of the SCP processor.
It should be noted that the data collected by the ambient light sensor is accurate after the image display is stable.
In the application, the ambient light data used for calibrating the ambient light sensor in the first cycle test can be recorded as the first ambient light data. The ambient light data used to calibrate the ambient light sensor when not first cycled is noted as second ambient light data.
And S28, after the production service process receives the ambient light data collected by the ambient light sensor, transmitting the calibration command word and the ambient light data collected by the ambient light sensor to a calibration algorithm library.
And S29, judging whether all calibration steps are finished or not according to the received information by the calibration algorithm library.
If not, the calibration algorithm library performs S210, passing the response of the step (e.g., identification of the currently performed step, identification of success or failure) to the production service process.
S210, the calibration algorithm library transmits the response of the step to the generation service process.
S211, after receiving the response, the production service process transmits the response to the first HAL interface of the HAL layer through socket communication.
S212, the first HAL interface returns a response to the hi dl interface, and the hi dl interface transmits the response to the calibration package of the application layer.
S213, the calibration program package receives the response.
After the calibration package receives the response, looking at the step identifier in the response, it can be determined whether the last step of the calibration flow is completed, and if the last step is not completed, it is determined which step the currently ended step belongs to, and which step in the calibration flow should be executed by the next step.
If the currently ended step does not belong to the last step in the calibration procedure, the next step in the calibration procedure is continued according to S22 to S213 in the embodiment shown in fig. 9.
In specific implementation, for example, after the dark ambient light calibration is completed, the next cycle of displaying the center point calibration image at the position A1 of the display screen is continued, after the cycle of displaying the center point calibration image at the position A2 of the display screen is completed, the next cycle of displaying the center point calibration image at the position A3 of the display screen is continued, … …, and the last cycle of acquiring ambient light data in the last dark ambient calibration is continued.
In the embodiment shown in fig. 9, in the process of executing steps S22 to S213 in a loop, until after a certain execution reaches step S29, the calibration algorithm library determines that all steps have been completed (the identifier of the step carried in the calibration command is the identifier of the last step) according to the received information, then the following steps are executed:
s214, the response of the last step and all calibration data are given to the production service process.
After the calibration algorithm library gives the response and calibration data to the production service process, the cached calibration data may be cleaned up.
All calibration data are calibration data calculated according to ambient light data, for example, coordinates of a center point, length and width of a target area, interference systems of all sub-areas, data corresponding to a brightness fitting curve, and the like.
S215, the production service process writes the calibration data returned by the calibration algorithm library into the NV partition.
Of course, after step S215, the production service process still needs to transmit the response of the last step to the calibration package.
For example, after step S215, steps S211 to S213 are continuously performed to notify the calibration package that all steps are completed through the HAL interface and the hidl interface.
After the calibration package receives the response indicating that all the steps have been completed, when the calibration package executes step S22, it may be determined that the last step of the calibration procedure has been completed; the calibration package continues with S216.
S216, the calibration program package controls the display screen of the mobile phone to display an interface with successful calibration.
It will also be appreciated from this example that step S27 is an interaction procedure between the AP processor and the SCP processor. How many cycles are needed during the calibration test, how many interactions between the AP processor and the SCP processor need to be performed.
The flow between the AP processor and the SCP processor at each communication may be as described with reference to the embodiment shown in fig. 10.
Referring to fig. 10, a process for a production service process in an AP processor to obtain ambient light data from an ambient light sensor drive in an SCP processor. I.e. step S27 in fig. 9.
S1001, after receiving the calibration command word sent by the first HAL interface, the production service process sends an acquisition request of ambient light data to the second HAL interface.
S1002, after receiving the request for obtaining the ambient light data, the second HAL interface establishes inter-core communication with the ambient light sensor driver in the SCP processor.
S1003, after the second HAL interface successfully establishes the inter-core communication with the ambient light sensor driver, registers and runs the receive callback thread.
The receiving callback thread is used for receiving the ambient light data reported by the ambient light sensor driver.
And S1004, after the second HAL interface runs the callback thread, sending an acquisition request of the ambient light data to the ambient light sensor driver.
And if the receiving callback thread does not receive the ambient light data returned by the ambient light sensor drive within the preset time, the second HAL interface destroys the inter-core communication and destroys the receiving callback thread.
In the embodiment of the application, in the calibration stage of the ambient light sensor, the ambient light sensor collects ambient light data in a preset time period and reports the ambient light data to the ambient light sensor driver, so that the ambient light sensor driver receives the ambient light data reported by the ambient light sensor in the preset time period.
S1005, after the ambient light sensor driver receives the acquisition request, the ambient light sensor driver waits for receiving ambient light data reported by the ambient light sensor at preset time intervals.
In the embodiment of the present application, because there is a delay in step S24, after the ambient light sensor driver receives the acquisition request, the last ambient light data received before the acquisition request is received may be sent to the second HAL interface according to the different time of the delay; or after receiving the acquisition request, waiting to receive the first ambient light data reported by the ambient light sensor, and sending the first ambient light sensor data to the second HAL interface. Whatever the way used, it is ultimately necessary to ensure that the ambient light data reported to the second HAL interface is: in the current circulation test flow, the display screen stably displays the image and then the ambient light data acquired by the ambient light sensor.
S1006, the ambient light sensor driver sends ambient light data reported by the ambient light sensor to the second HAL interface through inter-core communication.
And S1007, after the receiving callback thread in the second HAL interface receives the ambient light data, destroying the inter-core communication with the ambient light sensor driver in the SCP processor.
S1008, the second HAL interface destroys the receive callback thread.
S1009, the second HAL interface transmits the ambient light data received from the ambient light sensor driver to the production service process.
As can be appreciated from fig. 10, a plurality (e.g., n) of loop test procedures (shown in fig. 9) are included in the calibration flow of the ambient light sensor, wherein each loop test procedure needs to be performed once through step S27 in fig. 9.
Each time step S27 is executed, the process of establishing inter-core communication between the AP processor and the SCP processor (step S1002), registering the operation reception callback thread (step S1003), transmitting an acquisition request of ambient light data to the SCP processor by the AP processor through inter-core communication (step S1004), and destroying the inter-core communication (step S1007) is required.
Wherein the setup procedure of inter-core communication between the AP processor and the SCP processor (step S1002) consumes about 8ms; the time taken to send an acquisition request for ambient light data and wait for ambient light data to be received takes approximately 27ms. The process of destroying the inter-core communication (S1007) consumes about 2ms. The calibration process corresponding to one ambient light sensor needs to be performed n times, and each time step S27 is performed, a minimum of 37ms is required. A lot of test time will be consumed, resulting in a low test efficiency.
The embodiment of the application can execute the establishment process of the inter-core communication between the AP processor and the SCP processor (step S1002), register and run the receiving callback thread (step S1003) and send the acquisition request of the ambient light data to the SCP processor through the inter-core communication (step S1004) when the calibration process of the ambient light sensor is carried out for the first time, and the process of destroying the inter-core communication (step S1007) is not executed any more. And then, after the ambient light sensor driver in the SCP processor receives the report request sent by the AP processor, reporting the received ambient light data to the AP processor after receiving the ambient light data reported by one ambient light sensor.
In the non-first-cycle test, the process of establishing inter-core communication between the AP processor and the SCP processor (step S1002), registering the operation reception callback thread (step S1003), and the AP processor transmitting an acquisition request of ambient light data to the SCP processor through inter-core communication (step S1004) is not performed, and the process of destroying the inter-core communication (step S1007) is not performed. The SCP processor always reports the ambient light data to the AP processor, so after receiving the acquisition request sent by the production service process, the second HAL in the AP processor may directly send the last ambient light data received before the second HAL receives the acquisition request to the production service process, and the second HAL does not need to wait for the ambient light sensor to drive and report the latest ambient light data, so that a lot of time can be spent.
Of course, in practical applications, after receiving the acquisition request sent by the production service process, the second HAL in the AP processor may also wait for a period of time, and send the first ambient light data received after the second HAL receives the acquisition request to the production service process. In this way, time for repeatedly creating inter-core communications and destroying inter-core communications can also be saved.
After all loop tests are over, for example, in step S22, the calibration package determines that the identity of the step in the received response is the last step in the calibration flow, and then steps S217 through S210 may be performed.
Step S217, the calibration program package calls the hidl interface to send a destroy instruction to the first HAL interface.
In step S218, the first HAL interface transmits a destruction instruction to the production service process.
In step S219, the production service process transmits a destruction instruction to the second HAL interface.
In step S210, the second HAL interface destroys the inter-core communication with the ambient light sensor driver in the SCP processor, and destroys the receive callback thread.
For a clearer understanding of the internal implementation flow of step S27 in the first-cycle test, in the non-first-cycle test, and after the cycle test reception, reference may be made to the timing chart shown in fig. 11.
In the 1 st cycle test process, after receiving the calibration command word sent by the first HAL, the production service process executes step S10011.
S10011, the production service process sends an acquisition request of the ambient light data to the second HAL interface.
In a specific implementation, the production service process may determine whether the first cycle test process is based on the received calibration command word, and may add a special identifier to the acquisition request as a flag of whether the first cycle test process is.
S1101, the second HAL interface may determine that the current cycle test procedure is the first cycle test procedure according to the information carried in the received acquisition request.
S1002, the second HAL interface establishes inter-core communication with the ambient light sensor driver in the SCP processor if it is determined to be the first cycle.
S1003, after the second HAL interface successfully establishes the inter-core communication with the ambient light sensor driver, registers and runs the receive callback thread.
And S10041, after the second HAL interface runs the callback thread, sending a report request of the ambient light data to the ambient light sensor driver.
S10051, after the ambient light sensor driver receives the report request, it waits to receive the ambient light data reported by the ambient light sensor according to the preset time interval.
S10061, the ambient light sensor driver sends the first ambient light data received after receiving the report request (or the last ambient light data received before receiving the report request) reported by the ambient light sensor to the second HAL interface through inter-core communication.
It should be noted that, in practical applications, the ambient light sensor driver may report each ambient light data received subsequently to the second HAL interface after receiving the report request. In this embodiment, reference is made to step S10052, step 10062, step S10053, step 10063, step 10054, step 10064, and the like. Of course, only a part of the steps are shown in this embodiment. In practical application, the period of reporting the ambient light data to the AP processor side by the SCP processor is basically consistent with the period when the ambient light sensor collects the ambient light data.
S10091, if the reception callback thread in the second HAL interface receives the ambient light data within the preset time (which may be recorded as the first time period), the second HAL interface sends the received ambient light data to the production service process.
It will be appreciated that during the first cycle, the second HAL interface adds the step of determining whether it is currently the first cycle based on the information carried by the acquisition request to establish inter-core communication with the ambient light sensor driver during the first cycle.
After the second HAL interface receives the ambient light data reported by the ambient light sensor driver, the inter-core communication is not destroyed, and a callback thread for receiving the ambient light data is not destroyed.
As time goes on, the first cycle ends, the second cycle starts, and when the second cycle is executed to step S27, the following steps are executed.
Of course, the process may be an execution step when proceeding to step S27 in the i-th (from 2 to n) round process, where n is the number of times of the test process executed in the round in the calibration flow.
Of course, the receiving callback thread in the second HAL interface may destroy the receiving callback thread if the receiving callback thread does not receive the ambient light data within a preset time (which may be recorded as the first time).
S10052, the ambient light sensor reports ambient light data to the ambient light sensor driver.
S10062, the ambient light sensor drives sending ambient light data to the second HAL interface via the non-destroyed inter-core communication.
The execution timing of step S10052 and step S10062 is related to the period in which the ambient light sensor collects the ambient light data, and is not necessarily between step S26 and step S10012.
S10012, after receiving the calibration command word sent by the first HAL, the production service process sends an acquisition request of ambient light data to the second HAL interface.
S1102, the second HAL interface can determine that the current cycle test process is not the first cycle test process according to the information carried in the received acquisition request.
S10053, the ambient light sensor reports ambient light data to the ambient light sensor driver.
S10063, the ambient light sensor drives reporting ambient light data to the second HAL interface.
S10092, the second HAL interface may send the last received ambient light data (the ambient light data reported in step S10062) to the production service process before receiving the acquisition request in step S10012, if it is determined that the current cycle is not the first time; the first ambient light data (the ambient light data reported in step S10063) after receiving the acquisition request in step S10012 may also be sent to the production service process.
In particular, the manner may be determined based on the delay in step S24.
As described above, in step S22 after all the loop tests are finished, the calibration package determines that the identification of the step in the received response is the last step in the calibration flow, and then steps S217 to S210 may be performed. Wherein, step S219 and step S210 may refer to the following description.
S10013, the production service process sends an instruction to the second HAL interface to destroy inter-core communications.
S10054, the ambient light sensor continuously reports ambient light data to the ambient light sensor driver.
S10064, the ambient light sensor drives reporting ambient light data to the second HAL interface.
Similarly, the execution timing of step S10054 and step S10064 is related to the period in which the ambient light sensor collects the ambient light data, and is not necessarily between step S26 and step S10013.
S1007, after the second HAL interface receives the destroying instruction of the inter-core communication sent by the production service process, the inter-core communication between the second HAL interface and the ambient light sensor driver is destroyed.
S1008, the second HAL interface destroys the receive callback thread.
In step 10055, the ambient light sensor reports ambient light data to the ambient light sensor driver.
Of course, after the inter-core communication between the second HAL interface and the ambient light sensor driver is destroyed, the ambient light sensor also continues to report ambient light data to the ambient light sensor driver. Only the inter-core communication between the SCP processor and the AP processor has been destroyed and the ambient light sensor driver in the SCP processor has not been able to report ambient light data to the second HAL interface in the AP processor anymore.
The process of establishing inter-core communication and destroying inter-core communication can be saved for n-1 times through the process; in addition, the time can be saved by directly reporting the ambient light data on the SCP processor side. Therefore, the time consumption of the calibration method of the ambient light sensor is greatly reduced, and the calibration efficiency is improved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
The embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor, implements the steps of the above-described method embodiments.
Embodiments of the present application also provide a computer program product enabling a first device to carry out the steps of the method embodiments described above, when the computer program product is run on the first device.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above-described embodiments, and may be implemented by a computer program to instruct related hardware, and the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a first device, a recording medium, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunication signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The embodiment of the application also provides a chip system, which comprises a processor, wherein the processor is coupled with the memory, and the processor executes a computer program stored in the memory to realize the steps of any method embodiment of the application. The chip system can be a single chip or a chip module composed of a plurality of chips.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the elements and method steps of the examples described in connection with the embodiments disclosed herein can be implemented as electronic hardware, or as a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (12)

1. A method of calibrating an ambient light sensor, for application to an electronic device including a display screen, an ambient light sensor positioned below the display screen, a first processor, and a second processor, the method comprising:
during the 1 st cycle test, the display screen displays a first image;
the first processor establishing inter-core communication between the first processor and the second processor;
the first processor sends a report request of the ambient light data to the second processor based on the inter-core communication, wherein the report request is used for indicating the second processor to report the received ambient light data to the first processor after receiving the ambient light data reported by the ambient light sensor;
the first processor receives the ambient light data reported by the second processor based on the inter-core communication, wherein the ambient light data comprises first ambient light data collected by the ambient light sensor when the display screen displays the first image;
in the ith cyclic test, the display screen displays a second image, i is less than or equal to n, and n is the preset total number of cyclic tests;
the first processor receives ambient light data reported by the second processor based on the inter-core communication, wherein the ambient light data comprises second ambient light data collected by the ambient light sensor when the display screen displays the second image, and the first ambient light data and the second ambient light data are used for calibrating the ambient light sensor;
After the nth cycle test is completed, the first processor destroys inter-core communications between the first processor and the second processor.
2. The method of claim 1, wherein the first processor comprises: a production service process and a HAL interface, the first processor establishing inter-core communication between the first processor and the second processor comprising:
the production service process sends a first acquisition request of ambient light data to the HAL interface;
in response to the first acquisition request, the HAL interface establishes inter-core communication between the HAL interface and the second processor;
the first processor sending a report request for ambient light data to the second processor based on the inter-core communication includes:
the HAL interface sends a report request of ambient light data to the second processor based on the inter-core communication.
3. The method of claim 2, wherein the first ambient light data is first ambient light data reported by the second processor based on the inter-core communication received by the HAL interface after the HAL interface sends a request to report ambient light data to the second processor based on the inter-core communication.
4. The method of claim 3, wherein after the HAL interface receives the first ambient light data reported by the second processor based on the inter-core communication, the method further comprises:
the HAL interface sends the first ambient light data to the production service process.
5. The method of claim 2, wherein the method further comprises:
after the HAL interface sends a report request for ambient light data to the second processor based on the inter-core communication, the method further comprises:
if the HAL interface does not receive the ambient light data reported by the second processor based on the inter-core communication within the first time period, the HAL interface destroys the inter-core communication between the HAL interface and the second processor.
6. The method of any one of claims 1 to 5, wherein the first processor comprises: a production service process and a HAL interface, the display screen displaying the second image, the method further comprising:
the production service process sends a second acquisition request of the ambient light data to the HAL interface;
and in response to the second acquisition request, the HAL interface sends second ambient light data reported by the second processor based on the inter-core communication to the production service process.
7. The method of claim 6, wherein the second ambient light data is the last ambient light data received before the HAL interface received the second acquisition request or is the first ambient light data received after the HAL interface received the second acquisition request.
8. The method of any one of claims 1 to 7, wherein the first processor comprises: a production service process and a HAL interface, the first processor destroying inter-core communications between the first processor and the second processor comprising:
the production service process sends a destruction instruction of the inter-core communication to the HAL interface;
in response to the destroy instruction, the HAL interface destroys inter-core communications between the HAL interface and the second processor.
9. The method of claim 2, wherein the method further comprises:
and responding to the first acquisition request, registering a callback receiving thread by the HAL interface, and running the callback receiving thread, wherein the callback receiving thread is used for receiving the ambient light data reported by the second processor based on the inter-core communication.
10. The method of claim 9, wherein the method further comprises:
And after the nth cycle test is finished, the HAL interface destroys the callback receiving thread.
11. An electronic device comprising a first processor and a second processor for running a computer program stored in a memory, such that the electronic device implements the method of any one of claims 1 to 10.
12. A system on a chip comprising a first processor and a second processor, the first processor being coupled to a memory, the first processor executing a computer program stored in the memory to implement the method of any one of claims 1 to 10.
CN202211422682.6A 2022-11-14 2022-11-14 Calibration method of ambient light sensor, electronic equipment and chip system Active CN116659659B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211422682.6A CN116659659B (en) 2022-11-14 2022-11-14 Calibration method of ambient light sensor, electronic equipment and chip system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211422682.6A CN116659659B (en) 2022-11-14 2022-11-14 Calibration method of ambient light sensor, electronic equipment and chip system

Publications (2)

Publication Number Publication Date
CN116659659A true CN116659659A (en) 2023-08-29
CN116659659B CN116659659B (en) 2024-03-29

Family

ID=87717700

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211422682.6A Active CN116659659B (en) 2022-11-14 2022-11-14 Calibration method of ambient light sensor, electronic equipment and chip system

Country Status (1)

Country Link
CN (1) CN116659659B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120019494A1 (en) * 2010-07-26 2012-01-26 Apple Inc. Alignment factor for ambient lighting calibration
CN113806103A (en) * 2021-07-08 2021-12-17 荣耀终端有限公司 Data processing method, electronic equipment, chip system and storage medium
WO2022156555A1 (en) * 2021-01-20 2022-07-28 华为技术有限公司 Screen brightness adjustment method, apparatus, and terminal device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120019494A1 (en) * 2010-07-26 2012-01-26 Apple Inc. Alignment factor for ambient lighting calibration
WO2022156555A1 (en) * 2021-01-20 2022-07-28 华为技术有限公司 Screen brightness adjustment method, apparatus, and terminal device
CN113806103A (en) * 2021-07-08 2021-12-17 荣耀终端有限公司 Data processing method, electronic equipment, chip system and storage medium

Also Published As

Publication number Publication date
CN116659659B (en) 2024-03-29

Similar Documents

Publication Publication Date Title
CN113806103B (en) Data processing method, electronic equipment, chip system and storage medium
CN111563466B (en) Face detection method and related product
EP4231147A1 (en) Drawing command processing method and related device therefor
CN111865646A (en) Terminal upgrading method and related device
CN111343326A (en) Method and related device for acquiring test log
CN115597706B (en) Ambient light detection method, electronic equipment and chip system
CN116659659B (en) Calibration method of ambient light sensor, electronic equipment and chip system
CN113496477A (en) Screen detection method and electronic equipment
CN116048831B (en) Target signal processing method and electronic equipment
WO2022007757A1 (en) Cross-device voiceprint registration method, electronic device and storage medium
CN116754069B (en) Calibration data testing method, electronic device, testing equipment and storage medium
CN116663587A (en) Two-dimensional code identification method and identification device
CN111526321B (en) Voice communication method, voice communication device, storage medium and electronic equipment
CN116405594B (en) Voice communication method and device
CN115792431B (en) Abnormal position detection method and electronic equipment
CN114125805B (en) Bluetooth reconnection method and terminal equipment
CN116709220B (en) Network connection method, network connection device, electronic equipment and computer readable storage medium
CN116744187B (en) Speaker control method and device
CN112583651B (en) Method, device and medium for testing applet communication interface
CN117133311B (en) Audio scene recognition method and electronic equipment
CN115495716B (en) Local authentication method and electronic equipment
CN113129220B (en) Image processing method and electronic equipment
CN117714860A (en) Image processing method and electronic equipment
CN117133215A (en) Method, chip, electronic device and readable storage medium for determining gray scale value
CN117708009A (en) Signal transmission method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant