CN113778641B - Method for controlling camera, electronic device and computer readable storage medium - Google Patents

Method for controlling camera, electronic device and computer readable storage medium Download PDF

Info

Publication number
CN113778641B
CN113778641B CN202110911919.6A CN202110911919A CN113778641B CN 113778641 B CN113778641 B CN 113778641B CN 202110911919 A CN202110911919 A CN 202110911919A CN 113778641 B CN113778641 B CN 113778641B
Authority
CN
China
Prior art keywords
camera
service
priority
information
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110911919.6A
Other languages
Chinese (zh)
Other versions
CN113778641A (en
Inventor
武文斌
吴洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110911919.6A priority Critical patent/CN113778641B/en
Publication of CN113778641A publication Critical patent/CN113778641A/en
Priority to PCT/CN2022/089557 priority patent/WO2023015956A1/en
Application granted granted Critical
Publication of CN113778641B publication Critical patent/CN113778641B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/542Event management; Broadcasting; Multicasting; Notifications

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a method for controlling a camera, electronic equipment and a computer readable storage medium, and relates to the technical field of terminals. The method comprises the following steps: and opening the target camera according to the first instruction of the first process. The second process requests to open the target camera. And determining the system priority sequence of the first process and the second process, wherein the system priority of the second process is higher than that of the first process in the system priority sequence. And adjusting the system priority sequence to obtain a first priority sequence of the first process and the second process. And notifying the second process that the target camera is failed to be opened based on the first priority ranking. The problem of camera use conflict between two processes is solved.

Description

Method for controlling camera, electronic device and computer readable storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a method for controlling a camera, an electronic device, and a computer-readable storage medium.
Background
Electronic devices often include multiple processes, different ones of which may implement different services. Illustratively, the plurality of processes includes an application process, an Always On (AO) business process, and the like. The AO service (also called intelligent perception service) realized by the AO service process is that if the electronic equipment detects the time needing human eye detection, the front-facing camera is opened to carry out human eye detection, and when the user is determined to watch the screen according to the result of the human eye detection, the corresponding service is executed according to the scene related to the AO service. Illustratively, in the case where the AO service includes a gaze-neutral screen scene, if the user's operation on the electronic device is not detected for a period of time, the front camera is automatically turned on to perform face detection and calculation of a human eye gaze algorithm, to determine whether the user is gazing at the screen, and to keep the screen continuously in a lighted state when it is determined that the user is gazing at the screen.
However, in an electronic device, in case one process has opened a camera, another process also needs to open the camera, e.g. the camera process has opened a front-facing camera, which the AO-service process also requests. This will cause the problem of conflict of camera usage between the two processes.
Disclosure of Invention
The application provides a method for controlling a camera, electronic equipment and a computer readable storage medium, which solve the problem that the two processes have camera use conflict in the prior art.
In order to achieve the purpose, the following technical scheme is adopted in the application:
in a first aspect, a method for controlling a camera is provided, where the method includes:
opening a target camera according to a first instruction of a first process;
the second process requests to open the target camera;
determining a system prioritization of the first process and the second process, the system prioritization of the second process being higher than the system prioritization of the first process;
adjusting the system priority sequence to obtain a first priority sequence of the first process and the second process;
and notifying the second process that the target camera is failed to be opened based on the first priority sequence.
Therefore, the problem that the camera use conflicts occur in the two processes is solved. In addition, the system priority sequence of the first process and the second process is obtained, and then the system priority sequence is adjusted, so that the finally obtained first priority sequence can meet the actual requirements of the user on the basis of not modifying the original system framework, and the user experience is improved.
As an example of the present application, the adjusting the system prioritization includes:
and arranging the priority of the first process before the priority of the second process according to the first identification information of the first process, the second identification information of the second process and process priority information, wherein the process priority information comprises self-defined priority sequencing among different processes.
Therefore, the system priority sequence of the first process and the second process is adjusted according to the user-defined priority sequence, so that the final priority sequence can meet the actual requirement of the user, and the user experience is improved.
As an example of the present application, the requesting, by the second process, to open the target camera includes:
the second process acquires camera information of a camera which can support the second process to realize corresponding services;
the second process issues a second instruction for requesting to open a camera corresponding to the camera information, wherein the second instruction carries the camera information;
and if the camera information carried in the second instruction corresponds to the target camera, determining that a second process requests to open the target camera.
Therefore, the second process issues a second instruction, and the determined camera information is carried in the second instruction, so that the electronic equipment determines that the second process pre-requests to open the target camera according to the second instruction.
As an example of the application, the second instruction further carries second identification information of the second process, and exemplarily, the second identification information is a tag of the second process.
As an example of the present application, the acquiring, by the second process, camera information of a camera capable of supporting the second process to implement a corresponding service includes:
the second process inquires a camera list, wherein the camera list comprises a plurality of pieces of camera information, and each piece of camera information in the plurality of pieces of camera information corresponds to one type of camera capability;
and determining the camera information capable of supporting the second process to realize the camera capability of the corresponding service from the plurality of camera information according to the camera capability corresponding to each piece of camera information.
That is, the second process determines the camera information capable of supporting the implementation of the corresponding service by querying the camera capability, so as to generate the second instruction according to the determined camera information.
As an example of the present application, the method further comprises:
the second process starts a monitoring module, and the monitoring module is used for monitoring the state of the camera;
after notifying the second process that the target camera is failed to be opened based on the first priority ranking, the method further comprises:
and determining whether to re-request to open the target camera according to business requirements under the condition that the second process monitors the notification that the target camera is closed through the monitoring module.
So, monitor the camera state through starting the monitoring module, so can the state of real-time perception camera to whether the second process needs or can open the camera according to the actual business demand judgement, improved the validity to the camera use.
As an example of the present application, after the first process closes the target camera and the second process has opened the target camera, the method further includes:
a third process requests to open the target camera;
determining a system priority ordering of the second process and the third process, the system priority of the second process being higher than the system priority of the third process;
adjusting the system priority sequence of the second process and the third process to obtain a second priority sequence of the second process and the third process;
closing the target camera on which the second process is opened based on the second priority ranking;
and opening the target camera according to the request of the third process.
Illustratively, the second process is an AO business process and the third process is an application process. Under the condition that the AO business process opens the target camera, if the application program process also requests to open the target camera, determining a second priority sequence of the AO business process and the application program process by adjusting the system priority sequence of the AO business process and the application program process, wherein in the second priority sequence, the priority of the AO business process is lower than that of the application program process, so that the camera opened by the AO business process is closed, and the camera opened by the application program process is opened. Therefore, the situation that the camera cannot be opened when a user needs to use the camera is avoided.
As an example of the present application, after the target camera is opened according to the request of the third process, the method further includes:
sending a notification of camera state change to the second process;
and the second process monitors the notice of the state change of the camera through the monitoring module.
In one embodiment, after the camera service module opens the camera requested to be opened by the application program process, the change condition of the camera state is notified to the AO service process. Therefore, the AO service process can monitor the state of the camera notified by the camera service module at the moment through the monitoring function, so that whether the AO service process executes the AO service is determined according to the state of the camera subsequently. For example, in the case where the AO service includes a scene of staring at a screen, after receiving the notification of the change of the camera state, the AO service may not perform human eye detection since it is determined that the application process is operating the target camera, although it is not detected for a while that the user is operating the electronic device.
As an example of the present application, the first process is an application process, the second process is an AO service process, and the third process is an application process, where the AO service process is configured to execute a specified operation when detecting that a user gazes at a screen at a human eye detection opportunity.
As an example of the present application, the target camera corresponds to a plurality of camera information, and each of the plurality of camera information corresponds to one camera capability.
That is, a plurality of camera information are set for one physical camera, so that the physical camera can shoot pictures with various formats to support different services.
In a second aspect, an apparatus for controlling a camera is provided, where the apparatus includes a camera service module and a first process module, and the first process module is configured to execute a second process:
the camera service module is used for opening a target camera according to a first instruction of a first process;
the first process module is used for requesting to open the target camera;
the camera service module is configured to determine a system prioritization of the first process and the second process, where a system priority of the second process is higher than a system priority of the first process in the system prioritization;
the camera service module is used for adjusting the system priority sequence to obtain a first priority sequence of the first process and the second process;
the camera service module is configured to notify the second process that the target camera fails to be opened based on the first priority ranking.
As an example of the present application, the camera service module is configured to:
and arranging the priority of the first process before the priority of the second process according to the first identification information of the first process, the second identification information of the second process and process priority information, wherein the process priority information comprises self-defined priority sequencing among different processes.
As an example of the present application;
the first process module is used for acquiring the camera information of a camera which can support the second process to realize the corresponding service;
the first process module is used for issuing a second instruction for requesting to open a camera corresponding to the camera information to the camera service module, and the second instruction carries the camera information;
and if the camera information carried in the second instruction corresponds to the target camera, the camera service module is used for determining that a second process requests to open the target camera.
As an example of the present application, the first process module is configured to:
inquiring a camera list, wherein the camera list comprises a plurality of camera information, and each piece of camera information in the plurality of camera information corresponds to one camera capability;
and determining the camera information capable of supporting the second process to realize the camera capability of the corresponding service from the plurality of camera information according to the camera capability corresponding to each piece of camera information.
As an example of the present application;
the first process module is used for starting a monitoring module, and the monitoring module is used for monitoring the state of a camera;
the camera service module notifies the second process that the target camera is failed to be opened based on the first priority ranking, and the first process module is used for determining whether to re-request to open the target camera according to business requirements under the condition that the monitoring module monitors the notification that the target camera is closed.
As an example of the application, the apparatus further includes a second process module, where the second process module is configured to run a third process, and after the target camera is closed by the first process and the target camera is opened by the second process;
the second process module is used for requesting to open the target camera;
the camera service module is used for determining system priority ordering of the second process and the third process, wherein the system priority of the second process is higher than that of the third process;
the camera service module is used for adjusting the system priority sequence of the second process and the third process to obtain a second priority sequence of the second process and the third process;
the camera service module is used for closing the target camera opened by the second process based on the second priority sequence;
the camera service module is used for opening the target camera according to the request of the third process.
As an example of the application, after the camera service module opens the target camera according to the request of the third process, the camera service module sends a notification of camera state change to the second process;
the first process module is used for monitoring the notice of the state change of the camera through the monitoring module.
As an example of the present application, the first process is an application process, the second process is an AO service process, and the third process is an application process, where the AO service process is configured to execute a specified operation when detecting that a user gazes at a screen at a human eye detection opportunity.
As an example of the present application, the target camera corresponds to a plurality of camera information, and each of the plurality of camera information corresponds to one camera capability.
In a third aspect, an electronic device is provided that includes a memory and a processor;
the memory is used for storing a program supporting the electronic device to execute the method of the first aspect, and storing data used for implementing the method of the first aspect; the processor is configured to execute programs stored in the memory.
In a fourth aspect, there is provided a computer readable storage medium having stored therein instructions which, when run on a computer, cause the computer to perform the method of any of the first aspects above.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of the first aspect described above.
The technical effects obtained by the second, third, fourth and fifth aspects are similar to the technical effects obtained by the corresponding technical means in the first aspect, and are not described herein again.
The technical scheme provided by the application can bring the following beneficial effects at least:
and opening the target camera according to the first instruction of the first process. And the second process requests to open the target camera, and determines the system priority sequence of the first process and the second process, wherein the system priority of the second process is higher than that of the first process in the system priority sequence. And adjusting the system priority sequence of the first process and the second process to obtain a first priority sequence, wherein the priority of the first process is higher than that of the second process in the first priority sequence. Therefore, the second process is notified that the opening of the target camera failed. The problem that the camera use conflicts occur in the two processes is solved.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of an application scenario provided in an embodiment of the present application;
fig. 4 is a schematic diagram of another application scenario provided in the embodiment of the present application;
fig. 5 is a schematic flowchart of a method for controlling a camera according to an embodiment of the present disclosure;
fig. 6 is a schematic flowchart of another method for controlling a camera according to an embodiment of the present disclosure;
fig. 7 is a schematic flowchart of another method for controlling a camera according to an embodiment of the present disclosure;
fig. 8 is a schematic flowchart of another method for controlling a camera according to an embodiment of the present disclosure;
fig. 9 is a schematic flowchart of another method for controlling a camera according to an embodiment of the present disclosure;
fig. 10 is a schematic flowchart of another method for controlling a camera according to an embodiment of the present disclosure;
fig. 11 is a schematic flowchart of another method for controlling a camera according to an embodiment of the present disclosure;
fig. 12 is a schematic flowchart of another method for controlling a camera according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of an apparatus for controlling a camera according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, the following detailed description of the embodiments of the present application will be made with reference to the accompanying drawings.
It should be understood that reference to "a plurality" in this application means two or more. In the description of the present application, "/" indicates an OR meaning, for example, A/B may indicate A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, for the convenience of clearly describing the technical solutions of the present application, the words "first", "second", and the like are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
Before describing the method provided by the embodiment of the present application in detail, a brief description is given to an execution subject related to the embodiment of the present application. The method provided by the embodiment of the application can be executed by electronic equipment, the electronic equipment is provided with front cameras, and the number of the front cameras can be one or more. In one embodiment, the electronic device is further configured with one or more rear cameras, which is not limited in this application. In addition, one or more applications may be installed in the electronic device, and each application in the one or more applications can open a camera configured in the electronic device to capture an image through the camera. Illustratively, the one or more applications may include a camera, payment application software, and the like. For example, the application is a camera, and the user can open the front camera through the camera to take a self-timer. In addition, the electronic device supports AO services.
As an example, the electronic device may include, but is not limited to, a mobile phone, a tablet computer, an Augmented Reality (AR)/Virtual Reality (VR) device, an ultra-mobile personal computer (UMPC), a notebook computer, a netbook, a Personal Digital Assistant (PDA), and the like, which are not limited in this embodiment.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 170 and wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 141 may be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, and the application processor, etc.
The ISP is used to process the data fed back by the camera 193. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into a sound signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending voice information, the user can input a voice signal into the microphone 170C by uttering a voice signal by the mouth of the user near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be an Open Mobile Terminal Platform (OMTP) standard interface of 3.5mm, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment 100, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device 100 may utilize the distance sensor 180F to range to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint characteristics to unlock a fingerprint, access an application lock, photograph a fingerprint, answer an incoming call with a fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation acting thereon or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so that the heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration prompts as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects in response to touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards can be the same or different. The SIM card interface 195 is also compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present invention uses an Android system with a hierarchical architecture as an example to exemplarily explain a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, which are an application layer, an application framework layer, a system layer, an extension layer, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications such as AO awareness, camera, gallery, calendar, call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The AO perception is used for perceiving the opportunity of human eye detection according to the service and informing an AO module in the expansion layer.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, explorer, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions for the electrical apparatus 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system layer comprises a camera service module, a system library and an Android runtime (Android runtime).
In this embodiment of the application, the camera service module is configured to detect whether a camera usage conflict occurs between a screen-watching unattended service and another service (e.g., a service implemented by an application process) in an AO service validation process, and when a conflict is determined to occur, solve the conflict problem, and for specific implementation, refer to the following embodiments. Wherein the application program process is used for running the application program of the application layer.
In addition, the camera service module can sense the use conditions of all cameras in the electronic equipment, such as which cameras are being opened and which processes are respectively the processes for opening the cameras.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The extension layer comprises some contents customized by a developer according to self requirements. In some embodiments, the extension layer is also referred to as a custom layer, or as a ventor (developer) layer, or as a chip extension layer, etc.
As an example of the present application, the extension layer includes an AO module for running AO business processes. Specifically, the AO module is responsible for turning on or off the human eye detection service. And under the condition of opening the human eye detection service, requesting a camera service module to open a camera and receive pictures, and then carrying out human eye detection to realize the AO service.
Further, the extension Layer includes a plurality of modules included in a Hardware abstraction Layer (hal). The hal layer, among other things, serves as an intermediary, separating hardware and software from each other. The primary purpose of this layer is to allow software to run on hardware it is not intended to run. In one embodiment, the plurality of modules in the hal layer includes, but is not limited to, at least one of: bluetooth (bluetooth), camera (camera), sensors (sensors).
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary workflow of the software and hardware of the electronic device 100 in connection with capturing a photo scene.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, timestamp of the touch operation, and the like). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of the application framework layer, starts the camera application, further starts the camera drive by calling the kernel layer, and captures a still image or a video through the camera 193.
After the execution subject related to the embodiment of the present application is described, a brief description is given next to an application scenario related to the embodiment of the present application. Here, the electronic device is a mobile phone as an example.
In one scenario, if the user does not operate the mobile phone, the mobile phone may automatically turn off the screen after a certain period of time if the mobile phone is not performing other services (such as playing video), which may affect the user experience. In another scenario, if the mobile phone starts the incoming call ringing mode, when the mobile phone is in an incoming call, if the incoming call volume of the mobile phone is large, interference is brought to other users.
Aiming at the problems in the two scenes, the AO service is provided, and can detect whether a user watches the screen of the mobile phone or not when the AO service takes effect, and execute the specified operation according to the detection result. For example, in the first scenario described above, it is determined that the user is looking at the screen, keeping the screen in the lit state. For another example, in the second scenario described above, it is determined that the incoming call volume is automatically muted with the user looking at the screen.
In one embodiment, the AO service in the handset may be turned on manually by the user. Referring to fig. 3 by way of example, when a user wants to open an AO service in a mobile phone, a setting option in the mobile phone may be selected, as shown in (a) of fig. 3. In response to the user's trigger operation on the setting option in the mobile phone, the mobile phone opens a setting page, for example, as shown in fig. 3 (b). An "auxiliary function" option is provided in the setting page, the user clicks the "auxiliary function" option, and in response to the user's trigger operation on the "auxiliary function" option, the mobile phone opens the auxiliary function page, as shown in (c) of fig. 3. The auxiliary function page includes a "wisdom perception" option, shown as 31 in fig. 3 (c). The user may select the "smart perception" option, and in response to the user's triggering operation of the "smart perception" option, the mobile phone opens a smart perception page, as shown in fig. 3 (d). The intelligent perception page comprises a watching screen and a watching screen incoming call volume weakening item, wherein the watching screen is provided with a switch for turning on or off the watching screen business, and the watching screen incoming call volume weakening item is provided with a switch for turning on or off the watching screen incoming call volume weakening business.
For example, taking "staring-free screen" as an example, when the user wants to turn on the staring-free screen service, the switch corresponding to the "staring-free screen" item may be manually slid to the right, as shown in (d) of fig. 3. And responding to the operation that the user slides the switch corresponding to the item of 'watching the screen without rest' to the right, and starting the business of watching the screen without rest by the mobile phone.
In an exemplary case, taking "watch the screen to reduce the incoming call volume" as an example, when the user wants to start the service of watching the screen to reduce the incoming call volume, the switch corresponding to the "watch the screen to reduce the incoming call volume" item may be manually slid rightward. And responding to the operation of sliding the switch corresponding to the item of 'watching the screen to reduce the incoming call volume' by the user to the right, and starting a service of watching the screen to reduce the incoming call volume by the mobile phone.
And the AO service is started under the condition that at least one of the screen-fixation-no-rest service and the screen-fixation-weakening incoming call volume service is started. And the AO service uses a front camera to collect images in the effective process. As an example of the present application, although the AO service is in an on state, if other services in the mobile phone use the front-facing camera in advance, the front-facing camera may still be normally turned on. For example, referring to fig. 4, when the user wants to take a picture with a front camera, the user may trigger a camera icon in the mobile phone, as shown in fig. 4 (a). In response to the trigger operation of the camera icon by the user, the mobile phone turns on the camera, as shown in (b) of fig. 4. In one example, the mobile phone turns on a front camera by default after detecting a trigger operation of a camera icon by a user. In another example, the mobile phone turns on the rear camera by default after detecting the triggering operation of the camera icon by the user, please refer to (b) in fig. 4, a front camera option 41 is included in the currently turned-on interface, and when the user wants to take a picture through the front camera, the front camera option 41 may be triggered. And when the user triggers the front camera option 41, the mobile phone turns on the front camera. Therefore, the user can take a picture through the opened front camera.
As an example of the present application, if other services in the mobile phone are using the front-facing camera, for example, the mobile phone opens the front-facing camera to take a picture, at this time, if the AO service opens the front-facing camera in advance, the opening will fail, that is, the validation of the AO service does not affect other services to open the front-facing camera.
In one embodiment, when the user does not need to use the AO service, the AO service may be closed in the setup page shown in the above (d) diagram in fig. 3. Exemplarily, when a user wants to close the watching-staying-screen service, the switch corresponding to the watching-staying-screen item is manually slid leftwards, and in response to the operation that the user slides leftwards on the switch corresponding to the watching-staying-screen item, the mobile phone closes the watching-staying-screen service. In addition, when the user wants to close the watching screen incoming call volume weakening service, the switch corresponding to the watching screen incoming call volume weakening item is manually slid leftwards, and in response to the operation that the user slides leftwards on the switch corresponding to the watching screen incoming call volume weakening item, the mobile phone closes the watching screen incoming call volume weakening service. And under the condition that the watching screen-staying service and the watching screen incoming call volume weakening service are both closed, determining that the AO service is closed.
It should be noted that the above description is only made by taking the example of manually turning on the AO service by the user. In another embodiment, the mobile phone may also automatically start the AO service in case that it detects that the preset condition is currently satisfied. The preset conditions can be set according to actual requirements. For example, after detecting that the user unlocks the screen when the mobile phone is started, the mobile phone automatically starts the watching-screen-staying service and the watching-screen-weakening incoming call volume service so as to automatically start the AO service. Further, the mobile phone controls the switch corresponding to the item of 'watching the screen without rest' to be in an on state and controls the switch corresponding to the item of 'watching the screen to weaken the incoming call volume' to be in an on state under the condition of automatically starting the AO service. As such, when the user does not want to turn on the AO service, the smart aware page may be entered based on the above-mentioned setup path (i.e., setup-assist function-smart aware) to manually turn off the gaze-private screen service and the gaze-screen-mute incoming volume service in the smart aware page to manually turn off the AO service.
After the execution subject and the application scenario related to the embodiment of the present application are introduced, a method for controlling a camera provided in the embodiment of the present application is described in detail below with reference to the accompanying drawings.
Referring to fig. 5, fig. 5 is a schematic diagram of a module framework according to an exemplary embodiment, which mainly includes an AO awareness module 51, an AO module 52, a camera service module 53, and an application module 54. The AO module 52 is used to run AO business processes and the application module 54 is used to run other application processes. Please refer to fig. 2 for the positions of the modules in the software architecture.
The camera use conflict related to the embodiment of the application comprises two scenes: one scenario is that the application module 54 has already opened the front-facing camera, and the AO module 52 has pre-opened the front-facing camera; another scenario is where the AO module 52 has already opened the front-facing camera and the application module 54 has pre-opened the front-facing camera. Next, a method for controlling a camera provided in an embodiment of the present application will be described separately for these two scenarios.
First, a method for controlling a camera in a first scene is introduced, and at this time, an interaction flow of each module includes:
(1) the AO perception module 51 perceives human eye detection timing.
In one example, the electronic device senses the human eye detection opportunity through the AO sensing module 51 in case the AO service is in an on state. When the AO perceiving module 51 perceives the opportunity of human eye detection, it informs the AO module 52 to execute the AO service, and then the following step (2) is entered. Otherwise, if the human eye detection time is not reached, the AO sensing module 51 continues sensing.
(2) The AO-module 52 obtains the camera information 6 of the camera to be opened.
As an example of the present application, after receiving the notification from the AO sensing module 51, the AO module 52 queries a camera list, where the camera list includes a plurality of pieces of camera information, and each piece of camera information corresponds to one type of camera capability. The AO module 52 determines, according to the camera capability corresponding to each piece of camera information, the camera information of the camera capable of supporting the AO service, which is exemplarily 6.
It should be noted that the camera information in the embodiment of the present application corresponds to a camera in a logical concept. Specifically, since one physical camera may correspond to a plurality of camera functions, different camera functions in the plurality of camera functions may output images with different formats, for example, a certain front camera may capture both a 2DRGB (red green blue) image for implementing the AO service and an image with a common format. Therefore, for a physical camera, a plurality of pieces of camera information may be set in correspondence at the bottom layer, each piece of camera information in the plurality of pieces of camera information corresponds to one type of camera capability of the physical camera, and different camera capabilities are used to indicate a camera characteristic used in a corresponding service scene, and it can also be understood that each piece of camera information corresponds to one logical camera. For example, the front camera is correspondingly provided with 3 camera information, which are respectively 1, 3 and 6. The camera capability corresponding to the camera information 6 can capture an image required by the AO service, and the camera capability corresponding to the camera information 1 can capture an image in a common format. By way of example and not limitation, each camera capability may be configured by one camera profile, such as each camera profile may be configured in a camera module in the hal layer.
As an example of the present application, the camera information set in the bottom layer may be recorded in a form of a list, and each piece of camera information corresponds to one type of camera capability, that is, the bottom layer of the electronic device may be provided with a camera list, in which a plurality of pieces of camera information and a camera capability corresponding to each piece of camera information in the plurality of pieces of camera information are recorded.
In one example, the camera information included in one physical camera may correspond to one camera list, and it is understood that when the electronic device is configured with a plurality of physical cameras, a plurality of camera lists correspond to the electronic device. Illustratively, assuming that the electronic device includes a front camera and a rear camera, the front camera corresponds to a camera list, and the rear camera corresponds to a camera list. For example, a camera list corresponding to the front camera includes 3 pieces of camera information, and a camera list corresponding to the rear camera includes 5 pieces of camera information.
In another example, the electronic device may also record camera information of different physical cameras through one camera list, that is, only one camera list may also exist in the electronic device. At this time, the physical cameras corresponding to the camera information can be determined according to preset corresponding rules, and the preset corresponding rules are used for indicating the corresponding relationship between the camera information and the physical cameras. For example, if the camera information is any one of 1, 3, and 6, the corresponding physical camera is the front camera, and if the camera information is any one of 0, 2, 4, 5, and 7, the corresponding physical camera is the rear camera.
It should be noted that the above description is only given by taking an example in which the camera information exists in a list form. In another embodiment, the camera information may also be recorded in other manners, for example, the camera information may also be recorded in a set manner, which is not limited in this application.
As another example of the present application, the AO module 52 only stores the camera information of the camera capable of implementing the AO service, for example, only stores the camera information 6, that is, the AO module 52 does not need to query the camera list, directly obtains the stored camera information, and uses the obtained camera information as the camera information of the camera to be opened.
(3) The AO module 52 sends a second instruction to the camera service module 53, where the second instruction is used to request the camera service module 53 to open a camera corresponding to the determined camera information.
As an example of the present application, the second instruction may carry a tag and the determined camera information.
This tag is used to uniquely identify the AO service process running in the AO module 52. In one embodiment, the label of a process is negotiated between the process and the camera service module so that the process and the camera service module can identify the opposite business. In case of replacing or restarting the electronic device, the tag is not changed.
On one hand, the tag is used by the camera service module 53 to determine who the second instruction is sent, so that a notification that the opening of the camera fails is sent to the AO service process corresponding to the tag or a notification that the opening of the camera by the AO module 52 is closed is sent subsequently after the priority ranking is determined. On the other hand, the tag is also used for the camera service module 53 to subsequently query the priority of the AO service process from the customized process priority information.
By way of example and not limitation, the format of the second instruction may be a preset format, and the preset format may be set according to actual requirements.
Illustratively, the second instruction is opencamera (1_ ospervice), where opencamera indicates that the camera is to be opened, 1 in the parenthesis indicates camera information of the camera to be opened, and aoervice is a tag. The meaning of the second instruction is that the AO service process requests to open the camera corresponding to the camera information 1.
As an example of the present application, the AO module 52 sends the second instruction by calling the vndk interface.
(4) The camera service module 53 queries the application process that has opened the camera.
After receiving the second instruction, the camera service module 53 parses the second instruction to determine the meaning of the second instruction. And in addition, the carried camera information and the label are analyzed from the second instruction. Further, to determine whether the camera requested to be opened by the AO module 52 conflicts with other cameras that have been opened, the camera service module 53 determines which camera is being opened and which application process is using the camera that has been opened. For example, the camera information of the camera that the application process has opened is 6. For another example, the camera information of the camera that has been opened by the application process is 1.
It should be noted that, because all processes in the electronic device need to be executed through the camera service module 53 to open or close the camera, for example, when a certain camera is pre-opened by other application processes, a request for opening the certain camera is sent to the camera service module 53, and thus, the camera service module 53 calls a camera module in a hal layer to open the certain camera for the certain camera; when the other application program processes close the certain camera in advance, a request for closing the certain camera is sent to the camera service module 53, and thus, the camera service module 53 calls a camera module of the hal layer to close the certain camera for the certain camera. Thus, the camera service module 53 is able to sense which camera is turned on and which process is using the camera that has been turned on.
(5) The camera services module 53 queries the system priority score for the process.
The processes described herein include application processes and AO business processes.
As an example, the camera service module 53 queries the system priority score of an AO business process based on its process number and the system priority score of an application process based on its process number. The application program process described herein refers to a process that has opened the camera, that is, the process determined in the step (4) above.
The level of the system priority score of a process varies dynamically with the running state of the process. Typically, the system priority score of a background running process is higher than the system priority score of a foreground running process. In addition, the system priority score of the process running at the bottom layer (also called the resident process) is higher than the system priority score of the process running at the application layer.
(6) The camera service module 53 determines whether there is a camera usage conflict problem between the AO service process and the application process.
In implementation, the camera service module 53 determines whether there is a camera usage conflict problem in the AO service process and the application process according to the camera information pre-opened in the AO service process and the camera information already opened in the application process.
In one example, if the camera information pre-opened by the AO service process is the same as the camera information already opened by the application program process, it is determined that there is a camera usage conflict problem between the AO service process and the application program process.
In another example, if the camera information pre-opened by the AO service process and the camera information already opened by the application process correspond to the same physical camera (for example, the same front-facing camera), it is determined that there is a camera usage conflict problem between the AO service process and the application process.
If the conflict exists, entering the following step (7); if no conflict exists, the camera service module opens the requested camera for the AO module 52, and illustratively, the camera service module calls a corresponding camera configuration file from the hal layer according to camera information carried in the second instruction sent by the AO module 52, so as to open the corresponding camera.
(7) The camera service module 53 determines a system prioritization based on the system priority score for the process.
I.e. the system prioritization is determined based on the system priority value of the AO-service process and the system priority value of the application process.
Since the AO service process is a process at the bottom layer and the application process is a process at the upper layer, the system priority of the AO service process is higher than that of the application process, and therefore, in the system priority ranking, the priority of the AO service process is higher than that of the application process.
It should be noted that, the above steps (6) and (7) do not have a strict sequential execution order, that is, in an embodiment, the system priority order may be determined according to the system priority value of the queried process, and then the conflict determination is performed. At this time, if it is determined that there is a conflict, the following step (8) is performed, otherwise, if it is determined that there is no conflict, the camera service module opens the requested camera for the AO module 52.
(8) The camera service module 53 adjusts the system prioritization according to the customized process priority information.
Since the AO service process is a process at the bottom layer and the application process is a process at the upper layer, the system priority of the AO service process is higher than that of the application process, and thus, in the system priority ordering, the priority of the AO service process is higher than that of the application process, resulting in that in the first scenario, the front camera function of the application module 54 is turned off, and the front camera function of the AO service is turned on; affecting the user experience. In the second scenario, when the user wants to take a self-timer, the electronic device cannot open the front camera because the electronic device is implementing the AO service.
Since the system priority ranking does not match the experience requirements of the user, in the embodiment of the present application, the customized process priority information of the electronic device needs to adjust the system priority ranking, so that the adjusted priority ranking meets the actual requirements of the user.
The self-defined process priority information can be preset by a user according to actual requirements and is stored in a certain designated storage area of the electronic equipment. The self-defined process priority information includes priority information of one or more processes, each of the one or more priority information is used to indicate a priority of a corresponding process, and exemplarily, the priority information may be a sequence number, and the smaller the sequence number is, the higher the corresponding priority is.
In the case that the customized process priority information includes priority information of a plurality of processes, in the customized process priority information, priority information of different processes may be distinguished by identification information of the processes, where the identification information includes a tag or an application identification that can be used to uniquely identify one application program. Illustratively, the application identification may be an application package name or an application ID, etc.
As an example of the application, the customized process priority information exists in a form of a list, that is, the customized priority list may be stored in the electronic device in advance. The self-defined priority list comprises identification information of a plurality of processes, and the identification information of each process in the plurality of processes corresponds to one priority information. Illustratively, the custom priority list is shown in Table 1.
TABLE 1
Identification information Priority information
netstat-anpgrep-xj 1
systemserver2d_old 2
aoservice 3
...... ......
As an example of the present application, the priority of the AO service process is usually set to be the lowest to avoid that the AO service affects other services using the front camera. Exemplarily, in the above table 1, aoservicece represents identification information of an AO service process; system mserver2d _ old represents identification information of a secure payment process; netstat-anpgrep-xj represents identification information of the camera process.
In addition, the above description is only given by taking the priority information as a serial number as an example, and in another embodiment, the priority information may also be other identifiers such as characters, which is not limited in the embodiment of the present application.
It should be noted that, the above is only described by taking the example of recording the customized process priority information in the form of a list, and in another embodiment, the recording may be performed in other manners, for example, the recording may be performed in a manner of a set, and this is not limited in the embodiment of the present application.
As an example of the present application, the self-defined process priority information includes identification information of a process with a low priority, which may be understood as a white list. Such as tags including the AO-service processes therein, do not include the application identities of the application processes, in which case the AO-service processes are determined to be lower in priority than the application processes, i.e. the prioritization in the system prioritization is readjusted.
Optionally, the customized process priority further includes identification information of other processes, in which case, there is a ranking between the identification information of the AO service process and the other processes to indicate the priority ranking between the AO service process and the other processes. Illustratively, the AO business process has the lowest priority.
As another example of the present application, the customized process priority information includes a tag of the AO service process and an application identifier of the application program process, and there is a sorting relationship between the tag of the AO service process and the application identifier of the application program process, which indicates a priority sorting between the AO service process and the application program process. In this case, the priority ordering of the AO service process and the application program process is determined according to the arrangement order of the tag of the AO service process and the application identifier of the application program process.
As an example of the present application, the priority of the AO service process is lower than the priority of the application process, when the following step (9) is performed.
(9) The camera service module 53 notifies the AO service process that the camera opening has failed.
In the embodiment of the application, when the conflict occurs between the AO business process and the application program process when the front camera is opened, the camera service module determines the priority sequence of the AO business process and the application program process according to the label, the application identifier and the self-defined process priority information of the AO business process, wherein the process with the higher priority in the priority sequence is the process capable of opening the front camera. Thus, the problem of camera use conflict between the application program process and the AO service is solved.
It should be noted that, in the above embodiment, the description is only given by taking an example in which the AO module 52 carries a tag of the AO service process in the second instruction. In another embodiment, the camera service module 53 may also obtain the tags of the AO business processes by other means. For example, the camera service module 53 may also query the label of the AO service process according to the process number of the AO service process, which is not limited in the embodiment of the present application.
Also as an example of the present application, the AO module 52 starts a monitor function after determining the information of the cameras to be turned on (i.e., after the above-mentioned step (2)) to monitor the change notification of the states of all the cameras through the monitor function. The camera service module 53 may notify the AO module 52 after a subsequent determination that the application process is finished using the camera. After the AO module 52 monitors this notification through the monitoring function, it can determine whether to request to turn on the camera again according to the service requirement. For example, taking AO service including a staring-saving screen service as an example, if a user operation is not detected within a period of time, it is determined to re-request to turn on the camera.
It is worth mentioning that the monitoring function is started after the information of the camera to be opened is determined, so that the state of the camera can be sensed in real time, the AO module 52 can conveniently judge whether the front camera needs to be opened or can be opened according to the actual service requirement, and the effectiveness of using the front camera is improved.
As an example of the present application, the camera service module 53 includes a plurality of sub-modules therein, and the camera service module 53 implements the above operation through interaction of the plurality of sub-modules. Illustratively, the plurality of sub-modules includes a camera service sub-module and a process information service sub-module. Next, the interaction process of the plurality of sub-modules will be described. Referring to fig. 6, fig. 6 is a schematic diagram illustrating an interaction flow of a plurality of sub-modules according to an example embodiment. By way of example and not limitation, the interaction flow includes:
601: the camera service sub-module receives a second instruction.
As previously described, the second instruction is from the AO module 52.
In one example, the camera service module provides an acamera manager submodule on a hal layer, and the acamera manager submodule provides a calling interface for a bottom layer process, so that the bottom layer process can call the provided calling interface when a camera needs to be opened, so that an instruction for opening the camera is sent to the acamera manager submodule, and the acamera manager submodule forwards the instruction to the camera service submodule. For example, in the present embodiment, the AO module 52 sends the second instruction to the AcameraManager submodule.
In one embodiment, as shown in FIG. 6, the AcameraManager submodule checks the second instruction before forwarding the second instruction. The verification method specifically comprises the following steps: and determining whether the format of the second instruction is a preset format and determining whether the camera information carried in the second instruction is legal, and when the format of the second instruction is the preset format and the camera information carried in the second instruction is legal, determining that the verification is passed. Otherwise, if the format of the second instruction is not the preset format and/or the camera information carried in the second instruction is illegal, determining that the verification is not passed.
The determining whether the camera information carried in the second instruction is legal or not means determining whether the camera information carried in the second instruction belongs to the camera list or not, and if the camera information carried in the second instruction belongs to the camera list, determining that the camera information is legal. If not, the method is determined to be illegal.
And if the acameraManager sub-module passes the verification of the second instruction, forwarding the second instruction to the camera service sub-module, and correspondingly, the camera service sub-module receives the second instruction. Otherwise, if the acamera manager submodule fails to check the second instruction, a notification of failure to open the camera is returned to the AO module 52.
In one embodiment, the camera service sub-module is a cameraservice module.
It should be noted that, here, the camera service module provides the acamera manager submodule in the hal layer, so as to take the example of bottom-layer process call for explanation. In addition, the camera service module also provides a camera manager submodule in the application framework layer, so that the camera is called when the upper application process opens the camera in advance, that is, the application process forwards a command for requesting to open or close the camera to the camera service module through the camera manager submodule for processing.
602: the camera service sub-module checks the second instruction and analyzes the label.
In order to avoid processing the malicious request, the camera service sub-module checks the second instruction again after receiving the second instruction. In one example, the verification process of the second instruction by the camera service sub-module includes: and judging whether the format of the second instruction is a preset format or not. And if the format of the second instruction is the preset format, determining that the verification is passed, otherwise, if the format of the second instruction is not the preset format, determining that the verification is not passed.
As an example, after the second instruction is checked, the camera service sub-module parses the second instruction to obtain the carried camera information and the tag from the second instruction. In addition, the camera service sub-module calls a handleEvictionLocked code to execute subsequent conflict processing through the handleEvictionLocked, which specifically includes the following contents. Otherwise, if the check fails, the AO module 52 is notified that the camera opening has failed.
603: the camera service sub-module requests the process information service sub-module to query the system priority scores of the processes.
The processes described herein include AO business processes and application processes that have opened a camera.
604: the process information service sub-module queries the system priority score for the process.
In one possible implementation, the process service information sub-module may query the system priority score of the AO service process according to the process number of the AO service process, and query the system priority score of the application process according to the process number of the application process.
Illustratively, the process information service module queries the system priority scores of the two processes through the getprocessstatescordfromppids code.
605: the process information service submodule returns the system priority score of the process.
That is, the process information service sub-module sends the queried system priority value to the camera service sub-module.
606: the camera service sub-module determines whether a conflict exists and proceeds to step 607.
For example, the determination method may refer to step (6) in the above embodiment, and details are not repeated here.
In one example, the conflict determination may be implemented by calling the wooldEvict code.
And if no conflict exists, the camera service submodule calls a camera configuration file configured on the bottom layer so as to open the corresponding camera.
607: and the camera service submodule determines a rejection request list according to the system priority score of the process.
Wherein the reject request list includes information about processes with low priority. Illustratively, the related information includes, but is not limited to, identification information (such as an application identification or a tag) of a corresponding process, camera information of a pre-opened or opened camera. In one embodiment, the deny request list may be defined as an evictList.
In the embodiment of the application, the camera service submodule determines system priority ordering according to the system priority score of the process, and then determines a rejection request list according to the system priority ordering. Illustratively, if the process with the lower priority in the system priority order is an application process, the rejection request list includes an application tag of the application process and camera information of a camera that has been opened by the application process.
It should be noted that, if the camera service sub-module determines that there is no conflict, the rejection request list is empty.
608: and the camera service submodule updates the rejection request list according to the self-defined process priority information.
In one example, the camera service sub-module adjusts the system priority ranking according to the custom process priority information, and in the adjusted priority ranking, the priority of the AO service process is lower than the priority of the application process. In this case, the camera service sub-module replaces the relevant information of the application process included in the rejection request list with the relevant information of the AO service process, for example, the replaced rejection request list includes a tag of the AO service process and camera information of a camera that the AO pre-requests to open.
As an example of the present application, the deny request list may be updated by calling the updateevtcedList code.
609: the camera service sub-module handles the conflict.
In implementation, if the rejection request list includes the relevant information of the process of the current pre-opened camera, a notification for indicating that the opening of the camera fails is returned. Otherwise, if the rejection request list comprises the relevant information of the process of the opened camera, closing the opened camera to disconnect the camera connection.
Illustratively, the reject request list includes information related to the AO service process, and the camera service sub-module feeds back a notification indicating that the camera opening has failed to the AO module 52.
For easy understanding, the method for turning on the camera in the above embodiment will be described by taking the flowchart shown in fig. 7 as an example. The method is applied to the electronic equipment, and the method can comprise the following steps:
701: the application process has opened the camera.
Illustratively, the camera information of the camera that the application process has opened is 1. For example, the application process is a camera process, which has opened the camera 1 to take a picture.
702: the AO service process pre-requests to turn on the camera.
In one example, the AO service includes a staring-private-screen scenario, and when the user does not operate the electronic device for a period of time, such as detecting that the user does not operate the electronic device within 8s, the AO service process starts executing the AO service, and the AO service process pre-requests to turn on the camera.
703: and the AO service process inquires the camera list and determines the first camera information.
In implementation, the AO service process determines, according to the camera capabilities recorded in the camera list, the camera capabilities capable of supporting the AO service, and acquires, from the camera list, camera information corresponding to the determined camera capabilities. Illustratively, the camera information obtained from the camera list is 6, that is, the first camera information is 6.
It should be noted that, in the embodiment of the present application, the first camera information is camera information determined by a process that requests to open a camera in advance, and the second camera information is camera information corresponding to an opened camera.
704: the AO service process starts a monitoring function.
The AO service process starts a monitoring module to execute a monitoring function through the monitoring module, and the monitoring module is responsible for monitoring change notifications of the states of all the cameras.
705: and the AO business process issues a second instruction for opening the camera.
The second instruction carries a tag of the AO service process, and further, the second instruction may also carry the first camera information determined in step 703.
Illustratively, the AO business process sends a second instruction to the camera service module. After receiving the second instruction, the camera service module can determine who the second instruction is issued according to the tag carried by the second instruction, so that the priority information of the AO service process can be conveniently inquired from the self-defined process priority information subsequently according to the tag carried by the second instruction.
706: the camera service module determines whether a conflict exists.
Of course, the conflict described herein refers to camera usage conflict. For example, the camera service module obtains the camera information of the currently opened camera to obtain the second camera information. Then, the camera service module determines which physical camera the first camera information corresponds to, and determines which physical camera the second camera information corresponds to, so as to determine whether a conflict exists. Exemplarily, if the first camera information and the second camera information correspond to the same front-facing camera, that is, the camera pre-opened by the AO service process and the camera already opened by the application process are the same front-facing camera, it is determined that the conflict exists.
If a conflict is determined to exist, then step 707 is entered as follows. Otherwise, if there is no conflict, proceed to step 710 as follows.
707: the camera service module obtains a system prioritization.
And the camera service module performs priority sequencing on the AO business process and the application program process according to the system priority of the AO business process and the system priority of the application program process to obtain system priority sequencing. In system prioritization, the AO service process has a higher priority than the application process.
708: and the camera service module adjusts the system priority sequence according to the self-defined process priority information.
As an example of the present application, the customized process priority information only includes identification information of a process with a low priority, such as only a tag of the AO service process included in the customized process priority information, and does not include an application identification of the application process, at this time, it may be determined that the priority of the AO service process is lower than the priority of the application process, so in the adjusted system priority ranking, the priority of the AO service process is lower than the priority of the application process.
709: and the application program process works normally, and the camera service module informs the AO service process of failure in opening the camera.
That is, since the priority of the application process is higher than that of the AO service process, and the electronic device allows the process with the higher priority to open the camera, the application process continues to use the currently opened camera, and the AO service process fails to open the front camera.
In one embodiment, after determining that the pre-camera is failed to be opened by the AO service process, the camera service module sends a failure notification to the AO service process. And after the AO service receives the failure notice, determining that the application program process uses the front-facing camera, thereby determining that the front-facing camera cannot be opened at present to realize the AO service.
The problem of camera use conflicts has been solved so far. In one embodiment, the method further includes steps 710-712 as follows.
710: the application process requests the camera service module to close the camera.
711: and the camera service module closes the camera and sends a notice of camera state change to the AO business process.
The notification of the camera state change is used to notify the AO service process application process that the camera has been closed.
It should be understood that the camera turned off by the camera service module described herein refers to the camera that the application process requests to be turned off.
712: after the AO service process monitors the notice of the state change of the camera through the monitoring function, whether the camera needs to be turned on again is judged according to the service requirement.
713: and the AO service process opens the camera corresponding to the first camera information.
Illustratively, the AO service process opens a front camera corresponding to the camera information 6 to normally make a picture through the front camera. In addition, after normal plotting, the AO service process closes the front-facing camera.
Referring to fig. 8, the following description will be directed to a method for controlling a camera in the second scenario (i.e., the AO module 52 has opened the camera, and the application module 54 requests to open the camera), where the interaction flow of each module in fig. 5 includes:
801: the application module 54 detects a pre-open camera.
In one example, when a trigger operation of a user on an application icon corresponding to an application process is detected, it is determined that the application process pre-requests to open a camera. Illustratively, the application program process is a camera process, and when the triggering operation of a camera icon in the electronic equipment by a user is detected, the application program process indicates that the user wants to open the camera through the camera, and at this time, the application program process indicates that the camera process pre-requests to open the camera.
802: the application module 54 determines the camera information to be turned on.
In implementation, the application program process determines, according to the camera capabilities recorded in the camera list, the camera capabilities capable of supporting the application program process to implement the corresponding services, and acquires, from the camera list, camera information corresponding to the determined camera capabilities.
803: the application module 54 sends a first instruction to the camera service module 53, where the first instruction is used to request to turn on the camera.
That is, the first instruction is used to request the camera service module 53 to turn on the camera corresponding to the determined camera information.
804: the camera service module 53 inquires about a process of opening a camera and camera information of the opened camera.
Exemplarily, it is assumed that the application process having opened the camera is an AO service process, and the camera information of the opened camera is 6.
805: the camera service module 53 queries the system priority score for the process.
The processes described herein include AO business processes and application processes.
806: the camera service module 53 determines whether there is a camera usage conflict problem between the AO service process and the application process.
807: the camera service module 53 determines a system prioritization based on the system priority score for the process.
808: the camera service module 53 adjusts the system prioritization according to the custom process priority information.
The specific implementation of steps 805 to 808 can be referred to as (5) to (8) in the embodiment shown in fig. 5.
809: the camera service module 53 closes the camera that the AO service process has been opened, and opens the camera that the application module 54 requests to be opened.
As an example of the present application, in the adjusted system priority order, the priority of the AO service process is lower than the priority of the application process, so that the camera service module closes the camera that has been opened by the AO service process, and opens the camera that the application process requests to be opened, so as to ensure that the application process can normally open the front-facing camera.
As an example of the present application, after the camera service module closes the camera that the AO service process has opened, and opens the camera that the application process requests to open, the camera service module notifies the AO service process of the state of the camera.
Illustratively, the camera service module sends a change notification to the AO business process to notify the AO business process that the state of the front camera has been changed through the change notification. Correspondingly, after the AO service process receives the change notification, the camera resources used in the realization of the AO service are cleaned and released.
810: the application module 54 uses a camera.
That is, after the application program process opens the front-facing camera, the image is normally drawn through the front-facing camera.
811: the application module 54 requests the camera service module 53 to turn off the camera.
In one embodiment, after the application process uses the front-facing camera, the camera service module is requested to turn off the front-facing camera. Correspondingly, the camera service module closes the front camera with the application program process opened.
812: the camera service module 53 turns off the camera.
813: the camera service module 53 notifies the AO module 52 that the camera has been turned off.
Since the AO module 52 starts the monitor function, the AO module 52 monitors the state of the camera through the monitor function, and when the notification of the change of the state of the front camera is monitored, it can be determined that the front camera is closed by the application process.
814: the AO module 52 determines whether the camera needs to be turned back on according to the service requirements.
For example, if the AO service includes a gaze-neutral screen scene, it is determined to turn on the front camera again when the user's operation of the electronic device is not detected for a duration of time.
In the above embodiment, in the case where the AO module 52 has already opened the front-facing camera, if the application module 54 also requests to open the front-facing camera, the prioritization of the AO service process and the application process is determined according to the self-defined process priority information. When the priority of the AO service process is lower than that of the application process, the camera that has been opened by the AO module 52 is closed, and the camera that has been requested to be opened by the application module 54 is opened.
It should be noted that, in the above scenario, the internal implementation flow of the camera service module 53 may refer to the embodiment shown in fig. 6, and details are not described here.
For ease of understanding, the method for turning on the camera in the embodiment of fig. 8 described above is described next by using a flowchart shown in fig. 9. The method is applied to the electronic equipment, and the method can comprise the following steps:
901: the AO service process has opened the camera.
Illustratively, the camera information of the camera that the AO business process has opened is 6.
It should be noted that, as described above, the AO service starts a monitoring function in the validation process, and the monitoring function is used for monitoring the notification of the change of the camera state in the electronic device.
902: the application process pre-requests to turn on the camera.
903: and the application program process inquires the camera list and determines the first camera information.
Illustratively, the camera information acquired from the camera list is 1.
904: and the application program process issues a first instruction for opening the camera.
The first instruction carries the camera information determined in step 903.
Illustratively, an application process sends a first instruction to a camera service module. And after receiving the first instruction, the camera service module acquires an application identifier of the application program process issuing the first instruction so as to determine priority information of the application program process according to the application identifier and the self-defined process priority information.
905: the camera service module determines whether a conflict exists.
The camera service module determines which physical camera corresponds to the camera information carried in the first instruction, and determines which physical camera is currently opened by the application program process, so as to judge whether a conflict exists. And if the camera pre-opened by the AO business process and the camera opened by the application program process are the same front-facing camera, determining that the conflict exists.
If a conflict is determined to exist, then step 906 is entered as follows. Otherwise, if there is no conflict, go to step 911 as follows.
906: the camera service module obtains a system prioritization.
And the camera service module performs priority sequencing on the AO business process and the application program process according to the system priority of the AO business process and the system priority of the application program process to obtain system priority sequencing. In system prioritization, the priority of the AO service process is higher than the priority of the application process.
907: and the camera service module adjusts the system priority sequence according to the self-defined process priority information.
As an example of the present application, in the self-defined process priority information, the priority of the AO service process is lower than that of the application program process, so in the adjusted priority ranking, the priority of the AO service process is lower than that of the application program process.
908: and the camera service module closes the camera opened by the AO service process and opens the camera opened by the application program process request.
In one embodiment, after the camera service module opens the camera requested to be opened by the application process, the following step 912 is entered. Since the AO service process starts the monitoring function, the AO service process monitors the state of the camera notified by the camera service module at this time through the monitoring function, and in this case, it can be determined that the application process is opening the camera.
909: the application program process is drawn through the camera.
Illustratively, after the application program process opens the front camera, the map is normally drawn through the front camera.
910: the application process requests the camera service module to close the camera.
In one embodiment, after the application process uses the front-facing camera, the application process requests the camera service module to turn off the front-facing camera. Correspondingly, the camera service module closes the front camera with the application program process opened.
Then, the camera service module notifies the AO service process of the camera status, that is, the following step 912 is entered. Correspondingly, since the AO service process opens the monitoring function, the AO service process can monitor the state of the camera notified by the camera service module through the monitoring function, and in this case, it can be determined that the application process has closed the front-facing camera.
In the above embodiment, when the front-facing camera is already opened by the AO service process, if the application program process also requests to open the front-facing camera, the priority ranking of the AO service process and the application program process is determined according to the self-defined process priority information. And when the priority of the AO business process is lower than that of the application program process, closing the camera opened by the AO business process, and opening the camera which is requested to be opened by the application program process.
911: the camera service module opens the camera requested to be opened for the application program process.
For example, in implementation, the camera service module opens a camera corresponding to the camera information 1 for the application process, so that the application process can normally make a picture through the camera.
In one embodiment, after the camera service module opens the camera requested to be opened by the application process, the AO service process is notified of a change of the camera state, that is, the following step 912 is performed. Therefore, the AO service process can monitor the state of the camera notified by the camera service module at the moment through the monitoring function, so that whether the AO service process executes the AO service is determined according to the state of the camera subsequently. For example, in the case where the AO service includes a scene of watching no-rest screen, after receiving the notification of the change of the camera state, the AO service may not detect that the user is operating the electronic device for a while, but may not perform human eye detection because it is determined that the application process is using the camera corresponding to the camera information 1.
And after the application program process normally makes a picture, the application program process requests the camera service module to close the camera corresponding to the camera information 1. In one embodiment, after the camera service module closes the camera opened by the application program process, the change condition of the camera state of the AO service process is notified, so that the AO service process subsequently determines whether to perform human eye detection according to the service requirement.
912: and the camera service module informs the AO service process of the camera state.
913: the AO service process monitors the state of the camera through a monitoring function.
914: and the AO service process judges whether the camera needs to be turned on again according to the service requirement.
Illustratively, after determining that the application program process has closed the front-facing camera, the AO service process determines whether the front-facing camera needs to be opened again according to service requirements. For example, if the AO service includes a staring-blind scene, it is determined to turn on the front camera again when the operation of the electronic device by the user is not detected for a period of time.
Referring to fig. 10, fig. 10 is a schematic flowchart of a method for controlling a camera according to an embodiment of the present disclosure, by way of example and not limitation, the method may be applied to the electronic device, and the method may include the following:
step 1001: and opening the target camera according to the first instruction of the first process.
As an example of the present application, the first process is an application process, and illustratively, the first process is a camera process.
The first instruction carries camera information, and a camera corresponding to the camera information is a target camera. The first instruction is used for requesting to open the target camera. In one example of the present application, the target camera is a front-facing camera.
As an example of the present application, this step is performed by the camera service module described above.
As an example of the present application, the target camera has a plurality of camera information corresponding thereto, and each of the plurality of camera information corresponds to one camera capability. That is, a physical camera may correspond to one or more camera information, each corresponding to a camera capability.
Step 1002: the second process requests to open the target camera.
As an example of the present application, the second process is an AO service process for performing a specified operation when the user's gaze on the screen is detected at the human eye detection occasion. The designation operation is determined according to a scene included in the AO service, and is, for example, to keep the screen in a lighted state in a case where the AO service includes a scene of gaze fixation.
As an example of the present application, a specific implementation of the second process requesting to open the target camera may include: and the second process acquires the camera information of the camera which can support the second process to realize the corresponding service. And the second process issues a second instruction for requesting to open the camera corresponding to the camera information, wherein the second instruction carries the camera information. And if the camera information carried in the second instruction corresponds to the target camera, determining that the second process requests to open the target camera.
As an example of the present application, a specific implementation of the second process acquiring the camera information of the camera capable of supporting the second process to implement the corresponding service may include: and the second process inquires a camera list, the camera list comprises a plurality of pieces of camera information, and each piece of camera information in the plurality of pieces of camera information corresponds to one type of camera capability. And determining the camera information capable of supporting the second process to realize the camera capability of the corresponding service from the plurality of camera information according to the camera capability corresponding to each camera information.
As an example of the present application, the second process issues a second instruction to the camera service module. Correspondingly, the camera service module judges whether the camera information carried in the second instruction corresponds to the target camera or not. If so, determining that the second process requests to open the target camera.
It is understood that when the second process requests to open the target camera, the camera use conflict problem is solved. As an example of the present application, it is determined by the camera service module whether there is a camera usage conflict. For example, if the camera information carried in the second instruction issued by the second process is the same as the camera information opened by the first process, it is determined that a conflict exists, or if the camera information carried in the second instruction corresponds to the same physical camera as the camera information opened by the first process, it is determined that a conflict exists. If there is a conflict, the process proceeds to step 1003.
In addition, as an example of the present application, before the second process sends the second instruction, the second process starts a monitoring module, and the monitoring module is configured to monitor a state of the camera.
Step 1003: and determining the system priority sequence of the first process and the second process, wherein the system priority of the second process is higher than that of the first process in the system priority sequence.
As an example of the present application, a specific implementation of determining the system prioritization of the first process and the second process may include: and acquiring the system priority score of the first process according to the process number of the first process, and acquiring the system priority score of the second process according to the process number of the second process. And determining the system priority sequence of the first process and the second process according to the system priority score of the first process and the system priority score of the second process.
Step 1004: and adjusting the system priority sequence to obtain a first priority sequence of the first process and the second process.
As an example of the present application, a specific implementation of adjusting the system prioritization may include: and arranging the priority of the first process before the priority of the second process according to the first identification information of the first process, the second identification information of the second process and the process priority information, wherein the process priority information comprises self-defined priority sequencing among different processes.
Illustratively, the first identification information of the first process is an application identification. The second identification information of the second process is a tag. For example, the second identification information is aoervice, and the first identification information is appid.
Illustratively, the process priority information is shown in table 1.
As an example of the present application, the process priority information includes the second identification information but does not include the first identification information. At this time, it is determined that the priority of the first process is higher than the priority of the second process, that is, the priority of the first process is ranked before the priority of the second process in the first prioritization.
As another example of the present application, the process priority information includes first identification information and second identification information, and at this time, and according to the priority information corresponding to the first identification information and the priority information corresponding to the second identification information, it is determined that the priority of the first process is higher than the priority of the second process, so in the first priority ranking, the priority of the first process is ranked before the priority of the second process.
Step 1005: and notifying the second process that the target camera is failed to be opened based on the first priority ranking.
Since the priority of the first process in the first priority ranking is higher than the priority of the second process, the electronic device notifies the second process that the target camera fails to be opened. At which point the second process continues to use the target camera.
It will be understood that in one embodiment, when the first process and the second process open different cameras, there is no conflict, such as the electronic device is configured with two front-facing cameras, and the AO service process and the application process open different front-facing cameras respectively, in which case the AO service process normally opens the front-facing camera and the application process normally opens the front-facing camera.
The problem of camera use conflicts has been solved so far. As an example of the present application, the method further comprises the following steps.
And under the condition that the second process monitors the notification that the target camera is closed through the monitoring module, determining whether to re-request to open the target camera according to the service requirement.
As described above, the second process starts the monitoring module, and when the monitoring module monitors the notification that the target camera is turned off, it indicates that the first process has turned off the target camera. At this time, the second process determines whether to reopen the camera according to the service requirement. Determining to reopen the front camera, for example if no user action is detected within a period of time.
In the embodiment of the application, the target camera is opened according to the first instruction of the first process. And the second process requests to open the target camera, and determines the system priority sequence of the first process and the second process, wherein the system priority of the second process is higher than that of the first process in the system priority sequence. And adjusting the system priority sequence of the first process and the second process to obtain a first priority sequence, wherein the priority of the first process is higher than that of the second process in the first priority sequence. Therefore, the second process is notified that the opening of the target camera has failed. The problem that camera use conflicts occur in two processes is solved.
As an example of the present application, after the first process closes the target camera and the second process has opened the target camera, please refer to fig. 11, the electronic device further performs the following operations.
Step 1101: the third process requests the target camera to be turned on.
As an example of the present application, the third process is an application process, and exemplarily, the third process is a camera process.
As an example of the application, the third process sends a third instruction to the camera service module, where the third instruction carries camera information of a camera to be opened.
Step 1102: and determining the system priority sequence of the second process and the third process, wherein the system priority of the second process is higher than that of the third process.
For specific implementation, reference may be made to the implementation manner for determining the system priority order of the first process and the second process, which is not described repeatedly here.
Step 1103: and adjusting the system priority sequence of the second process and the third process to obtain a second priority sequence of the second process and the third process.
For specific implementation, reference may be made to the implementation manner of adjusting the system priority order of the first process and the second process, which is not described repeatedly here.
Step 1104: and closing the target camera which is opened by the second process based on the second priority sequence.
As an example of the application, in the second priority ranking, the priority of the third process is higher than the priority of the second process, so the electronic device turns off the target camera that the second process has been turned on. Illustratively, the camera service module turns off the target camera on which the second process is turned on.
Step 1105: and opening the target camera according to the request of the third process.
As an example of the present application, the camera service module opens the target camera according to the camera information carried in the third instruction.
The problem of camera use conflicts has been solved so far. As an example of the present application, the electronic device further performs the following operation after opening the target camera according to a request of the third process.
Step 1106: and sending a notice of the change of the camera state to the second process.
As an example of the present application, the camera service module may notify the second process after opening the target camera according to the request of the third process, for example, the camera service module sends a notification of the camera status change to the second process.
Step 1107: and the second process monitors the notice of the state change of the camera through the monitoring module.
As an example of the present application, after the second process monitors the notification of the camera state change through the monitoring module, it may be determined that the third process is currently opening the target camera, so that even when the human eye detection opportunity is reached, the second process may not execute the AO service.
It should be noted that, the above description is given by taking an example in which the default process priority information includes priority information of the first process and/or the second process. Referring next to fig. 12, fig. 12 is a flow chart illustrating a method of controlling a camera according to another exemplary embodiment. The method can comprise the following steps:
please refer to 1001 to 1002 in the embodiment shown in fig. 10, which is not repeated here.
1203: and inquiring whether the self-defined process priority information comprises the priority information of the first process and/or the priority information of the second process or not according to the first identification information and the second identification information.
As an example of the present application, the first process is an application process and the second process is an AO business process.
If the self-defined process priority information comprises the label of the AO service process, the self-defined process priority information can be determined to comprise the priority information of the AO service process. Similarly, if the customized process priority information includes the application identifier, it may be determined that the customized process priority information includes the priority information of the application program process.
In one embodiment, where the customized process priority information includes priority information for AO service processes and/or priority information for application processes, the following step 1204 is performed. Otherwise, if the customized process priority information does not include the priority information of the AO service process and does not include the priority information of the application process, the following step 1205 is executed.
1204: and determining a first priority sequence according to the first identification information, the second identification information and the self-defined process priority information.
That is to say, before determining the priority ranking of the AO service process and the application program process according to the tag, the application identifier, and the customized process priority information of the AO service process, the electronic device first queries whether the customized process priority information includes the tag and the application identifier of the AO service process, so as to determine whether the customized process priority information includes the priority information of the AO service process and whether the customized process priority information includes the priority information of the application program process. In the case that it is determined that the customized process priority information includes the tag and/or the application identifier of the AO service process, the description may determine the priority ordering of the AO service process and the application process according to the customized process priority information.
1205: a first prioritization is determined based on the system priority of the first process and the system priority of the second process.
That is, if the self-defined process priority information does not include the label and the application identifier of the AO service process, the priority sequence of the AO service process and the application program process is determined according to the system priority of the AO service process and the system priority of the application program process.
In the embodiment of the application, before the priorities of the AO service process and the application program process are sorted according to the self-defined process priority information, whether the self-defined process priority information comprises the priority information of the AO service process and the priority information of the application program process is inquired. And if the priority information of the AO service process and the application program process is included, sequencing the priority of the AO service process and the application program process according to the self-defined process priority information, otherwise, sequencing the priority of the AO service process and the application program process according to the system priority. Therefore, the priority sequence of the AO business process and the priority sequence of the application program process can be ensured according to the system priority under the condition that the priorities of the AO business process and the application program process are not customized, so that the process which opens the camera is determined, and the conditions of system confusion and the like caused by conflict are avoided.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by functions and internal logic of the process, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 13 is a structural block diagram of an apparatus for controlling a camera provided in an embodiment of the present application, corresponding to the method for controlling a camera described in the foregoing embodiment, and only the portions related to the embodiment of the present application are shown for convenience of description.
Referring to fig. 13, the apparatus includes a camera service module 1310 and a first process module 1320, the first process module 1320 is configured to run a second process: as an example of this application, the first process module 1320 is the AO module 52 described above.
A camera service module 1310 for opening a target camera according to a first instruction of a first process;
a first process module 1320, configured to request to open a target camera;
a camera service module 1310 configured to determine a system prioritization of the first process and the second process, wherein the system priority of the second process is higher than the system priority of the first process in the system prioritization;
a camera service module 1310, configured to adjust the system priority order to obtain a first priority order of the first process and the second process;
a camera service module 1310 configured to notify the second process that the opening of the target camera fails based on the first priority ranking.
As an example of the present application, the camera service module 1310 is configured to:
and arranging the priority of the first process before the priority of the second process according to the first identification information of the first process, the second identification information of the second process and the process priority information, wherein the process priority information comprises self-defined priority sequencing among different processes.
As an example of the present application;
the first process module 1320 is configured to obtain camera information of a camera that can support a second process to implement a corresponding service;
the first process module 1320 is configured to issue a second instruction for requesting to open a camera corresponding to the camera information to the camera service module 1310, where the second instruction carries the camera information;
if the camera information carried in the second instruction corresponds to the target camera, the camera service module 1310 is configured to determine that the second process requests to open the target camera.
As an example of the present application, the first process module 1320 is for:
inquiring a camera list, wherein the camera list comprises a plurality of pieces of camera information, and each piece of camera information in the plurality of pieces of camera information corresponds to one type of camera capability;
and determining the camera information capable of supporting the second process to realize the camera capability of the corresponding service from the plurality of camera information according to the camera capability corresponding to each camera information.
As an example of the present application;
a first process module 1320, configured to start a monitoring module, where the monitoring module is configured to monitor a state of a camera;
the camera service module 1310 notifies the second process that the target camera is failed to be opened based on the first priority ranking, and the first process module 1320 is configured to determine whether to re-request to open the target camera according to the service requirement when the monitoring module monitors that the target camera is closed.
As an example of the present application, the apparatus further includes a second process module 1330, the second process module 1330 is configured to execute a third process, and the second process module 1330 is exemplarily the application module 54. Closing the target camera in the first process, and after the target camera is opened in the second process;
the second process module 1330 is configured to request to open the target camera;
the camera service module 1310 is configured to determine a system priority ranking of the second process and the third process, where the system priority of the second process is higher than the system priority of the third process;
the camera service module 1310 is configured to adjust the system priority of the second process and the system priority of the third process, and obtain a second priority of the second process and the third process;
the camera service module 1310 is configured to close the target camera that has been opened by the second process based on the second priority ranking;
the camera service module 1310 is configured to open the target camera according to a request of the third process.
As an example of the present application, after the camera service module 1310 opens the target camera according to the request of the third process, the camera service module sends a notification of the change of the camera state to the second process;
the first process module 1320 is configured to monitor a notification of a camera status change through the monitoring module.
As an example of the present application, the first process is an application process, the second process is an AO service process, and the third process is an application process, and the AO service process is configured to perform a specified operation when detecting that the user gazes at the screen at the human eye detection occasion.
As an example of the present application, the target camera has a plurality of camera information, and each of the plurality of camera information corresponds to one camera capability.
In the embodiment of the application, the target camera is opened according to the first instruction of the first process. And the second process requests to open the target camera, and system priority sequencing of the first process and the second process is determined, wherein the system priority of the second process is higher than that of the first process in the system priority sequencing. And adjusting the system priority sequence of the first process and the second process to obtain a first priority sequence, wherein the priority of the first process is higher than that of the second process in the first priority sequence. Therefore, the second process is notified that the opening of the target camera failed. The problem that camera use conflicts occur in two processes is solved.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. For the specific working processes of the units and modules in the system, reference may be made to the corresponding processes in the foregoing method embodiments, which are not described herein again.
In the above embodiments, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described or recited in any embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above may be implemented by instructing relevant hardware by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the embodiments of the methods described above may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or apparatus capable of carrying computer program code to an electronic device, such as a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, and software distribution media. Such as a usb-drive, a removable hard drive, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
Finally, it should be noted that: the above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A method of controlling a camera, the method comprising:
opening a target camera according to a first instruction of a first process, wherein the first process is an application program process;
a second process requests to open the target camera, wherein the second process is an AO service process, the AO service process is a system process, and the AO service process is used for indicating other processes to execute specified operation when detecting that a user gazes at a screen;
determining a system prioritization of the first process and the second process, the system prioritization of the second process being higher than the system prioritization of the first process;
according to the application identifier of the first process, the label of the second process and self-defined process priority information, arranging the system priority of the first process in front of the system priority of the second process to obtain a first priority sequence of the first process and the second process, wherein the process priority information comprises the self-defined priority sequence among different processes;
and notifying the second process that the target camera is failed to be opened based on the first priority sequence.
2. The method of claim 1, wherein the second process requesting the target camera to be opened comprises:
the second process acquires camera information of a camera which can support the second process to realize corresponding services;
the second process issues a second instruction for requesting to open a camera corresponding to the camera information, wherein the second instruction carries the camera information;
and if the camera information carried in the second instruction corresponds to the target camera, determining that a second process requests to open the target camera.
3. The method according to claim 2, wherein the acquiring, by the second process, the camera information of a camera capable of supporting the second process to implement the corresponding service comprises:
the second process inquires a camera list, wherein the camera list comprises a plurality of pieces of camera information, and each piece of camera information in the plurality of pieces of camera information corresponds to one type of camera capability;
and determining the camera information capable of supporting the second process to realize the camera capability of the corresponding service from the plurality of pieces of camera information according to the camera capability corresponding to each piece of camera information.
4. The method according to any one of claims 1-3, further comprising:
the second process starts a monitoring module, and the monitoring module is used for monitoring the state of the camera;
after notifying the second process that the target camera is failed to be opened based on the first priority ranking, the method further comprises:
and determining whether to re-request to open the target camera according to business requirements under the condition that the second process monitors the notification that the target camera is closed through the monitoring module.
5. The method of claim 1, wherein after the first process turns off the target camera and the second process has turned on the target camera, the method further comprises:
a third process requests to open the target camera, wherein the third process is an application program process;
determining a system priority ordering of the second process and the third process, the system priority of the second process being higher than the system priority of the third process;
after the system priority of the second process is ranked to the system priority of the third process, obtaining a second priority ranking of the second process and the third process;
closing the target camera on which the second process is opened based on the second priority ranking;
and opening the target camera according to the request of the third process.
6. The method of claim 5, wherein after the opening the target camera according to the request of the third process, further comprising:
sending a notification of camera state change to the second process;
and the second process monitors the notification of the change of the state of the camera through a monitoring module, and the monitoring module is used for monitoring the state of the camera.
7. The method of any of claims 1-2, 5-6, wherein the target camera has a plurality of camera information corresponding thereto, each camera information of the plurality of camera information corresponding to a camera capability.
8. An electronic device, comprising a memory and a processor;
the memory is used for storing a program supporting the electronic equipment to execute the method of any one of claims 1-7 and storing data involved in implementing the method of any one of claims 1-7; the processor is configured to execute programs stored in the memory.
9. A computer-readable storage medium having instructions stored thereon, which when run on a computer, cause the computer to perform the method of any one of claims 1-7.
CN202110911919.6A 2021-08-09 2021-08-09 Method for controlling camera, electronic device and computer readable storage medium Active CN113778641B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110911919.6A CN113778641B (en) 2021-08-09 2021-08-09 Method for controlling camera, electronic device and computer readable storage medium
PCT/CN2022/089557 WO2023015956A1 (en) 2021-08-09 2022-04-27 Method for controlling camera, and electronic device and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110911919.6A CN113778641B (en) 2021-08-09 2021-08-09 Method for controlling camera, electronic device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN113778641A CN113778641A (en) 2021-12-10
CN113778641B true CN113778641B (en) 2022-08-19

Family

ID=78837281

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110911919.6A Active CN113778641B (en) 2021-08-09 2021-08-09 Method for controlling camera, electronic device and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN113778641B (en)
WO (1) WO2023015956A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113778641B (en) * 2021-08-09 2022-08-19 荣耀终端有限公司 Method for controlling camera, electronic device and computer readable storage medium
CN116700815B (en) * 2022-10-21 2024-04-26 荣耀终端有限公司 Hardware resource control method, electronic device and readable storage medium
CN117725941A (en) * 2024-02-02 2024-03-19 荣耀终端有限公司 Code scanning method and electronic equipment
CN117971305A (en) * 2024-03-28 2024-05-03 荣耀终端有限公司 Upgrading method of operating system, server and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108429879A (en) * 2018-02-13 2018-08-21 广东欧珀移动通信有限公司 Electronic equipment, camera control method and Related product

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100785104B1 (en) * 2005-10-19 2007-12-12 엘지전자 주식회사 Apparatus and method for resource management of mobile telecommunication terminal
EP2504997A4 (en) * 2009-11-27 2013-07-17 Sentry Technology Corp Enterprise management system and auditing method employed thereby
US9400677B2 (en) * 2013-01-02 2016-07-26 Apple Inc. Adaptive handling of priority inversions using transactions
CN103677848B (en) * 2013-12-27 2018-11-16 厦门雅迅网络股份有限公司 A kind of camera control method based on Android
CN103793246B (en) * 2014-01-22 2017-09-05 深圳Tcl新技术有限公司 Coordinate the method and system of camera resource
CN106339268B (en) * 2016-09-12 2018-01-16 广东欧珀移动通信有限公司 application control method and intelligent terminal
US10212326B2 (en) * 2016-11-18 2019-02-19 Microsoft Technology Licensing, Llc Notifications for control sharing of camera resources
CN107589994B (en) * 2017-08-16 2020-12-01 深圳市爱培科技术股份有限公司 Method, device, system and storage medium for managing application process priority
CN109462726B (en) * 2017-09-06 2021-01-19 比亚迪股份有限公司 Camera control method and device
CN112399232A (en) * 2019-08-18 2021-02-23 海信视像科技股份有限公司 Display equipment, camera priority use control method and device
CN112363836A (en) * 2020-11-12 2021-02-12 四川长虹电器股份有限公司 Android system camera resource control method
CN113778641B (en) * 2021-08-09 2022-08-19 荣耀终端有限公司 Method for controlling camera, electronic device and computer readable storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108429879A (en) * 2018-02-13 2018-08-21 广东欧珀移动通信有限公司 Electronic equipment, camera control method and Related product

Also Published As

Publication number Publication date
WO2023015956A1 (en) 2023-02-16
CN113778641A (en) 2021-12-10

Similar Documents

Publication Publication Date Title
CN114467297B (en) Video call display method and related device applied to electronic equipment
CN113778641B (en) Method for controlling camera, electronic device and computer readable storage medium
CN113722058B (en) Resource calling method and electronic equipment
CN111913750B (en) Application program management method, device and equipment
CN112492193B (en) Method and equipment for processing callback stream
US20230189366A1 (en) Bluetooth Communication Method, Terminal Device, and Computer-Readable Storage Medium
CN114125130B (en) Method for controlling communication service state, terminal device and readable storage medium
CN114095599B (en) Message display method and electronic equipment
CN113254409A (en) File sharing method, system and related equipment
CN112543447A (en) Device discovery method based on address list, audio and video communication method and electronic device
CN114579389A (en) Application management method, device, equipment and storage medium
CN114115512A (en) Information display method, terminal device, and computer-readable storage medium
CN114995715B (en) Control method of floating ball and related device
CN114115770A (en) Display control method and related device
CN113641271A (en) Application window management method, terminal device and computer readable storage medium
CN113141483B (en) Screen sharing method based on video call and mobile device
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
CN114125793A (en) Bluetooth data transmission method and related device
CN114077365A (en) Split screen display method and electronic equipment
CN113438366B (en) Information notification interaction method, electronic device and storage medium
CN110609650B (en) Application state switching method and terminal equipment
WO2022242412A1 (en) Method for killing application, and related device
CN113950045B (en) Subscription data downloading method and electronic equipment
CN113645595B (en) Equipment interaction method and device
CN114765768A (en) Network selection method and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant