CN111432156A - Image processing method and device, computer readable medium and terminal equipment - Google Patents

Image processing method and device, computer readable medium and terminal equipment Download PDF

Info

Publication number
CN111432156A
CN111432156A CN202010266485.4A CN202010266485A CN111432156A CN 111432156 A CN111432156 A CN 111432156A CN 202010266485 A CN202010266485 A CN 202010266485A CN 111432156 A CN111432156 A CN 111432156A
Authority
CN
China
Prior art keywords
terminal
rotation
sensor
screen
video data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010266485.4A
Other languages
Chinese (zh)
Inventor
刘宇宁
白政英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Oppo Communication Technology Co ltd
Original Assignee
Chengdu Oppo Communication Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Oppo Communication Technology Co ltd filed Critical Chengdu Oppo Communication Technology Co ltd
Priority to CN202010266485.4A priority Critical patent/CN111432156A/en
Publication of CN111432156A publication Critical patent/CN111432156A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to the field of electronic device technologies, and in particular, to an image processing method, an image processing apparatus, a computer-readable medium, and a terminal device. The method comprises the following steps: the first terminal responds to the triggered target event, collects video data and acquires a screen direction parameter; when the first terminal screen is identified to turn over according to the screen direction parameters, acquiring sensor parameters of a target sensor to determine state information of the first terminal; when the first terminal is determined to be in a stable state, displaying a rotation selection menu in an interactive interface of the first terminal; wherein the rotation selection menu includes a confirm rotation option; and responding to the selected execution rotation option, and displaying the rotated video data at a second terminal in video call with the first terminal. The method realizes the autonomous selection and the automatic correction of the video visual angle by the user, and is convenient for the user of the video call to watch the video at the normal visual angle.

Description

Image processing method and device, computer readable medium and terminal equipment
Technical Field
The present disclosure relates to the field of electronic device technologies, and in particular, to an image processing method, an image processing apparatus, a computer-readable medium, and a terminal device.
Background
The functions of intelligent mobile terminal equipment such as mobile phones are increasingly abundant, and besides the basic communication function, video call can be carried out through application programs. In the prior art, when a video call is performed, if one terminal of the call performs shooting in an abnormal video direction, for example, when the terminal is turned over by 90 °, 180 °, or 270 °, the other terminal of the call receives a video image with a turned-over viewing angle, so that the viewing angle of the video image of the terminal at the other end is turned over, which is not favorable for a user to normally watch the video image.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure provides a proximity detection method, a proximity detection apparatus, a computer-readable medium, and a terminal device, which can automatically correct an angle of a video screen during a video call.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an image processing method including:
the first terminal responds to the triggered target event, collects video data and acquires a screen direction parameter;
when the first terminal screen is identified to turn over according to the screen direction parameters, acquiring sensor parameters of a target sensor to determine state information of the first terminal;
when the first terminal is determined to be in a stable state, displaying a rotation selection menu in an interactive interface of the first terminal; wherein the rotation selection menu includes a confirm rotation option;
and responding to the selected execution rotation option, and displaying the rotated video data at a second terminal in video call with the first terminal.
According to a second aspect of the present disclosure, there is provided an image processing apparatus comprising:
the screen direction parameter acquisition module is used for the first terminal to respond to the triggered target event, acquire video data and acquire a screen direction parameter;
the state information acquisition module is used for acquiring sensor parameters of a target sensor to determine state information of the first terminal when the first terminal screen is identified to turn over according to the screen direction parameters;
the menu display control module is used for displaying a rotary selection menu in an interactive interface of the first terminal when the first terminal is determined to be in a stable state; wherein the rotation selection menu includes a confirm rotation option;
and the rotation execution module is used for responding to the selected execution rotation option and displaying the rotated video data on a second terminal in video call with the first terminal.
According to a third aspect of the present disclosure, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the image processing method described above.
According to a fourth aspect of the present disclosure, there is provided a terminal device comprising:
one or more processors;
a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the image processing method described above.
According to the image processing method provided by the embodiment of the disclosure, video data are collected when a trigger target event is identified, meanwhile, a screen direction parameter of a terminal at the current moment is obtained, and whether the terminal is in a stable state is determined according to a sensor parameter; when the current screen of the terminal is turned over and is in a stable state, a rotation selection menu can be provided, so that a user can select whether to rotate the current video visual angle according to actual requirements, and the user can select autonomously. And then when the user selects to rotate, the visual angle of the current video data can be automatically turned to the normal visual angle, so that the second terminal at the receiving side can receive the video at the normal visual angle, and the user at the second terminal side can watch the video conveniently.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 schematically illustrates a flow diagram of an image processing method in an exemplary embodiment of the disclosure;
FIG. 2 schematically illustrates a system architecture diagram in an exemplary embodiment of the disclosure;
FIG. 3 schematically illustrates a flow diagram of an image processing method in an exemplary embodiment of the disclosure;
FIG. 4 schematically illustrates a diagram showing a rotation selection menu in an exemplary embodiment of the present disclosure;
fig. 5 is a diagram schematically illustrating a video screen of a screen flip when a first terminal initiates a video call in an exemplary embodiment of the present disclosure;
fig. 6 is a schematic diagram illustrating a video screen after performing view flipping in a video call initiated by a first terminal in an exemplary embodiment of the disclosure;
fig. 7 is a diagram schematically illustrating a video picture without performing view flipping received by a second terminal in an exemplary embodiment of the present disclosure;
fig. 8 is a diagram schematically illustrating a video picture received by a second terminal and performing view flipping in an exemplary embodiment of the present disclosure;
fig. 9 schematically illustrates a composition diagram of an image processing apparatus in an exemplary embodiment of the present disclosure;
fig. 10 schematically illustrates an electronic device structure diagram of a terminal device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In the existing intelligent terminal equipment, such as a mobile phone and a tablet computer, when a video call is performed, if an initiator terminal of the video call makes a video shot in an abnormal video direction (90 °, 180 °, 270 °), a receiver of the video call will receive video data with a reversed viewing angle of a screen. The video call function is very unfriendly because the user, after subjectively or objectively selecting the direction of the mobile terminal device, performs video, for example: under the conditions that a charging port of the mobile phone is occupied and cannot be vertically placed, and the like, a receiving party end user receives a rotated picture. Normal viewing by the user is not utilized. Alternatively, in some aspects, the first and second electrodes may,
in view of the above-described drawbacks and deficiencies of the prior art, an image processing method is provided in the present exemplary embodiment. Referring to fig. 1, the image processing method described above may include the steps of:
s11, the first terminal responds to the triggered target event, collects video data and obtains screen direction parameters;
s12, acquiring sensor parameters of a target sensor to determine state information of the first terminal when the first terminal screen is identified to turn over according to the screen direction parameters;
s13, when the first terminal is determined to be in a stable state, displaying a rotation selection menu in an interactive interface of the first terminal; wherein the rotation selection menu includes a confirm rotation option;
and S14, responding to the selected execution rotation option, and displaying the rotated video data on the second terminal in video call with the first terminal.
In the image processing method provided by the present exemplary embodiment, on one hand, video data starts to be acquired when a trigger target event is identified, and meanwhile, a screen direction parameter of the terminal at the current moment is acquired, and whether the terminal is in a stable state is determined according to a sensor parameter; when the current screen of the terminal is turned over and is in a stable state, a rotation selection menu can be provided, so that a user can select whether to rotate the current video visual angle according to actual requirements, and the user can select autonomously. On the other hand, when the user selects to execute the rotation, the view angle of the current video data can be automatically turned to the normal view angle, so that the second terminal at the receiving side can receive the video at the normal view angle, and the user at the second terminal side can watch the video conveniently.
Hereinafter, each step of the image processing method in the present exemplary embodiment will be described in more detail with reference to the drawings and examples.
In step S11, the first terminal collects video data in response to the triggered target event, and acquires a screen direction parameter.
For example, the target event may be an activation operation of a video call in an application program, such as a selection operation of a video call control in WeChat and QQ, or an activation of a video call function in a mid-system tool of the terminal device, such as a selection operation of a Vo L TE video call, and the like.
For example, referring to fig. 2, at the initiating end of a video call, a first terminal 101 may conduct a video call with a second terminal 102 through a network 103. Network 103 may include, among other things, various types of connections such as wired communication links, wireless communication links, and so forth. The first terminal 101 and the second terminal 102 may be smart phones or tablet computers equipped with camera components.
For the initiating end of the video call, when the first terminal 101 starts the video call function, the first terminal can start to acquire video data through the front camera or the rear camera; meanwhile, the screen direction parameter can be acquired at the first terminal. Specifically, when the target event is triggered, a process of acquiring a rotation angle may be created, and the screen rotation angle may be read to acquire a rotation value of the current screen. For example, when video data starts to be collected, a Rotation value of a screen may be obtained by a get Rotation () method in a Display class, and then whether the first terminal direction changes is determined. When the return value is 180, the screen is turned upside down; when the return value is 90, it is described that the screen is rotated by 90 °. For example, referring to fig. 5, when the first terminal initiates a video call, the first terminal screen is turned by 180 °; the small window in the current picture is a video with an overturned visual angle acquired by the first terminal, and the large window is a video picture with a normal visual angle acquired by the second terminal. Correspondingly, for the second terminal, referring to fig. 7, the large window displays the video frames received from the first terminal at the abnormal viewing angle, and the small window displays the video frames of the user at the normal viewing angle collected by the second terminal.
Or, if the read rotation value is 0, it indicates that the screen is not rotated. At this time, the periodic monitoring of the screen direction parameters can be executed, so that whether the screen direction is changed or not can be monitored in real time in the video call process until the video call is finished. For example, the monitoring period may be configured according to user requirements, for example, the monitoring period is configured to be 1s, 3s, and the like.
Step S12, when the first terminal screen is identified to turn over according to the screen direction parameter, acquiring the sensor parameter of a target sensor to determine the state information of the first terminal;
in this exemplary embodiment, when the first terminal determines that the screen rotation occurs, the sensor parameters of the plurality of sensors at the current time may be acquired, and the state information of the first terminal at the current time may be determined. Specifically, the output parameters of the gravity sensor and the output parameters of the acceleration sensor may be collected. Specifically, the step S12 may include:
step S121, calling a sensor management service to obtain an output parameter of the gravity sensor and an output parameter of the acceleration sensor;
step S122, when the output parameters of the gravity sensor and the acceleration sensor are consistent, judging that the first terminal is in a stable state; or
And S123, judging that the first terminal is in an unstable state when the output parameters of the gravity sensor and the acceleration sensor are not consistent.
For example, the output of the GRAVITY Sensor may be obtained through a Sensor type _ GRAVITY in a Sensor management service (Sensor Manager), and the output of the acceleration Sensor may be obtained through a Sensor type L initial _ acid L error in the Sensor management service, when the output results of the two sensors are the same or similar, or the difference between the values is within a certain range, it is determined that the first terminal is in a stable state, for example, when the device is in a stationary stable state, the output results of the two outputs are the same, or when the output results of the two sensors are different, it is determined that the first terminal is in an unstable state, and at this time, the process may be rolled back to step S11 to restart the recognition of the screen direction in the next monitoring period.
Step S13, when the first terminal is determined to be in a stable state, displaying a rotation selection menu in an interactive interface of the first terminal; wherein the rotation selection menu includes a confirm rotation option;
in this exemplary embodiment, for example, as shown in the flowchart of fig. 3, when the first terminal is judged to be in the stable state currently according to the parameter of the sensor, a popup window including a rotation selection menu may be generated, and the popup window is displayed in the interactive interface of the first terminal for the user to select. The rotation selection menu may include a plurality of options; for example, referring to FIG. 4, the rotation selection menu 40 may include a confirm rotation option, a do not rotate option, and a do not rotate and no longer alert option.
And step S14, responding to the selected execution rotation option, and displaying the rotated video data on the second terminal in video call with the first terminal.
In this example embodiment, after the user selects the execution rotation option in the interactive interface of the first terminal, a corresponding rotation control instruction may be generated. The first terminal executes the rotation control instruction.
For example, the rotation may be performed on the first terminal side, and specifically may include:
step S211, determining a rotation angle to be executed according to the screen direction parameter;
step S212, the first terminal invokes a media recording port to perform fixed rotation on the video data according to the rotation angle, and sends the rotated video data to the second terminal so that the second terminal displays the rotated video data.
For example, after it is determined to perform rotation, a corresponding rotation angle may be obtained according to the screen direction parameter obtained in the preceding step, and the view angle of the video data may be restored to the normal view angle through the rotation angle. For example, when the return value of the screen direction parameter is 90, which indicates that the first terminal screen is rotated by 90 °, the corresponding to-be-executed rotation angle is 270 °; or, when the return value of the screen direction parameter is 180 degrees, which indicates that the first terminal screen is rotated by 180 degrees, the corresponding rotation angle to be executed is 180 degrees.
Or determining a screen turning angle according to the screen direction parameter, and determining an angle range corresponding to the screen turning angle; and determining the corresponding rotation angle to be executed according to the angle range. For example, a fuzzy judgment can be made on the rotation angle, and if the rotation angle is less than 20 degrees, the video rotation is performed. For example, when the recognition screen is rotated by 70 ° to 100 °, it is determined that the rotation is 90 °.
In this exemplary embodiment, after the rotation angle is determined, the first terminal may utilize a media recorder set organization Hint () interface to perform fixed rotation on the video data at the first terminal side, and then transmit the rotated video to the second terminal, so that the video with the normal viewing angle after the viewing angle correction is displayed at the second terminal side. For example, referring to fig. 8, the second terminal displays the video of the normal viewing angle after the viewing angle correction; in addition, the first terminal may display the view-angle-corrected video, and the view-angle-corrected video may be displayed in a small window as shown in fig. 6. If the viewing angle is not corrected, the second terminal displays the video with the flipped viewing angle as shown in fig. 7.
Alternatively, in this exemplary embodiment, for example, the rotation may be performed on the second terminal side, and specifically, the method may include:
step S221, determining a rotation angle to be executed according to the screen direction parameter;
step S222, sending the video data and the rotation angle to the second terminal, so that the second terminal rotates the video data according to the rotation angle and displays the rotated video data.
Specifically, after the first terminal determines the rotation angle to be executed, the collected video data and the rotation angle may be packaged and sent to the second terminal, so that the second terminal rotates the video at the second terminal side according to the rotation angle after receiving the video data. In addition, the data packet may further include a rotation tag for marking that the video needs to be rotated, and the tag may be overwritten in the header of the data packet.
In addition, when the second terminal receives the video data including the rotation angle, a selection menu of the angle of view rotation may be generated in an interactive interface of the second terminal, and a user of the second terminal may determine whether to rotate the angle of view of the received video data.
Alternatively, in another exemplary embodiment of the present disclosure, if the user selects not to execute the rotation option, the method may further include:
step S31, in response to the selected non-execution rotation option, the first terminal sends the collected video data to the second terminal; and
step S32, acquiring a screen direction parameter of the first terminal at the current moment according to a preset time period, and acquiring a sensor parameter of a target sensor to determine the state information of the first terminal when recognizing screen turnover according to the screen direction parameter; when the first terminal is determined to be in a stable state, displaying a rotation selection menu in an interactive interface of the first terminal; or
And step S33, when the screen is not turned over according to the screen direction parameter, sending the video data to the second terminal.
For example, if the user chooses not to perform the flipping, the captured video data may be directly sent to the second terminal. At this time, the video displayed by the second terminal is a video with an abnormal viewing angle with an inverted viewing angle, as shown in fig. 7.
In addition, in the video call process, the screen direction and the parameters of the sensor can be obtained again according to the time nodes of the preset period, and whether the video data needs to be turned over through the visual angle at the first terminal side in the current video call is judged based on the screen direction and the parameters until the current video call is finished.
Or, in this exemplary embodiment, when the user selects the non-selection option and no longer prompts the option, the screen direction parameter is no longer collected in the current video call process until the video call is ended.
According to the method provided by the embodiment of the disclosure, when a user starts a video call function, the identification of the screen direction parameters is triggered, and when the current screen of the terminal is identified to rotate, the current state is determined through the plurality of sensor parameters, so that the video data can be turned over in a stable state, the transmitted video is the video data with a normal visual angle, the video data with the turned visual angle is prevented from being received by the other end of the video call, and the normal video data can be displayed at the other end of the video call. In addition, in the process of video call, the first terminal can periodically acquire the screen direction parameters, so that the visual angle of the video data can be corrected in time when a user turns over the mobile phone in the process of video call.
It is to be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, referring to fig. 9, the embodiment of the present example further provides an image processing apparatus 90, applied to a terminal device, including: a screen direction parameter acquisition module 901, a state information acquisition module 902, a menu display control module 903, and a rotation execution module 904. The screen direction parameter obtaining module 901 may be configured to, in response to a triggered target event, the first terminal collects video data and obtains a screen direction parameter.
The state information obtaining module 902 may be configured to obtain a sensor parameter of a target sensor to determine state information of the first terminal when the first terminal is identified to turn over according to the screen direction parameter.
The menu display control module 903 may be configured to display a rotation selection menu in an interactive interface of the first terminal when it is determined that the first terminal is in a stable state; wherein the rotation selection menu includes a confirm rotation option.
The rotation performing module 904 may be configured to display the rotated video data at a second terminal in video call with the first terminal in response to the selected performing rotation option.
In an example of the present disclosure, the screen direction parameter obtaining module 901 may include: and creating an acquisition rotation angle process in response to the triggered target event to acquire a rotation value of the first terminal screen.
In one example of the present disclosure, the sensor parameters include: the output parameters of the gravity sensor and the output parameters of the acceleration sensor.
The status information obtaining module 902 may include: a sensor parameter acquisition unit, a first state judgment unit, and a second state judgment unit (not shown in the figure).
The sensor parameter acquiring unit may be configured to invoke a sensor management service to acquire an output parameter of the gravity sensor and an output parameter of the acceleration sensor.
The first state determination unit may be configured to determine that the first terminal is in a stable state when the output parameter of the gravity sensor is consistent with the output parameter of the acceleration sensor.
The second state determination unit may be configured to determine that the first terminal is in an unstable state when the output parameter of the gravity sensor and the output parameter of the acceleration sensor are not identical.
In an example of the present disclosure, the rotation performing module 904 may include: a first rotation angle obtaining unit and a first rotation executing unit (not shown).
The first rotation angle obtaining unit may be configured to determine a rotation angle to be executed according to the screen direction parameter.
The first rotation executing unit may be configured to invoke a media recording port by the first terminal to perform fixed rotation on the video data according to the rotation angle, and send the rotated video data to the second terminal so that the second terminal displays the rotated video data.
In an example of the present disclosure, the rotation performing module 904 may include: a second rotation angle obtaining unit, a second rotation executing unit (not shown in the figure). Wherein the content of the first and second substances,
the second rotation angle obtaining unit may be configured to determine a rotation angle to be executed according to the screen direction parameter.
The second rotation executing unit may be configured to send the video data and the rotation angle to the second terminal, so that the second terminal rotates the video data according to the rotation angle and displays the rotated video data.
In one example of the present disclosure, the rotating selection of the dish further comprises: the rotation option is not executed. The apparatus 90 may further comprise: a second execution unit (not shown).
The second execution unit may be configured to, in response to the selected non-execution rotation option, send the captured video data to the second terminal by the first terminal; acquiring a screen direction parameter of the first terminal at the current moment according to a preset time period, and acquiring a sensor parameter of a target sensor to determine state information of the first terminal when the screen is identified to turn over according to the screen direction parameter; when the first terminal is determined to be in a stable state, displaying a rotation selection menu in an interactive interface of the first terminal; or when the screen is identified not to be turned over according to the screen direction parameter, the video data is sent to the second terminal.
In an example of the present disclosure, the first rotation angle obtaining unit may be configured to determine a screen turning angle according to the screen direction parameter, and determine an angle range corresponding to the screen turning angle; and determining the corresponding rotation angle to be executed according to the angle range.
The details of each module in the image processing apparatus are already described in detail in the corresponding image processing method, and therefore, the details are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Fig. 10 shows a schematic diagram of a wireless communication device suitable for implementing an embodiment of the invention.
It should be noted that the electronic device 600 shown in fig. 10 is only an example, and should not bring any limitation to the functions and the scope of the embodiments of the present disclosure.
As shown in fig. 10, the electronic device 600 may specifically include: a processor 610, an internal memory 621, an external memory interface 622, a Universal Serial Bus (USB) interface 630, a charging management module 640, a power management module 641, a battery 642, an antenna 1, an antenna 2, a mobile communication module 650, a wireless communication module 660, an audio module 670, a speaker 671, a receiver 672, a microphone 673, an earphone interface 674, a sensor module 680, a display 690, a camera module 691, a pointer 692, a motor 693, buttons 694, and a Subscriber Identity Module (SIM) card interface 695. Among other things, sensor modules 680 may include a depth sensor 6801, a pressure sensor 6802, a gyroscope sensor 6803, an air pressure sensor 6804, a magnetic sensor 6805, an acceleration sensor 6806, a distance sensor 6807, a proximity light sensor 6808, a fingerprint sensor 6809, a temperature sensor 6810, a touch sensor 6811, an ambient light sensor 6812, and a bone conduction sensor 6813.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the electronic device 600. In other embodiments of the present application, the electronic device 600 may include more or fewer components than illustrated, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 610 may include one or more processing units, such as: the Processor 610 may include an Application Processor (AP), a modem Processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural Network Processor (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 610 for storing instructions and data. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transmission instructions, and notification instructions, and execution is controlled by the processor 610. In some embodiments, the memory in the processor 610 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 610. If the processor 610 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 610, thereby increasing the efficiency of the system.
In some embodiments, processor 610 may include one or more interfaces. The Interface may include an Integrated Circuit (I2C) Interface, an Inter-Integrated Circuit built-in audio (I2S) Interface, a Pulse Code Modulation (PCM) Interface, a Universal Asynchronous Receiver Transmitter (UART) Interface, a Mobile Industry Processor Interface (MIPI), a General-purpose input/Output (GPIO) Interface, a Subscriber Identity Module (SIM) Interface, and/or a Universal Serial Bus (USB) Interface, etc.
The I2C interface is a bi-directional synchronous Serial bus that includes a Serial Data line (SDA) and a Serial Clock line (SC L). in some embodiments, the processor 610 may include multiple sets of I2C buses.the processor 610 may be coupled to the touch sensor 6811, the charger, the flash, the camera module 691, etc. via different I2C bus interfaces, for example, the processor 610 may be coupled to the touch sensor 6811 via an I2C interface, such that the processor 610 and the touch sensor 6811 communicate via an I2C bus interface, thereby implementing the touch function of the electronic device 600.
The I2S interface may be used for audio communication. In some embodiments, processor 610 may include multiple sets of I2S buses. The processor 610 may be coupled to the audio module 670 via an I2S bus to enable communication between the processor 610 and the audio module 670. In some embodiments, the audio module 670 may communicate audio signals to the wireless communication module 660 via an I2S interface to enable answering a call via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 670 and the wireless communication module 660 may be coupled by a PCM bus interface. In some embodiments, the audio module 670 may also transmit audio signals to the wireless communication module 660 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 610 and the wireless communication module 660. For example: the processor 610 communicates with the bluetooth module in the wireless communication module 660 through the UART interface to implement the bluetooth function. In some embodiments, the audio module 670 may transmit the audio signal to the wireless communication module 660 through the UART interface, so as to realize the function of playing music through the bluetooth headset.
The MIPI interface may be used to connect the processor 610 with the display screen 690, the camera module 691, and other peripheral devices. The MIPI Interface includes a Camera Serial Interface (CSI), a display screen Serial Interface (DSI), and the like. In some embodiments, the processor 610 and the camera module 691 communicate via a CSI interface to implement the camera function of the electronic device 600. The processor 610 and the display screen 690 communicate via the DSI interface to implement the display function of the electronic device 600.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 610 with the camera module 691, the display screen 690, the wireless communication module 660, the audio module 670, the sensor module 680, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 630 is an interface conforming to the USB standard specification, and may specifically be a MiniUSB interface, a microsusb interface, a USB type c interface, or the like. The USB interface 630 may be used to connect a charger to charge the electronic device 600, and may also be used to transmit data between the electronic device 600 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 600. In other embodiments of the present application, the electronic device 600 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 640 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 640 may receive charging input from a wired charger via the USB interface 630. In some wireless charging embodiments, the charging management module 640 may receive a wireless charging input through a wireless charging coil of the electronic device 600. The charging management module 640 may also supply power to the electronic device through the power management module 641 while charging the battery 642.
The power management module 641 is configured to connect the battery 642, the charging management module 640 and the processor 610. The power management module 641 receives the input from the battery 642 and/or the charging management module 640, and supplies power to the processor 610, the internal memory 621, the display screen 690, the camera module 691, the wireless communication module 660, and the like. The power management module 641 may also be configured to monitor battery capacity, battery cycle count, battery state of health (leakage, impedance), and other parameters. In some other embodiments, the power management module 641 may be disposed in the processor 610. In other embodiments, the power management module 641 and the charging management module 640 may be disposed in the same device.
The wireless communication function of the electronic device 600 may be implemented by the antenna 1, the antenna 2, the mobile communication module 650, the wireless communication module 660, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 600 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 650 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the electronic device 600. the mobile communication module 650 may include at least one filter, a switch, a power Amplifier, a low Noise Amplifier (L ow Noise Amplifier, L NA), etc. the mobile communication module 650 may receive electromagnetic waves from the antenna 1, filter, amplify, etc. the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 671, the receiver 672, etc.) or displays an image or video through the display screen 690. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 610, and may be located in the same device as the mobile communication module 650 or other functional modules.
The Wireless Communication module 660 may provide solutions for Wireless Communication applied to the electronic device 600, including Wireless L Area Networks (W L AN) (e.g., Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (Infrared, IR), and the like, the Wireless Communication module 660 may be one or more devices integrating at least one Communication processing module, the Wireless Communication module 660 may receive electromagnetic waves via the antenna 2, may receive electromagnetic wave signals and may perform filtering processing on the electromagnetic waves, and may transmit the processed signals to the processor 610, the Wireless Communication module 660 may further receive signals to be transmitted from the processor 610, may perform Frequency Modulation on the signals, may amplify the signals, and may convert the signals into electromagnetic wave radiation via the antenna 2.
In some embodiments, the antenna 1 of the electronic device 600 is coupled to the Mobile communication module 650, and the antenna 2 is coupled to the wireless communication module 660, such that the electronic device 600 may communicate with the network and other devices via wireless communication technologies, which may include Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Time-Division Code Division Multiple Access (TDSCDMA), Long-Term Evolution (Audio Evolution, L TE), GNSS, W L AN, NFC, FM, and/or IR technologies, etc. the Satellite may include Global Positioning System (Global Positioning System 3535), Global Navigation System (GPS), Satellite Navigation System (SBAS), Beidou Navigation System (Beidou Navigation System), Beidou Navigation System (GPS/Satellite Navigation System), Beidou Navigation System (GPS/GPS), Beidou Navigation System (Beidou Navigation System), and/or Beidou Navigation System (GPS).
The electronic device 600 implements display functions via the GPU, the display screen 690, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 690 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 610 may include one or more GPUs that execute program instructions to generate or alter display information.
The Display screen 690 may be a liquid Crystal Display (L acquired Crystal Display, L CD), Organic light Emitting diodes (Organic L0 light-Emitting diodes, O L ED), Active matrix Organic light Emitting diodes (Active-matrix Organic L light-Emitting diodes, AMO L ED), flexible light Emitting diodes (Flex-Emitting diodes, F L ED), minified, Micro L ED, Micro-O L ED, Quantum dot light diodes (Quantum dot L light-Emitting diodes, Q L ED), etc. in some embodiments, the electronic device 600 may include 1 or N Display screens 690, N being a positive integer greater than 1.
The electronic device 600 may implement a shooting function through the ISP, the camera module 691, the video codec, the GPU, the display screen 690, the application processor, and the like.
The ISP is used to process the data fed back by the camera module 691. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera module 691.
The camera module 691 is for capturing still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a Complementary Metal-Oxide-Semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 600 may include 1 or N camera modules 691, where N is a positive integer greater than 1, and if the electronic device 600 includes N cameras, one of the N cameras is the main camera.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 600 selects at a frequency bin, the digital signal processor is used to perform a fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 600 may support one or more video codecs. In this way, the electronic device 600 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a Neural-Network (NN) computing processor, which processes input information quickly by using a biological Neural Network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 600 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 622 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 600. The external memory card communicates with the processor 610 through the external memory interface 622 to implement data storage functions. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 621 may be used to store computer-executable program code, including instructions. The internal memory 621 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, phone book, etc.) created during use of the electronic device 600, and the like. In addition, the internal memory 621 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like. The processor 610 executes various functional applications of the electronic device 600 and data processing by executing instructions stored in the internal memory 621 and/or instructions stored in a memory provided in the processor.
The electronic device 600 may implement audio functions through the audio module 670, the speaker 671, the receiver 672, the microphone 673, the headset interface 674, an application processor, and the like. Such as music playing, recording, etc.
The audio module 670 is used to convert digital audio information into an analog audio signal output and also used to convert an analog audio input into a digital audio signal. The audio module 670 may also be used to encode and decode audio signals. In some embodiments, the audio module 670 may be disposed in the processor 610, or some functional modules of the audio module 670 may be disposed in the processor 610.
The speaker 671, also called "horn", is used to convert the electrical audio signals into sound signals. The electronic apparatus 600 can listen to music through the speaker 671 or listen to a hands-free call.
A receiver 672, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device 600 receives a call or voice information, it can receive voice by placing the receiver 672 close to the ear.
A microphone 673, also known as a "microphone", is used to convert acoustic signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal into the microphone 673 by making a sound near the microphone 673 through the mouth of the user. The electronic device 600 may be provided with at least one microphone 673. In other embodiments, the electronic device 600 may be provided with two microphones 673 to implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 600 may further include three, four, or more microphones 673 to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headset interface 674 is used to connect wired headsets. The headset interface 674 may be a USB interface 630, or may be a 3.5mm Open Mobile electronic device Platform (OMTP) standard interface, a Cellular Telecommunications Industry Association of america (CTIA) standard interface.
The depth sensor 6801 is used to obtain depth information of the scene. In some embodiments, the depth sensor may be disposed in the camera module 691.
The pressure sensor 6802 is used for sensing the pressure signal and converting the pressure signal into an electrical signal. In some embodiments, pressure sensor 6802 may be disposed on display 690. The pressure sensor 6802 can be of a wide variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 6802, the capacitance between the electrodes changes. The electronic device 600 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 690, the electronic apparatus 600 detects the intensity of the touch operation according to the pressure sensor 6802. The electronic apparatus 600 can also calculate the position of the touch from the detection signal of the pressure sensor 6802. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 6803 may be used to determine a motion pose of the electronic device 600. In some embodiments, the angular velocity of electronic device 600 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensors 6803. The gyro sensor 6803 can be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 6803 detects a shake angle of the electronic device 600, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 600 through a reverse movement, thereby achieving anti-shake. The gyro sensor 6803 can also be used for navigation and body feeling game scenes.
The air pressure sensor 6804 is for measuring air pressure. In some embodiments, the electronic device 600 calculates altitude, aiding in positioning and navigation from barometric pressure values measured by the barometric pressure sensor 6804.
The magnetic sensor 6805 comprises a hall sensor. The electronic device 600 may detect the opening and closing of the flip holster using the magnetic sensor 6805. In some embodiments, when the electronic device 600 is a flip, the electronic device 600 can detect the opening and closing of the flip according to the magnetic sensor 6805. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 6806 can detect the magnitude of acceleration of the electronic device 600 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 600 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 6807 for measuring distance. The electronic device 600 may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device 600 may utilize the distance sensor 6807 to measure distances to achieve fast focus.
The proximity light sensor 6808 can include, for example, a light emitting diode (L ED) and a light detector, such as a photodiode, the light emitting diode can be an infrared light emitting diode, the electronic device 600 emits infrared light outward through the light emitting diode, the electronic device 600 uses the photodiode to detect infrared reflected light from nearby objects, when sufficient reflected light is detected, it can be determined that there is an object near the electronic device 600, when insufficient reflected light is detected, the electronic device 600 can determine that there is no object near the electronic device 600, the electronic device 600 can use the proximity light sensor 6808 to detect that a user is holding the electronic device 600 near the ear for a call to automatically extinguish the screen for power savings.
The fingerprint sensor 6809 is for collecting a fingerprint. The electronic device 600 can utilize the collected fingerprint characteristics to achieve fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering, and the like.
The temperature sensor 6810 is used to detect temperature. In some embodiments, the electronic device 600 implements a temperature processing strategy using the temperature detected by the temperature sensor 6810. For example, when the temperature reported by the temperature sensor 6810 exceeds a threshold, the electronic device 600 performs a reduction in performance of a processor located near the temperature sensor 6810 to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 600 heats the battery 642 when the temperature is below another threshold to avoid a low temperature causing the electronic device 600 to shut down abnormally. In other embodiments, when the temperature is below a further threshold, the electronic device 600 performs a boost on the output voltage of the battery 642 to avoid an abnormal shutdown due to low temperatures.
The touch sensor 6811 is also referred to as a "touch device". The touch sensor 6811 may be disposed on the display screen 690, and the touch sensor 6811 and the display screen 690 form a touch screen, which is also referred to as a "touch screen". The touch sensor 6811 is used to detect a touch operation applied thereto or therearound. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 690. In other embodiments, the touch sensor 6811 can be disposed on the surface of the electronic device 600 at a different location than the display screen 690.
The ambient light sensor 6812 is used to sense the ambient light level. Electronic device 600 may adaptively adjust the brightness of display 690 based on the perceived ambient light level. The ambient light sensor 6812 can also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 6812 can also cooperate with the proximity light sensor 6808 to detect whether the electronic device 600 is in a pocket for protection against accidental touches.
The bone conduction sensor 6813 can acquire a vibration signal. In some embodiments, the bone conduction sensor 6813 can acquire vibration signals of the human voice vibrating a bone mass. The bone conduction sensor 6813 may receive a blood pressure pulsation signal in contact with the pulse of the human body. In some embodiments, the bone conduction sensor 6813 may also be disposed in a headset, integrated into a bone conduction headset. The audio module 670 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part acquired by the bone conduction sensor 6813, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure pulsation signal acquired by the bone conduction sensor 6813, so as to realize a heart rate detection function.
Keys 694 include a power-on key, a volume key, etc. Keys 694 may be mechanical keys. Or may be touch keys. The electronic apparatus 600 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 600.
The motor 693 may generate a vibration cue. The motor 693 can be used for incoming call vibration prompt and also for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 693 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 690. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 692 may be an indicator light that may be used to indicate a state of charge, a change in charge, or may be used to indicate a message, a missed call, a notification, etc.
The SIM card interface 695 is used for connecting a SIM card. The SIM card can be attached to and detached from the electronic device 600 by being inserted into the SIM card interface 695 or being pulled out of the SIM card interface 695. The electronic device 600 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 695 can support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 695 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 695 may also be compatible with different types of SIM cards. The SIM interface 695 may also be compatible with an external memory card. The electronic device 600 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 600 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 600 and cannot be separated from the electronic device 600.
In particular, according to an embodiment of the present invention, the processes described below with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the invention include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium. The computer program, when executed by a Central Processing Unit (CPU), performs various functions defined in the system of the present application.
It should be noted that the computer readable medium shown in the embodiment of the present invention may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present invention may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
It should be noted that, as another aspect, the present application also provides a computer-readable medium, which may be included in the electronic device described in the above embodiment; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 1.
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. An image processing method, comprising:
the first terminal responds to the triggered target event, collects video data and acquires a screen direction parameter;
when the first terminal screen is identified to turn over according to the screen direction parameters, acquiring sensor parameters of a target sensor to determine state information of the first terminal;
when the first terminal is determined to be in a stable state, displaying a rotation selection menu in an interactive interface of the first terminal; wherein the rotation selection menu includes a confirm rotation option;
and responding to the selected execution rotation option, and displaying the rotated video data at a second terminal in video call with the first terminal.
2. The image processing method according to claim 1, wherein the first terminal acquires the screen direction parameter in response to the triggered target event, and comprises:
and creating an acquisition rotation angle process in response to the triggered target event to acquire a rotation value of the first terminal screen.
3. The image processing method of claim 1, wherein the sensor parameters comprise: the output parameters of the gravity sensor and the output parameters of the acceleration sensor;
the acquiring sensor parameters of a target sensor to determine state information of the first terminal includes:
calling a sensor management service to acquire output parameters of a gravity sensor and output parameters of an acceleration sensor;
when the output parameters of the gravity sensor and the acceleration sensor are consistent, judging that the first terminal is in a stable state; or
And when the output parameters of the gravity sensor and the acceleration sensor are not consistent, judging that the first terminal is in an unstable state.
4. The image processing method according to claim 1, wherein the displaying the rotated video data at the second terminal in video call with the first terminal in response to the selected execution rotation option comprises:
determining a rotation angle to be executed according to the screen direction parameter;
and the first terminal calls a media recording port to fixedly rotate the video data according to the rotation angle and sends the rotated video data to the second terminal so that the second terminal displays the rotated video data.
5. The image processing method according to claim 1, wherein the displaying the rotated video data at the second terminal in video call with the first terminal in response to the selected execution rotation option comprises:
determining a rotation angle to be executed according to the screen direction parameter;
and sending the video data and the rotation angle to the second terminal so that the second terminal rotates the video data according to the rotation angle and displays the rotated video data.
6. The image processing method of claim 1, wherein the rotating a selection menu further comprises: not executing the rotation option; the method comprises the following steps:
in response to the selected non-rotation-execution option, the first terminal sending the collected video data to the second terminal; and
acquiring a screen direction parameter of the first terminal at the current moment according to a preset time period, and acquiring a sensor parameter of a target sensor to determine state information of the first terminal when the screen is identified to turn over according to the screen direction parameter; when the first terminal is determined to be in a stable state, displaying a rotation selection menu in an interactive interface of the first terminal; or
And when the screen is identified not to be turned over according to the screen direction parameter, sending the video data to the second terminal.
7. The image processing method according to claim 4 or 5, wherein the determining the rotation angle to be executed according to the screen direction parameter comprises:
determining a screen turning angle according to the screen direction parameters, and determining an angle range corresponding to the screen turning angle;
and determining the corresponding rotation angle to be executed according to the angle range.
8. An image processing apparatus characterized by comprising:
the screen direction parameter acquisition module is used for the first terminal to respond to the triggered target event, acquire video data and acquire a screen direction parameter;
the state information acquisition module is used for acquiring sensor parameters of a target sensor to determine state information of the first terminal when the first terminal screen is identified to turn over according to the screen direction parameters;
the menu display control module is used for displaying a rotary selection menu in an interactive interface of the first terminal when the first terminal is determined to be in a stable state; wherein the rotation selection menu includes a confirm rotation option;
and the rotation execution module is used for responding to the selected execution rotation option and displaying the rotated video data on a second terminal in video call with the first terminal.
9. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the image processing method of any one of claims 1 to 7.
10. A terminal device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the image processing method according to any one of claims 1 to 7.
CN202010266485.4A 2020-04-07 2020-04-07 Image processing method and device, computer readable medium and terminal equipment Pending CN111432156A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010266485.4A CN111432156A (en) 2020-04-07 2020-04-07 Image processing method and device, computer readable medium and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010266485.4A CN111432156A (en) 2020-04-07 2020-04-07 Image processing method and device, computer readable medium and terminal equipment

Publications (1)

Publication Number Publication Date
CN111432156A true CN111432156A (en) 2020-07-17

Family

ID=71552360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010266485.4A Pending CN111432156A (en) 2020-04-07 2020-04-07 Image processing method and device, computer readable medium and terminal equipment

Country Status (1)

Country Link
CN (1) CN111432156A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115484412A (en) * 2022-09-21 2022-12-16 高创(苏州)电子有限公司 Image processing method and device, video call method, medium and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102695034A (en) * 2012-05-30 2012-09-26 青岛海信移动通信技术股份有限公司 Method and device for regulating end display of video image during video call
CN103118242A (en) * 2012-11-16 2013-05-22 佳都新太科技股份有限公司 Video call image rectification method
CN104243830A (en) * 2014-09-29 2014-12-24 广东欧珀移动通信有限公司 Method and device for controlling camera to rotate
US20180227542A1 (en) * 2016-04-22 2018-08-09 Huizhou Tcl Mobile Communication Co., Ltd. Method and system for automatically correcting frame angle in mobile terminal video communication
CN109922204A (en) * 2017-12-13 2019-06-21 中兴通讯股份有限公司 Image processing method and terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102695034A (en) * 2012-05-30 2012-09-26 青岛海信移动通信技术股份有限公司 Method and device for regulating end display of video image during video call
CN103118242A (en) * 2012-11-16 2013-05-22 佳都新太科技股份有限公司 Video call image rectification method
CN104243830A (en) * 2014-09-29 2014-12-24 广东欧珀移动通信有限公司 Method and device for controlling camera to rotate
US20180227542A1 (en) * 2016-04-22 2018-08-09 Huizhou Tcl Mobile Communication Co., Ltd. Method and system for automatically correcting frame angle in mobile terminal video communication
CN109922204A (en) * 2017-12-13 2019-06-21 中兴通讯股份有限公司 Image processing method and terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115484412A (en) * 2022-09-21 2022-12-16 高创(苏州)电子有限公司 Image processing method and device, video call method, medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN111182140B (en) Motor control method and device, computer readable medium and terminal equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN111132234A (en) Data transmission method and corresponding terminal
CN111552451A (en) Display control method and device, computer readable medium and terminal equipment
CN114422340A (en) Log reporting method, electronic device and storage medium
CN111865646A (en) Terminal upgrading method and related device
CN114257920B (en) Audio playing method and system and electronic equipment
CN114880251A (en) Access method and access device of storage unit and terminal equipment
CN111061410B (en) Screen freezing processing method and terminal
CN112188094B (en) Image processing method and device, computer readable medium and terminal equipment
CN115514844A (en) Volume adjusting method, electronic equipment and system
CN113467735A (en) Image adjusting method, electronic device and storage medium
CN111935705A (en) Data service management method and device, computer readable medium and terminal equipment
CN111930335A (en) Sound adjusting method and device, computer readable medium and terminal equipment
CN111432156A (en) Image processing method and device, computer readable medium and terminal equipment
CN113674258B (en) Image processing method and related equipment
US20240183754A1 (en) Method and System for Measuring Motor Damping
CN113467747B (en) Volume adjusting method, electronic device and storage medium
CN115393676A (en) Gesture control optimization method and device, terminal and storage medium
CN114915747A (en) Video call method and related equipment
CN111586236A (en) Electronic equipment marking method and device, computer readable medium and electronic equipment
CN116782023A (en) Shooting method and electronic equipment
CN113672454B (en) Screen freezing monitoring method, electronic equipment and computer readable storage medium
CN114115513B (en) Key control method and key device
CN113364067B (en) Charging precision calibration method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200717

RJ01 Rejection of invention patent application after publication