CN117546479A - Image processing method, device and storage medium - Google Patents

Image processing method, device and storage medium Download PDF

Info

Publication number
CN117546479A
CN117546479A CN202280004332.XA CN202280004332A CN117546479A CN 117546479 A CN117546479 A CN 117546479A CN 202280004332 A CN202280004332 A CN 202280004332A CN 117546479 A CN117546479 A CN 117546479A
Authority
CN
China
Prior art keywords
image
original image
display
terminal
original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280004332.XA
Other languages
Chinese (zh)
Inventor
刘晓林
宋伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Publication of CN117546479A publication Critical patent/CN117546479A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • H04N25/72Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors using frame transfer [FT]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure relates to an image processing method, an image processing device and a storage medium. The image processing method comprises the following steps: acquiring an original image acquired by an image acquisition sensor in a second terminal, and determining an image display instruction; obtaining a display image of the original image based on the image display instruction and the original image; and displaying the display image of the original image. The image processing method disclosed by the invention is used for processing the image, so that the implementation mode is quick and smooth.

Description

Image processing method, device and storage medium Technical Field
The disclosure relates to the technical field of terminals, and in particular relates to an image processing method, an image processing device and a storage medium.
Background
In the related art, in the image capturing sensor with small pixels, a processing mechanism of an original image captured by the image capturing sensor is included, and a technical method of image processing by the image capturing sensor with large pixels depends on an image front-end engine of a terminal device and an image processing mode of high coupling of the image processing engine, so that the captured original image cannot be processed directly by the image processing engine, and the image capturing sensor must be used on the same device as a picture processor.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image processing method, apparatus, and storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided an image processing method, applied to a first terminal, including:
acquiring an original image acquired by an image acquisition sensor in a second terminal, and determining an image display instruction;
obtaining a display image of the original image based on the image display instruction and the original image;
and displaying the display image of the original image.
In one embodiment, the obtaining a display image of the original image based on the image display instruction and the original image includes:
acquiring source cache data, wherein the source cache data is the source data of the original image cached in a memory;
based on the source cache data, a display image cache queue matched with the image display instruction is obtained;
and caching the display images in the display image cache queue to serve as the display images of the original images.
In one embodiment, the image display instruction includes a preview image display instruction, or a preview image display instruction and a photographed image display instruction of a preview image;
The obtaining a display image buffer queue matched with the image display instruction based on the source buffer data comprises the following steps:
acquiring the image size and the image format of the source cache data based on a session capture creation request included in the source cache data;
constructing a basic image processing target based on the image size and the image format of the source cache data;
configuring a target output stream of the basic image processing target, wherein the target output stream comprises an output stream for displaying a preview image of the original image, or an output stream for displaying the preview image of the original image and an output stream for displaying a photographing image of the preview image;
and generating a display image cache queue of the original image based on the display image corresponding to the target output stream.
In one embodiment, the session capture creation request included in the source cache data is created in advance in the following manner:
creating a capture request function, wherein parameters of the capture request function comprise a reprocessing instruction for the original image;
creating an input configuration object according to the display image format of the original image, wherein the input configuration object comprises the original image size and the image format;
Based on the created capture request function, the input configuration object is transmitted to the system bottom layer, and a session capture creation request for capturing the input configuration object is created by the system bottom layer.
In one embodiment, the acquiring the image size and the image format of the source cache data based on the session capture creation request included in the source cache data includes:
acquiring source cache data of the original image based on capturing operation corresponding to the multi-session capturing creation request;
and caching source cache data of the original image.
In one embodiment, the method further comprises:
setting a listening mechanism of the input configuration object in response to completion of creating the input configuration object;
based on the monitoring mechanism, the input configuration object transmitted to the bottom layer of the system is monitored.
In one embodiment, the method further comprises:
if the input configuration object transmitted to the bottom layer of the system is not monitored, displaying prompt information;
the prompt information is used for prompting that the input configuration object corresponding to the original image cannot be acquired.
According to a second aspect of embodiments of the present disclosure, there is provided an image processing method, applied to a second terminal, the image processing method including:
Acquiring an original image based on an image acquisition sensor;
and sending the original image to a first terminal, and obtaining a display image of the original image based on the image display instruction of the first terminal and the original image.
According to a third aspect of embodiments of the present disclosure, there is provided an image processing apparatus applied to a first terminal, the image processing apparatus including:
the determining unit is used for acquiring the original image acquired by the image acquisition sensor in the second terminal and determining an image display instruction;
the processing unit is used for obtaining a display image of the original image based on the image display instruction and the original image;
and a display unit for displaying the display image of the original image.
In one embodiment, the processing unit obtains the display image of the original image based on the image display instruction and the original image in the following manner:
acquiring source cache data, wherein the source cache data is the source data of the original image cached in a memory;
based on the source cache data, a display image cache queue matched with the image display instruction is obtained;
and caching the display images in the display image cache queue to serve as the display images of the original images.
In one embodiment, the image display instruction in the processing unit includes a preview image display instruction, or a preview image display instruction and a photographed image display instruction of a preview image;
the obtaining a display image buffer queue matched with the image display instruction based on the source buffer data comprises the following steps:
acquiring the image size and the image format of the source cache data based on a session capture creation request included in the source cache data;
constructing a basic image processing target based on the image size and the image format of the source cache data;
configuring a target output stream of the basic image processing target, wherein the target output stream comprises an output stream for displaying a preview image of the original image, or an output stream for displaying the preview image of the original image and an output stream for displaying a photographing image of the preview image;
and generating a display image cache queue of the original image based on the display image corresponding to the target output stream.
In one embodiment, the session capture creation request included in the processing unit source cache data is created in advance in the following manner:
creating a capture request function, wherein parameters of the capture request function comprise a reprocessing instruction for the original image;
Creating an input configuration object according to the display image format of the original image, wherein the input configuration object comprises the original image size and the image format;
based on the created capture request function, the input configuration object is transmitted to the system bottom layer, and a session capture creation request for capturing the input configuration object is created by the system bottom layer.
In one embodiment, the processing unit obtains an image size and an image format of the source cache data based on a session capture creation request included in the source cache data in the following manner:
acquiring source cache data of the original image based on capturing operation corresponding to the multi-session capturing creation request;
and caching source cache data of the original image.
In one embodiment, the processing unit is further configured to:
setting a listening mechanism of the input configuration object in response to completion of creating the input configuration object;
based on the monitoring mechanism, the input configuration object transmitted to the bottom layer of the system is monitored.
In one embodiment, the processing unit is further configured to:
if the input configuration object transmitted to the bottom layer of the system is not monitored, displaying prompt information;
The prompt information is used for prompting that the input configuration object corresponding to the original image cannot be acquired.
According to a fourth aspect of embodiments of the present disclosure, there is provided an image processing apparatus, applied to a second terminal, comprising:
the acquisition unit is used for acquiring an original image based on the image acquisition sensor and determining an image display instruction by the first terminal;
and the sending unit is used for sending the original image to a first terminal, and obtaining a display image of the original image based on the image display instruction of the first terminal and the original image.
According to a fifth aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
a processor; a memory for storing processor-executable instructions;
wherein the processor is configured to: the image processing method of the first aspect or any implementation manner of the first aspect is performed.
According to a sixth aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
a processor; a memory for storing processor-executable instructions;
wherein the processor is configured to: the image processing method of the second aspect or any one of the embodiments of the second aspect is performed.
According to a seventh aspect of the disclosed embodiments, there is provided a storage medium having stored therein instructions which, when executed by a processor of a terminal, enable the terminal to perform the image processing method of the first aspect or any one of the implementation manners of the first aspect.
According to an eighth aspect of the disclosed embodiments, there is provided a storage medium having stored therein instructions which, when executed by a processor of a terminal, enable the terminal to perform the image processing method of the second aspect or any one of the embodiments of the second aspect.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: the method comprises the steps of obtaining an original image collected by an image collecting sensor in a second terminal, determining an image display instruction, obtaining a display image of the original image based on the image display instruction and the original image, and displaying the display image of the original image. The image processing method provided by the embodiment of the disclosure realizes that the image processing engine can process the original image acquired by the image acquisition sensor in the second terminal, so that the first terminal and the second terminal are positioned at different terminals, and the purpose of previewing the image acquired by the second terminal on the first terminal in real time can be realized, and the realization mode is quick and smooth.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a method for processing an image applied to a first terminal according to an exemplary embodiment.
Fig. 2 is a flow chart illustrating a process for obtaining a display image of an original image according to an exemplary embodiment.
FIG. 3 is a flow chart illustrating a display image cache queue that obtains matching image display instructions according to an exemplary embodiment.
Fig. 4 is a flow diagram illustrating a pre-creation session capture request according to an example embodiment.
FIG. 5 is a schematic diagram illustrating a system underlying process, according to an example embodiment.
FIG. 6 is a flowchart illustrating an example of obtaining an image size and an image format of source cache data, according to an example embodiment.
Fig. 7 is a flowchart illustrating a listening mechanism applied to a first terminal image processing method according to an exemplary embodiment.
Fig. 8 is a flowchart illustrating a listening mechanism applied to a first terminal image processing method according to an exemplary embodiment.
Fig. 9 is a flowchart illustrating an image processing method applied to a second terminal according to an exemplary embodiment.
FIG. 10 is a schematic diagram of a system dataflow graph that is shown according to an exemplary embodiment.
Fig. 11 is a block diagram illustrating an image processing apparatus applied to a first terminal according to an exemplary embodiment.
Fig. 12 is a block diagram showing an image processing apparatus applied to a second terminal according to an exemplary embodiment.
Fig. 13 is a block diagram illustrating an apparatus for image processing according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure.
The image processing method, the device and the storage medium provided by the embodiment of the disclosure can be applied to the situation that the first terminal and the second terminal are positioned at different positions, and the scene of the image acquired by the second terminal can be previewed on the first terminal in real time. For example, the image processing method provided by the embodiment of the disclosure may be applied to an image acquired by a terminal such as an image acquisition sensor, and transmitted to a terminal such as a mobile phone through a network to perform image processing and preview a scene of image display.
In the related art, there is a technology in which a terminal can process only an image from the terminal, but cannot process images from other terminals in real time.
The image acquisition referred in this disclosure refers to image acquisition, which is that a camera and a terminal are linked with each other through an image acquisition card, and the image acquisition card receives an analog signal or a digital signal of the camera and converts signal processing into information suitable for the terminal.
The image processor (ISP) mainly comprises hardware modules such as IFE/BPS/IPE/JPEG, which respectively take on different image processing tasks. Wherein IFE (Image Front End) performs color correction, downsampling, demosaicing, and processing of the 3A data for previews and videos; BPS (Bayer processing segment) is mainly used for removing dead pixels, focusing phases, demosaicing, downsampling, HDR processing and Bayer mixed noise reduction processing of photographed image data; IPE (Image processing engine) mainly performs image processing such as hardware noise reduction, image cropping, noise reduction, color processing, detail enhancement, etc.; JPEG is used for storing photographing data, and JPEG encoding work is carried out through the hardware module.
The image processor referred to in this disclosure mainly uses BPS (Bayer processing segment), IPE (Image processing engine) modules for encoding images.
Currently, there is room for improvement in image processing technology, for example, a terminal uses a mobile phone as an example, so that a mobile phone camera operated by a user can process an original picture transmitted through a memory and display a preview. The method has the advantages that the purposes that the first terminal and the second terminal are located at different terminals, and images collected by the second terminal can be previewed on the first terminal in real time are achieved, and therefore the effect that the process of previewing the images by the second terminal is rapid and smooth is achieved.
The embodiment of the disclosure provides an image processing method, in which an original image acquired by a second terminal is acquired, and the original image is reprocessed to obtain a display image of the original image, wherein the display image of the original image displays a display image of the original image. The image processing method provided by the embodiment of the disclosure realizes that the image processing engine can process the original image acquired by the second terminal, the first terminal and the second terminal are positioned at different terminals, and the purpose that the image acquired by the second terminal can be previewed on the first terminal in real time is realized, so that the realization mode is quick and smooth.
Fig. 1 is a flowchart illustrating a method for processing an image applied to a first terminal according to an exemplary embodiment. As shown in fig. 1, the following steps are included.
In step S11, an original image acquired by an image acquisition sensor in the second terminal is obtained, and an image display instruction is determined.
In step S12, a display image of the original image is obtained based on the image display instruction and the original image.
In step S13, a display image of the original image is displayed.
In the embodiment of the disclosure, after the image acquisition sensor in the second terminal acquires the original image, the original image can be transmitted to the first terminal through the network, the first terminal receives the received original image and determines an image display instruction, and according to the image display instruction and the original image, the first terminal can preview the original image to be displayed.
For example, the first terminal may be a cell phone and the second terminal may be an image acquisition sensor. The original image acquired by the camera of the image acquisition sensor can be transmitted to the mobile phone through a network, an image engine module in an image processor of the mobile phone is utilized to determine an image display instruction for the original image, and according to the image display instruction and the original image, the original image can be previewed on a mobile phone screen, and it can be understood that the image acquired by the image acquisition sensor is displayed on the mobile phone screen.
In the embodiment of the disclosure, the first terminal and the second terminal in the image processing method provided by the disclosure may be different terminals, so that images in different positions can be viewed.
Fig. 2 is a flow chart illustrating a process for obtaining a display image of an original image according to an exemplary embodiment. As shown in fig. 2, a display image of an original image is obtained based on an image display instruction and the original image, including the following steps.
In step S21, source cache data, which is the source data of the original image cached in the memory, is acquired.
In step S22, a display image buffer queue matching the image display instruction is obtained based on the source buffer data.
In step S23, the display image in the display image buffer queue is taken as the display image of the original image.
In the embodiment of the disclosure, taking the mobile phone as an example, after the mobile phone acquires the original image, the original image is cached in the memory of the mobile phone, the bottom layer of the mobile phone system acquires source cache data according to the original image cached in the memory, and based on the image display instruction, performs a matching image display instruction on the source cache data of the original image to obtain a display image cache queue of the original image, and takes the display image in the display image cache queue as the display image of the original image.
According to the image display method provided by the embodiment of the disclosure, an original image can be input to the bottom layer of the system for image processing.
FIG. 3 is a flow chart illustrating a display image cache queue that obtains matching image display instructions according to an exemplary embodiment. As shown in fig. 3, the image display instruction includes a preview image display instruction, or a preview image display instruction and a photographing image display instruction of a preview image, and a display image buffer queue matching the image display instruction is obtained based on the source buffer data, including the following steps.
In step S31, the image size and the image format of the source cache data are acquired based on the session capture creation request included in the source cache data.
In the embodiment of the disclosure, a creation instruction is sent according to a session capture request of source cache data, and an image size and an image format of the source cache data are input into the session capture request of the source cache data. For example, the image size of the source buffer data is 1280×720, the image format is a RAW format, and then the image size obtained by the session is 1280×720, and the image format is a RAW format.
In step S32, a base image processing target is constructed based on the image size and the image format of the source cache data.
In the embodiment of the disclosure, a basic image processing target is constructed according to the image size and the image format of the source cache data. The basic image processing target refers to performing basic image processing on an image of the source cache data.
In step S33, a target output stream of the base image processing target is configured, the target output stream including an output stream of a preview image displaying an original image, or an output stream of a preview image displaying an original image and an output stream of a photographed image displaying a preview image.
In the embodiment of the disclosure, the process of the image display instruction needs to configure a target output stream of the basic image processing target, and the image displayed by the terminal is determined according to data in the target output stream. Wherein the target output stream includes an output stream of a preview image displaying the original image, or an output stream of a preview image displaying the original image and an output stream of a photographed image displaying the preview image.
It will be appreciated that the image size, format of the target output stream output may be determined by data preset to the first terminal.
In step S34, a display image buffer queue of the original image is generated based on the display image corresponding to the target output stream.
In the embodiment of the disclosure, according to the display image corresponding to the target output stream, a display image buffer queue of the original image may be generated, where the display image buffer queue of the original image is a way of outputting the display image, and there may be a plurality of images to be displayed in the display image buffer queue of the original image, and the sequence of the images to be displayed in the display image buffer queue determines the display sequence of the display image.
For example, the sequence of the images to be displayed in the buffer queue of the displayed images is the graph a and the graph b … … graph n, and then the sequence of the images to be displayed in the first terminal is the graph a and the graph b … … graph n.
Fig. 4 is a flow diagram illustrating a pre-creation session capture request according to an example embodiment. As shown in fig. 4, the source cache data includes the pre-creation of a session capture creation request, including the following steps.
In step S41, a capture request function is created, and a parameter of the capture request function includes a reprocessing instruction for an original image.
In step S42, an input configuration object is created from the display image format of the original image, the input configuration object including the original image size and the image format.
In step S43, the input configuration object is transferred to the system bottom layer based on the created capture request function, and a session capture creation request for capturing the input configuration object is created by the system bottom layer.
In the embodiment of the disclosure, a session capturing request of source cache data needs to be established, and data transmission between original image information received by a first terminal and a system bottom layer is established through the session capturing request.
The session capture request of the source cache data includes creating a capture request function, wherein parameters of the capture request function include an image display instruction for the original image. And creating an input configuration object according to the display image format of the original image, wherein the input configuration object comprises an image size and an image format. According to the created capture request function, the input configuration object is transmitted to the system bottom layer, and a session capture creation request for capturing the input configuration object is created by the system bottom layer.
FIG. 5 is a schematic diagram illustrating a system underlying process, according to an example embodiment. Referring to fig. 5, a first terminal is illustrated as a mobile phone. After receiving the reprocessing instruction, the mobile phone caches the obtained original image in a memory, creates a session capturing request, establishes an input configuration object according to the display image format of the original image, transmits the input configuration object to the bottom layer of the mobile phone system through the session capturing request, and processes the original image by the bottom layer of the mobile phone system.
The media stream of the mobile phone system bottom layer comprises an input configuration object and a target output stream, and the mobile phone system bottom layer carries out basic image processing on an original image according to the input configuration object and the target output stream. The base image processing performs processing such as syntax analysis of an image, statistics of an image, automatic adjustment of white balance, dead pixel removal of image data through an image processor, demosaicing, and mixed noise reduction processing of bayer, according to a bayer processing section.
In the above example, after the mobile phone receives the original image, an input configuration object is formed according to the image size and the image format, after the bottom layer of the mobile phone system receives the input configuration object through the media stream, the original image is subjected to grammar analysis, image statistics, automatic white balance adjustment and processing procedures of an image processor, so as to form a target output stream, and the image processed by the bottom layer of the system is output to the session through the target output stream.
In the embodiment of the disclosure, in order to realize display image coherence, the session may send the request multiple times.
FIG. 6 is a flowchart illustrating an example of obtaining an image size and an image format of source cache data, according to an example embodiment. As shown in fig. 6, the following steps are included.
In step S51, source buffer data of an original image is acquired based on a capturing operation corresponding to the multiple session capturing creation request.
In step S52, the source buffer data of the original image is buffered.
In the embodiment of the disclosure, multiple capturing operations may be performed according to the session capturing request, and the obtained original image is cached in the source cache data. Taking a mobile phone as an example, a session can send 12 requests every second, then data of 12 groups of original images are added in source cache data every second, the original image data is processed through a bottom layer of a mobile phone system, and a processing result is output to a mobile phone screen and displayed. It can be appreciated that the mobile phone can send the session request and perform the capturing operation multiple times, so that the display of the original image can be consistent, and the user can watch the original image on the screen of the mobile phone in the form of video.
In the embodiment of the disclosure, the source cache data can be successfully transmitted to the bottom layer of the system, and the first terminal can monitor the transmission process.
Fig. 7 is a flowchart illustrating a listening mechanism applied to a first terminal image processing method according to an exemplary embodiment. As shown in fig. 7, the following steps are included.
In step S61, in response to completion of creating the input configuration object, a listening mechanism of the input configuration object is set.
In step S62, the input configuration object transmitted to the system bottom layer is listened to based on the listening mechanism.
In the embodiment of the disclosure, a monitoring mechanism of an input configuration object is set in the first terminal image processing method, the monitoring object is a created input configuration object, monitoring is performed on the input configuration object transmitted to the bottom layer of the system,
fig. 8 is a flowchart illustrating a listening mechanism applied to a first terminal image processing method according to an exemplary embodiment. As shown in fig. 8, the following steps are included.
In step S71, if the input configuration object transmitted to the bottom layer of the system is not monitored, a prompt message is displayed.
In step S72, the prompt information is used to prompt that the input configuration object corresponding to the original image cannot be acquired.
In the embodiment of the disclosure, if the monitoring mechanism does not monitor that the input configuration object is transmitted to the bottom layer of the system, the monitoring mechanism displays prompt information, wherein the content of the prompt information is that the input configuration object corresponding to the original image cannot be acquired.
For example, there is an input configuration object that is not successfully transmitted to the system bottom layer, the listening mechanism listens to the input configuration object, and a prompt message is displayed at the first terminal, where the content of the prompt message is that the input configuration object cannot be obtained.
In the embodiment of the disclosure, the monitoring mechanism in the image processing method provided by the disclosure can monitor the result of the transmission of the input configuration object to the bottom layer of the system in real time, and if the transmission is unsuccessful, prompt is performed.
Fig. 9 is a flowchart illustrating an image processing method applied to a second terminal according to an exemplary embodiment. As shown in fig. 9, the image processing method includes the following steps.
In step S81, an original image is acquired based on the image acquisition sensor.
In step S82, the original image is transmitted to the first terminal, and a display image of the original image is obtained based on the image display instruction of the first terminal and the original image.
In the embodiment of the disclosure, the second terminal may be an image acquisition sensor, the first terminal may be a mobile phone, the image acquisition sensor acquires an original image, the original image is sent to the mobile phone, the mobile phone reprocesses the original image, and finally a display image of the original image is obtained on the mobile phone.
In the above example, the original image may be the sky, after the image acquisition sensor acquires the original image of the sky, the original image of the sky is sent to the mobile phone, the mobile phone reprocesss the original image, and finally the sky is displayed on the mobile phone screen.
FIG. 10 is a schematic diagram of a system dataflow graph that is shown according to an exemplary embodiment. Referring to fig. 10, in the embodiment of the present disclosure, the system data flow is divided into two data flows, one is a data flow and the other is a command flow.
One implementation of the disclosed embodiments is to implement image processing by a Java method. Taking a Java method, taking a first terminal as a mobile phone and a second terminal as an image collector as an example, the data flow of the image processing method is divided into two parts, wherein one part is an application sending session instruction and corresponding image data to a camera service, the camera service transmits the corresponding image data to a Hal layer after receiving the session instruction, and the Hal layer transmits the session instruction and the corresponding image data to an image processor.
The other part of the data flow of the image processing method is that the hardware network receives the original image data transmitted from the image collector, the original image data is transmitted to the mobile information network of the camera, then transmitted to the Java local interface and finally transmitted to the application, and the process of controlling the image collector to collect the image and preview the image on the mobile phone is achieved.
The command flow in the system data flow diagram, the command transmission flow direction, the command transmission between the application and the Java local interface, the command transmission between the Java local interface and the mobile information network of the camera, and the command transmission between the mobile information network of the camera and the hardware network.
The Java local interface, the mobile information network of the camera, the camera service and the Hal layer are classified as user layers, namely, the image processing process is not required to be carried out through a system bottom layer, and the hardware network and the image processor are classified as kernel layers, namely, the image processing process is carried out through the system bottom layer.
In summary, the image processing method provided by the embodiment of the present disclosure is an image processing scheme capable of satisfying that an image acquisition device and an original image display device are respectively located at different terminals and capable of displaying images, and realizing a fast and smooth implementation manner.
It should be understood by those skilled in the art that the various implementations/embodiments of the present disclosure may be used in combination with the foregoing embodiments or may be used independently. Whether used alone or in combination with the previous embodiments, the principles of implementation are similar. In the practice of the present disclosure, some of the examples are described in terms of implementations that are used together. Of course, those skilled in the art will appreciate that such illustration is not limiting of the disclosed embodiments.
Based on the same conception, the embodiment of the disclosure also provides an image processing device applied to the first terminal.
It may be understood that, in order to implement the above-mentioned functions, the image processing apparatus applied to the first terminal provided in the embodiments of the present disclosure includes a hardware structure and/or a software module that perform respective functions. The disclosed embodiments may be implemented in hardware or a combination of hardware and computer software, in combination with the various example elements and algorithm steps disclosed in the embodiments of the disclosure. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present disclosure.
Fig. 11 is a block diagram illustrating an image processing apparatus applied to a first terminal according to an exemplary embodiment. Referring to fig. 11, the image processing apparatus 100 applied to the first terminal includes a determination unit 101, a processing unit 102, and a display unit 103.
A determining unit 101, configured to acquire an original image acquired by the image acquisition sensor in the second terminal, and determine an image display instruction.
The processing unit 102 is configured to obtain a display image of the original image based on the image display instruction and the original image.
A display unit 103 for displaying a display image of the original image.
In one embodiment, the processing unit 102 is configured to: acquiring source cache data, wherein the source cache data is the source data of the original image cached in the memory; based on the source cache data, obtaining a display image cache queue matched with the image display instruction; and caching the display images in the queue as the display images of the original images.
In one embodiment, the image display instructions in the processing unit 102 include preview image display instructions, or preview image display instructions and photographic image display instructions of the preview image; based on the source cache data, obtaining a display image cache queue matching the image display instruction, including: acquiring the image size and the image format of the source cache data based on a session capture creation request included in the source cache data; constructing a basic image processing target based on the image size and the image format of the source cache data; configuring a target output stream of the basic image processing target, wherein the target output stream comprises an output stream of a preview image of an original image or an output stream of the preview image of the original image and an output stream of a shooting image of the preview image; and generating a display image cache queue of the original image based on the display image corresponding to the target output stream.
In one embodiment, the session capture creation request included in the processing unit 102 source cache data is pre-created in the following manner: creating a capture request function, wherein parameters of the capture request function comprise a reprocessing instruction for an original image; creating an input configuration object according to the display image format of the original image, wherein the input configuration object comprises the original image size and the image format; based on the created capture request function, the input configuration object is transferred to the system bottom layer, and a session capture creation request for capturing the input configuration object is created by the system bottom layer.
In one embodiment, the processing unit 102 obtains the image size and the image format of the source cache data based on the session capture creation request included in the source cache data in the following manner: acquiring source cache data of an original image based on capturing operation corresponding to a plurality of session capturing creation requests; the source buffer data of the original image is buffered.
In one embodiment, the processing unit 102 is further configured to: setting a listening mechanism of the input configuration object in response to completion of creating the input configuration object; based on the listening mechanism, listening is performed to the input configuration objects transmitted to the system bottom layer.
In one embodiment, the processing unit 102 is further configured to: if the input configuration object transmitted to the bottom layer of the system is not monitored, displaying prompt information; the prompt information is used for prompting that the input configuration object corresponding to the original image can not be acquired.
Based on the same conception, the embodiment of the disclosure also provides an image processing device applied to the second terminal.
Fig. 12 is a block diagram showing an image processing apparatus applied to a second terminal according to an exemplary embodiment. Referring to fig. 12, the image processing apparatus 200 applied to the first terminal includes an acquisition unit 201, and a transmission unit 202.
An acquisition unit 201, configured to acquire an original image based on an image acquisition sensor, and determine an image display instruction by a first terminal;
a transmitting unit 202, configured to transmit the original image to the first terminal, and obtain a display image of the original image based on the image display instruction of the first terminal and the original image.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 13 is a block diagram illustrating an apparatus 300 for image processing according to an exemplary embodiment. For example, apparatus 300 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 13, the apparatus 300 may include one or more of the following components: a processing component 302, a memory 304, a power component 306, a multimedia component 308, an audio component 310, an input/output (I/O) interface 312, a sensor component 314, and a communication component 316.
The processing component 302 generally controls overall operation of the apparatus 300, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 302 may include one or more processors 320 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 302 can include one or more modules that facilitate interactions between the processing component 302 and other components. For example, the processing component 302 may include a multimedia module to facilitate interaction between the multimedia component 308 and the processing component 302.
Memory 304 is configured to store various types of data to support operations at apparatus 300. Examples of such data include instructions for any application or method operating on the device 300, contact data, phonebook data, messages, pictures, videos, and the like. The memory 304 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 306 provides power to the various components of the device 300. The power components 306 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 300.
The multimedia component 308 includes a screen between the device 300 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 308 includes a front-facing camera and/or a rear-facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the apparatus 300 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 310 is configured to output and/or input audio signals. For example, the audio component 310 includes a Microphone (MIC) configured to receive external audio signals when the device 300 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 304 or transmitted via the communication component 316. In some embodiments, audio component 310 further comprises a speaker for outputting audio signals.
The I/O interface 312 provides an interface between the processing component 302 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 314 includes one or more sensors for providing status assessment of various aspects of the apparatus 300. For example, the sensor assembly 314 may detect the on/off state of the device 300, the relative positioning of the components, such as the display and keypad of the device 300, the sensor assembly 314 may also detect a change in position of the device 300 or a component of the device 300, the presence or absence of user contact with the device 300, the orientation or acceleration/deceleration of the device 300, and a change in temperature of the device 300. The sensor assembly 314 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. The sensor assembly 314 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 314 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 316 is configured to facilitate communication between the apparatus 300 and other devices, either wired or wireless. The device 300 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 316 receives broadcast signals or broadcast-related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 316 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 304, including instructions executable by processor 320 of apparatus 300 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
It is understood that the term "plurality" in this disclosure means two or more, and other adjectives are similar thereto. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It is further understood that the terms "first," "second," and the like are used to describe various information, but such information should not be limited to these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the expressions "first", "second", etc. may be used entirely interchangeably. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that "connected" includes both direct connection where no other member is present and indirect connection where other element is present, unless specifically stated otherwise.
It will be further understood that although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the scope of the appended claims.

Claims (12)

  1. An image processing method, applied to a first terminal, comprising:
    Acquiring an original image acquired by an image acquisition sensor in a second terminal, and determining an image display instruction;
    obtaining a display image of the original image based on the image display instruction and the original image;
    and displaying the display image of the original image.
  2. The image processing method according to claim 1, wherein the obtaining a display image of the original image based on the image display instruction and the original image includes:
    acquiring source cache data, wherein the source cache data is the source data of the original image cached in a memory;
    based on the source cache data, a display image cache queue matched with the image display instruction is obtained;
    and caching the display images in the display image cache queue to serve as the display images of the original images.
  3. The image processing method according to claim 2, wherein the image display instruction includes a preview image display instruction, or a preview image display instruction and a photographed image display instruction of a preview image;
    the obtaining a display image buffer queue matched with the image display instruction based on the source buffer data comprises the following steps:
    Acquiring the image size and the image format of the source cache data based on a session capture creation request included in the source cache data;
    constructing a basic image processing target based on the image size and the image format of the source cache data;
    configuring a target output stream of the basic image processing target, wherein the target output stream comprises an output stream for displaying a preview image of the original image, or an output stream for displaying the preview image of the original image and an output stream for displaying a photographing image of the preview image;
    and generating a display image cache queue of the original image based on the display image corresponding to the target output stream.
  4. The image processing method according to claim 3, wherein the session capture creation request included in the source cache data is created in advance in the following manner:
    creating a capture request function, wherein parameters of the capture request function comprise a reprocessing instruction for the original image;
    creating an input configuration object according to the display image format of the original image, wherein the input configuration object comprises the original image size and the image format;
    based on the created capture request function, the input configuration object is transmitted to the system bottom layer, and a session capture creation request for capturing the input configuration object is created by the system bottom layer.
  5. The image processing method according to claim 3 or 4, wherein the acquiring the image size and the image format of the source cache data based on the session capture creation request included in the source cache data includes:
    acquiring source cache data of the original image based on capturing operation corresponding to the multi-session capturing creation request;
    and caching source cache data of the original image.
  6. The image processing method according to claim 4, characterized in that the method further comprises:
    setting a listening mechanism of the input configuration object in response to completion of creating the input configuration object;
    based on the monitoring mechanism, the input configuration object transmitted to the bottom layer of the system is monitored.
  7. The image processing method according to claim 6, characterized in that the method further comprises:
    if the input configuration object transmitted to the bottom layer of the system is not monitored, displaying prompt information;
    the prompt information is used for prompting that the input configuration object corresponding to the original image cannot be acquired.
  8. An image processing method, applied to a second terminal, comprising:
    acquiring an original image based on an image acquisition sensor;
    And sending the original image to a first terminal, and obtaining a display image of the original image based on an image display instruction of the first terminal and the original image.
  9. An image processing apparatus, applied to a first terminal, comprising:
    the determining unit is used for acquiring the original image acquired by the image acquisition sensor in the second terminal and determining an image display instruction;
    the processing unit is used for obtaining a display image of the original image based on the image display instruction and the original image;
    and the display unit is used for displaying the display image of the original image.
  10. An image processing apparatus, characterized by being applied to a second terminal, comprising:
    the acquisition unit is used for acquiring an original image based on the image acquisition sensor and determining an image display instruction by the first terminal;
    and the sending unit is used for sending the original image to a first terminal, and obtaining a display image of the original image based on the image display instruction of the first terminal and the original image.
  11. An image processing apparatus, comprising:
    a processor;
    a memory for storing processor-executable instructions;
    Wherein the processor is configured to: an image processing method according to any one of claims 1 to 7 or an image processing method according to claim 8 is performed.
  12. A storage medium having instructions stored therein which, when executed by a processor of a terminal, enable the terminal to perform the image processing method of any one of claims 1 to 7, or to perform the image processing method of claim 8.
CN202280004332.XA 2022-06-08 2022-06-08 Image processing method, device and storage medium Pending CN117546479A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/097695 WO2023236115A1 (en) 2022-06-08 2022-06-08 Image processing method and apparatus, and storage medium

Publications (1)

Publication Number Publication Date
CN117546479A true CN117546479A (en) 2024-02-09

Family

ID=89117243

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280004332.XA Pending CN117546479A (en) 2022-06-08 2022-06-08 Image processing method, device and storage medium

Country Status (2)

Country Link
CN (1) CN117546479A (en)
WO (1) WO2023236115A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080104546A (en) * 2007-05-28 2008-12-03 삼성전자주식회사 Real-size preview system and control method in terminal that have digitial camera function
CN105263134A (en) * 2015-10-08 2016-01-20 惠州Tcl移动通信有限公司 Image transmission method and mobile equipment
CN106251279A (en) * 2016-08-18 2016-12-21 深圳市金立通信设备有限公司 A kind of image processing method and terminal
CN109857882A (en) * 2018-12-20 2019-06-07 惠州Tcl移动通信有限公司 Image processing method, device and storage medium
CN110602412B (en) * 2019-08-30 2022-04-29 北京迈格威科技有限公司 IPC, image processing device, image processing system and method
CN113810593B (en) * 2020-06-15 2023-08-01 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment

Also Published As

Publication number Publication date
WO2023236115A1 (en) 2023-12-14

Similar Documents

Publication Publication Date Title
US11368632B2 (en) Method and apparatus for processing video, and storage medium
CN105100829B (en) Video content intercept method and device
CN108419016B (en) Shooting method and device and terminal
US11539888B2 (en) Method and apparatus for processing video data
CN111953904B (en) Shooting method, shooting device, electronic equipment and storage medium
CN114009003A (en) Image acquisition method, device, equipment and storage medium
CN112261453A (en) Method, device and storage medium for transmitting subtitle splicing map
CN111726516A (en) Image processing method and device
CN110868561A (en) Video call method, video call device and computer readable storage medium
CN111586296B (en) Image capturing method, image capturing apparatus, and storage medium
CN111461950B (en) Image processing method and device
CN117546479A (en) Image processing method, device and storage medium
CN114397990A (en) Image distribution method and device, electronic equipment and computer readable storage medium
CN107682623B (en) Photographing method and device
CN113766115B (en) Image acquisition method, mobile terminal, device and storage medium
CN114500819B (en) Shooting method, shooting device and computer readable storage medium
CN114339015B (en) Photographing processing method, photographing processing device and storage medium
CN117412181A (en) Camera system and terminal device
CN118102095A (en) Camera control method, device, camera, readable storage medium and chip
CN118350346A (en) Document preview method and device, electronic equipment and storage medium
CN117956268A (en) Preview frame rate control method and device thereof
CN115733913A (en) Continuous photographing method and device and storage medium
CN115914757A (en) Multimedia data processing method, device and system
CN116029909A (en) Monitoring video processing method and device, chip and electronic equipment
CN118175420A (en) Image acquisition method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination