WO2020057199A1 - 成像方法、装置和电子设备 - Google Patents

成像方法、装置和电子设备 Download PDF

Info

Publication number
WO2020057199A1
WO2020057199A1 PCT/CN2019/091580 CN2019091580W WO2020057199A1 WO 2020057199 A1 WO2020057199 A1 WO 2020057199A1 CN 2019091580 W CN2019091580 W CN 2019091580W WO 2020057199 A1 WO2020057199 A1 WO 2020057199A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
frame
exposure
original
original image
Prior art date
Application number
PCT/CN2019/091580
Other languages
English (en)
French (fr)
Inventor
***
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020057199A1 publication Critical patent/WO2020057199A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Definitions

  • the present application relates to the technical field of mobile terminals, and in particular, to an imaging method, device, and electronic device.
  • This application is intended to solve at least one of the technical problems in the related technology.
  • this application proposes an imaging method, which is implemented by an application program.
  • the dynamic range of the acquired original image is increased, and multiple frames of the original image obtained by the exposure are synthesized to improve The quality of the image in night scene mode.
  • the present application proposes an imaging device.
  • the present application proposes an electronic device.
  • the present application proposes a computer-readable storage medium.
  • An embodiment of one aspect of the present application proposes an imaging method, which is implemented by an application program and includes:
  • An embodiment of another aspect of the present application provides an imaging device, including:
  • a control module configured to control the image sensor to perform exposure using the exposure parameter through a hardware abstraction layer HAL to obtain multiple frames of original images
  • a synthesis module configured to synthesize the multi-frame images to obtain an imaging image
  • the display module displays the imaging image.
  • An embodiment of another aspect of the present application provides an electronic device, including: a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the program, the implementation is implemented as described above. Aspect of the imaging method.
  • Another embodiment of the present application provides a computer-readable storage medium on which a computer program is stored.
  • the program is executed by a processor, the imaging method according to the foregoing aspect is implemented.
  • the hardware sensor layer HAL controls the image sensor to use the exposure parameters for exposure to obtain multiple frames of the original image.
  • the multiple frames of the original image are synthesized to obtain the imaged image and the imaged image is displayed.
  • FIG. 1 is a schematic flowchart of an imaging method according to an embodiment of the present application.
  • FIG. 2 is a schematic flowchart of another imaging method according to an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of still another imaging method according to an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of an imaging device according to an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of another imaging device according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
  • FIG. 7 is a schematic diagram of an image processing circuit 90 in one embodiment.
  • FIG. 1 is a schematic flowchart of an imaging method according to an embodiment of the present application.
  • the method includes the following steps:
  • step 101 photometry is performed to determine exposure parameters corresponding to multiple frames of original images.
  • the execution subject of this embodiment of the present application is an application program.
  • the exposure parameter includes one or more combinations of exposure duration, sensitivity, and exposure compensation value.
  • a user shooting operation is detected, and when a user shooting operation is detected, a reference exposure amount is determined according to the brightness information of the current preview image, and the original of each frame is determined according to the reference exposure amount and an exposure compensation value preset for each frame of the original image.
  • the target exposure of the image is determined according to the target exposure of each frame of the original image and the preset sensitivity of each frame of the original image to determine the exposure time of each frame of the original image, thereby determining the exposure parameters of the multiple frames of the original image.
  • step 102 the image sensor is controlled by the hardware abstraction layer HAL to perform exposure using exposure parameters to obtain multiple frames of original images.
  • the original image refers to an almost unprocessed image obtained directly from the image sensor CCD or CMOS, such as an original image in RAW format.
  • the original image contains more image details. Processing and processing to obtain an imaging image can make the details of the brightness and dark parts in the imaging image clearer and improve the imaging quality.
  • the application program sends a control instruction to the hardware abstraction layer HAL, so that the hardware abstraction layer HAL calls an interface function according to the control instruction, and controls the image sensor to perform exposure using the determined exposure parameters to obtain multiple frames of original images corresponding to different exposure parameters.
  • step 103 a plurality of frames of original images are synthesized to obtain an imaging image.
  • the brightness information Y of the imaged image is determined based on the luminance component of the original image superimposed on each frame, and Color component synthesis is performed on the original images stored in the queue to obtain chroma information U and density information V of the imaged image, thereby obtaining an imaged image according to the determined brightness information Y, chroma information U, and density information V of the imaged image. .
  • Step 104 Display the imaged image.
  • the synthesized imaging image is displayed.
  • it can be displayed on the display screen of an electronic device, or it can be transmitted to a computer through a USB or wireless transmission device for display.
  • photometry is used to determine exposure parameters corresponding to multiple frames of original images
  • the image sensor is controlled by the hardware abstraction layer HAL to use exposure parameters to obtain multiple frames of original images.
  • the multiple frames of original images are synthesized to obtain Imaging image, displaying the imaging image, increasing the dynamic range of the acquired original image by setting different exposure parameters of the original image, and combining multiple frames of the original image obtained by the exposure, improving the image quality of the imaging image in night mode quality.
  • an embodiment of the present application further proposes an imaging method, which further clearly explains how to determine the exposure parameters of the original image to be acquired by each frame through metering.
  • FIG. 2 is another example provided by the embodiment of the present application. A schematic flowchart of an imaging method is shown in FIG. 2. Step 101 may further include the following sub-steps:
  • Step 1011 Detect a user shooting operation.
  • the application terminal detects a user's shooting operation, where the shooting operation may be a click shooting operation or a sliding shooting operation.
  • Step 1012 When a user shooting operation is detected, a reference exposure amount is determined according to the brightness information of the current preview image.
  • the reference exposure amount is determined according to the brightness information of the preview image.
  • the photometric module in the electronic device measures the preview image corresponding to the current shooting scene. Brightness information, and convert the measured brightness information with the set lower sensitivity to determine the reference exposure and set it to EVO.
  • the sensitivity measured by the photometry module is 500iso and the exposure time It is 50 milliseconds (ms) and the target sensitivity is 100iso. After conversion, the sensitivity is 100iso, the exposure time is 250ms, and the sensitivity is 100iso, and the exposure time is 250ms as the reference exposure amount EVO.
  • EVO is not a fixed value, but a value that changes according to the brightness information of the preview image.
  • the brightness information of the preview image changes, so the reference exposure EV0 is also Changed.
  • Step 1013 Determine the target exposure of the original image in each frame according to the reference exposure and the preset exposure compensation value of the original image in each frame.
  • the preset exposure compensation value of each frame of the original image ranges from EV (-24) to EV (+12). According to the scene, the number of frames of the captured original image is set, the interval of the exposure compensation value is set, or the exposure compensation value corresponding to each frame is set. Among them, “+” means increase the exposure based on the reference exposure set by metering, and “-” means decrease the exposure. The corresponding number is the number of steps to compensate the exposure. According to the preset exposure compensation value and The reference exposure amount determines the target exposure amount of the original image to be collected for each frame.
  • the exposure compensation value of an original image frame is EV (-6)
  • the number -6 is the number of steps to compensate the exposure
  • the reference exposure amount is EVO.
  • the determined target exposure of the original image of the frame is EVO * 2 -6 , that is, EVO / 64, which reduces the brightness of the original image acquisition of the frame; if the exposure compensation value of the original image of a frame is EV (+2), the benchmark If the exposure is EVO, the target exposure of the original image of the frame is determined as EVO * 4, which is 4 times the EVO, that is, the brightness of the original image acquisition of the frame is increased. Similarly, the target of the original image for each other frame is to be acquired The method of confirming the exposure amount is the same, and will not be listed here one by one.
  • Step 1014 Determine the exposure duration of each frame of the original image according to the target exposure of each frame of the original image and the preset sensitivity of each frame of the original image.
  • the preset sensitivity of each frame of the original image is the same, and the sensitivity value ranges from 100 ISO to 200 ISO.
  • the aperture value is fixed.
  • the target exposure amount is determined by the sensitivity and the exposure time.
  • the sensitivity is determined, it corresponds to The exposure time can be determined.
  • the sensitivity IOS value and exposure duration corresponding to the reference exposure are divided into: 100iso and 250ms, the preset sensitivity of a frame of the original image is 100iso, and the preset exposure compensation value is EV (-3).
  • the target exposure duration of the acquired image is That is 32ms, which means that the exposure time is reduced.
  • the exposure compensation value is EV (+1)
  • the obtained exposure time is 500ms, which means that the exposure time is increased.
  • the exposure time of each frame of the original image can be determined.
  • the dynamic range of the original image imaging is enlarged by determining the exposure parameters corresponding to the multiple frames of the original image.
  • the hardware abstraction layer HAL is used to control the image sensor to use the exposure parameters for exposure to obtain multiple frames of the original image. Multiple frames of original images are synthesized to obtain an imaged image, and the imaged image is displayed.
  • the dynamic range of the obtained original image is increased, and multiple frames of original image obtained by exposure are synthesized. Improved image quality in night scene mode.
  • FIG. 3 is a schematic flowchart of another imaging method provided by an embodiment of the present application. As shown in FIG. 3, the method includes the following steps:
  • Step 301 Metering to determine exposure parameters corresponding to multiple frames of original images.
  • steps 1011-1014 in the embodiment corresponding to FIG. 2.
  • the principles are the same, and details are not described herein again.
  • Step 302 Set the format and image size of each frame of the original image collected by the image sensor by calling the interface between the application and the HAL.
  • an interface function mCameraDevice.createReprocessableCaptureSession between the application and the HAL can be called to set the format and image size of each frame of the original image collected by the image sensor.
  • the original image can be set RAW format, image size is 12bit or 14bit, etc.
  • step 303 the image sensor is controlled by the hardware abstraction layer HAL to use the exposure parameters to obtain the original image. For each frame of the original image, a queue created in advance by the application program is used to store a frame of the original image.
  • the hardware sensor layer HAL is used to control the image sensor to use the corresponding exposure parameters to sequentially perform exposure. Whenever the image sensor is controlled to perform one exposure to obtain one frame of the original image, Store a frame of the original image using a queue created in advance by the application.
  • Step 304 Whenever a subsequent frame of the original image is stored in the queue, the luminance component of the subsequent frame of the original image and the original image stored in the queue are superimposed, and the image obtained by superimposing the luminance component is used as an echo image.
  • the brightness component of the subsequent frame of the original image and the stored one frame of the original image are superimposed, specifically: , Superimposing the brightness components of the pixels corresponding to the two frames of the original image, and using the image obtained by superimposing the brightness components as an echo image, and then performing step 305 to perform image processing on the echo image.
  • step 305 is executed to perform image processing on the obtained echoed image.
  • the luminance component is superimposed on the original image already stored in the queue, the luminance component obtained by the superposition is acquired, and the luminance component obtained by the superposition is used as the current one obtained
  • the brightness component of the original frame image is used as the echo image, that is, the brightness component of the echo image is the superimposed brightness component.
  • step 305 the interface between the application program and the HAL is called, image processing is performed on the obtained one-frame echo image, and the processed echo image is displayed.
  • the image processing includes one or more combinations of color space conversion, noise reduction, sharpness adjustment, and color correction.
  • an interface between the application program and the HAL is called, and a processing request is sent to the HAL by calling the interface, where the processing request is used to indicate a storage location of the echoed image before image processing, and corresponding image information, where the image The information includes the image size and / or shooting parameters.
  • the HAL reads the echo image before image processing according to the storage location, and sends the echo image and corresponding image information to the image processor for image processing. Specifically, the echo image is displayed.
  • the encoder encodes the echo image, so that the echo image after the image processing is converted from the YUV space to the RGB space, so that the echo image after the image processing can be displayed on the display interface of the application program.
  • Image processing and display so that users can see the display effect of the superimposed image obtained in time The user can be informed of the shooting process. Because multiple frames of original image acquisition and processing take a certain amount of time, the user is prevented from mistakenly thinking that the shooting is stuck or not responding, which improves the user experience.
  • step 307 the acquired multiple frames of original images are synthesized to obtain an imaging image.
  • each pixel point corresponding to multiple frames of the original image is weighted by the chrominance information U to obtain the chrominance information U of the imaged image.
  • the weight information V is obtained by weight calculation.
  • the brightness information Y of the imaged image is determined according to the brightness component of the original image superimposed in each frame. Furthermore, based on the determined brightness information Y, chroma information U, and density information V of the determined imaged image, an imaged image is obtained.
  • Step 308 Display the imaged image.
  • the obtained imaging image is subjected to image processing, noise reduction, sharpness adjustment, and color correction are performed on the imaging image, and the obtained imaging image is converted from YUV space to RGB space.
  • image processing noise reduction, sharpness adjustment, and color correction are performed on the imaging image
  • the obtained imaging image is converted from YUV space to RGB space.
  • a JPEG format Imaging an image so that a display device that supports RGB display can display the imaging image.
  • the exposure parameters of the original image to be acquired are determined, so that the dynamic range is large.
  • each frame is acquired
  • the original image is stored in a queue created in advance by the application, and the brightness component of the original image collected in the next frame and the stored original image or the obtained echo image is superimposed, so that the superimposed brightness component is
  • the original image of the next frame of the brightness information is displayed, so that the user can see the effect of the original image superimposed in time, understand the shooting process, and after obtaining all the multiple original images to be collected, combine the original images to obtain the imaging.
  • the image so that the brighter part is suppressed, and the darker part is also improved in brightness and detail, which improves the dynamic range in night scene mode and improves the imaging quality.
  • the present application also proposes an imaging device.
  • FIG. 4 is a schematic structural diagram of an imaging device according to an embodiment of the present application.
  • the device includes a determination module 41, a control module 42, a synthesis module 43, and a display module 44.
  • the determining module 41 is configured to perform photometry to determine exposure parameters corresponding to multiple frames of images.
  • the control module 42 is configured to control the image sensor to use an exposure parameter to perform exposure through a hardware abstraction layer HAL to obtain multiple frames of original images.
  • a synthesizing module 43 is configured to synthesize multiple frames of images to obtain an imaging image.
  • the display module 44 displays the imaged image.
  • photometry is used to determine exposure parameters corresponding to multiple frames of original images
  • the image sensor is controlled by the hardware abstraction layer HAL to use exposure parameters to obtain multiple frames of original images.
  • Imaging image displaying the imaging image, increasing the dynamic range of the acquired original image by setting different exposure parameters of the original image, and combining multiple frames of the original image obtained by the exposure, improving the image quality of the image in night scene quality.
  • FIG. 5 is a schematic structural diagram of another imaging device provided by the embodiment of the present application. Based on the previous embodiment, The device further includes a setting module 51, a storage module 52, and a processing module 53.
  • a setting module 51 is used to set the format and image size of each frame of the original image collected by the image sensor by calling the interface between the application and the HAL.
  • the storage module 52 is configured to store a frame of the original image by using a queue created in advance by the application program each time the image sensor is controlled to perform an exposure to obtain a frame of the original image.
  • a processing module 53 is configured to superimpose the brightness components on the original images stored in the queue to obtain an echoed image and display the echoed image; whenever a subsequent original image is stored in the queue, the subsequent original image is transmitted. Superimpose the luminance component on the echo image again, and display the image obtained by superimposing the luminance component again as an echo image.
  • the processing module 53 is further configured to perform image processing on the obtained one-frame echo image by invoking an interface between the application program and the HAL whenever an one-frame echo image is obtained; image processing Including: one or more combinations of color space conversion, noise reduction, sharpness adjustment, and color correction.
  • processing module 53 is further specifically configured to:
  • a processing request is sent to the HAL, and the processing request is used to indicate the storage location of the echoed image before image processing, and the corresponding image information; wherein the image information, including the image size and / or shooting parameters, is used by the HAL according to After reading the storage location to obtain the echoed image before image processing, image processing is performed.
  • the foregoing synthesis module 43 is specifically configured to:
  • Color component synthesis is performed on the original images stored in the queue to obtain the chrominance information U and the density information V of the imaged image; the brightness information Y of the imaged image is determined according to the luminance component superimposed on the original image of each frame.
  • the foregoing determining module 41 is specifically configured to:
  • Detect user shooting operation when user shooting operation is detected, determine the reference exposure based on the current brightness information of the preview image; determine the target of each frame of the original image based on the reference exposure and the exposure compensation value preset for each frame of the original image Exposure amount; according to the target exposure amount of each frame of the original image and the preset sensitivity of each frame of the original image, determine the exposure time of each frame of the original image.
  • the exposure parameter includes one or more combinations of exposure duration, sensitivity, and exposure compensation value.
  • the preset sensitivity of the original image of each frame is the same, and the sensitivity value ranges from 100ISO to 200ISO; the preset exposure compensation value of each frame of the original image ranges from EV (-24) to EV (+12).
  • the exposure parameter of the original image to be acquired is determined by detecting the light of the current shooting environment, so that the dynamic range is large.
  • each frame is acquired
  • the original image is stored in a queue created in advance by the application, and the luminance component of the original image collected in the next frame and the stored original image and the echo image obtained are superimposed, so that the superimposed luminance component is
  • the original image of the next frame of the brightness information is displayed, so that the user can see the effect of the original image superimposed in time, understand the shooting process, and after obtaining all the multiple original images to be collected, combine the original images to obtain the imaging.
  • Image and display the imaged image, in the night scene shooting mode, based on the large dynamic range, the original image is superimposed and synthesized, so that the brighter part is suppressed, and the darker part is also brighter and more detailed. Improved to improve the shooting quality in night scene mode.
  • an embodiment of the present application further provides an electronic device including a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the program, The imaging method described in the foregoing method embodiment is implemented.
  • FIG. 6 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
  • the electronic device 200 includes a processor 60, a memory 50 (for example, a non-volatile storage medium), an internal memory 82, a display screen 83, and an input device 84 connected through a system bus 81.
  • the memory 50 of the electronic device 200 stores an operating system and computer-readable instructions.
  • the computer-readable instructions can be executed by the processor 60 to implement the control method in the embodiment of the present application.
  • the processor 60 is used to provide computing and control capabilities to support the operation of the entire electronic device 200.
  • the internal memory 50 of the electronic device 200 provides an environment for execution of computer-readable instructions in the memory 52.
  • the display screen 83 of the electronic device 200 may be a liquid crystal display or an electronic ink display, and the input device 84 may be a touch layer covered on the display screen 83, or may be a button, a trackball or a touch button provided on the housing of the electronic device 200 Board, which can also be an external keyboard, trackpad, or mouse.
  • the electronic device 200 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (for example, a smart bracelet, a smart watch, a smart helmet, or smart glasses).
  • FIG. 6 is only a schematic diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the electronic device 200 to which the solution of the present application is applied.
  • the specific electronic device 200 may include more or fewer components than shown in the figure, or some components may be combined, or have different component arrangements.
  • the electronic device 200 includes an image processing circuit 90.
  • the image processing circuit 90 may be implemented by using hardware and / or software components, including various types of defining an ISP (Image Signal Processing) pipeline. Processing unit.
  • FIG. 7 is a schematic diagram of an image processing circuit 90 in one embodiment. As shown in FIG. 7, for ease of description, only aspects of the image processing technology related to the embodiments of the present application are shown.
  • the image processing circuit 90 includes an ISP processor 91 (the ISP processor 91 may be the processor 60) and a control logic 92.
  • the image data captured by the camera 93 is first processed by the ISP processor 91.
  • the ISP processor 91 analyzes the image data to capture image statistical information that can be used to determine one or more control parameters of the camera 93.
  • the camera 93 may include one or more lenses 932 and an image sensor 934.
  • the image sensor 934 may include a color filter array (such as a Bayer filter). The image sensor 934 may obtain light intensity and wavelength information captured by each imaging pixel, and provide a set of raw image data that can be processed by the ISP processor 91.
  • the sensor 94 (such as a gyroscope) may provide parameters (such as image stabilization parameters) of the acquired image processing to the ISP processor 91 based on the interface type of the sensor 94.
  • the sensor 94 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the foregoing interfaces.
  • the image sensor 934 may also send the original image data to the sensor 94.
  • the sensor 94 may provide the original image data to the ISP processor 91 based on the interface type of the sensor 94, or the sensor 94 stores the original image data into the image memory 95.
  • the ISP processor 91 processes the original image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 91 may perform one or more image processing operations on the original image data and collect statistical information about the image data. The image processing operations may be performed with the same or different bit depth accuracy.
  • the ISP processor 91 may also receive image data from the image memory 95.
  • the sensor 94 interface sends the original image data to the image memory 95, and the original image data in the image memory 95 is then provided to the ISP processor 91 for processing.
  • the image memory 95 may be an independent dedicated memory in the memory 50, a part of the memory 50, a storage device, or an electronic device, and may include a DMA (Direct Memory Access) feature.
  • DMA Direct Memory Access
  • the ISP processor 91 may perform one or more image processing operations, such as time-domain filtering.
  • the processed image data may be sent to the image memory 95 for further processing before being displayed.
  • the ISP processor 91 receives processing data from the image memory 95, and performs processing on the image data in the original domain and in the RGB and YCbCr color spaces.
  • the image data processed by the ISP processor 91 may be output to a display 97 (the display 97 may include a display 83) for viewing by a user and / or further processing by a graphics engine or GPU (Graphics Processing Unit).
  • the output of the ISP processor 91 can also be sent to the image memory 95, and the display 97 can read image data from the image memory 95.
  • the image memory 95 may be configured to implement one or more frame buffers.
  • the output of the ISP processor 91 may be sent to an encoder / decoder 96 to encode / decode image data.
  • the encoded image data can be saved and decompressed before being displayed on the display 97 device.
  • the encoder / decoder 96 may be implemented by a CPU or a GPU or a coprocessor.
  • the statistical data determined by the ISP processor 91 may be sent to the control logic unit 92.
  • the statistical data may include image sensor 934 statistical information such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, and lens 932 shading correction.
  • the control logic 92 may include a processing element and / or a microcontroller that executes one or more routines (such as firmware). The one or more routines may determine the control parameters of the camera 93 and the ISP processor according to the received statistical data. 91 control parameters.
  • control parameters of the camera 93 may include sensor 94 control parameters (such as gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 932 control parameters (such as focus distance for focusing or zooming), or these parameters The combination.
  • the ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), and lens 932 shading correction parameters.
  • the following are the steps for implementing the imaging method by using the processor 60 in FIG. 6 or the image processing circuit 90 (specifically, the ISP processor 91) in FIG. 7:
  • the image sensor is controlled by the hardware abstraction layer HAL to perform exposure using the exposure parameters to obtain multiple frames of original images;
  • an embodiment of the present application further provides a computer-readable storage medium on which a computer program is stored.
  • a computer program When instructions in the storage medium are executed by a processor, the implementation is implemented as in the foregoing method embodiment.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
  • Any process or method description in a flowchart or otherwise described herein can be understood as representing a module, fragment, or portion of code that includes one or more executable instructions for implementing steps of a custom logic function or process
  • the scope of the preferred embodiments of this application includes additional implementations in which the functions may be performed out of the order shown or discussed, including performing the functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present application pertain.
  • Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
  • computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
  • each part of the application may be implemented by hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • Discrete logic circuits with logic gates for implementing logic functions on data signals Logic circuits, ASICs with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
  • a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware.
  • the program can be stored in a computer-readable storage medium.
  • the program is When executed, one or a combination of the steps of the method embodiment is included.
  • each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
  • the above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

本申请提出一种成像方法、装置和电子设备,涉及移动终端技术领域,其中,该方法是由应用程序实现的,方法包括:测光以确定多帧原始图像对应的曝光参数,通过硬件抽象层HAL控制图像传感器采用曝光参数进行曝光,得到多帧原始图像,对多帧原始图像进行合成,得到成像图像,对成像图像进行显示,通过设置原始图像的不同的曝光参数,增大了获取的原始图像的动态范围,通过对曝光得到的多帧原始图像进行合成,提高了夜景模式下成像图像的质量,解决了现有技术中,夜景拍摄模式下,成像质量较差的技术问题。

Description

成像方法、装置和电子设备
相关申请的交叉引用
本申请要求OPPO广东移动通信有限公司于2018年9月20日提交的、申请名称为“成像方法、装置和电子设备”的、中国专利申请号“201811117586.4”的优先权。
技术领域
本申请涉及移动终端技术领域,尤其涉及一种成像方法、装置和电子设备。
背景技术
随着移动终端技术和图像处理技术的发展,用户对拍照的效果要求也越来越高,特别是对夜景拍摄的效果要求越来越高,希望在夜景拍摄时得到的图片也清晰无噪点。
但目前的夜景模式下,高亮的部分因曝光过度成像模糊,而光线较低的地方,因亮度较暗无法清晰成像,对暗部强制提高亮度后,会使得图像出现较多的噪点,使得成像质量较差。
发明内容
本申请旨在至少在一定程度上解决相关技术中的技术问题之一。
为此,本申请提出一种成像方法,由应用程序实现,通过设置原始图像的不同的曝光参数,增大了获取的原始图像的动态范围,通过对曝光得到的多帧原始图像进行合成,提高了夜景模式下成像图像的质量。
本申请提出一种成像装置。
本申请提出一种电子设备。
本申请提出一种计算机可读存储介质。
本申请一方面实施例提出了一种成像方法,该方法是由应用程序实现的,包括:
测光以确定多帧原始图像对应的曝光参数;
通过硬件抽象层HAL控制图像传感器采用所述曝光参数进行曝光,得到多帧原始图像;
对所述多帧原始图像进行合成,得到成像图像;
对所述成像图像进行显示。
本申请又一方面实施例提出了一种成像装置,包括:
确定模块,用于测光以确定多帧图像对应的曝光参数;
控制模块,用于通过硬件抽象层HAL控制图像传感器采用所述曝光参数进行曝光,得 到多帧原始图像;
合成模块,用于对所述多帧图像进行合成,得到成像图像;
显示模块,对所述成像图像进行显示。
本申请又一方面实施例提出了一种电子设备,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如前述一方面所述的成像方法。
本申请又一方面实施例提出了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时,实现如前述一方面所述的成像方法。
本申请实施例所提供的技术方案可以包含如下的有益效果:
测光以确定多帧原始图像对应的曝光参数,通过硬件抽象层HAL控制图像传感器采用曝光参数进行曝光,得到多帧原始图像,对多帧原始图像进行合成,得到成像图像,对成像图像进行显示,通过设置原始图像的不同的曝光参数,增大了获取的原始图像的动态范围,通过对曝光得到的多帧原始图像进行合成,提高了夜景模式下成像图像的质量。
附图说明
本申请上述的和/或附加的方面和优点从下面结合附图对实施例的描述中将变得明显和容易理解,其中:
图1为本申请实施例所提供的一种成像方法的流程示意图;
图2为本申请实施例所提供的另一种成像方法的流程示意图;
图3为本申请实施例所提供的又一种成像方法的流程示意图;
图4为本申请实施例提供的一种成像装置的结构示意图;
图5为本申请实施例所提供的另一种成像装置的结构示意图;
图6为一个实施例中电子设备200的内部结构示意图;以及
图7为一个实施例中图像处理电路90的示意图。
具体实施方式
下面详细描述本申请的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本申请,而不能理解为对本申请的限制。
下面参考附图描述本申请实施例的成像方法、装置和电子设备。
图1为本申请实施例所提供的一种成像方法的流程示意图。
如图1所示,该方法包括以下步骤:
步骤101,测光以确定多帧原始图像对应的曝光参数。
本申请实施例的执行主体为应用程序。
其中,曝光参数包括曝光时长、感光度和曝光补偿值中的一个或多个组合。
具体地,探测用户拍摄操作,当探测到用户拍摄操作时,根据当前的预览图像的亮度信息,确定基准曝光量,根据基准曝光量和各帧原始图像预设的曝光补偿值,确定各帧原始图像的目标曝光量,根据各帧原始图像的目标曝光量和各帧原始图像预设的感光度,确定各帧原始图像的曝光时长,从而确定多帧原始图像的曝光参数。
步骤102,通过硬件抽象层HAL控制图像传感器采用曝光参数进行曝光,得到多帧原始图像。
其中,原始图像是指直接从图像传感器CCD或CMOS上得到的几乎是未经过处理过的图像,例如为RAW格式的原始图像,原始图像中包含了更多了图像细节,通过对原始图像的获取和处理得到成像图像,可以使得成像图像中亮度和暗部的细节更清晰,可提高成像质量。
具体地,应用程序向硬件抽象层HAL发送控制指令,以使得硬件抽象层HAL根据控制指令调用接口函数,控制图像传感器采用确定的曝光参数进行曝光,得到对应不同曝光参数的多帧原始图像。
步骤103,对多帧原始图像进行合成,得到成像图像。
作为一种可能的实现方式,对多帧原始图像进行合成之前,需要先对获取的每一帧图像进行存储和亮度分量叠加,具体地,每当控制图像传感器进行一次曝光得到一帧原始图像时,利用应用程序预先创建的队列对一帧原始图像进行存储,对队列已存储的原始图像进行亮度分量叠加,得到回显图像,每当利用队列存储后续的一帧原始图像时,将后续的一帧原始图像与回显图像再次进行亮度分量叠加,得到新的回显图像,当获取到最后一帧原始图像后,根据各帧原始图像叠加的亮度分量,确定成像图像的明亮度信息Y,并对队列已存储的原始图像进行色彩分量合成,得到成像图像的色度信息U和浓度信息V,从而,根据确定的成像图像的明亮度信息Y、色度信息U和浓度信息V,得到成像图像。
步骤104,对成像图像进行显示。
具体地,对合成得到的成像图像进行显示,例如,可在电子设备的显示屏上进行回放显示,也可以通过USB或者无线传输设备,传输至电脑端进行显示。
本实施例的成像方法中,测光以确定多帧原始图像对应的曝光参数,通过硬件抽象层HAL控制图像传感器采用曝光参数进行曝光,得到多帧原始图像,对多帧原始图像进行合成,得到成像图像,对成像图像进行显示,通过设置原始图像的不同的曝光参数,增大了获取的原始图像的动态范围,通过对曝光得到的多帧原始图像进行合成,提高了夜景模式下成 像图像的质量。
基于上述实施例,本申请实施例还提出了一种成像方法,进一步清楚的说明了如何通过测光确定各帧待采集的原始图像的曝光参数,图2为本申请实施例所提供的另一种成像方法的流程示意图,如图2所示,步骤101还可以包含如下的子步骤:
步骤1011,探测用户拍摄操作。
具体地,应用程序端,探测用户的拍摄操作,其中,拍摄操作可以为点击拍摄操作,也可以为滑动拍摄操作。
步骤1012,当探测到用户拍摄操作时,根据当前的预览图像的亮度信息,确定基准曝光量。
具体地,当探测到用户的确认拍摄操作时,根据预览图像的亮度信息,确定基准曝光量,作为一种可能的实现方式,通过电子设备中的测光模块,测量当前拍摄场景对应的预览图像的亮度信息,并以设定的较低的感光度,将测量得到的亮度信息进行转化,确定基准曝光量,设定为EVO,例如,根据测光模块测量得到的感光度为500iso,曝光时长为50毫秒(ms),目标感光度为100iso,则转化后得到感光度为100iso,曝光时长为250ms,将感光度100iso,曝光时长250ms作为基准曝光量EVO。
需要说明的是,EVO并不是一个固定的值,而是根据预览图像的亮度信息进行变化的值,当环境亮度发生变化时,预览图像的亮度信息则会发生变化,那么基准曝光量EV0则也发生变化。
步骤1013,根据基准曝光量和各帧原始图像预设的曝光补偿值,确定各帧原始图像的目标曝光量。
在一种场景下,该场景下拍摄装置的抖动较小,可以拍摄较多帧原始图像,各帧原始图像预设的曝光补偿值取值范围为EV(-24)至EV(+12),根据该场景下,设定的拍摄的原始图像的帧数,设定曝光补偿值的间隔,或者设定好每一帧对应的曝光补偿值。其中,“+”表示在测光所定基准曝光量的基础上增加曝光,“-”表示减少曝光,相应的数字为补偿曝光的级数,根据各帧待采集原始图像预设的曝光补偿值和基准曝光量,确定各帧待采集原始图像的目标曝光量,例如,若一帧原始图像的曝光补偿值为EV(-6),数字-6为补偿曝光的级数,基准曝光量为EVO,则确定的该帧原始图像的目标曝光量为EVO*2 -6,即EVO/64,即降低该帧原始图像采集的亮度;若一帧原始图像的曝光补偿值为EV(+2),基准曝光量为EVO,则确定的该帧原始图像的目标曝光量为EVO*4,即为4倍的EVO,即提高该帧原始图像采集的亮度,同理,其他各帧待采集原始图像的目标曝光量的确认方法相同,此处不一一列举。
步骤1014,根据各帧原始图像的目标曝光量和各帧原始图像预设的感光度,确定各帧 原始图像的曝光时长。
其中,各帧原始图像预设的感光度相同,感光度取值范围为100ISO至200ISO。
本申请实施例中,采集各帧原始图像时,光圈值是固定的,针对每一帧待采集的原始图像,目标曝光量由感光度和曝光时长共同确定的,当感光度确定时,则对应的曝光时长则可以确定。
例如,基准曝光量对应的感光度IOS值和曝光时长分为为:100iso和250ms,一帧原始图像的预设感光度为100iso,预设曝光补偿值为EV(-3),则该帧待采集图像的目标曝光时长为
Figure PCTCN2019091580-appb-000001
即为32ms,即降低了曝光时长,同理,当曝光补偿值为EV(+1)时,得到的曝光时长则为500ms,即提高了曝光时长。同理,可确定各帧原始图像的曝光时长。通过设置的较宽的动态范围,使得待采集的各帧原始图像分别采用不同的曝光时长进行采集,从而使得图像中各部分的细节均可以在不同的曝光时长的控制下得到清晰的成像,从而提高成像效果。
本申请实施例的成像方法中,通过确定多帧原始图像对应的曝光参数,扩大了原始图像成像的动态范围,通过硬件抽象层HAL控制图像传感器采用曝光参数进行曝光,得到多帧原始图像,对多帧原始图像进行合成,得到成像图像,对成像图像进行显示,通过设置原始图像的不同的曝光参数,增大了获取的原始图像的动态范围,通过对曝光得到的多帧原始图像进行合成,提高了夜景模式下成像图像的质量。
基于上述实施例,本申请还提出了一种成像方法的可能的实现方式,进一步清楚的说明了成像图像的生成过程,且在生成成像图像的过程中,还会将叠加得到的不同亮度的回显图形进行显示,以使得用户及时查看到拍摄图像,图3为本申请实施例所提供的又一种成像方法的流程示意图,如图3所示,该方法包括如下步骤:
步骤301,测光以确定多帧原始图像对应的曝光参数。
具体可参照图2对应实施例中的步骤1011-1014,原理相同,此处不再赘述。
步骤302,通过调用应用程序与HAL之间的接口,设置图像传感器采集的各帧原始图像的格式和图像尺寸。
作为一种可能的实现方式,在安卓***中,可调用应用程序与HAL之间的接口函数mCameraDevice.createReprocessableCaptureSession,来设置图像传感器采集的各帧原始图像的格式和图像尺寸,例如,可设置原始图像为RAW格式,图像大小为12bit或者是14bit等。
步骤303,通过硬件抽象层HAL控制图像传感器采用曝光参数进行曝光,得到原始图像,每得到一帧原始图像,利用应用程序预先创建的队列对一帧原始图像进行存储。
具体地,根据测光确定的多帧原始图像对应的曝光参数,通过硬件抽象层HAL控制图像传感器采用对应的曝光参数,依次进行曝光,每当控制图像传感器进行一次曝光得到一帧 原始图像时,利用应用程序预先创建的队列对一帧原始图像进行存储。
步骤304,每当利用队列存储后续的一帧原始图像时,将后续的一帧原始图像与队列已存储的原始图像进行亮度分量叠加,将进行亮度分量叠加得到的图像作为回显图像。
在一种场景下,队列中仅存储有1帧原始图像时,当队列存储后续一帧原始图像时,将后续一帧原始图像与队列中已存储的1帧原始图像进行亮度分量叠加,具体为,将2帧原始图像对应像素的亮度分量进行叠加,将进行亮度分量叠加得到的图像作为回显图像,进而,执行步骤305,对回显图像进行图像处理。
基于上一场景,每当利用队列存储后续的一帧原始图像时,将后续的一帧原始图像与已得到的回显图像再次进行亮度分量叠加,将再次进行亮度分量叠加得到的图像作为新的回显图像,进而,执行步骤305,对得到的回显图像进行图像处理。
需要说明的是,当前每获取得到后续一帧原始图像时,即和已经存储在队列中的原始图像进行亮度分量叠加,获取叠加得到的亮度分量,将叠加得到的亮度分量作为当前获取的后续一帧原始图像的亮度分量,作为回显图像,也就是说该回显图像的亮度分量为叠加得到的亮度分量。
步骤305,调用应用程序与HAL之间的接口,对得到的一帧回显图像进行图像处理,对处理后的回显图像进行显示。
其中,图像处理包括色彩空间转换、降噪、清晰度调整和色彩校正中的一个或多个组合。
具体地,调用应用程序与HAL之间的接口,通过调用接口,向HAL发送处理请求,其中,处理请求用于指示图像处理前的回显图像的存储位置,以及对应的图像信息,其中,图像信息包括图像尺寸和/或拍摄参数,HAL根据存储位置读取得到图像处理前的回显图像,将回显图像和对应的图像信息发送至图像处理器进行图像处理,具体为,将回显图像进行色彩空间转换,即将回显图像从原始图像对应的RGB空间转换至YUV空间,以使得图像处理器ISP可以对转换至YUV空间后的回显图像进行处理,并在处理完后,通过硬件编码器对回显图像进行编码,使图像处理后的回显图像从YUV空间再转换至RGB空间,以使得图像处理后的回显图像在应用程序的显示界面可进行显示,通过对回显图像的图像处理和显示,使得用户可以及时看到当前拍摄得到的图像叠加后的显示效果,使得用户能够获知拍摄进程,因多帧原始图像采集以及处理需要一定时间,避免用户误以为拍摄卡顿或无响应,提升了用户的体验度。
步骤307,对获取的多帧原始图像进行合成,得到成像图像。
具体地,当获取得到所有帧原始图像后,对队列中已存储的多帧原始图像进行色彩空间转换,将多帧原始图像由RGB空间转换至YUV空间,将转换后的多帧原始图像进行色彩分量合成,作为一种可能的实现方式,根据各个原始图像所占的权重,将多帧原始图像对应 的每一个像素点,进行色度信息U的加权计算,得到成像图像的色度信息U,同理,加权计算得到浓度信息V。根据各帧原始图像叠加的亮度分量,确定成像图像的明亮度信息Y。进而,根据确定的成像图像的明亮度信息Y、色度信息U和浓度信息V,得到成像图像。
步骤308,对成像图像进行显示。
具体地,将得到的成像图像,进行图像处理,对成像图像进行降噪、清晰度调整和色彩校正,并将得到的成像图像从YUV空间转化为RGB空间,例如,通过硬件编码得到JPEG格式的成像图像,以使得支持RGB显示的显示装置可以对成像图像进行显示。
本申请实施例的成像方法中,通过检测当前拍摄环境的光线,确定待采集的原始图像的曝光参数,使得动态范围较大,在通过曝光参数进行原始图像采集的过程中,每采集到一帧原始图像,则存储至应用程序预先创建的队列中,并将后一帧采集的原始图像和已经存储的原始图像或已经得到的回显图像进行亮度分量的叠加,以使得以叠加的亮度分量为亮度信息的后一帧原始图像进行显示,使得用户可以及时看到原始图像叠加后的效果,了解拍摄进程,并在获取得到所有待采集的多帧原始图像后,将原始图像进行合成,得到成像图像,以使得亮度较亮的部分得到压制,而较暗的部分亮度和细节也得到提升,提高了夜景模式下的动态范围,提高了成像质量。
为了实现上述实施例,本申请还提出一种成像装置。
图4为本申请实施例提供的一种成像装置的结构示意图。
如图4所示,该装置包括:确定模块41、控制模块42、合成模块43和显示模块44。
确定模块41,用于测光以确定多帧图像对应的曝光参数。
控制模块42,用于通过硬件抽象层HAL控制图像传感器采用曝光参数进行曝光,得到多帧原始图像。
合成模块43,用于对多帧图像进行合成,得到成像图像。
显示模块44,对成像图像进行显示。
需要说明的是,前述对方法实施例的解释说明也适用于该实施例的装置,此处不再赘述。
本实施例的成像装置中,测光以确定多帧原始图像对应的曝光参数,通过硬件抽象层HAL控制图像传感器采用曝光参数进行曝光,得到多帧原始图像,对多帧原始图像进行合成,得到成像图像,对成像图像进行显示,通过设置原始图像的不同的曝光参数,增大了获取的原始图像的动态范围,通过对曝光得到的多帧原始图像进行合成,提高了夜景模式下成像图像的质量。
基于上述实施例,本申请实施例还提供了一种成像装置的可能的实现方式,图5为本申请实施例所提供的另一种成像装置的结构示意图,在上一实施例的基础上,该装置还包括:设置模块51、存储模块52和处理模块53。
设置模块51,用于通过调用应用程序与HAL之间的接口,设置图像传感器采集的各帧原始图像的格式和图像尺寸。
存储模块52,用于每当控制图像传感器进行一次曝光得到一帧原始图像时,利用应用程序预先创建的队列对一帧原始图像进行存储。
处理模块53,用于对队列已存储的原始图像进行亮度分量叠加,得到回显图像,对回显图像进行显示;每当利用队列存储后续的一帧原始图像时,将后续的一帧原始图像与回显图像再次进行亮度分量叠加,将再次进行亮度分量叠加得到的图像作为回显图像进行显示。
作为一种可能的实现方式,处理模块53,还用于每当得到一帧回显图像时,通过调用应用程序与HAL之间的接口,对得到的一帧回显图像进行图像处理;图像处理包括:色彩空间转换、降噪、清晰度调整和色彩校正中的一个或多个组合。
作为一种可能的实现方式,处理模块53,具体还用于:
通过调用接口,向HAL发送处理请求,处理请求用于指示图像处理前的回显图像的存储位置,以及对应的图像信息;其中,图像信息,包括图像尺寸和/或拍摄参数,用于HAL根据存储位置读取得到图像处理前的回显图像之后,进行图像处理。
作为一种可能的实现方式,上述合成模块43,具体用于:
对队列已存储的原始图像进行色彩分量合成,得到成像图像的色度信息U和浓度信息V;根据各帧原始图像叠加的亮度分量,确定成像图像的明亮度信息Y。
作为一种可能的实现方式,上述确定模块41,具体用于:
探测用户拍摄操作;当探测到用户拍摄操作时,根据当前的预览图像的亮度信息,确定基准曝光量;根据基准曝光量和各帧原始图像预设的曝光补偿值,确定各帧原始图像的目标曝光量;根据各帧原始图像的目标曝光量和各帧原始图像预设的感光度,确定各帧原始图像的曝光时长。
其中,曝光参数包括曝光时长、感光度和曝光补偿值中的一个或多个组合。
作为一种可能的实现方式,各帧原始图像预设的感光度相同,感光度取值范围为100ISO至200ISO;各帧原始图像预设的曝光补偿值取值范围为EV(-24)至EV(+12)。
需要说明的是,前述对方法实施例的解释说明也适用于该实施例的装置,此处不再赘述。
本申请实施例的成像装置中,通过检测当前拍摄环境的光线,确定待采集的原始图像的曝光参数,使得动态范围较大,在通过曝光参数进行原始图像采集的过程中,每采集到一帧原始图像,则存储至应用程序预先创建的队列中,并将后一帧采集的原始图像和已经存储的原始图像和已经得到的回显图像进行亮度分量的叠加,以使得以叠加的亮度分量为亮度信息的后一帧原始图像进行显示,使得用户可以及时看到原始图像叠加后的效果,了解拍摄进程,并在获取得到所有待采集的多帧原始图像后,将原始图像进行合成,得到成像图像,并对成 像图像进行显示,提高了夜景拍摄模式下,根据较大的动态范围,对得到原始图像进行叠加合成,以使得亮度较亮的部分得到压制,而较暗的部分亮度和细节也得到提升,提高了夜景模式下的拍摄质量。
为了实现上述实施例,本申请实施例还提出了一种电子设备,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如前述方法实施例所述的成像方法。
图6为一个实施例中电子设备200的内部结构示意图。该电子设备200包括通过***总线81连接的处理器60、存储器50(例如为非易失性存储介质)、内存储器82、显示屏83和输入装置84。其中,电子设备200的存储器50存储有操作***和计算机可读指令。该计算机可读指令可被处理器60执行,以实现本申请实施方式的控制方法。该处理器60用于提供计算和控制能力,支撑整个电子设备200的运行。电子设备200的内存储器50为存储器52中的计算机可读指令的运行提供环境。电子设备200的显示屏83可以是液晶显示屏或者电子墨水显示屏等,输入装置84可以是显示屏83上覆盖的触摸层,也可以是电子设备200外壳上设置的按键、轨迹球或触控板,也可以是外接的键盘、触控板或鼠标等。该电子设备200可以是手机、平板电脑、笔记本电脑、个人数字助理或穿戴式设备(例如智能手环、智能手表、智能头盔、智能眼镜)等。本领域技术人员可以理解,图6中示出的结构,仅仅是与本申请方案相关的部分结构的示意图,并不构成对本申请方案所应用于其上的电子设备200的限定,具体的电子设备200可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
请参阅图7,本申请实施例的电子设备200中包括图像处理电路90,图像处理电路90可利用硬件和/或软件组件实现,包括定义ISP(Image Signal Processing,图像信号处理)管线的各种处理单元。图7为一个实施例中图像处理电路90的示意图。如图7所示,为便于说明,仅示出与本申请实施例相关的图像处理技术的各个方面。
如图7所示,图像处理电路90包括ISP处理器91(ISP处理器91可为处理器60)和控制逻辑器92。摄像头93捕捉的图像数据首先由ISP处理器91处理,ISP处理器91对图像数据进行分析以捕捉可用于确定摄像头93的一个或多个控制参数的图像统计信息。摄像头93可包括一个或多个透镜932和图像传感器934。图像传感器934可包括色彩滤镜阵列(如Bayer滤镜),图像传感器934可获取每个成像像素捕捉的光强度和波长信息,并提供可由ISP处理器91处理的一组原始图像数据。传感器94(如陀螺仪)可基于传感器94接口类型把采集的图像处理的参数(如防抖参数)提供给ISP处理器91。传感器94接口可以为SMIA(Standard Mobile Imaging Architecture,标准移动成像架构)接口、其它串行或并行照相机接口或上述接口的组合。
此外,图像传感器934也可将原始图像数据发送给传感器94,传感器94可基于传感器94接口类型把原始图像数据提供给ISP处理器91,或者传感器94将原始图像数据存储到图像存储器95中。
ISP处理器91按多种格式逐个像素地处理原始图像数据。例如,每个图像像素可具有8、10、12或14比特的位深度,ISP处理器91可对原始图像数据进行一个或多个图像处理操作、收集关于图像数据的统计信息。其中,图像处理操作可按相同或不同的位深度精度进行。
ISP处理器91还可从图像存储器95接收图像数据。例如,传感器94接口将原始图像数据发送给图像存储器95,图像存储器95中的原始图像数据再提供给ISP处理器91以供处理。图像存储器95可为存储器50、存储器50的一部分、存储设备、或电子设备内的独立的专用存储器,并可包括DMA(Direct Memory Access,直接直接存储器存取)特征。
当接收到来自图像传感器934接口或来自传感器94接口或来自图像存储器95的原始图像数据时,ISP处理器91可进行一个或多个图像处理操作,如时域滤波。处理后的图像数据可发送给图像存储器95,以便在被显示之前进行另外的处理。ISP处理器91从图像存储器95接收处理数据,并对处理数据进行原始域中以及RGB和YCbCr颜色空间中的图像数据处理。ISP处理器91处理后的图像数据可输出给显示器97(显示器97可包括显示屏83),以供用户观看和/或由图形引擎或GPU(Graphics Processing Unit,图形处理器)进一步处理。此外,ISP处理器91的输出还可发送给图像存储器95,且显示器97可从图像存储器95读取图像数据。在一个实施例中,图像存储器95可被配置为实现一个或多个帧缓冲器。此外,ISP处理器91的输出可发送给编码器/解码器96,以便编码/解码图像数据。编码的图像数据可被保存,并在显示于显示器97设备上之前解压缩。编码器/解码器96可由CPU或GPU或协处理器实现。
ISP处理器91确定的统计数据可发送给控制逻辑器92单元。例如,统计数据可包括自动曝光、自动白平衡、自动聚焦、闪烁检测、黑电平补偿、透镜932阴影校正等图像传感器934统计信息。控制逻辑器92可包括执行一个或多个例程(如固件)的处理元件和/或微控制器,一个或多个例程可根据接收的统计数据,确定摄像头93的控制参数及ISP处理器91的控制参数。例如,摄像头93的控制参数可包括传感器94控制参数(例如增益、曝光控制的积分时间、防抖参数等)、照相机闪光控制参数、透镜932控制参数(例如聚焦或变焦用焦距)、或这些参数的组合。ISP控制参数可包括用于自动白平衡和颜色调整(例如,在RGB处理期间)的增益水平和色彩校正矩阵,以及透镜932阴影校正参数。
例如,以下为运用图6中的处理器60或运用图7中的图像处理电路90(具体为ISP处理器91)实现成像方法的步骤:
01:测光以确定多帧原始图像对应的曝光参数;
02:通过硬件抽象层HAL控制图像传感器采用所述曝光参数进行曝光,得到多帧原始图像;
03:对所述多帧原始图像进行合成,得到成像图像;
04:对所述成像图像进行显示。
为了实现上述实施例,本申请实施例还提出了一种计算机可读存储介质,其上存储有计算机程序,当所述存储介质中的指令由处理器被执行时,实现如前述方法实施例所述的成像方法。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现定制逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行***、装置或设备(如基于计算机的***、包括处理器的***或其他可以从指令执行***、装置或设备取指令并执行指令的***)使用,或结合这些指令执行***、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行***、装置或设备或结合这些指令执行***、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储 器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本申请的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行***执行的软件或固件来实现。如,如果用硬件来实现和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本申请各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (20)

  1. 一种成像方法,其特征在于,所述方法由应用程序实现,所述方法包括以下步骤:
    测光以确定多帧原始图像对应的曝光参数;
    通过硬件抽象层HAL控制图像传感器采用所述曝光参数进行曝光,得到多帧原始图像;
    对所述多帧原始图像进行合成,得到成像图像;
    对所述成像图像进行显示。
  2. 根据权利要求1所述的成像方法,其特征在于,所述方法还包括:
    每当控制所述图像传感器进行一次曝光得到一帧原始图像时,利用所述应用程序预先创建的队列对所述一帧原始图像进行存储。
  3. 根据权利要求2所述的成像方法,其特征在于,所述利用所述应用程序预先创建的队列对所述一帧原始图像进行存储之后,还包括:
    对所述队列已存储的原始图像进行亮度分量叠加,得到回显图像,对所述回显图像进行显示;
    每当利用所述队列存储后续的一帧原始图像时,将所述后续的一帧原始图像与所述回显图像再次进行亮度分量叠加,将再次进行亮度分量叠加得到的图像作为所述回显图像进行显示。
  4. 根据权利要求3所述的成像方法,其特征在于,所述对所述多帧原始图像进行合成,得到成像图像,包括:
    对所述队列已存储的原始图像进行色彩分量合成,得到所述成像图像的色度信息U和浓度信息V;
    根据各帧原始图像叠加的亮度分量,确定所述成像图像的明亮度信息Y。
  5. 根据权利要求3或4所述的成像方法,其特征在于,所述进行显示之前,还包括:
    每当得到一帧回显图像时,通过调用应用程序与HAL之间的接口,对得到的一帧回显图像进行图像处理;所述图像处理包括:色彩空间转换、降噪、清晰度调整和色彩校正中的一个或多个组合。
  6. 根据权利要求5所述的成像方法,其特征在于,所述通过调用应用程序与HAL之间的接口,对得到的一帧回显图像进行图像处理,包括:
    通过调用所述接口,向所述HAL发送处理请求,所述处理请求用于指示图像处理前的回显图像的存储位置,以及对应的图像信息;其中,所述图像信息,包括图像尺寸和/或拍摄参数,用于所述HAL根据所述存储位置读取得到图像处理前的回显图像之后,进行图像处理。
  7. 根据权利要求1-6任一项所述的成像方法,其特征在于,所述曝光参数包括曝光时 长、感光度和曝光补偿值中的一个或多个组合,所述测光以确定多帧原始图像对应的曝光参数,包括:
    探测用户拍摄操作;
    当探测到用户拍摄操作时,根据当前的预览图像的亮度信息,确定基准曝光量;
    根据所述基准曝光量和各帧原始图像预设的曝光补偿值,确定各帧原始图像的目标曝光量;
    根据各帧原始图像的目标曝光量和各帧原始图像预设的感光度,确定各帧原始图像的曝光时长。
  8. 根据权利要求7所述的成像方法,其特征在于,
    所述各帧原始图像预设的感光度相同,所述感光度取值范围为100ISO至200ISO;
    各帧原始图像预设的曝光补偿值取值范围为EV(-24)至EV(+12)。
  9. 根据权利要求1-8任一项所述的成像方法,其特征在于,所述通过硬件抽象层HAL控制图像传感器采用所述曝光参数进行曝光之前,还包括:
    通过调用所述应用程序与HAL之间的接口,设置所述图像传感器采集的各帧原始图像的格式和图像尺寸。
  10. 一种成像装置,其特征在于,所述装置包括:
    确定模块,用于测光以确定多帧图像对应的曝光参数;
    控制模块,用于通过硬件抽象层HAL控制图像传感器采用所述曝光参数进行曝光,得到多帧原始图像;
    合成模块,用于对所述多帧图像进行合成,得到成像图像;
    显示模块,对所述成像图像进行显示。
  11. 根据权利要求10所述的成像装置,其特征在于,所述装置还包括:
    存储模块,用于每当控制所述图像传感器进行一次曝光得到一帧原始图像时,利用所述应用程序预先创建的队列对所述一帧原始图像进行存储。
  12. 根据权利要求11所述的成像装置,其特征在于,所述装置,还包括:
    处理模块,用于对所述队列已存储的原始图像进行亮度分量叠加,得到回显图像,对所述回显图像进行显示;每当利用所述队列存储后续的一帧原始图像时,将所述后续的一帧原始图像与所述回显图像再次进行亮度分量叠加,将再次进行亮度分量叠加得到的图像作为所述回显图像进行显示。
  13. 根据权利要求12所述的成像装置,其特征在于,所述合成模块,具体用于:
    对所述队列已存储的原始图像进行色彩分量合成,得到所述成像图像的色度信息U和浓度信息V;
    根据各帧原始图像叠加的亮度分量,确定所述成像图像的明亮度信息Y。
  14. 根据权利要求12或13所述的成像装置,其特征在于,所述处理模块,具体还用于:
    每当得到一帧回显图像时,通过调用应用程序与HAL之间的接口,对得到的一帧回显图像进行图像处理;所述图像处理包括:色彩空间转换、降噪、清晰度调整和色彩校正中的一个或多个组合。
  15. 根据权利要求14所述的成像装置,其特征在于,所述处理模块,具体还用于:
    通过调用所述接口,向所述HAL发送处理请求,所述处理请求用于指示图像处理前的回显图像的存储位置,以及对应的图像信息;其中,所述图像信息,包括图像尺寸和/或拍摄参数,用于所述HAL根据所述存储位置读取得到图像处理前的回显图像之后,进行图像处理。
  16. 根据权利要求10-15任一项所述的成像装置,其特征在于,所述曝光参数包括曝光时长、感光度和曝光补偿值中的一个或多个组合,所述确定模块,具体用于:
    探测用户拍摄操作;
    当探测到用户拍摄操作时,根据当前的预览图像的亮度信息,确定基准曝光量;
    根据所述基准曝光量和各帧原始图像预设的曝光补偿值,确定各帧原始图像的目标曝光量;
    根据各帧原始图像的目标曝光量和各帧原始图像预设的感光度,确定各帧原始图像的曝光时长。
  17. 根据权利要求16所述的成像装置,其特征在于,
    所述各帧原始图像预设的感光度相同,所述感光度取值范围为100ISO至200ISO;
    各帧原始图像预设的曝光补偿值取值范围为EV(-24)至EV(+12)。
  18. 根据权利要求10-17任一项所述的成像装置,其特征在于,所述装置,还包括:
    设置模块,用于通过调用所述应用程序与HAL之间的接口,设置所述图像传感器采集的各帧原始图像的格式和图像尺寸。
  19. 一种电子设备,其特征在于,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如权利要求1-9中任一所述的成像方法。
  20. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如权利要求1-9中任一所述的成像方法。
PCT/CN2019/091580 2018-09-20 2019-06-17 成像方法、装置和电子设备 WO2020057199A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811117586.4 2018-09-20
CN201811117586.4A CN108833804A (zh) 2018-09-20 2018-09-20 成像方法、装置和电子设备

Publications (1)

Publication Number Publication Date
WO2020057199A1 true WO2020057199A1 (zh) 2020-03-26

Family

ID=64149914

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/091580 WO2020057199A1 (zh) 2018-09-20 2019-06-17 成像方法、装置和电子设备

Country Status (2)

Country Link
CN (1) CN108833804A (zh)
WO (1) WO2020057199A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113191994A (zh) * 2021-04-26 2021-07-30 北京小米移动软件有限公司 图像处理方法、装置及存储介质
CN116723409A (zh) * 2022-02-28 2023-09-08 荣耀终端有限公司 自动曝光方法与电子设备

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108833804A (zh) * 2018-09-20 2018-11-16 Oppo广东移动通信有限公司 成像方法、装置和电子设备
CN109348089B (zh) * 2018-11-22 2020-05-22 Oppo广东移动通信有限公司 夜景图像处理方法、装置、电子设备及存储介质
CN109660743B (zh) * 2018-12-21 2021-06-08 南京理工大学 一种制冷型红外热像仪高动态范围成像的实现方法
CN109963083B (zh) * 2019-04-10 2021-09-24 Oppo广东移动通信有限公司 图像处理器、图像处理方法、拍摄装置和电子设备
CN110213484B (zh) * 2019-05-31 2021-03-12 维沃移动通信有限公司 一种拍照方法、终端设备及计算机可读存储介质
CN110677557B (zh) * 2019-10-28 2022-04-22 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN110958399B (zh) * 2019-12-09 2021-06-29 Oppo广东移动通信有限公司 高动态范围图像hdr实现方法及相关产品
CN111193869B (zh) * 2020-01-09 2021-08-06 Oppo广东移动通信有限公司 一种图像数据处理方法、图像数据处理装置及移动终端
CN111225153B (zh) * 2020-01-21 2021-08-06 Oppo广东移动通信有限公司 一种图像数据处理方法、图像数据处理装置及移动终端
CN114125316B (zh) * 2021-11-27 2024-04-05 深圳市天启时代科技有限公司 一种远程图像处理方法以及***
CN116389898B (zh) * 2023-02-27 2024-03-19 荣耀终端有限公司 图像处理方法、设备及存储介质
CN116260920B (zh) * 2023-05-09 2023-07-25 深圳市谨讯科技有限公司 多数据混合控制方法、装置、设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102761695A (zh) * 2011-04-28 2012-10-31 佳能株式会社 摄像设备及其控制方法
CN107197169A (zh) * 2017-06-22 2017-09-22 维沃移动通信有限公司 一种高动态范围图像拍摄方法及移动终端
CN107241557A (zh) * 2017-06-16 2017-10-10 广东欧珀移动通信有限公司 图像曝光方法、装置、摄像设备及存储介质
US20170310903A1 (en) * 2015-05-22 2017-10-26 Samsung Electronics Co., Ltd. Image capturing apparatus and method of controlling the same
CN107358593A (zh) * 2017-06-16 2017-11-17 广东欧珀移动通信有限公司 成像方法和装置
CN108833804A (zh) * 2018-09-20 2018-11-16 Oppo广东移动通信有限公司 成像方法、装置和电子设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI536830B (zh) * 2014-03-10 2016-06-01 Nat Univ Chung Cheng Measuring an exposure parameter art high dynamic range image generating method
CN106775902A (zh) * 2017-01-25 2017-05-31 北京奇虎科技有限公司 一种图像处理的方法和装置、移动终端
CN109040609B (zh) * 2018-08-22 2021-04-09 Oppo广东移动通信有限公司 曝光控制方法、装置、电子设备和计算机可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102761695A (zh) * 2011-04-28 2012-10-31 佳能株式会社 摄像设备及其控制方法
US20170310903A1 (en) * 2015-05-22 2017-10-26 Samsung Electronics Co., Ltd. Image capturing apparatus and method of controlling the same
CN107241557A (zh) * 2017-06-16 2017-10-10 广东欧珀移动通信有限公司 图像曝光方法、装置、摄像设备及存储介质
CN107358593A (zh) * 2017-06-16 2017-11-17 广东欧珀移动通信有限公司 成像方法和装置
CN107197169A (zh) * 2017-06-22 2017-09-22 维沃移动通信有限公司 一种高动态范围图像拍摄方法及移动终端
CN108833804A (zh) * 2018-09-20 2018-11-16 Oppo广东移动通信有限公司 成像方法、装置和电子设备

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113191994A (zh) * 2021-04-26 2021-07-30 北京小米移动软件有限公司 图像处理方法、装置及存储介质
CN113191994B (zh) * 2021-04-26 2023-11-21 北京小米移动软件有限公司 图像处理方法、装置及存储介质
CN116723409A (zh) * 2022-02-28 2023-09-08 荣耀终端有限公司 自动曝光方法与电子设备
CN116723409B (zh) * 2022-02-28 2024-05-24 荣耀终端有限公司 自动曝光方法与电子设备

Also Published As

Publication number Publication date
CN108833804A (zh) 2018-11-16

Similar Documents

Publication Publication Date Title
WO2020057199A1 (zh) 成像方法、装置和电子设备
WO2020038069A1 (zh) 曝光控制方法、装置和电子设备
CN108683862B (zh) 成像控制方法、装置、电子设备及计算机可读存储介质
US11228720B2 (en) Method for imaging controlling, electronic device, and non-transitory computer-readable storage medium
US11582400B2 (en) Method of image processing based on plurality of frames of images, electronic device, and storage medium
WO2020038072A1 (zh) 曝光控制方法、装置和电子设备
CN109005364B (zh) 成像控制方法、装置、电子设备以及计算机可读存储介质
WO2020029732A1 (zh) 全景拍摄方法、装置和成像设备
CN108322669B (zh) 图像获取方法及装置、成像装置和可读存储介质
CN110072052B (zh) 基于多帧图像的图像处理方法、装置、电子设备
CN109788207B (zh) 图像合成方法、装置、电子设备及可读存储介质
CN110248106B (zh) 图像降噪方法、装置、电子设备以及存储介质
WO2020038087A1 (zh) 超级夜景模式下的拍摄控制方法、装置和电子设备
US11490024B2 (en) Method for imaging controlling, electronic device, and non-transitory computer-readable storage medium
CN110213502B (zh) 图像处理方法、装置、存储介质及电子设备
WO2020207261A1 (zh) 基于多帧图像的图像处理方法、装置、电子设备
CN109672819B (zh) 图像处理方法、装置、电子设备及计算机可读存储介质
CN108833802B (zh) 曝光控制方法、装置和电子设备
US11601600B2 (en) Control method and electronic device
WO2020034702A1 (zh) 控制方法、装置、电子设备和计算机可读存储介质
CN109167930A (zh) 图像显示方法、装置、电子设备和计算机可读存储介质
WO2020029679A1 (zh) 控制方法、装置、成像设备、电子设备及可读存储介质
CN109194855A (zh) 成像方法、装置和电子设备
US11503223B2 (en) Method for image-processing and electronic device
CN109756680B (zh) 图像合成方法、装置、电子设备及可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19863814

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19863814

Country of ref document: EP

Kind code of ref document: A1