WO2021190099A1 - 成像装置、方法及电子设备 - Google Patents

成像装置、方法及电子设备 Download PDF

Info

Publication number
WO2021190099A1
WO2021190099A1 PCT/CN2021/071495 CN2021071495W WO2021190099A1 WO 2021190099 A1 WO2021190099 A1 WO 2021190099A1 CN 2021071495 W CN2021071495 W CN 2021071495W WO 2021190099 A1 WO2021190099 A1 WO 2021190099A1
Authority
WO
WIPO (PCT)
Prior art keywords
emitting
target
regions
component
light
Prior art date
Application number
PCT/CN2021/071495
Other languages
English (en)
French (fr)
Inventor
成通
林华鑫
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2021190099A1 publication Critical patent/WO2021190099A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the present invention relates to the field of imaging, in particular to an imaging device, method and electronic equipment.
  • Time of Flight TOF
  • TOF can be used for physical ranging, 3D modeling, photographing, etc.
  • the method of increasing the hardware resolution of the image chip at the receiving end is usually adopted.
  • the increase in chip hardware resolution will also lead to an increase in the power consumption and size of the image chip.
  • the increase in the power consumption of the image chip will lead to an increase in heat generation and affect the performance of the mobile phone.
  • the increase in the size of the image chip will take up more space.
  • the embodiments of the present invention provide an imaging device, a method, and an electronic device to solve the problem of low image resolution in the prior art.
  • the present invention is implemented as follows:
  • an imaging device which includes:
  • a body the body having a containing cavity and an opening communicating with the containing cavity;
  • a cover plate, the cover plate is arranged to cover the opening
  • a target emitting component the target emitting component is arranged at the bottom of the body, the bottom faces the cover plate, and the target emitting component has a plurality of light emitting regions arranged at intervals;
  • a diffractive optical component arranged on the side of the cover facing the diffractive optical component, the diffractive optical component is provided with a plurality of diffractive regions, and the positions of the diffractive regions correspond to the positions of the light-emitting regions one-to-one;
  • the receiving component is used for receiving the image formed by the laser signal sent by the target emitting component after passing through the diffractive optical component.
  • an imaging method which is applied to an electronic device, and the method includes:
  • Diffraction processing is performed on a plurality of the laser signals through a plurality of diffractive regions in the diffractive optical assembly, wherein the positions of the diffractive regions correspond to the positions of the light-emitting regions in a one-to-one correspondence;
  • the image received by the receiving component is superimposed to obtain a target image, where the image is an image obtained by irradiating a target object with a plurality of the laser signals after passing through the diffractive optical component.
  • an electronic device in a third aspect, includes a processor, a memory, and a computer program that is stored on the memory and can run on the processor.
  • the computer program When the computer program is executed by the processor, Implement the steps of the method as described in the second aspect.
  • a computer-readable storage medium is provided, and a computer program is stored on the computer-readable storage medium, and the computer program implements the steps of the method described in the second aspect when the computer program is executed by a processor.
  • the body and the cover plate can provide installation and protection for the target emitting component and the diffractive optical component.
  • the target emitting component is divided into multiple light emitting areas, and then the diffractive optical component is divided into multiple positions and the multiple of the target emitting component.
  • the positions of the light-emitting areas correspond to the diffractive areas one-to-one, and finally the image formed after the laser signal sent by the target transmitting component is received by the receiving component and passed through the diffractive optical component.
  • the target emitting component and the diffractive optical component are designed to be partitioned, and then each area is controlled to work in turn, and the target object is irradiated to obtain multiple feature images containing feature points, and then multiple feature images are superimposed, so that The number of feature points of the superimposed image increases, thereby improving the image resolution.
  • FIG. 1 is a schematic structural diagram of an imaging device provided by an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of partitions of a target launch component provided by an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of partitions of a diffractive optical component provided by an embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of an imaging method provided by an embodiment of the present invention.
  • Fig. 5 is a schematic diagram of the hardware structure of an electronic device according to an embodiment of the present invention.
  • 1- target emitting component 2- diffractive optical component; 3- collimating optical component; 4- body; 5- cover plate; 6-photosensitive component.
  • the present invention provides an imaging device, a method, and an electronic device.
  • the imaging device can be applied to an electronic device to take an image of a target object.
  • the imaging device divides the target emitter component 1 and the diffractive optical component 2 into corresponding pairs of regions, so that the feature points of the feature map of the collected target image are increased, and then multiple feature images are merged to improve the resolution of the final image .
  • the imaging device may include: a body 4, a cover plate 5, a target emitting component 1, a diffractive optical component 2, and a receiving component.
  • the body 4 has a containing cavity and an opening communicating with the containing cavity
  • the cover plate 5 is arranged on the opening
  • the target launching assembly 1 is arranged at the bottom of the body 4, and the bottom faces the cover 5, and the target launching assembly 1 has a plurality of spaced arrangements.
  • the diffractive optical component 2 is arranged on the side of the cover plate 5 facing the diffractive optical component 2, and the diffractive optical component 2 is provided with a plurality of diffractive regions. After receiving the laser signal emitted by the target transmitting component 1 and passing through the diffractive optical component 2, the image is formed.
  • the body and the cover plate can provide installation and protection for the target emitting component and the diffractive optical component.
  • the target emitting component is divided into multiple light emitting areas, and then the diffractive optical component is divided into multiple positions and the multiple of the target emitting component.
  • the positions of the light-emitting areas correspond to the diffractive areas one-to-one, and finally the image formed after the laser signal sent by the target transmitting component is received by the receiving component and passed through the diffractive optical component.
  • the target emitting component 1 and the diffractive optical component 2 are designed in zones, and then each area is controlled in turn, and the target object is irradiated to obtain multiple feature images containing feature points, and then multiple feature images are superimposed , So that the number of feature points of the superimposed image is increased, thereby improving the image resolution.
  • the target emitting component 1 may be a vertical cavity surface emitter (Vertical Cavity Surface Emitting Laser, VCSEL) chip, or other devices that can emit laser signals.
  • the diffractive optical component 2 may be a diffractive optical element (DOE) device, or may be another device with a diffraction function.
  • DOE diffractive optical element
  • FIG. 2 it is a schematic diagram of a partition of a target transmitting component provided by an embodiment of the present invention.
  • the target emitting component 1 can be divided into four areas A, B, C, and D.
  • FIG. 2 it is a schematic diagram of the partitioning of a target transmitting component according to an embodiment of the present invention.
  • each light-emitting area is provided with a reference light-emitting point, and multiple reference light-emitting points form a reference light-emitting sub-area.
  • the reference light-emitting sub-region is used to independently emit a reference laser signal.
  • one light-emitting point is selected in each light-emitting area, and multiple light-emitting points are determined as the reference light-emitting sub-areas.
  • E in Fig. 2 is the reference light-emitting sub-area.
  • the reference light-emitting sub-regions all emit a reference laser signal.
  • the reference laser signal is used as the position of the laser irradiation point when the images formed by the light emission of each reference light-emitting area are superimposed as a reference.
  • the selection of the light-emitting point in each light-emitting area can be any position, which is not specifically limited in the embodiment of the present invention.
  • FIG. 3 it is a schematic diagram of partitions of a diffractive optical assembly provided by an embodiment of the present invention.
  • the diffractive optical component can be divided into four regions A', B', C', and D'whose positions correspond to the positions of the four regions A, B, C, and D of the target emitting component 1. .
  • each diffraction area includes a plurality of diffraction holes, and the order and/or shape of the diffraction holes in each diffraction area are different.
  • the order and/or shape of the multiple diffraction holes in each diffraction area can be designed in different forms, so that when the images are superimposed, multiple laser signals of each image can be clearly seen at the same time.
  • the imaging that is, the number of feature points in the superimposed image is the sum of the laser signal imaging of each image, which will not be caused by the overlap of the imaging where the laser signal cannot be seen, and the resolution will be higher. The resulting image is clearer.
  • the diffraction holes in each diffraction area can be holes with the same shape but different orderings.
  • each area is a round hole, but the first area is a multi-row and multi-column arrangement with parallel columns but staggered behavior.
  • the second area is a multi-row and multi-column matrix arrangement, the third area is a multi-row and multi-column arrangement with parallel rows but interlaced columns, and the fourth area is a multi-row multi-column arrangement with interlaced rows and columns. Columnar arrangement.
  • the diffraction holes in each diffraction area can also be holes with different shapes but the same order.
  • each area is arranged in a matrix with multiple rows and multiple columns, but the first area is a circular hole, and the second The area is a square hole, the third area is a pentagonal hole, and the fourth area is a triangular hole.
  • the diffraction holes in each diffraction area can also be holes with different shapes and orderings.
  • the first area is a circular hole with parallel rows but staggered rows and rows, and the second area has multiple holes.
  • the square holes are arranged in a matrix with rows and columns.
  • the third area is a pentagonal hole with rows and columns that are parallel but interlaced.
  • the fourth area is a pentagonal hole with rows and columns. Arrangement of triangular holes.
  • the diffraction holes in each diffraction area may also be holes of any other shape and/or order, as long as the order and/or shape of the diffraction holes in each diffraction area are different, and the embodiments of the present invention will not describe them one by one.
  • the imaging device may further include: a collimating optical component 3, which is arranged between the target emitting component 1 and the diffractive optical component 2, and is used for performing laser signals emitted by the target emitting component 1 Alignment processing.
  • a collimating optical component 3 which is arranged between the target emitting component 1 and the diffractive optical component 2, and is used for performing laser signals emitted by the target emitting component 1 Alignment processing.
  • the above-mentioned target emitting component 1, diffractive optical component 2, processing component, and collimating optical component 3 are all arranged in the accommodating cavity 4. Specifically, the target emitting component 1 is fixed to the bottom of the containing cavity 4, the collimating optical component 3 is disposed in the middle of the containing cavity 4, and the diffractive optical component 2 is disposed on the side surface of the cover plate 5 facing the target emitting component 1.
  • the imaging device may further include: a photosensitive component 6 arranged at the bottom of the accommodating cavity 4 for detecting the uniformity of the laser signal emitted by the target emitting component 1 and whether there is fluctuation, To ensure the stability of the laser signal.
  • the photosensitive component 6 may be a photosensitive diode (Photo Diode, PD).
  • PD Photo Diode
  • FIG. 4 is a schematic flowchart of an imaging method provided by an embodiment of the present invention. As shown in FIG. 4, the imaging method may include: the content shown in step S301 to step S303.
  • step S301 a plurality of spaced light-emitting regions in the target emitting component are sequentially controlled to emit laser signals.
  • the target emitting component is first divided into a plurality of light-emitting areas arranged at intervals, and then each light-emitting area is controlled to emit laser signals in turn.
  • step S302 a plurality of laser signals are respectively subjected to diffraction processing through the plurality of diffraction regions in the diffractive optical assembly.
  • the position of the diffraction area corresponds to the position of the light-emitting area one-to-one.
  • the diffractive optical component is divided into a plurality of diffractive regions whose positions correspond to the positions of the light-emitting regions of the target emitting component one-to-one, and the above-mentioned laser signal is diffracted through the plurality of diffractive regions.
  • step S303 superimposition processing is performed on the image received by the receiving component to obtain a target image.
  • the image received by the receiving component is an image obtained by using multiple laser signals to irradiate the target object after passing through the diffractive optical component.
  • the laser signal emitted by each light-emitting area is irradiated to the target object after the diffraction processing of a corresponding diffraction area, and then an image can be obtained.
  • the laser signal emitted by each light-emitting area in turn is diffracted by the corresponding diffraction area and then irradiated.
  • On the target object multiple images can be obtained, and then the multiple images obtained are superimposed to obtain the target image.
  • a plurality of spaced light-emitting regions in the target emitting assembly are firstly controlled to emit laser signals respectively, and then a plurality of diffractive regions corresponding to the positions of the light-emitting regions in the diffractive optical assembly are used to respectively control the above-mentioned diffractive regions.
  • the laser signal is subjected to diffraction processing, and finally the images obtained by respectively irradiating the target object with the above-mentioned laser signal received by the receiving component after passing through the diffractive optical component are superimposed to obtain the target image.
  • the target emitting component and the diffractive optical component are designed in zones, and then each area is controlled to work in turn.
  • multiple images containing laser irradiation points can be obtained, and then multiple images are superimposed.
  • the number of laser irradiation points of the superimposed image is increased, that is, the number of feature points in the image is increased, thereby improving the image resolution.
  • sequentially controlling a plurality of spaced light-emitting regions in the target emitting assembly to respectively emit laser signals may include: sequentially controlling a plurality of spaced light-emitting regions in the target emitting assembly to respectively emit laser signals When the laser signal is used, the reference luminous sub-area is controlled to emit the reference laser signal.
  • the reference light-emitting sub-region is located in multiple light-emitting regions.
  • one light-emitting point is selected in each light-emitting area, and multiple light-emitting points are determined as reference light-emitting sub-areas.
  • the reference light-emitting sub-areas all emit a reference laser signal, which is used as a reference feature position when multiple subsequent feature images are superimposed.
  • the selection of the light-emitting point in each light-emitting area can be any position, which is not specifically limited in the embodiment of the present invention.
  • performing superposition processing on the image received by the receiving component to obtain the target image may include the following steps.
  • the reference feature position is determined; the multiple images of the target object are aligned according to the reference feature location, and the aligned multiple images are superimposed to obtain the target image.
  • the imaging position after the reference laser signal obtained in the above embodiment is irradiated to the target object is determined as the reference feature position, and then the reference positions in the multiple feature images are aligned, and the multiple feature images are superimposed into A feature image.
  • each light-emitting area and the reference light-emitting sub-area emit laser signals
  • an image of the target object is obtained, and when multiple light-emitting areas emit laser signals in sequence, multiple images of the target object are obtained. Since the positions of the reference laser signals emitted by the reference light-emitting sub-regions in each image are the same, the superimposed target image can be obtained by aligning the positions of the reference features in each image.
  • Each image includes multiple feature points, and the superimposed image will include the sum of the feature points of multiple images. Since the feature points in the image are multiplied, the resolution will also be greatly improved, and the resulting image will be Will be clearer.
  • the diffractive processing of multiple laser signals is performed through multiple diffractive regions in the diffractive optical component, which may include: passing through the diffractive holes of the multiple diffractive regions in the diffractive optical component, respectively Perform diffraction processing on multiple laser signals.
  • the diffraction area includes a plurality of diffraction holes, and the order and/or shape of the diffraction holes in each diffraction area are different.
  • the order and/or shape of the multiple diffraction holes in each diffraction area can be designed to be different. That is, the number of feature points in the superimposed image is the sum of the imaging of the laser signal of each feature image. It will not be caused by overlapping imaging where the laser signal cannot be seen, and the resolution will be higher. High, the resulting image is clearer.
  • the imaging method may further include: collimating the laser signal emitted by the target emitting component through a collimating optical component.
  • the collimating optical component is arranged between the target emitting component and the diffractive optical component.
  • a collimating optical component is arranged between the target emitting component and the diffractive optical component, so that the laser signal emitted by the target emitting component is emitted into the diffractive optical component in parallel after passing through the collimating optical component, so as to avoid the target emitting component
  • the emitted laser signal is emitted outside the diffractive optical component.
  • Fig. 5 is a schematic diagram of the hardware structure of an electronic device implementing various embodiments of the present invention.
  • the electronic device 400 includes but is not limited to: a radio frequency unit 401, a network module 402, an audio output unit 403, an input unit 404, a sensor 405, a display unit 406, a user input unit 407, an interface unit 408, a memory 409, a processor 410, and Power supply 411 and other components.
  • a radio frequency unit 401 includes but is not limited to: a radio frequency unit 401, a network module 402, an audio output unit 403, an input unit 404, a sensor 405, a display unit 406, a user input unit 407, an interface unit 408, a memory 409, a processor 410, and Power supply 411 and other components.
  • Those skilled in the art can understand that the structure of the electronic device shown in FIG. 5 does not constitute a limitation on the electronic device.
  • the electronic device may include more or fewer components than those shown in the figure, or a combination of certain components, or different components. Layout.
  • electronic devices include, but are not limited to, mobile phones, tablet computers,
  • the processor 410 is used for:
  • the image received by the receiving component is superimposed to obtain a target image, where the image is an image obtained by irradiating the target object with a plurality of laser signals after passing through the diffractive optical component.
  • a plurality of spaced light-emitting regions in the target emitting assembly are firstly controlled to emit laser signals respectively, and then a plurality of diffractive regions corresponding to the positions of the light-emitting regions in the diffractive optical assembly are used to respectively control the above-mentioned diffractive regions.
  • the laser signal is subjected to diffraction processing, and finally the images obtained by respectively irradiating the target object with the above-mentioned laser signal received by the receiving component after passing through the diffractive optical component are superimposed to obtain the target image.
  • the target emitting component and the diffractive optical component are designed in zones, and then each area is controlled to work in turn.
  • multiple images containing laser irradiation points can be obtained, and then multiple images are superimposed.
  • the number of laser irradiation points of the superimposed image is increased, that is, the number of feature points in the image is increased, thereby improving the image resolution.
  • the radio frequency unit 401 can be used to receive and send signals during information transmission or communication. Specifically, the downlink data from the base station is received and processed by the processor 410; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 401 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 401 can also communicate with the network and other devices through a wireless communication system.
  • the electronic device provides users with wireless broadband Internet access through the network module 402, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 403 may convert the audio data received by the radio frequency unit 401 or the network module 402 or stored in the memory 409 into an audio signal and output it as sound. Moreover, the audio output unit 403 may also provide audio output related to a specific function performed by the electronic device 400 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 403 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 404 is used to receive audio or video signals.
  • the input unit 404 may include a graphics processing unit (GPU) 4041 and a microphone 4042.
  • the graphics processor 4041 is configured to respond to still pictures or video images obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 406.
  • the image frame processed by the graphics processor 4041 may be stored in the memory 409 (or other storage medium) or sent via the radio frequency unit 401 or the network module 402.
  • the microphone 4042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 401 in the case of a telephone call mode for output.
  • the electronic device 400 also includes at least one sensor 405, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 4061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 4061 and the display panel 4061 when the electronic device 400 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 405 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 406 is used to display information input by the user or information provided to the user.
  • the display unit 406 may include a display panel 4061, and the display panel 4061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 407 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the electronic device.
  • the user input unit 407 includes a touch panel 4071 and other input devices 4072.
  • the touch panel 4071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 4071 or near the touch panel 4071. operate).
  • the touch panel 4071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 410, the command sent by the processor 410 is received and executed.
  • the touch panel 4071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 407 may also include other input devices 4072.
  • other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 4071 can cover the display panel 4061.
  • the touch panel 4071 detects a touch operation on or near it, it transmits it to the processor 410 to determine the type of touch event, and then the processor 410 determines the type of touch event according to the touch The type of event provides corresponding visual output on the display panel 4061.
  • the touch panel 4071 and the display panel 4061 are used as two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 4071 and the display panel 4061 can be integrated
  • the implementation of the input and output functions of the electronic device is not specifically limited here.
  • the interface unit 408 is an interface for connecting an external device and the electronic device 400.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 408 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 400 or can be used to connect the electronic device 400 to an external device. Transfer data between devices.
  • the memory 409 can be used to store software programs and various data.
  • the memory 409 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 409 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 410 is the control center of the electronic device. It uses various interfaces and lines to connect the various parts of the entire electronic device, runs or executes the software programs and/or modules stored in the memory 409, and calls the data stored in the memory 409 , Perform various functions of electronic equipment and process data, so as to monitor the electronic equipment as a whole.
  • the processor 410 may include one or more processing units; preferably, the processor 410 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem
  • the processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 410.
  • the electronic device 400 may also include a power source 411 (such as a battery) for supplying power to various components.
  • a power source 411 such as a battery
  • the power source 411 may be logically connected to the processor 410 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system. And other functions.
  • the electronic device 400 includes some functional modules not shown, which will not be repeated here.
  • the embodiment of the present invention also provides an electronic device, including a processor 410, a memory 409, and a computer program stored on the memory 409 and running on the processor 410.
  • an electronic device including a processor 410, a memory 409, and a computer program stored on the memory 409 and running on the processor 410.
  • the computer program is executed by the processor 410,
  • Each process of the foregoing imaging method embodiment is implemented, and the same technical effect can be achieved. In order to avoid repetition, details are not repeated here.
  • the embodiment of the present invention also provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the above-mentioned imaging method embodiment is realized, and the same technical effect can be achieved. To avoid repetition, I won’t repeat it here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk, or optical disk, etc.
  • the technical solution of the present invention essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make a terminal (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present invention.
  • a terminal which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)

Abstract

本发明公开了一种成像装置、方法及电子设备,该成像装置包括:本体、盖板、目标发射组件、衍射光学组件和接收组件。所述目标发射组件上具有多个间隔设置的发光区域;所述衍射光学组件上设置有多个衍射区域,所述衍射区域的位置与所述发光区域的位置一一对应;接收组件用于接收所述目标发射组件发出的激光信号经过所述衍射光学组件后形成的图像。

Description

成像装置、方法及电子设备
交叉引用
本发明要求在2020年03月23日提交中国专利局、申请号为202010208348.5、发明名称为“成像装置、方法及电子设备”的中国专利申请的优先权,该申请的全部内容通过引用结合在本发明中。
技术领域
本发明涉及图像领域,尤其涉及一种成像装置、方法及电子设备。
背景技术
随着移动智能终端的发展,3D传感技术逐渐成为智能手机的标配硬件之一,例如飞行时间测距法(Time of flight,TOF)。
在智能手机中TOF可用于物理测距、3D建模、拍照等,为了提升TOF模组成像的分辨率,通常采用提升接收端图像芯片硬件分辨率的方式。但是,芯片硬件分辨率提升的同时会导致图像芯片的功耗和尺寸的增大,图像芯片的功耗增大会导致产热增加,影响手机的性能,而图像芯片的尺寸增大会占用更多的空间。
发明内容
本发明实施例提供一种成像装置、方法及电子设备,以解决现有技术中图像分辨率较低的问题。
为了解决上述技术问题,本发明是这样实现的:
第一方面,提供了一种成像装置,该装置包括:
本体,所述本体具有容纳腔和与所述容纳腔连通的开口;
盖板,所述盖板盖设于所述开口;
目标发射组件,所述目标发射组件设置于所述本体的底部,所述底部朝 向所述盖板,所述目标发射组件上具有多个间隔设置的发光区域;
衍射光学组件,设置于所述盖板朝向于所述衍射光学组件的侧面,所述衍射光学组件上设置有多个衍射区域,所述衍射区域的位置与所述发光区域的位置一一对应;
接收组件,所述接收组件用于接收所述目标发射组件发出的激光信号经过所述衍射光学组件后形成的图像。
第二方面,提供了一种成像方法,应用于电子设备,该方法包括:
依次控制目标发射组件中的多个间隔设置的发光区域分别发出激光信号;
通过衍射光学组件中的多个衍射区域,分别对多个所述激光信号进行衍射处理,其中,所述衍射区域的位置与所述发光区域的位置一一对应;
对接收组件接收的图像进行叠加处理,得到目标图像,其中,所述图像为利用多个所述激光信号经过所述衍射光学组件后分别照射目标对象获取的图像。
第三方面,提供了一种电子设备,该电子设备包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如第二方面所述的方法的步骤。
第四方面,提供了一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如第二方面所述的方法的步骤。
在本发明实施例中,本体和盖板可以为目标发射组件和衍射光学组件提供安装和保护,将目标发射组件分成多个发光区域,然后将衍射光学组件分成多个位置与目标发射组件的多个发光区域的位置一一对应的衍射区域,最后利用接收组件接收目标发射组件发出的激光信号经过衍射光学组件后形成的图像。本发明实施例通过对目标发射组件和衍射光学组件进行分区设计,然后分别依次控制每个区域工作,照射到目标对象获取到多个包含特征点的特征图像,再将多个特征图像叠加,使得叠加后的图像的特征点的数量增加, 进而提升图像分辨率。
附图说明
此处所说明的附图用来提供对本发明的进一步理解,构成本发明的一部分,本发明的示意性实施例及其说明用于解释本发明,并不构成对本发明的不当限定。在附图中:
图1是本发明的一个实施例提供的一种成像装置的结构示意图;
图2是本发明的一个实施例提供的一种目标发射组件的分区示意图;
图3是本发明的一个实施例提供的一种衍射光学组件的分区示意图;
图4是本发明的一个实施例提供的一种成像方法的流程示意图;
图5是本发明的一个实施例提供的一种电子设备的硬件结构示意图。
其中,1-目标发射组件;2-衍射光学组件;3-准直光学组件;4-本体;5-盖板;6-感光组件。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明提供了一种成像装置、方法及电子设备,该成像装置可以应用到电子设备上,以用来拍摄目标对象的图像。该成像装置通过将目标发射器组件1和衍射光学组件2分区成对应的对个区域,使得采集的目标图像的特征图的特征点增多,然后将多个特征图像融合,提升最终图像的分辨率。
如图1所示,为本发明实施例提供的一种成像装置的结构示意图。如图所示,该成像装置可以包括:本体4、盖板5、目标发射组件1、衍射光学组件2和接收组件。
其中,本体4具有容纳腔和与该容纳腔连通的开口,盖板5盖设于开口,目标发射组件1设置于本体4的底部,底部朝向盖板5,目标发射组件1具有多个间隔设置的发光区域,衍射光学组件2设置于盖板5朝向于衍射光学组件2的侧面,衍射光学组件2上设置有多个衍射区域,衍射区域的位置与发光区域的位置一一对应,接收组件用于接收目标发射组件1发出的激光信号经过衍射光学组件2后形成的图像。
在本发明实施例中,本体和盖板可以为目标发射组件和衍射光学组件提供安装和保护,将目标发射组件分成多个发光区域,然后将衍射光学组件分成多个位置与目标发射组件的多个发光区域的位置一一对应的衍射区域,最后利用接收组件接收目标发射组件发出的激光信号经过衍射光学组件后形成的图像。本发明实施例通过对目标发射组件1和衍射光学组件2进行分区设计,然后分别依次控制每个区域工作,照射到目标对象获取到多个包含特征点的特征图像,再将多个特征图像叠加,使得叠加后的图像的特征点的数量增加,进而提升图像分辨率。
在本发明的一个可能的实施方式中,该目标发射组件1可以是垂直腔面发射器(Vertical Cavity Surface Emitting Laser,VCSEL)芯片,也可以是其他可以发射激光信号的器件。衍射光学组件2可以是衍射光学器件(Diffractive Optical Elements,DOE)器件,也可以是其他具有衍射功能的器件。
在本发明的一个可能的实施方式中,如图2所示,为本发明实施例提供的一种目标发射组件的分区示意图。如图2所示,目标发射组件1可以分成A、B、C、D四个区域。
如图2所示,为本发明实施例提供的一种目标发射组件的分区示意图。如图所示,各发光区域内均设置有参考发光点,多个参考发光点组成参考发光子区域。该参考发光子区域用于独立发出参考激光信号。
具体地,在每个发光区域内均选取一个发光点,将多个发光点确定为参 考发光子区域,如图2中的E为参考发光子区域。在上述A、B、C、D任意一个发光区域发出激光信号时,参考发光子区域均发出参考激光信号。该参考激光信号用于作为每个参考发光区域发光形成的图像叠加时作为参考时的激光照射点位置。
其中,各个发光区域内的发光点的选取可以是任意位置,本发明实施例中不做具体限定。
在本发明的一个可能的实施方式中,如图3所示,为本发明实施例提供的一种衍射光学组件的分区示意图。如图3所示,该衍射光学组件可以分为位置与目标发射组件1的A、B、C、D四个区域的位置一一对应的A’、B’、C’、D’四个区域。
其中,每个衍射区域均包含多个衍射孔,各衍射区域内的衍射孔的排序和/或形状均不同。
在本发明实施例中,可以将每个衍射区域的多个衍射孔的排序和/或形状设计成不同的形式,以便在图像叠加时,可以清晰的同时看到每个图像的多个激光信号的成像,也就是,叠加后的图像中的特征点的数量是每个图像的激光信号成像之和,不会因为有重叠看不到激光信号的成像的情况产生,分辨率也会更加高,得到的图像更加清晰。
也就是,各衍射区域内的衍射孔可以是形状都相同但排序都不相同的孔,例如,各个区域均为圆孔,但第一个区域是列平行但行为交错的多行多列式排布,第二个区域是多行多列的矩阵式排布,第三个区域是行平行但列交错的多行多列式排布,第四个是行和列均为交错的多行多列式排布。各衍射区域内的衍射孔也可以是形状都不相同但排序均相同的孔,例如,各个区域均为多行多列的矩阵式排布,但第一个区域是圆形孔,第二个区域是正方形孔,第三个区域是五角形孔,第四个区域是三角形孔。各衍射区域内的衍射孔还可以是形状和排序均不相同的孔,例如,第一个区域是列平行但行为交错的多行多列式排布的圆形孔,第二个区域是多行多列的矩阵式排布的正方形孔, 第三个区域是行平行但列交错的多行多列式排布的五角形孔,第四个区域是行和列均为交错的多行多列式排布的三角形孔。各衍射区域内的衍射孔还可以是其他任意形状和/或排序的孔,只要是各衍射区域内的衍射孔的排序和/或形状均不同即可,本发明实施例不做一一描述。
在本发明的一个可能的实施方式中,该成像装置还可以包括:准直光学组件3,设置于目标发射组件1与衍射光学组件2之间,用于对目标发射组件1发出的激光信号进行准直处理。
在本发明的一个可能的实施方式中,上述目标发射组件1、衍射光学组件2、处理组件、准直光学组件3均设置于该容纳腔4内。具体地,目标发射组件1固定于容纳腔4的底部,准直光学组件3设置于容纳腔4的中部,衍射光学组件2设置于盖板5朝向于目标发射组件1的侧面。
在本发明的一个可能的是实施方式中,该成像装置还可以包括:感光组件6,设置于容纳腔4的底部,用于检测目标发射组件1发出的激光信号的均匀性和是否有波动,以保证激光信号的稳定性。
在本发明的一个具体地实时方式中,感光组件6可以是感光二极管(Photo Diode,PD)。
本发明实施例还提供了一种成像方法,如图4所述为本发明实施例提供的一种成像方法的流程示意图。如图4所示,该成像方法可以包括:步骤S301~步骤S303所示的内容。
在步骤S301中,依次控制目标发射组件中的多个间隔设置的发光区域分别发出激光信号。
在本发明实施例中,首先将目标发射组件分成多个间隔设置的发光区域,然后控制每个发光区域依次分别发出激光信号。
在步骤S302中,通过衍射光学组件中的多个衍射区域,分别对多个激光信号进行衍射处理。
其中,衍射区域的位置与发光区域的位置一一对应。
在本发明实施例中,将衍射光学组件分成位置与目标发射组件的发光区域的位置一一对应的多个衍射区域,通过多个衍射区域对上述激光信号进行衍射处理。
在步骤S303中,对接收组件接收的图像进行叠加处理,得到目标图像。
其中,接收组件接收的图像为利用多个激光信号经过衍射光学组件后分别照射目标对象获取的图像。
也就是,每个发光区域发出的激光信号经过一个对应的衍射区域的衍射处理后照射到目标对象,就可以得到一个图像,各个发光区域依次发出的激光信号分别经过对应的衍射区域的衍射后照射到目标对象上,就可以得到多个图像,然后对得到的多个图像进行叠加处理,得到目标图像。
本发明实施例,首先依次控制目标发射组件中的多个间隔设置的发光区域分别发出激光信号,然后通过衍射光学组件中的多个位置与发光区域的位置一一对应的衍射区域,分别对上述激光信号进行衍射处理,最后对接收组件接收的利用上述激光信号经过衍射光学组件后,分别照射目标对象获取的图像进行叠加处理,得到目标图像。本发明实施例通过对目标发射组件和衍射光学组件进行分区设计,然后分别依次控制每个区域工作,照射到目标对象后,可以得到多个包含激光照射点的图像,再将多个图像叠加,使得叠加后的图像的激光照射点的数量增加,也就是图像中的特征点数量增加,进而提升图像分辨率。
在本发明的一个可能的实施方式中,依次控制目标发射组件中的多个间隔设置的发光区域分别发出激光信号,可以包括:在依次控制目标发射组件中的多个间隔设置的发光区域分别发出激光信号时,控制参考发光子区域发出参考激光信号。
其中,参考发光子区域位于多个发光区域内。
在本发明实施例中,在每个发光区域内均选取一个发光点,将多个发光 点确定为参考发光子区域。在上述任意一个发光区域发出激光信号时,参考发光子区域均发出参考激光信号,以用于在后续多个特征图像叠加时作为参考特征位置。
其中,各个发光区域内的发光点的选取可以是任意位置,本发明实施例中不做具体限定。
在本发明的一个可能的实施方式中,对接收组件接收的图像进行叠加处理,得到目标图像,可以包括以下步骤。
根据参考激光信号在图像中的成像位置,确定参考特征位置;根据参考特征位置对目标对象的多个图像进行对齐处理,并将对齐处理后的多个图像进行叠加处理,得到目标图像。
在本发明实施例中,将上述实施例得到的参考激光信号照射到目标对象后的成像位置,确定为参考特征位置,然后将多个特征图像中的参考位置对齐,将多个特征图像叠加成一个特征图像。
每个发光区域与参考发光子区域发出激光信号时,都会得到目标对象的一个图像,多个发光区域依次分别发出激光信号时,会得到目标对象的多个图像。由于每个图像中的参考发光子区域发出的参考激光信号的位置相同,因此可以通过将各个图像中的参考特征位置对齐,得到叠加后的目标图像。每个图像中均包括多个特征点,叠加后的图像就会包括多个图像的特征点之和,由于图像中的特征点成倍增多,因此分辨率也会得到大大提升,得到的图像就会更加清晰。
在本发明的一个可能的实施方式中,通过衍射光学组件中的多个衍射区域,分别对多个激光信号进行衍射处理,可以包括:通过衍射光学组件中的多个衍射区域的衍射孔,分别对多个激光信号进行衍射处理。
其中,衍射区域包括多个衍射孔,各衍射区域内的衍射孔的排序和/或形状均不同。
在本发明实施例中,为了在多个图像叠加后可以清晰的看到多个图像中 的各个激光信号的成像,可以将每个衍射区域的多个衍射孔的排序和/或形状设计成不同的形式,也就是,叠加后的图像中的特征点的数量是每个特征图像的激光信号的成像之和,不会因为有重叠看不到激光信号的成像的情况产生,分辨率也会更加高,得到的图像更加清晰。
其中,个衍射区域内的衍射孔的排列和/或形状在上述装置实施例中已详细描述,在本实施例中不做赘述。
在本发明的一个可能的实施方式中,在分别对多个激光信号进行衍射处理之前,该成像方法还可以包括:通过准直光学组件,对目标发射组件发出的激光信号进行准直处理。
其中,准直光学组件设置于目标发射组件与衍射光学组件之间。
在本发明实施例中,通过在目标发射组件与衍射光学组件之间设置准直光学组件,使得目标发射组件发出的激光信号通过准直光学组件后平行发射到衍射光学组件中,以免目标发射组件发出的激光信号发射到衍射光学组件之外。
图5为实现本发明各个实施例的一种电子设备的硬件结构示意图。
该电子设备400包括但不限于:射频单元401、网络模块402、音频输出单元403、输入单元404、传感器405、显示单元406、用户输入单元407、接口单元408、存储器409、处理器410、以及电源411等部件。本领域技术人员可以理解,图5中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本发明实施例中,电子设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
其中,处理器410,用于:
依次控制目标发射组件中的多个间隔设置的发光区域分别发出激光信号;
通过衍射光学组件中的多个衍射区域,分别对多个激光信号进行衍射处 理,其中,衍射区域的位置与发光区域的位置一一对应;
对接收组件接收的图像进行叠加处理,得到目标图像,其中,图像为利用多个激光信号经过衍射光学组件后分别照射目标对象获取的图像。
本发明实施例,首先依次控制目标发射组件中的多个间隔设置的发光区域分别发出激光信号,然后通过衍射光学组件中的多个位置与发光区域的位置一一对应的衍射区域,分别对上述激光信号进行衍射处理,最后对接收组件接收的利用上述激光信号经过衍射光学组件后,分别照射目标对象获取的图像进行叠加处理,得到目标图像。本发明实施例通过对目标发射组件和衍射光学组件进行分区设计,然后分别依次控制每个区域工作,照射到目标对象后,可以得到多个包含激光照射点的图像,再将多个图像叠加,使得叠加后的图像的激光照射点的数量增加,也就是图像中的特征点数量增加,进而提升图像分辨率。
应理解的是,本发明实施例中,射频单元401可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器410处理;另外,将上行的数据发送给基站。通常,射频单元401包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元401还可以通过无线通信***与网络和其他设备通信。
电子设备通过网络模块402为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元403可以将射频单元401或网络模块402接收的或者在存储器409中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元403还可以提供与电子设备400执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元403包括扬声器、蜂鸣器以及受话器等。
输入单元404用于接收音频或视频信号。输入单元404可以包括图形处理器(Graphics Processing Unit,GPU)4041和麦克风4042,图形处理器4041 对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元406上。经图形处理器4041处理后的图像帧可以存储在存储器409(或其它存储介质)中或者经由射频单元401或网络模块402进行发送。麦克风4042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元401发送到移动通信基站的格式输出。
电子设备400还包括至少一种传感器405,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板4061的亮度,接近传感器可在电子设备400移动到耳边时,关闭显示面板4061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别电子设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器405还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元406用于显示由用户输入的信息或提供给用户的信息。显示单元406可包括显示面板4061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板4061。
用户输入单元407可用于接收输入的数字或字符信息,以及产生与电子设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元407包括触控面板4071以及其他输入设备4072。触控面板4071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板4071上或在触控面板4071附近的操作)。触控面板 4071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器410,接收处理器410发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板4071。除了触控面板4071,用户输入单元407还可以包括其他输入设备4072。具体地,其他输入设备4072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板4071可覆盖在显示面板4061上,当触控面板4071检测到在其上或附近的触摸操作后,传送给处理器410以确定触摸事件的类型,随后处理器410根据触摸事件的类型在显示面板4061上提供相应的视觉输出。虽然在图5中,触控面板4071与显示面板4061是作为两个独立的部件来实现电子设备的输入和输出功能,但是在某些实施例中,可以将触控面板4071与显示面板4061集成而实现电子设备的输入和输出功能,具体此处不做限定。
接口单元408为外部装置与电子设备400连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元408可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到电子设备400内的一个或多个元件或者可以用于在电子设备400和外部装置之间传输数据。
存储器409可用于存储软件程序以及各种数据。存储器409可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作***、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外, 存储器409可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器410是电子设备的控制中心,利用各种接口和线路连接整个电子设备的各个部分,通过运行或执行存储在存储器409内的软件程序和/或模块,以及调用存储在存储器409内的数据,执行电子设备的各种功能和处理数据,从而对电子设备进行整体监控。处理器410可包括一个或多个处理单元;优选的,处理器410可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作***、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器410中。
电子设备400还可以包括给各个部件供电的电源411(比如电池),优选的,电源411可以通过电源管理***与处理器410逻辑相连,从而通过电源管理***实现管理充电、放电、以及功耗管理等功能。
另外,电子设备400包括一些未示出的功能模块,在此不再赘述。
优选的,本发明实施例还提供一种电子设备,包括处理器410,存储器409,存储在存储器409上并可在所述处理器410上运行的计算机程序,该计算机程序被处理器410执行时实现上述成像方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本发明实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述成像方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质,如只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是 还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明各个实施例所述的方法。
上面结合附图对本发明的实施例进行了描述,但是本发明并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本发明的启示下,在不脱离本发明宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本发明的保护之内。

Claims (11)

  1. 一种成像装置,所述装置包括:
    本体,所述本体具有容纳腔和与所述容纳腔连通的开口;
    盖板,所述盖板盖设于所述开口;
    目标发射组件,所述目标发射组件设置于所述本体的底部,所述底部朝向所述盖板,所述目标发射组件上具有多个间隔设置的发光区域;
    衍射光学组件,设置于所述盖板朝向于所述衍射光学组件的侧面,所述衍射光学组件上设置有多个衍射区域,所述衍射区域的位置与所述发光区域的位置一一对应;
    接收组件,所述接收组件用于接收所述目标发射组件发出的激光信号经过所述衍射光学组件后形成的图像。
  2. 根据权利要求1所述的装置,其中,各所述发光区域内均设置有参考发光点,多个所述参考发光点组成参考发光子区域,所述参考发光子区域用于独立发出参考激光信号。
  3. 根据权利要求1所述的装置,其中,所述衍射区域包括多个衍射孔,各所述衍射区域内的所述衍射孔的排序和/或形状均不同。
  4. 根据权利要求1所述的装置,其中,所述装置还包括:
    准直光学组件,设置于所述目标发射组件与所述衍射光学组件之间,所述准直光学组件用于对所述目标发射组件发出的激光信号进行准直处理。
  5. 一种成像方法,应用于电子设备,所述方法包括:
    依次控制目标发射组件中的多个间隔设置的发光区域分别发出激光信号;
    通过衍射光学组件中的多个衍射区域,分别对多个所述激光信号进行衍射处理,其中,所述衍射区域的位置与所述发光区域的位置一一对应;
    对接收组件接收的图像进行叠加处理,得到目标图像,其中,所述图像为利用多个所述激光信号经过所述衍射光学组件后分别照射目标对象获取的图像。
  6. 根据权利要求5所述的方法,其中,所述依次控制目标发射组件中的多个间隔设置的发光区域分别发出激光信号,包括:
    在所述依次控制目标发射组件中的多个间隔设置的发光区域分别发出激光信号时,控制参考发光子区域发出参考激光信号,其中,所述参考发光子区域位于各所述发光区域内。
  7. 根据权利要求6所述的方法,其中,所述对接收组件接收的图像进行叠加处理,得到目标图像,包括:
    根据所述参考激光信号在所述图像中的成像位置,确定参考特征位置;
    根据所述参考特征位置对所述目标对象的多个图像进行对齐处理,并将对齐处理后的所述多个图像进行叠加处理,得到目标图像。
  8. 根据权利要求5所述的方法,其中,所述通过衍射光学组件中的多个衍射区域,分别对多个所述激光信号进行衍射处理,包括:
    通过衍射光学组件中的多个衍射区域的衍射孔,分别对多个所述激光信号进行衍射处理,其中,所述衍射区域包括多个衍射孔,各所述衍射区域内的衍射孔的排序和/或形状均不同。
  9. 根据权利要求5所述的方法,其中,在分别对多个所述激光信号进行衍射处理之前,所述方法还包括:
    通过准直光学组件,对所述目标发射组件发出的激光信号进行准直处理,其中,所述准直光学组件设置于所述目标发射组件与所述衍射光学组件之间。
  10. 一种电子设备,包括:存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求5至9中任一项所述的方法的步骤。
  11. 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求5至9中任一项所述的方法的步骤。
PCT/CN2021/071495 2020-03-23 2021-01-13 成像装置、方法及电子设备 WO2021190099A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010208348.5 2020-03-23
CN202010208348.5A CN111246073B (zh) 2020-03-23 2020-03-23 成像装置、方法及电子设备

Publications (1)

Publication Number Publication Date
WO2021190099A1 true WO2021190099A1 (zh) 2021-09-30

Family

ID=70864423

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/071495 WO2021190099A1 (zh) 2020-03-23 2021-01-13 成像装置、方法及电子设备

Country Status (2)

Country Link
CN (1) CN111246073B (zh)
WO (1) WO2021190099A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111246073B (zh) * 2020-03-23 2022-03-25 维沃移动通信有限公司 成像装置、方法及电子设备
CN114185054A (zh) * 2020-08-25 2022-03-15 上海禾赛科技有限公司 用于激光雷达的激光单元以及激光雷达
CN111968516A (zh) * 2020-08-28 2020-11-20 云谷(固安)科技有限公司 一种显示面板及显示装置
CN114615397B (zh) * 2020-12-09 2023-06-30 华为技术有限公司 Tof装置及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109597211A (zh) * 2018-12-25 2019-04-09 深圳奥比中光科技有限公司 一种投影模组、深度相机以及深度图像获取方法
US20190196215A1 (en) * 2017-12-21 2019-06-27 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN110058424A (zh) * 2019-03-27 2019-07-26 努比亚技术有限公司 一种激光衍射装置、3d装置及终端
CN110275381A (zh) * 2019-06-26 2019-09-24 业成科技(成都)有限公司 结构光发射模组及应用其的深度感测设备
CN111246073A (zh) * 2020-03-23 2020-06-05 维沃移动通信有限公司 成像装置、方法及电子设备

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6737634B2 (en) * 2002-01-16 2004-05-18 The University Of Chicago Use of multiple optical vortices for pumping, mixing and sorting
CN100420934C (zh) * 2002-11-14 2008-09-24 南开大学 一种近场扫描光学显微镜定位扫描成像方法
US7704644B2 (en) * 2005-01-25 2010-04-27 University Of Delaware Zero-alignment method for tunable fabrication of three-dimensional photonic crystals by multiple-exposure laser interference using diffraction gratings patterned on a single mask
US20140307055A1 (en) * 2013-04-15 2014-10-16 Microsoft Corporation Intensity-modulated light pattern for active stereo
WO2016191717A1 (en) * 2015-05-28 2016-12-01 Vixar Vcsels and vcsel arrays designed for improved performance as illumination sources and sensors
KR101892013B1 (ko) * 2016-05-27 2018-08-27 엘지전자 주식회사 이동 단말기
CN106569382B (zh) * 2016-10-26 2018-07-06 深圳奥比中光科技有限公司 激光投影仪及其深度相机
DE102017215850B4 (de) * 2017-09-08 2019-12-24 Robert Bosch Gmbh Verfahren zur Herstellung eines diffraktiven optischen Elements, LIDAR-System mit einem diffraktiven optischen Element und Kraftfahrzeug mit einem LIDAR-System
CN207854012U (zh) * 2017-12-28 2018-09-11 宁波舜宇光电信息有限公司 基于结构光的深度相机
CN108490628B (zh) * 2018-03-12 2020-01-10 Oppo广东移动通信有限公司 结构光投射器、深度相机和电子设备
CN108490635B (zh) * 2018-03-23 2019-12-13 深圳奥比中光科技有限公司 一种结构光投影模组和深度相机
JP2019190910A (ja) * 2018-04-20 2019-10-31 スタンレー電気株式会社 画像データ生成装置
CN108828702A (zh) * 2018-06-06 2018-11-16 Oppo广东移动通信有限公司 衍射光学元件、光电模组、输入输出组件及电子设备
CN110891131A (zh) * 2018-09-10 2020-03-17 北京小米移动软件有限公司 摄像头模组、处理方法及装置、电子设备、存储介质
CN109788195B (zh) * 2019-01-04 2021-04-16 Oppo广东移动通信有限公司 电子设备和移动平台
CN109618085B (zh) * 2019-01-04 2021-05-14 Oppo广东移动通信有限公司 电子设备和移动平台
CN209657072U (zh) * 2019-01-15 2019-11-19 深圳市安思疆科技有限公司 一种不含准直透镜的结构光投射模组以及3d成像装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190196215A1 (en) * 2017-12-21 2019-06-27 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN109597211A (zh) * 2018-12-25 2019-04-09 深圳奥比中光科技有限公司 一种投影模组、深度相机以及深度图像获取方法
CN110058424A (zh) * 2019-03-27 2019-07-26 努比亚技术有限公司 一种激光衍射装置、3d装置及终端
CN110275381A (zh) * 2019-06-26 2019-09-24 业成科技(成都)有限公司 结构光发射模组及应用其的深度感测设备
CN111246073A (zh) * 2020-03-23 2020-06-05 维沃移动通信有限公司 成像装置、方法及电子设备

Also Published As

Publication number Publication date
CN111246073A (zh) 2020-06-05
CN111246073B (zh) 2022-03-25

Similar Documents

Publication Publication Date Title
WO2021190099A1 (zh) 成像装置、方法及电子设备
US11769273B2 (en) Parameter obtaining method and terminal device
US20220004357A1 (en) Audio signal outputting method and terminal device
US11798504B2 (en) Ambient light detection method and electronic device
WO2021129776A1 (zh) 成像处理方法和电子设备
CN109407832B (zh) 一种终端设备的控制方法及终端设备
WO2020192324A1 (zh) 界面显示方法及终端设备
US20220053082A1 (en) Application interface display method and mobile terminal
WO2019184814A1 (zh) 指纹识别方法及移动终端
WO2021121265A1 (zh) 摄像头启动方法及电子设备
US20220321120A1 (en) Touch button, control method, and electronic device
WO2020220893A1 (zh) 截图方法及移动终端
WO2021057290A1 (zh) 信息控制方法及电子设备
WO2021190387A1 (zh) 检测结果输出的方法、电子设备及介质
WO2021129850A1 (zh) 语音消息播放方法及电子设备
CN110661949A (zh) 显示面板的控制方法及电子设备
CN110990172A (zh) 一种应用分享方法、第一电子设备及计算机可读存储介质
WO2021208890A1 (zh) 截屏方法及电子设备
WO2021143669A1 (zh) 一种获取配置信息的方法及电子设备
WO2020135175A1 (zh) 信息提醒方法及装置
WO2021104450A1 (zh) 电子设备及其音量调节方法
WO2020216181A1 (zh) 终端设备及其控制方法
WO2020151490A1 (zh) 应用的控制方法及终端设备
WO2021104232A1 (zh) 显示方法及电子设备
WO2021190370A1 (zh) 截图方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21775750

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21775750

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27/02/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21775750

Country of ref document: EP

Kind code of ref document: A1