CN107071277B - Optical drawing shooting device and method and mobile terminal - Google Patents

Optical drawing shooting device and method and mobile terminal Download PDF

Info

Publication number
CN107071277B
CN107071277B CN201710209579.6A CN201710209579A CN107071277B CN 107071277 B CN107071277 B CN 107071277B CN 201710209579 A CN201710209579 A CN 201710209579A CN 107071277 B CN107071277 B CN 107071277B
Authority
CN
China
Prior art keywords
image
reference plane
light source
track
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710209579.6A
Other languages
Chinese (zh)
Other versions
CN107071277A (en
Inventor
陈小翔
张腾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201710209579.6A priority Critical patent/CN107071277B/en
Publication of CN107071277A publication Critical patent/CN107071277A/en
Application granted granted Critical
Publication of CN107071277B publication Critical patent/CN107071277B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides a photoplotting shooting device, a photoplotting shooting method and a mobile terminal.A first image is collected through a first camera module in a double camera module and a second image is collected through a second camera module in a long exposure mode; when the first camera module collects the first image, determining the depth of field of the light source track in the first image according to the first image and the second image; determining the actual track of the part exceeding the reference plane according to the depth of field and the projection of the part exceeding the reference plane in the light source track on the reference plane; and then synthesizing the part of the light source track which does not exceed the reference plane in the first image with the actual track which exceeds the reference plane. By implementing the invention, the light source track is corrected in real time during shooting, so that the influence of the deviation track on the optical drawing is effectively avoided, the optical drawing precision is improved, and the probability and time of optical drawing post-processing are reduced.

Description

Optical drawing shooting device and method and mobile terminal
Technical Field
The present invention relates to the field of photographing and photography, and more particularly, to a photo-drawing photographing apparatus and method, and a mobile terminal.
Background
A photo-drawing work is that under a long exposure mode of a camera, the content in the current scene is continuously acquired, generally, the tracks of the movement of one or more light sources are formed, and then the tracks are imaged in the same image; this allows the use of light to draw various works, and therefore light painting is also known as light art. In the actual optical drawing creation, firstly, the optical drawing creator determines the main body and content of the optical drawing, such as what graphics are to be drawn, and it is required to know that the graphics are planar because the images are planar; then, the renderer carries the light source, and the light source is moved according to a certain trajectory under the requirement of the creator in the long exposure mode of the camera. However, the actual movement is in a three-dimensional space, the real trajectory cannot be in the same plane, and more or less light source trajectories exceed the range of the reference plane, and the projection of the light source trajectories on the reference plane is collected by the camera instead of the real trajectory, which inevitably causes the deviation of the light painting creation and assumption and affects the accuracy of the light painting creation. In addition, in the conventional production of the optical drawing work, the production of the optical drawing work is completed by a single camera, and even if a terminal with two cameras is adopted to produce the optical drawing work, all shooting is completed by only one camera, so that the production of the optical drawing work by the two cameras is not involved, and the optical drawing is generally carried out in a dark light environment, so that the defect of image quality caused by the single camera cannot be avoided.
Disclosure of Invention
The technical problem to be solved by the invention is how to improve the deviation between actual drawing and assumption of optical drawing in the prior art; to this technical problem, a photo-drawing shooting device is provided, which includes:
the shooting module is used for acquiring a first image through a first camera module in the double camera modules and acquiring a second image through a second camera module in the double camera modules in a long exposure mode;
the determining module is used for determining the depth of field of a light source track in the first image according to the first image and the second image when the first camera module collects the first image;
the correction module is used for determining the actual track of the part exceeding the reference plane according to the depth of field of the light source track and the projection of the part exceeding the reference plane in the light source track on the reference plane;
and the synthesis module is used for synthesizing the part of the light source track which does not exceed the reference plane in the first image with the actual track which exceeds the reference plane.
Optionally, the synthesis module is further configured to: and smoothly connecting the part of the light source track which does not exceed the reference plane with the actual track which exceeds the reference plane.
Optionally, the first image is a color image, and the second image is a black-and-white image; the synthesis module is further configured to:
synthesizing a part of the light source track which does not exceed the reference plane in the second image with an actual track of the part which exceeds the reference plane;
and overlapping the synthesized first image and the synthesized second image.
Optionally, the reference plane includes a preset plane and a space in a preset distance on two sides of the preset plane.
The invention also provides a mobile terminal which is characterized by comprising the optical drawing shooting device.
The invention also provides a photo-drawing shooting method, which comprises the following steps:
in a long exposure mode, a first image is collected through a first camera module in the double camera modules, and a second image is collected through a second camera module in the double camera modules;
when the first camera module collects a first image, determining the depth of field of a light source track in the first image according to the first image and the second image;
determining the actual track of the part exceeding the reference plane according to the depth of field of the light source track and the projection of the part exceeding the reference plane in the light source track on the reference plane;
and synthesizing the part of the light source track which does not exceed the reference plane in the first image with the actual track of the part which exceeds the reference plane.
Optionally, the synthesizing the actual trajectory of the part of the light source trajectory that does not exceed the reference plane with the part that exceeds the reference plane includes: and smoothly connecting the part of the light source track which does not exceed the reference plane with the actual track which exceeds the reference plane.
Optionally, the first image is a color image, and the second image is a black-and-white image; after the synthesizing the part of the light source track which does not exceed the reference plane with the actual track which exceeds the reference plane part, the method further comprises the following steps:
synthesizing a part of the light source track which does not exceed the reference plane in the second image with an actual track of the part which exceeds the reference plane;
and overlapping the synthesized first image and the synthesized second image.
Optionally, the reference plane includes a preset plane and a space in a preset distance on two sides of the preset plane.
Optionally, the reference plane includes: the preset focal plane is used as a reference plane, or a plane where the starting point of the light source track is located is used as the reference plane, or a plane where the average distance of the actual track of the light source track is located is used as the reference plane.
Advantageous effects
The invention provides a photoplotting shooting device, a photoplotting shooting method and a mobile terminal.A first image is collected through a first camera module in a double camera module and a second image is collected through a second camera module in a long exposure mode; when the first camera module collects the first image, determining the depth of field of the light source track in the first image according to the first image and the second image; determining the actual track of the part exceeding the reference plane according to the depth of field and the projection of the part exceeding the reference plane in the light source track on the reference plane; and then synthesizing the part of the light source track which does not exceed the reference plane in the first image with the actual track which exceeds the reference plane. By implementing the invention, the light source track is corrected in real time during shooting, so that the influence of the deviation track on the optical drawing is effectively avoided, the optical drawing precision is improved, and the probability and time of optical drawing post-processing are reduced.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal for implementing various embodiments of the present invention;
FIG. 2 is an electrical schematic diagram of an alternative camera for implementing various embodiments of the invention;
FIG. 3 is a schematic view of an optical drawing and shooting apparatus according to a first embodiment of the present invention;
fig. 4 is a schematic structural diagram of a terminal of a dual camera module according to a first embodiment of the present invention;
FIG. 5 is a schematic diagram of a first image according to a first embodiment of the present invention;
fig. 6 is a schematic diagram of a movement track of a light source perpendicular to a shooting direction according to a first embodiment of the present invention;
FIG. 7 is a schematic diagram of a synthesized first image according to a first embodiment of the present invention;
fig. 8 is a schematic diagram illustrating a mobile terminal according to a second embodiment of the present invention;
fig. 9 is a flowchart of a photo-plotting and shooting method according to a third embodiment of the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, the suffix of "unit" used to denote an element is used only for facilitating the description of the present invention, and has no specific meaning in itself.
The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a navigation device, and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal, however, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes. The mobile terminal in this embodiment may implement the optical drawing and shooting device in each embodiment of the present invention.
Fig. 1 is a schematic diagram of a hardware structure of an optional mobile terminal for implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an a/V (audio/video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc. Fig. 1 illustrates a mobile terminal having various components, but it is to be understood that not all illustrated components are required to be implemented, and that more or fewer components may instead be implemented, the elements of the mobile terminal being described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a mobile communication unit 112, a wireless internet unit 113, a short range communication unit 114, and a location information unit 115.
The mobile communication unit 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet unit 113 supports wireless internet access of the mobile terminal. The unit may be internally or externally coupled to the terminal. The wireless internet access technology to which the unit relates may include WLAN (wireless LAN) (Wi-Fi), Wibro (wireless broadband), Wimax (worldwide interoperability for microwave access), HSDPA (high speed downlink packet access), and the like.
The short-range communication unit 114 is a unit for supporting short-range communication. Some examples of short-range communication technologies include bluetooth (TM), Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), zigbee (TM), and the like.
The location information unit 115 is a unit for checking or acquiring location information of the mobile terminal. A typical example of the location information unit is a GPS (global positioning system). According to the current technology, the GPS unit 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information, thereby accurately calculating three-dimensional current location information according to longitude, latitude, and altitude. Currently, a method for calculating position and time information uses three satellites and corrects an error of the calculated position and time information by using another satellite. In addition, the GPS unit 115 can calculate speed information by continuously calculating current position information in real time.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 1220, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 151. The image frames processed by the cameras 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the construction of the mobile terminal. The microphone s122 can receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication unit 112 in case of the phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display unit 151 in the form of a layer, a touch screen may be formed.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 140 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 140 can detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled with an external device. The sensing unit 140 may include a light sensor 141.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification unit, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification unit may store various information for authenticating a user using the mobile terminal 100 and may include a user identity Unit (UIM), a subscriber identity unit (SIM), a universal subscriber identity Unit (USIM), and the like. In addition, a device having an identification unit (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner.
The output unit 150 may include a display unit 151, an audio output unit 152, and the like.
The display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display unit 151 and the touch pad are overlapped with each other in the form of a layer to form a touch screen, the display unit 151 may serve as an input device and an output device. The display unit 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. Depending on the particular desired implementation, the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output unit 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 152 may include a speaker, a buzzer, and the like.
The memory 160 may store software programs and the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, and the like) that has been or will be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia unit 181 for reproducing (or playing back) multimedia data, and the multimedia unit 181 may be constructed within the controller 180 or may be constructed separately from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a procedure or a function may be implemented with separate software units allowing to perform at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to this point, mobile terminals have been described in terms of their functionality. Hereinafter, a slide-type mobile terminal among various types of mobile terminals, such as a folder-type, bar-type, swing-type, slide-type mobile terminal, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
Fig. 2 is an electrical block diagram of an alternative camera implementing various embodiments of the invention.
The photographing lens 1211 is composed of a plurality of optical lenses for forming an object image, and is a single focus lens or a zoom lens. The photographing lens 1211 is movable in the optical axis direction under the control of the lens driver 1221, and the lens driver 1221 controls the focal position of the photographing lens 1211 in accordance with a control signal from the lens driving control circuit 1222. The lens drive control circuit 1222 performs drive control of the lens driver 1221 in accordance with a control command from the microcomputer 1217.
An image pickup device 1212 is disposed on the optical axis of the photographing lens 1211 near the position of the object image formed by the photographing lens 1211. The image pickup device 1212 is used to pick up an image of an object and acquire picked-up image data. Photodiodes constituting each pixel are two-dimensionally arranged in a matrix on the image pickup device 1212. Each photodiode generates a photoelectric conversion current corresponding to the amount of received light, and the photoelectric conversion current is charged by a capacitor connected to each photodiode. A bayer RGB color filter is disposed on the front surface of each pixel.
The image pickup device 1212 is connected to an image pickup circuit 1213, and the image pickup circuit 1213 performs charge accumulation control and image signal reading control in the image pickup device 1212, performs waveform shaping after reducing reset noise for the read image signal (analog image signal), and further performs gain improvement or the like so as to obtain an appropriate signal level.
The imaging circuit 1213 is connected to an a/D converter 1214, and the a/D converter 1214 performs analog-to-digital conversion on the analog image signal and outputs a digital image signal (hereinafter referred to as image data) to the bus 1227.
The bus 1227 is a transfer path for transferring various data read out or generated inside the camera. The a/D converter 1214 described above is connected to the bus 1227, and further connected to an image processor 1215, a JPEG processor 1216, a microcomputer 1217, an SDRAM (Synchronous Dynamic random access memory) 1218, a memory interface (hereinafter referred to as memory I/F)1219, and an LCD (Liquid Crystal Display) driver 1220.
The image processor 1215 performs various image processing such as OB subtraction processing, white balance adjustment, color matrix operation, gamma conversion, color difference signal processing, noise removal processing, synchronization processing, and edge processing on image data output from the image pickup device 1212. The JPEG processor 1216 compresses the image data read out from the SDRAM1218 in a JPEG compression method when recording the image data in the recording medium 1225. The JPEG processor 1216 decompresses JPEG image data for image reproduction display. When decompression is performed, a file recorded in the recording medium 1225 is read out, decompression processing is performed in the JPEG processor 1216, and the decompressed image data is temporarily stored in the SDRAM1218 and displayed on the LCD 1226. In the present embodiment, the JPEG system is used as the image compression/decompression system, but the compression/decompression system is not limited to this, and other compression/decompression systems such as MPEG, TIFF, and h.264 may be used.
The microcomputer 1217 functions as a control unit of the entire camera, and collectively controls various processing sequences of the camera. The microcomputer 1217 is connected to an operation unit 1223 and a flash memory 1224.
The operation unit 1223 includes, but is not limited to, physical keys or virtual keys, which may be various input buttons such as a power button, a photographing key, an editing key, a moving image button, a reproduction button, a menu button, a cross key, an OK button, a delete button, and an enlargement button, and operation controls such as various input keys, and which detect operation states of these operation controls.
The detection result is output to the microcomputer 1217. A touch panel is provided on the front surface of the LCD1226 as a display, and a touch position of the user is detected and output to the microcomputer 1217. The microcomputer 1217 executes various processing sequences corresponding to the user's operation according to the detection result of the operation position from the operation unit 1223.
The flash memory 1224 stores programs for executing various processing sequences of the microcomputer 1217. The microcomputer 1217 controls the entire camera according to the program. The flash memory 1224 stores various adjustment values of the camera, and the microcomputer 1217 reads the adjustment values and controls the camera in accordance with the adjustment values.
The SDRAM1218 is an electrically rewritable volatile memory for temporarily storing image data and the like. The SDRAM1218 temporarily stores the image data output from the a/D converter 1214 and the image data processed by the image processor 1215, JPEG processor 1216, and the like.
The memory interface 1219 is connected to the recording medium 1225, and performs control for writing and reading image data and data such as a file header added to the image data to and from the recording medium 1225. The recording medium 1225 is, for example, a recording medium such as a memory card that can be attached to and detached from the camera body, but is not limited to this, and may be a hard disk or the like that is built in the camera body.
The LCD driver 1210 is connected to the LCD1226, and stores the image data processed by the image processor 1215 in the SDRAM1218, and when display is required, reads the image data stored in the SDRAM1218 and displays the image data on the LCD1226, or the image data compressed by the JPEG processor 1216 is stored in the SDRAM1218, and when display is required, the JPEG processor 1216 reads the compressed image data in the SDRAM1218, decompresses the data, and displays the decompressed image data through the LCD 1226.
The LCD1226 is disposed on the back surface of the camera body and displays an image. The LCD1226LCD is not limited to this, and various display panels (LCD1226) such as organic EL may be used.
The following is a detailed description of specific examples.
First embodiment
Referring to fig. 3, fig. 3 is a schematic view illustrating a composition of a photo-drawing camera according to a first embodiment of the invention.
The optical drawing shooting device in this embodiment includes:
the shooting module 301 is used for acquiring a first image through a first camera module of the double camera modules and acquiring a second image through a second camera module of the double camera modules in a long exposure mode;
the determining module 302 is configured to determine a depth of field of a light source track in the first image according to the first image and the second image when the first camera module collects the first image;
the correction module 303 is configured to determine an actual trajectory of a portion exceeding the reference plane according to the depth of field of the light source trajectory and a projection of the portion exceeding the reference plane in the light source trajectory on the reference plane;
and a synthesizing module 304, configured to synthesize a portion of the light source trajectory in the first image that does not exceed the reference plane with an actual trajectory that exceeds the reference plane.
More and more mobile terminals are equipped with a dual-camera function, that is, have dual-camera modules, namely a first camera module and a second camera module; when shooting, two modules of making a video recording can independently shoot, synthesize the shooting result of two modules of making a video recording again at last, or select the image of one of them module of making a video recording as one piece. Based on the double-camera technology, functions such as determination of shooting depth of field, 3D scanning, auxiliary focusing, action recognition and the like which cannot be realized by the single-camera technology can be realized, and even under the condition of the same or similar parameters as the single camera, the double-camera technology can shoot a piece of picture more clearly than the single camera. Referring to fig. 4, fig. 4 is a schematic diagram illustrating a terminal using a dual camera technology, and the positions of two camera modules of the terminal are shown. The two camera modules are at the same moment, the positions of the images shot by the camera modules are slightly different, because the two camera modules cannot be completely overlapped and have a certain distance, but because the two camera modules shoot simultaneously, the images formed in the two camera modules at the moment of shooting are the same.
In this embodiment, the shooting module 301 simultaneously acquires a first image and a second image through the first camera module and the second camera module respectively in the long exposure mode; the first image and the second image can acquire images with consistent or inconsistent parameters according to requirements, for example, pixels of the first image and the second image can be different, so that the resolution of the first image and the resolution of the second image are different; the brightness of the first image and the second image may be different, for example, the first image may be collected through a large aperture, and the brightness of the first image is higher; the second image can be collected through a small aperture with low brightness. When the piece is selected, the first image and the second image can be synthesized, and the synthesized image is used as a final image; alternatively, the final image is selected from the images with better quality. In this embodiment, although the first image and the second image are distinguished, it is worth mentioning that the positions of the first image and the second image can be interchanged at will, and the first camera module and the second camera module are not substantially different from each other, and the first camera module or the second camera module can be used as the first image or the second image according to the specific piece of requirements.
The long exposure mode is an exposure mode which selects a slow shutter and has long exposure time, the mode can shoot scenes with dark light more clearly through light feeding for a longer time, and can also shoot continuous scenes such as traffic flow, waterfall and the like, and the optical drawing is to continuously collect the moving track of a light source on a piece of film by utilizing the characteristic of long exposure, so that the aim of optical drawing is fulfilled. In the long exposure mode, the shooting process is continuous, in other words, a dynamic scene is reflected in a static image. Referring to fig. 5, fig. 5 shows a schematic diagram of a first image, in which a line pattern is formed by a light source track, and it can be seen that the first image is not strictly in a plane due to the movement of the light source controlled by a renderer, and the formed first image is not regular. With continuing reference to fig. 6, fig. 6 is a schematic diagram of the motion trajectory of the light source perpendicular to the shooting direction when the first image as shown in fig. 5 is shot. The second image and the first image are taken simultaneously, so that the effect map of the second image is similar to that of the first image.
If the first image and the second image are to be synthesized, then for better sheeting effect and better image quality of sheeting, in this embodiment, the first image and the second image may be set to be a color image and the other a black-and-white image. At present, a common terminal color camera almost adopts a bayer array or a variant RGBW array arranged in RGBG as pixel distribution, and in such an array, light is partially filtered by an optical filter in a process of entering each pixel point, and only the intensity of the partial color is retained. In this arrangement, since each pixel can only record information of one color, in the later imaging process, it is necessary to perform inverse bayer operation with every four RGBG pixels as a group, specifically, by referring to the color intensity on the adjacent pixels, the algorithm is synthesized and reduced to the original color. Obviously, during the process of restoring, the picture inevitably has loss of picture details, and has certain influence on the picture quality, namely, the image shot in this way has a certain degree of distortion. The black-and-white cameras are different from the black-and-white cameras, the black-and-white cameras do not need to record and restore color information of a shot object, each pixel point can independently store the gray information of a picture, the detail information of the picture can be kept to the greatest extent, and the sharpness is improved. Therefore, the double-camera scheme adopts a color camera and a black-and-white camera, and can synthesize images shot by the color camera and the black-and-white camera, thereby not only keeping the color information of the color camera, but also ensuring the definition of the black-and-white camera, and further improving the image quality.
In this embodiment, the determining module 302 is configured to determine the depth of field of the light source track in the second image according to the first image and the second image when the first camera module collects the first image. In the long exposure shooting process, the first camera module and the second camera module simultaneously acquire respective images; when the first image and the second image are exposed, the first image and the second image are in the gradual change process, and the content shot in the first image and the content shot in the second image are changed along with the lapse of the exposure time and the movement of the light source; at this time, the depth of field of the light source track in the current shot content is determined through the first image and the second image. The depth of field measuring and calculating method is consistent with the principle of double-camera distance measurement, and the depth of field of the light source track in the first image can be determined according to the fixed distance between the two cameras, the imaging angle of a shooting object or the reference shooting focal length. The depth of field referred to herein means a distance between the light source trajectory and the reference plane, or a distance between the light source trajectory and the focal plane, since the reference plane and the focal plane are both determined planes, and the depth of field determined by any method can determine positional information of the light source trajectory with respect to the reference plane at the time of photographing.
The setting of the reference plane in this embodiment may include: the reference plane comprises a preset plane and spaces in a preset distance on two sides of the plane. In terms of mathematics, the plane is a strict definition, and has no size or thickness, but in actual shooting, the range of the camera for framing is limited, and the clear position, namely the focal plane, shot by the camera also has a certain allowable range, namely the depth of field of the focal plane. That is, the preset plane may be regarded as a general plane, and a space within a preset distance is located on both sides of the preset plane, and the preset distance may be set to be consistent with the depth of field of the focal plane, or slightly larger, or slightly smaller. If the image is slightly larger, the allowable shooting error is larger, and an image with lower definition can be shot; if the size is slightly smaller, the allowable shooting error is smaller, and an image with larger definition can be shot.
Specifically, in this embodiment, the reference plane may include: a preset focal plane; or a plane where the starting point of the light source trajectory is located is taken as a reference plane, or a plane where the average distance of the actual trajectory of the light source trajectory is located is taken as a reference plane. The preset focal plane is a plane where a focal point is located during shooting; the starting point of the light source track is taken as a reference plane, and the starting point of the light source track is generally the focus point, namely, the focal plane is taken as the reference plane; the reference plane is a plane that is a plane where the average distance of the actual trajectory of the light source trajectory is located, and the plane is a plane that carries the middle position of the movement of the light source maker in the direction perpendicular to the shooting direction in the optical drawing creation process and is used as the reference plane. This ensures the accuracy of the correction and the efficiency of the synthesis process.
In this embodiment, the correcting module 303 is configured to determine an actual trajectory of the portion exceeding the reference plane according to the depth of field of the light source trajectory and a projection of the portion exceeding the reference plane in the light source trajectory on the reference plane. Because the imaging of the camera module is based on the reference plane, the content in the first image and the second image collected by the first camera module and the second camera module should be the projection of the light source track on the reference plane. If the current light source tracks are on the reference plane, then the light source tracks can be considered to be accurate and are light source tracks which accord with the expectation of the optical painting creator; and if the current light source track is not on the reference plane but out of the reference plane, then the part of the light source track is considered to be deviated and is not expected. For those portions of the light source trajectory that do not meet the expectation, i.e. that exceed the reference plane, the actual trajectory can be determined from its projection onto the reference plane and the depth of field of the previously derived light source trajectory. Since the movement of the light drawer in the direction perpendicular to the recording direction is not regular, the distribution of the light source trajectory in the portion thereof which does not exceed the reference plane and, correspondingly, the portion which exceeds the reference plane is not regular.
In this embodiment, the synthesizing module 304 is configured to synthesize an actual trajectory of a portion of the light source trajectory that does not exceed the reference plane in the first image with a portion that exceeds the reference plane. When the correction module 303 determines the actual track of the portion exceeding the reference plane, the portion of the track may be recorded, and the recording manner may be stored in an image file or other format file in a form related to the first image, or may be real-time replacement of the projection recorded in the first image on the reference plane with the actual track during the exposure process. If the second way is adopted, it is equivalent to that the correction module 303 determines the actual track of the light source track exceeding the reference plane, and the synthesis module 304 synthesizes the part not exceeding the reference plane with the actual track of the part exceeding the reference plane; in the first mode, after the shooting is completed, the part of the first image where the light source track does not exceed the reference plane is combined with the recorded actual track of the part exceeding the reference plane. Referring to fig. 7, fig. 7 shows a schematic diagram of a first image after track synthesis, and it can be seen that the original irregular pattern becomes more regular and better meets the expectations of the creator after synthesis.
In addition, if there is an interruption between the two tracks during the synthesis process, which may be due to an error in the correction process or the renderer has left the same position for too long, the synthesis module 304 may be further configured to smoothly connect the portion of the light source track that does not exceed the reference plane with the actual track that exceeds the reference plane, that is, to make the transition between the two tracks natural and not abrupt as much as possible.
In addition, when the first image is a color image and the second image is a black-and-white image, that is, the first camera module is a color module and the second camera module is a black-and-white module, the quality of the image is remarkably improved by combining the first image and the second image, and then the combining module 304 can be further used for: and synthesizing the part of the light source track which does not exceed the reference plane and the actual track of the part which exceeds the reference plane in the second image, and then overlapping the synthesized first image and the second image. The process of synthesizing the two tracks in the second image is similar to the process of synthesizing the first image, and after the two tracks are synthesized in the same way, the synthesized result is the same except that one track is color and the other track is black and white, and then the two tracks are superposed, so that the color distortion of the color image is compensated, and the image quality is improved.
The embodiment provides a photo-drawing shooting device, which is characterized in that in a long exposure mode, a first image is collected through a first camera module in a double camera module, and a second image is collected through a second camera module; when the first camera module collects the first image, determining the depth of field of the light source track in the first image according to the first image and the second image; determining the actual track of the part exceeding the reference plane according to the depth of field and the projection of the part exceeding the reference plane in the light source track on the reference plane; and then synthesizing the part of the light source track which does not exceed the reference plane in the first image with the actual track which exceeds the reference plane. Through the implementation of this embodiment, through when shooing, to the real-time correction of light source orbit, the effectual influence of avoiding the deviation orbit to the light painting preparation has promoted the precision of light painting, has reduced the probability and the time of light painting post processing.
Second embodiment
Referring to fig. 8, fig. 8 is a schematic composition diagram of a mobile terminal according to the present embodiment.
The mobile terminal in the embodiment comprises the optical drawing shooting device, wherein the optical drawing shooting device comprises a shooting module, a determining module, a correcting module and a synthesizing module; the shooting module includes two camera modules, namely a first camera module and a second camera module, which can be implemented by the camera 121 in the a/V input unit 12 in the foregoing embodiment, namely, a first camera 1211 and a second camera 1212; in addition, the determination module, the modification module and the synthesis module can be implemented by the controller 180 in the foregoing embodiments. The specific implementation is as follows:
in the long exposure mode, the first camera 1211 acquires a first image while the second camera 1212 acquires a second image;
when the first camera module collects the first image, the controller 180 determines the depth of field of the light source track in the first image according to the first image and the second image; and finally, synthesizing the part of the light source track which does not exceed the reference plane and the actual track of the part which exceeds the reference plane in the first image.
In the present embodiment, in the long exposure mode, the first camera 1211 and the second camera 1212 simultaneously capture a first image and a second image, respectively; the first image and the second image can acquire images with consistent or inconsistent parameters according to requirements, for example, pixels of the first image and the second image can be different, so that the resolution of the first image and the resolution of the second image are different; the brightness of the first image and the second image may be different, for example, the first image may be collected through a large aperture, and the brightness of the first image is higher; the second image can be collected through a small aperture with low brightness. When the piece is selected, the first image and the second image can be synthesized, and the synthesized image is used as a final image; alternatively, the final image is selected from the images with better quality. In the embodiment, although the first image and the second image are distinguished, it should be noted that the positions of the first image and the second image may be arbitrarily interchanged, there is no essential difference between the first camera 1211 and the second camera 1212, and the first image or the second image captured by the first camera 1211 or the second camera 1212 may be regarded as the first image or the second image according to the specific slicing requirement.
The long exposure mode is an exposure mode which selects a slow shutter and has long exposure time, the mode can shoot scenes with dark light more clearly through light feeding for a longer time, and can also shoot continuous scenes such as traffic flow, waterfall and the like, and the optical drawing is to continuously collect the moving track of a light source on a piece of film by utilizing the characteristic of long exposure, so that the aim of optical drawing is fulfilled. In the long exposure mode, the shooting process is continuous, in other words, a dynamic scene is reflected in a static image. Referring to fig. 5, fig. 5 shows a schematic diagram of a first image, in which a line pattern is formed by a light source track, and it can be seen that the first image is not strictly in a plane due to the movement of the light source controlled by a renderer, and the formed first image is not regular. With continuing reference to fig. 6, fig. 6 is a schematic diagram of the motion trajectory of the light source perpendicular to the shooting direction when the first image as shown in fig. 5 is shot. The second image and the first image are taken simultaneously, so that the effect map of the second image is similar to that of the first image.
If the first image and the second image are to be synthesized, then for better sheeting effect and better image quality of sheeting, in this embodiment, the first image and the second image may be set to be a color image and the other a black-and-white image. At present, a common terminal color camera almost adopts a bayer array or a variant RGBW array arranged in RGBG as pixel distribution, and in such an array, light is partially filtered by an optical filter in a process of entering each pixel point, and only the intensity of the partial color is retained. In this arrangement, since each pixel can only record information of one color, in the later imaging process, it is necessary to perform inverse bayer operation with every four RGBG pixels as a group, specifically, by referring to the color intensity on the adjacent pixels, the algorithm is synthesized and reduced to the original color. Obviously, during the process of restoring, the picture inevitably has loss of picture details, and has certain influence on the picture quality, namely, the image shot in this way has a certain degree of distortion. The black-and-white cameras are different from the black-and-white cameras, the black-and-white cameras do not need to record and restore color information of a shot object, each pixel point can independently store the gray information of a picture, the detail information of the picture can be kept to the greatest extent, and the sharpness is improved. Therefore, the double-camera scheme adopts a color camera and a black-and-white camera, and can synthesize images shot by the color camera and the black-and-white camera, thereby not only keeping the color information of the color camera, but also ensuring the definition of the black-and-white camera, and further improving the image quality.
In this embodiment, the controller 180 determines the depth of field of the light source track in the second image according to the first image and the second image when the first camera 1211 acquires the first image. In the long exposure shooting process, the first camera 1211 and the second camera 1212 simultaneously acquire respective images; when the first image and the second image are exposed, the first image and the second image are in the gradual change process, and the content shot in the first image and the content shot in the second image are changed along with the lapse of the exposure time and the movement of the light source; at this time, the depth of field of the light source track in the current shot content is determined through the first image and the second image. The depth of field measuring and calculating method is consistent with the principle of double-camera distance measurement, and the depth of field of the light source track in the first image can be determined according to the fixed distance between the two cameras, the imaging angle of a shooting object or the reference shooting focal length. The depth of field referred to herein means a distance between the light source trajectory and the reference plane, or a distance between the light source trajectory and the focal plane, since the reference plane and the focal plane are both determined planes, and the depth of field determined by any method can determine positional information of the light source trajectory with respect to the reference plane at the time of photographing.
The setting of the reference plane in this embodiment may include: the reference plane comprises a preset plane and spaces in a preset distance on two sides of the plane. In terms of mathematics, the plane is a strict definition, and has no size or thickness, but in actual shooting, the range of the camera for framing is limited, and the clear position, namely the focal plane, shot by the camera also has a certain allowable range, namely the depth of field of the focal plane. That is, the preset plane may be regarded as a general plane, and a space within a preset distance is located on both sides of the preset plane, and the preset distance may be set to be consistent with the depth of field of the focal plane, or slightly larger, or slightly smaller. If the image is slightly larger, the allowable shooting error is larger, and an image with lower definition can be shot; if the size is slightly smaller, the allowable shooting error is smaller, and an image with larger definition can be shot.
Specifically, in this embodiment, the reference plane may include: a preset focal plane; or a plane where the starting point of the light source trajectory is located is taken as a reference plane, or a plane where the average distance of the actual trajectory of the light source trajectory is located is taken as a reference plane. The preset focal plane is a plane where a focal point is located during shooting; the starting point of the light source track is taken as a reference plane, and the starting point of the light source track is generally the focus point, namely, the focal plane is taken as the reference plane; the reference plane is a plane that is a plane where the average distance of the actual trajectory of the light source trajectory is located, and the plane is a plane that carries the middle position of the movement of the light source maker in the direction perpendicular to the shooting direction in the optical drawing creation process and is used as the reference plane. This ensures the accuracy of the correction and the efficiency of the synthesis process.
In the present embodiment, the controller 180 determines the actual trajectory of the portion exceeding the reference plane based on the depth of field of the light source trajectory and the projection of the portion exceeding the reference plane in the light source trajectory onto the reference plane. Since the imaging of the cameras is based on the reference plane, the content in the first and second images acquired by the first and second cameras 1211 and 1212 should be both projections of the light source trajectory on the reference plane. If the current light source tracks are on the reference plane, then the light source tracks can be considered to be accurate and are light source tracks which accord with the expectation of the optical painting creator; and if the current light source track is not on the reference plane but out of the reference plane, then the part of the light source track is considered to be deviated and is not expected. For those portions of the light source trajectory that do not meet the expectation, i.e. that exceed the reference plane, the actual trajectory can be determined from its projection onto the reference plane and the depth of field of the previously derived light source trajectory. Since the movement of the light drawer in the direction perpendicular to the recording direction is not regular, the distribution of the light source trajectory in the portion thereof which does not exceed the reference plane and, correspondingly, the portion which exceeds the reference plane is not regular.
In this embodiment, the controller 180 synthesizes the actual trajectory of the portion of the light source trajectory that does not exceed the reference plane in the first image with the actual trajectory of the portion that exceeds the reference plane. When the controller 180 determines the actual trajectory of the portion exceeding the reference plane, the portion of the trajectory may be recorded, and the recorded trajectory may be saved in an image file or other format file in a form related to the first image, or may replace the projection recorded on the reference plane in the first image with the actual trajectory in real time during the exposure process. If the second mode is adopted, the method is equivalent to that the actual track of the light source track exceeding the reference plane is determined, and meanwhile, the actual track of the part not exceeding the reference plane is combined with the actual track of the part exceeding the reference plane; in the first mode, after the shooting is completed, the part of the first image where the light source track does not exceed the reference plane is combined with the recorded actual track of the part exceeding the reference plane.
In addition, if there is a break between the two tracks during the synthesis process, which may be due to an error in the correction process or the renderer has left the same position for too long, the controller 180 may further smoothly connect the actual tracks of the portions of the light source tracks that do not exceed the reference plane with the actual tracks of the portions that exceed the reference plane, that is, make the transition between the two tracks natural as much as possible, and thus, the two tracks are not abrupt.
In addition, when the first image is a color image and the second image is a black-and-white image, that is, the first camera module is a color module and the second camera module is a black-and-white module, the two are synthesized under the condition that the quality of the image is remarkably improved, and then, the controller 180 can be further used for: and synthesizing the part of the light source track which does not exceed the reference plane and the actual track of the part which exceeds the reference plane in the second image, and then overlapping the synthesized first image and the second image. The process of synthesizing the two tracks in the second image is similar to the process of synthesizing the first image, and after the two tracks are synthesized in the same way, the synthesized result is the same except that one track is color and the other track is black and white, and then the two tracks are superposed, so that the color distortion of the color image is compensated, and the image quality is improved.
The embodiment provides a mobile terminal, in a long exposure mode, a first image is acquired through a first camera module in a double camera module, and a second image is acquired through a second camera module; when the first camera module collects the first image, determining the depth of field of the light source track in the first image according to the first image and the second image; determining the actual track of the part exceeding the reference plane according to the depth of field and the projection of the part exceeding the reference plane in the light source track on the reference plane; and then synthesizing the part of the light source track which does not exceed the reference plane in the first image with the actual track which exceeds the reference plane. Through the implementation of this embodiment, through when shooing, to the real-time correction of light source orbit, the effectual influence of avoiding the deviation orbit to the light painting preparation has promoted the precision of light painting, has reduced the probability and the time of light painting post processing.
Third embodiment
Referring to fig. 9, fig. 9 is a flowchart of a photo-plotting and shooting method provided in the present embodiment, including:
s901, in a long exposure mode, acquiring a first image through a first camera module in the double camera modules, and acquiring a second image through a second camera module in the double camera modules;
s902, when the first camera module collects the first image, determining the depth of field of a light source track in the first image according to the first image and the second image;
s903, determining the actual track of the part exceeding the reference plane according to the depth of field of the light source track and the projection of the part exceeding the reference plane in the light source track on the reference plane;
and S904, synthesizing the part of the light source track which does not exceed the reference plane in the first image with the actual track which exceeds the reference plane.
In the step S901, in a long exposure mode, a first image and a second image are simultaneously acquired through a first camera module and a second camera module, respectively; the first image and the second image can acquire images with consistent or inconsistent parameters according to requirements, for example, pixels of the first image and the second image can be different, so that the resolution of the first image and the resolution of the second image are different; the brightness of the first image and the second image may be different, for example, the first image may be collected through a large aperture, and the brightness of the first image is higher; the second image can be collected through a small aperture with low brightness. When the piece is selected, the first image and the second image can be synthesized, and the synthesized image is used as a final image; alternatively, the final image is selected from the images with better quality. In this embodiment, although the first image and the second image are distinguished, it is worth mentioning that the positions of the first image and the second image can be interchanged at will, and the first camera module and the second camera module are not substantially different from each other, and the first camera module or the second camera module can be used as the first image or the second image according to the specific piece of requirements.
The long exposure mode is an exposure mode which selects a slow shutter and has long exposure time, the mode can shoot scenes with dark light more clearly through light feeding for a longer time, and can also shoot continuous scenes such as traffic flow, waterfall and the like, and the optical drawing is to continuously collect the moving track of a light source on a piece of film by utilizing the characteristic of long exposure, so that the aim of optical drawing is fulfilled. In the long exposure mode, the shooting process is continuous, in other words, a dynamic scene is reflected in a static image. Referring to fig. 5, fig. 5 shows a schematic diagram of a first image, in which a line pattern is formed by a light source track, and it can be seen that the first image is not strictly in a plane due to the movement of the light source controlled by a renderer, and the formed first image is not regular. With continuing reference to fig. 6, fig. 6 is a schematic diagram of the motion trajectory of the light source perpendicular to the shooting direction when the first image as shown in fig. 5 is shot. The second image and the first image are taken simultaneously, so that the effect map of the second image is similar to that of the first image.
If the first image and the second image are to be synthesized, then for better sheeting effect and better image quality of sheeting, in this embodiment, the first image and the second image may be set to be a color image and the other a black-and-white image. At present, a common terminal color camera almost adopts a bayer array or a variant RGBW array arranged in RGBG as pixel distribution, and in such an array, light is partially filtered by an optical filter in a process of entering each pixel point, and only the intensity of the partial color is retained. In this arrangement, since each pixel can only record information of one color, in the later imaging process, it is necessary to perform inverse bayer operation with every four RGBG pixels as a group, specifically, by referring to the color intensity on the adjacent pixels, the algorithm is synthesized and reduced to the original color. Obviously, during the process of restoring, the picture inevitably has loss of picture details, and has certain influence on the picture quality, namely, the image shot in this way has a certain degree of distortion. The black-and-white cameras are different from the black-and-white cameras, the black-and-white cameras do not need to record and restore color information of a shot object, each pixel point can independently store the gray information of a picture, the detail information of the picture can be kept to the greatest extent, and the sharpness is improved. Therefore, the double-camera scheme adopts a color camera and a black-and-white camera, and can synthesize images shot by the color camera and the black-and-white camera, thereby not only keeping the color information of the color camera, but also ensuring the definition of the black-and-white camera, and further improving the image quality.
In S902, when the first camera module collects the first image, the depth of field of the light source track in the second image is determined according to the first image and the second image. In the long exposure shooting process, the first camera module and the second camera module simultaneously acquire respective images; when the first image and the second image are exposed, the first image and the second image are in the gradual change process, and the content shot in the first image and the content shot in the second image are changed along with the lapse of the exposure time and the movement of the light source; at this time, the depth of field of the light source track in the current shot content is determined through the first image and the second image. The depth of field measuring and calculating method is consistent with the principle of double-camera distance measurement, and the depth of field of the light source track in the first image can be determined according to the fixed distance between the two cameras, the imaging angle of a shooting object or the reference shooting focal length. The depth of field referred to herein means a distance between the light source trajectory and the reference plane, or a distance between the light source trajectory and the focal plane, since the reference plane and the focal plane are both determined planes, and the depth of field determined by any method can determine positional information of the light source trajectory with respect to the reference plane at the time of photographing.
The setting of the reference plane in this embodiment may include: the reference plane comprises a preset plane and spaces in a preset distance on two sides of the plane. In terms of mathematics, the plane is a strict definition, and has no size or thickness, but in actual shooting, the range of the camera for framing is limited, and the clear position, namely the focal plane, shot by the camera also has a certain allowable range, namely the depth of field of the focal plane. That is, the preset plane may be regarded as a general plane, and a space within a preset distance is located on both sides of the preset plane, and the preset distance may be set to be consistent with the depth of field of the focal plane, or slightly larger, or slightly smaller. If the image is slightly larger, the allowable shooting error is larger, and an image with lower definition can be shot; if the size is slightly smaller, the allowable shooting error is smaller, and an image with larger definition can be shot.
Specifically, in this embodiment, the reference plane may include: a preset focal plane; or a plane where the starting point of the light source trajectory is located is taken as a reference plane, or a plane where the average distance of the actual trajectory of the light source trajectory is located is taken as a reference plane. The preset focal plane is a plane where a focal point is located during shooting; the starting point of the light source track is taken as a reference plane, and the starting point of the light source track is generally the focus point, namely, the focal plane is taken as the reference plane; the reference plane is a plane that is a plane where the average distance of the actual trajectory of the light source trajectory is located, and the plane is a plane that carries the middle position of the movement of the light source maker in the direction perpendicular to the shooting direction in the optical drawing creation process and is used as the reference plane. This ensures the accuracy of the correction and the efficiency of the synthesis process.
In S903, an actual trajectory of a portion exceeding the reference plane is determined according to the depth of field of the light source trajectory and the projection of the portion exceeding the reference plane in the light source trajectory on the reference plane. Because the imaging of the camera module is based on the reference plane, the content in the first image and the second image collected by the first camera module and the second camera module should be the projection of the light source track on the reference plane. If the current light source tracks are on the reference plane, then the light source tracks can be considered to be accurate and are light source tracks which accord with the expectation of the optical painting creator; and if the current light source track is not on the reference plane but out of the reference plane, then the part of the light source track is considered to be deviated and is not expected. For those portions of the light source trajectory that do not meet the expectation, i.e. that exceed the reference plane, the actual trajectory can be determined from its projection onto the reference plane and the depth of field of the previously derived light source trajectory. Since the movement of the light drawer in the direction perpendicular to the recording direction is not regular, the distribution of the light source trajectory in the portion thereof which does not exceed the reference plane and, correspondingly, the portion which exceeds the reference plane is not regular.
In S904, the actual trajectory of the portion of the light source trajectory that does not exceed the reference plane in the first image is synthesized with the actual trajectory of the portion that exceeds the reference plane. When the correction module determines the actual track of the part exceeding the reference plane, the part of the track can be recorded, and the recording mode can be stored in an image file or other format file in a form related to the first image, or the actual track can be used for replacing the projection recorded in the first image on the reference plane in real time in the exposure process. If the second mode is adopted, the method is equivalent to that the correction module determines the actual track of the light source track exceeding the reference plane, and the synthesis module synthesizes the part not exceeding the reference plane and the actual track of the part exceeding the reference plane; in the first mode, after the shooting is completed, the part of the first image where the light source track does not exceed the reference plane is combined with the recorded actual track of the part exceeding the reference plane. Referring to fig. 7, fig. 7 shows a schematic diagram of a first image after track synthesis, and it can be seen that the original irregular pattern becomes more regular and better meets the expectations of the creator after synthesis.
In addition, if an interruption occurs between the two tracks during the synthesis process, which may be due to an error in the correction process or the renderer stays at the same position for too long, at this time, the actual tracks of the portion of the light source track that does not exceed the reference plane and the portion of the light source track that exceeds the reference plane may be smoothly connected, that is, the two tracks are excessively natural and do not appear to be obtrusive as much as possible.
In addition, when first image is the colour image, when the second image is black and white image, that is to say that first camera module is the colour module, and the second camera module is the black and white module, synthesizes both under this condition and can show the quality that promotes into pieces, then, can also include: and synthesizing the part of the light source track which does not exceed the reference plane and the actual track of the part which exceeds the reference plane in the second image, and then overlapping the synthesized first image and the second image. The process of synthesizing the two tracks in the second image is similar to the process of synthesizing the first image, and after the two tracks are synthesized in the same way, the synthesized result is the same except that one track is color and the other track is black and white, and then the two tracks are superposed, so that the color distortion of the color image is compensated, and the image quality is improved.
In the long exposure mode, a first image is acquired through a first camera module in the double camera modules, and a second image is acquired through a second camera module; when the first camera module collects the first image, determining the depth of field of the light source track in the first image according to the first image and the second image; determining the actual track of the part exceeding the reference plane according to the depth of field and the projection of the part exceeding the reference plane in the light source track on the reference plane; and then synthesizing the part of the light source track which does not exceed the reference plane in the first image with the actual track which exceeds the reference plane. Through the implementation of this embodiment, through when shooing, to the real-time correction of light source orbit, the effectual influence of avoiding the deviation orbit to the light painting preparation has promoted the precision of light painting, has reduced the probability and the time of light painting post processing.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. An optical drawing camera, comprising:
the shooting module is used for acquiring a first image through a first camera module in the double camera modules and acquiring a second image through a second camera module in the double camera modules in a long exposure mode;
the determining module is used for determining the depth of field of a light source track in the first image according to the first image and the second image when the first camera module collects the first image;
the correction module is used for determining the actual track of the part exceeding the reference plane according to the depth of field of the light source track and the projection of the part exceeding the reference plane in the light source track on the reference plane; the reference plane comprises a preset plane and spaces within a preset distance on two sides of the preset plane;
and the synthesis module is used for synthesizing the part of the light source track which does not exceed the reference plane in the first image with the actual track which exceeds the reference plane.
2. The optical rendering camera of claim 1, wherein the synthesis module is further configured to: and smoothly connecting the part of the light source track which does not exceed the reference plane with the actual track which exceeds the reference plane.
3. The optical drawing camera device as claimed in claim 1, wherein the first image is a color image and the second image is a black and white image; the synthesis module is further configured to:
synthesizing a part of the light source track which does not exceed the reference plane in the second image with an actual track of the part which exceeds the reference plane;
and overlapping the synthesized first image and the synthesized second image.
4. A mobile terminal characterized by comprising the optical drawing camera according to any one of claims 1-3.
5. A photo-drawing shooting method is characterized by comprising the following steps:
in a long exposure mode, a first image is collected through a first camera module in the double camera modules, and a second image is collected through a second camera module in the double camera modules;
when the first camera module collects a first image, determining the depth of field of a light source track in the first image according to the first image and the second image;
determining the actual track of the part exceeding the reference plane according to the depth of field of the light source track and the projection of the part exceeding the reference plane in the light source track on the reference plane; the reference plane comprises a preset plane and spaces within a preset distance on two sides of the preset plane;
and synthesizing the part of the light source track which does not exceed the reference plane in the first image with the actual track of the part which exceeds the reference plane.
6. The photo-drawing shooting method of claim 5, wherein the synthesizing of the portion of the light source track that does not exceed the reference plane in the first image with the actual track of the portion that exceeds the reference plane comprises: and smoothly connecting the part of the light source track which does not exceed the reference plane and the actual track which exceeds the reference plane in the first image.
7. The photo-drawing photographing method of claim 5, wherein the first image is a color image and the second image is a black-and-white image; after the synthesizing the portion of the light source track not exceeding the reference plane in the first image with the actual track exceeding the reference plane portion, the method further includes:
synthesizing a part of the light source track which does not exceed the reference plane in the second image with an actual track of the part which exceeds the reference plane;
and overlapping the synthesized first image and the synthesized second image.
8. The optical mapping photographing method of any one of claims 5-7, wherein the reference plane comprises: the preset focal plane is used as a reference plane, or a plane where the starting point of the light source track is located is used as the reference plane, or a plane where the average distance of the actual track of the light source track is located is used as the reference plane.
CN201710209579.6A 2017-03-31 2017-03-31 Optical drawing shooting device and method and mobile terminal Active CN107071277B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710209579.6A CN107071277B (en) 2017-03-31 2017-03-31 Optical drawing shooting device and method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710209579.6A CN107071277B (en) 2017-03-31 2017-03-31 Optical drawing shooting device and method and mobile terminal

Publications (2)

Publication Number Publication Date
CN107071277A CN107071277A (en) 2017-08-18
CN107071277B true CN107071277B (en) 2020-04-03

Family

ID=59603127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710209579.6A Active CN107071277B (en) 2017-03-31 2017-03-31 Optical drawing shooting device and method and mobile terminal

Country Status (1)

Country Link
CN (1) CN107071277B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108287639B (en) * 2018-01-23 2021-06-25 歌尔科技有限公司 Depth of field data acquisition method and device, projection component and terminal equipment
CN110412828A (en) * 2018-09-07 2019-11-05 广东优世联合控股集团股份有限公司 Printing method and system of three-dimensional light track image
CN109510948B (en) * 2018-09-30 2020-11-17 先临三维科技股份有限公司 Exposure adjusting method, exposure adjusting device, computer equipment and storage medium
CN113660397A (en) * 2021-08-12 2021-11-16 广州竭力信息科技有限公司 Light-painting interaction method based on real-time display of real scene

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102369722A (en) * 2010-02-12 2012-03-07 松下电器产业株式会社 Imaging device and method, and image processing method for imaging device
CN103402056A (en) * 2013-07-31 2013-11-20 北京阳光加信科技有限公司 Compensation processing method and system applied to image capture device
CN104065859A (en) * 2014-06-12 2014-09-24 青岛海信电器股份有限公司 Panoramic and deep image acquisition method and photographic device
CN105430263A (en) * 2015-11-24 2016-03-23 努比亚技术有限公司 Long-exposure panoramic image photographing device and method
CN105594191A (en) * 2013-10-02 2016-05-18 奥林巴斯株式会社 Imaging device, image processing device, and image processing method
CN106331514A (en) * 2016-09-07 2017-01-11 青岛海信移动通信技术股份有限公司 Method and terminal for controlling overlong exposure

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102369722A (en) * 2010-02-12 2012-03-07 松下电器产业株式会社 Imaging device and method, and image processing method for imaging device
CN103402056A (en) * 2013-07-31 2013-11-20 北京阳光加信科技有限公司 Compensation processing method and system applied to image capture device
CN105594191A (en) * 2013-10-02 2016-05-18 奥林巴斯株式会社 Imaging device, image processing device, and image processing method
CN104065859A (en) * 2014-06-12 2014-09-24 青岛海信电器股份有限公司 Panoramic and deep image acquisition method and photographic device
CN105430263A (en) * 2015-11-24 2016-03-23 努比亚技术有限公司 Long-exposure panoramic image photographing device and method
CN106331514A (en) * 2016-09-07 2017-01-11 青岛海信移动通信技术股份有限公司 Method and terminal for controlling overlong exposure

Also Published As

Publication number Publication date
CN107071277A (en) 2017-08-18

Similar Documents

Publication Publication Date Title
KR102156597B1 (en) Optical imaging method and apparatus
CN106454121B (en) Double-camera shooting method and device
CN113454982B (en) Electronic device for stabilizing image and method of operating the same
WO2017071559A1 (en) Image processing apparatus and method
WO2017067520A1 (en) Mobile terminal having binocular cameras and photographing method therefor
CN107950018B (en) Image generation method and system, and computer readable medium
CN106688227B (en) More photographic devices, more image capture methods
CN107071277B (en) Optical drawing shooting device and method and mobile terminal
CN103986867A (en) Image shooting terminal and image shooting method
WO2017206656A1 (en) Image processing method, terminal, and computer storage medium
CN114092364A (en) Image processing method and related device
CN110430357B (en) Image shooting method and electronic equipment
WO2018076938A1 (en) Method and device for processing image, and computer storage medium
US11750926B2 (en) Video image stabilization processing method and electronic device
US20200412967A1 (en) Imaging element and imaging apparatus
CN105407295B (en) Mobile terminal filming apparatus and method
JPWO2018003124A1 (en) Imaging device, imaging method and imaging program
CN104751488B (en) Photographing method for moving track of moving object and terminal equipment
CN106954020B (en) A kind of image processing method and terminal
CN115802158B (en) Method for switching cameras and electronic equipment
WO2017088662A1 (en) Focusing method and device
CN113810590A (en) Image processing method, electronic device, medium, and system
EP2760197B1 (en) Apparatus and method for processing image in mobile terminal having camera
CN106713656B (en) Shooting method and mobile terminal
CN110876014A (en) Image processing method and device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant