CN107483815B - Method and device for shooting moving object - Google Patents

Method and device for shooting moving object Download PDF

Info

Publication number
CN107483815B
CN107483815B CN201710677576.5A CN201710677576A CN107483815B CN 107483815 B CN107483815 B CN 107483815B CN 201710677576 A CN201710677576 A CN 201710677576A CN 107483815 B CN107483815 B CN 107483815B
Authority
CN
China
Prior art keywords
distance
structured light
image
camera
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710677576.5A
Other languages
Chinese (zh)
Other versions
CN107483815A (en
Inventor
张学勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710677576.5A priority Critical patent/CN107483815B/en
Publication of CN107483815A publication Critical patent/CN107483815A/en
Application granted granted Critical
Publication of CN107483815B publication Critical patent/CN107483815B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a method and a device for shooting a moving object, wherein the method comprises the following steps: in the process of object movement, respectively projecting a structural light source to an object at a plurality of time points, and shooting a plurality of structural light images of the structural light source modulated by the object; demodulating the phase corresponding to the pixel at the deformation position in each structured light image, and generating corresponding depth-of-field information according to the phase; determining a motion curve between the object and the camera according to the depth of field information of the plurality of structured light images, and predicting the distance and the direction of the object from the camera when the preset time point is reached according to the motion curve; and adjusting the focusing stroke curve of the camera according to the distance and the direction, and further shooting an image when a preset time point is reached. Therefore, the fast focusing is realized, the clear photographing of the moving object is realized, and the photographing definition is improved.

Description

Method and device for shooting moving object
Technical Field
The invention relates to the technical field of terminal equipment photographing, in particular to a method and a device for photographing a moving object.
Background
In the related technology, for objects far away from and near the lens, focusing is needed when clear imaging is carried out at a fixed position after passing through the lens.
However, the conventional method of judging the position first after the object is in place and then focusing results in a time difference between the shooting time and the real-time position of the object, and when the object moves rapidly, the track of the moving object cannot be tracked rapidly, so that the picture of the moving object is blurred.
Disclosure of Invention
The invention provides a method and a device for shooting a moving object, and aims to solve the technical problem that in the prior art, the moving object cannot be quickly focused, so that the shooting is not clear.
The embodiment of the invention provides a method for shooting a moving object, which comprises the following steps: in the process of object movement, respectively projecting a structural light source to an object at a plurality of time points, and shooting a plurality of structural light images of the structural light source modulated by the object; demodulating a phase corresponding to a pixel at a deformation position in each structured light image, and generating corresponding depth-of-field information according to the phase; determining a motion curve between the object and the camera according to the depth of field information of the plurality of structured light images, and predicting the distance and the direction of the object from the camera when a preset time point is reached according to the motion curve; and adjusting the focusing stroke curve of the camera according to the distance and the direction, and further shooting an image when the preset time point is reached.
Another embodiment of the present invention provides a photographing apparatus of a moving object, including: the shooting module is used for respectively projecting the structural light source to the object at a plurality of time points in the moving process of the object and shooting a plurality of structural light images of the structural light source modulated by the object; the generating module is used for demodulating the phase corresponding to the deformed position pixel in each structured light image and generating corresponding depth-of-field information according to the phase; the first determining module is used for determining a motion curve between the object and the camera according to the depth information of the structured light images; the prediction module is used for predicting the distance and the direction of the object from the camera when a preset time point is reached according to the motion curve; and the focusing module is used for adjusting a focusing stroke curve of the camera according to the distance and the direction so as to shoot the image when the shooting module reaches the preset time point.
Still another embodiment of the present invention provides a terminal device, including a memory and a processor, where the memory stores computer-readable instructions, and the instructions, when executed by the processor, cause the processor to execute the method for capturing a moving object according to the above embodiment of the present invention.
Yet another embodiment of the present invention provides a non-transitory computer-readable storage medium on which a computer program is stored, which, when executed by a processor, implements a photographing method of a moving object according to the above-described embodiment of the present invention.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
the method comprises the steps of projecting a structural light source to an object at a plurality of time points respectively, shooting a plurality of structural light images of the structural light source after object modulation, demodulating phases corresponding to deformation position pixels in each structural light image, generating corresponding depth of field information according to the phases, determining a motion curve of the object between the object and a camera according to the depth of field information of the plurality of structural light images, predicting the distance and the direction of the object from the camera when the object reaches a preset time point according to the motion curve, adjusting a focusing stroke curve of the camera according to the distance and the direction, and then shooting the image when the preset time point is reached. Therefore, the fast focusing is realized, the clear photographing of the moving object is realized, and the photographing definition is improved.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart of a photographing method of a moving object according to an embodiment of the present invention;
FIG. 2(a) is a first view of a scene of structured light measurements according to one embodiment of the present invention;
FIG. 2(b) is a diagram of a second scenario of structured light measurements, in accordance with one embodiment of the present invention;
FIG. 2(c) is a schematic view of a scene three of structured light measurements according to one embodiment of the present invention;
FIG. 2(d) is a diagram of a fourth scenario of structured light measurements, in accordance with one embodiment of the present invention;
FIG. 2(e) is a fifth view of a scene for structured light measurement according to one embodiment of the present invention;
FIG. 3(a) is a schematic diagram of a partial diffractive structure of a collimating beam splitting element according to one embodiment of the present invention;
FIG. 3(b) is a schematic diagram of a partial diffractive structure of a collimating beam splitting element according to another embodiment of the present invention;
FIG. 4 is a block diagram of a configuration of a photographing device of a moving object according to an embodiment of the present invention;
FIG. 5 is a block diagram of a configuration of a photographing device of a moving object according to another embodiment of the present invention;
FIG. 6 is a block diagram of a configuration of a photographing device of a moving object according to still another embodiment of the present invention; and
fig. 7 is a schematic structural diagram of an image processing circuit in a terminal device according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
The photographing and apparatus of a moving object of the embodiment of the present invention are described below with reference to the drawings.
It should be understood that, in order to solve the technical problems that in the prior art, in a photographing method in which a camera determines the position of an object first and then focuses after the object is in place, the track of the moving object cannot be tracked quickly, and a clear picture of the moving object cannot be taken, a photographing mode in which the operation behavior of the object is predicted so as to directly push a camera to the predicted focusing position for photographing is provided.
Specifically, fig. 1 is a flowchart of a photographing method of a moving object according to one embodiment of the present invention.
As shown in fig. 1, the method includes:
step 101, in the process of moving an object, projecting a structured light source to the object at a plurality of time points respectively, and shooting a plurality of structured light images of the structured light source modulated by the object.
And 102, demodulating the phase corresponding to the deformed position pixel in each structured light image, and generating corresponding depth-of-field information according to the phase.
It should be noted that, according to different application scenarios, different manners may be adopted to obtain depth information of the object at the corresponding time point.
As a possible implementation manner, in order to improve the accuracy of acquiring the motion information acquired during the motion of the object, the depth-of-field information of the object is acquired based on the structured light, wherein the structural features of the structured light source include laser stripes, gray codes, sinusoidal stripes, uniform speckles, or non-uniform speckles.
In the present embodiment, in order to make it clear to those skilled in the art how to obtain depth-of-field information of an object according to structured light, a specific principle of the method is described below by taking a widely-applied grating projection technology (fringe projection technology) as an example, where the grating projection technology belongs to a broad-spectrum structured light.
When using surface structured light projection, as shown in fig. 2(a), a sinusoidal stripe is generated by computer programming, the sinusoidal stripe is projected to a measured object through a projection device, a CCD camera is used to photograph the bending degree of the stripe modulated by an object, the bending stripe is demodulated to obtain a phase, and the phase is converted to the height of the full field. Of course, the most important point is the calibration of the system, including the calibration of the system geometry and the calibration of the internal parameters of the CCD camera and the projection device, which otherwise may cause errors or error coupling. Since the system external parameters are not calibrated it is not possible to calculate the correct height information from the phase.
Specifically, in the first step, a sinusoidal fringe pattern is programmed, because the phase is acquired subsequently by using a deformed fringe pattern, for example, by using a four-step phase shifting method, four fringes with a phase difference pi/2 are generated, and then the four fringes are projected onto the object to be measured (mask) in a time-sharing manner, and the pattern on the left side of fig. 2(b) is acquired, and the fringes on the reference plane shown on the right side of fig. 2(b) are acquired.
In a second step, phase recovery is performed, and the modulated phase is calculated from the four acquired modulated fringe patterns, where the resulting phase pattern is a truncated phase pattern, since the result of the four-step phase-shifting algorithm is calculated from the arctan function and is therefore limited to between [ -pi, pi ], i.e. it starts over again whenever its value exceeds this range. The phase principal value obtained is shown in fig. 2 (c).
In the second step, it is necessary to cancel the transition, i.e. restore the truncated phase to a continuous phase, as shown in fig. 2(d), with the modulated continuous phase on the left and the reference continuous phase on the right.
And thirdly, subtracting the modulated continuous phase from the reference continuous phase to obtain a phase difference, wherein the phase difference represents the height information of the measured object relative to the reference surface, and substituting the phase difference into a phase and height conversion formula (wherein corresponding parameters are calibrated) to obtain the three-dimensional model of the object to be measured as shown in fig. 2 (e).
It should be understood that, in practical applications, the structured light used in the embodiments of the present invention may be any pattern other than the grating, according to different application scenarios.
In this embodiment, a substantially flat diffraction element having a diffraction structure of relief with a specific phase distribution, a step relief structure having two or more concavities and convexities in cross section, or a step relief structure of a plurality of concavities and convexities may be used, the thickness of the substrate is approximately l micrometers, and the height of each step is not uniform, and is 0.7 micrometers to 0.9 micrometers. Fig. 3(a) is a partial diffraction structure of the collimating beam splitting element of this embodiment, and fig. 3(b) is a cross-sectional side view taken along section a-a, both in units of micrometers on the abscissa and on the ordinate.
Accordingly, since a general diffraction element diffracts a light beam to obtain a plurality of diffracted lights, there is a large difference in light intensity between the diffracted lights, and there is a large risk of injury to human eyes.
The collimation beam splitting component in this embodiment not only has the effect of carrying out the collimation to non-collimated light beam, still have the effect of beam split, non-collimated light through the speculum reflection goes out multi-beam collimated light beam toward different angles behind the collimation beam splitting component promptly, and the cross sectional area of the multi-beam collimated light beam of outgoing is approximate equal, energy flux is approximate equal, and then it is better to make the scattered point light that utilizes after this beam diffraction carry out image processing or projected effect, and simultaneously, laser emergent light disperses to each beam of light, the risk of injury people's eye has further been reduced, and owing to be speckle structured light, arrange even structured light for other, when reaching same collection effect, the electric energy that consumes is lower.
Based on the above description, in this embodiment, during the motion of the object, the structured light source is projected to the object at a plurality of time points, and a plurality of structured light images modulated by the structured light source through the object are captured, the phase corresponding to the deformed position pixel in each structured light image is demodulated, and the corresponding depth-of-field information is generated according to the distortion of the phase, so that the object is locked based on the structured light, the object can be accurately distinguished from the background environment, and the locking accuracy is higher.
It should be emphasized that, in practical applications, attribute information of an object, such as a difference in shape, material, volume, and moving speed of the object, and environmental information of the object, such as environmental brightness, have an influence on the acquired depth information.
For example, if the object is an object with a small volume and a large moving speed, then in order to more accurately acquire the depth of field information of the object, the structured light needs to be controlled to fall on the object, and the small object is prevented from falling outside the speckle point of the structured light source, so that the position between the structured light projector and the camera is adjusted, and the divergence area of the structured light is adjusted.
If the object is an object with a large volume and a slow moving speed, in order to clearly acquire the depth of field information of the object, the structured light is required to be controlled to comprehensively cover the object, and the problem that the depth of field information of the large object is not completely acquired due to the undersized scattered spot coverage area is avoided.
For another example, in a scene with bright ambient brightness, the light intensity of the structural light source does not need to be too large to collect the relevant information, if the light intensity of the structural light source is large, the waste of electric quantity is caused, and the cruising ability of the terminal device is affected.
Therefore, in order to improve the consistency between the acquired depth-of-field information of the object and the actual operation condition of the object, attribute information and environment information of the object can be acquired, and the position between the structured light projector and the camera and the brightness of the structured light source can be determined according to the attribute information and the environment information.
And 103, determining a motion curve between the object and the camera according to the depth of field information of the plurality of structured light images, and predicting the distance and the direction between the object and the camera when the preset time point is reached according to the motion curve.
And 104, adjusting the focusing stroke curve of the camera according to the distance and the direction, and further shooting an image when a preset time point is reached.
Specifically, since the motion of the object has a certain regularity, for example, the running direction of the user is along the track and the speed is uniform, for example, the moving basketball runs according to a certain parabolic estimated track and the speed variation is uniform, the position of the object at the preset time point is predicted based on the regularity of the moving object.
In the embodiment of the invention, a motion curve between the object and the camera is determined according to the depth information of the plurality of structured light images, and the distance and the direction of the object from the camera when the preset time point is reached are predicted according to the motion curve.
And then, adjusting the focusing stroke curve of the camera according to the distance and the direction, and then shooting an image when reaching a preset time point, so that focusing is performed before the actual reaching position of the object, the track of the moving object can be quickly tracked, and a clear picture of the moving object is guaranteed to be shot.
It should be noted that, the focusing control devices are different according to different application scenes, so that the focusing travel curve of the camera can be adjusted in different ways according to the distance and the direction, and an image is shot when a preset time point is reached.
As a possible implementation manner, the distance and the direction are calculated according to a preset first algorithm, the first distance of the motor movement is determined, the stroke curve of the camera focusing motor is adjusted to reach a position matched with the first distance, and then an image is shot when a preset time point is reached.
The preset first algorithm is related to a driving mode of the motor, the distance and the direction are calculated according to the preset first algorithm, and the determined first distance of the motor movement can ensure correct focusing and clear imaging when an object in the preset time is photographed.
As another possible implementation manner, the distance and the direction are calculated according to a preset second algorithm, a second distance of the movement of the micro electro mechanical system is determined, a stroke curve of the camera micro electro mechanical system is adjusted to reach a position matched with the second distance, and then an image is shot when a preset time point is reached.
The preset second algorithm is related to a driving mode of the micro-electro-mechanical system, the distance and the direction are calculated according to the preset second algorithm, and the determined second distance of the micro-electro-mechanical system can ensure correct focusing and clear imaging when an object in the preset time is photographed.
Therefore, according to the method for shooting the moving object, the depth of field information of the object can be obtained at the millimeter level precision through the structured light technology, the object can be locked quickly, for example, a certain person, the distance from a camera is obtained by carrying out real-time 3D scanning ranging on a moving body through the structured light technology, and the fast focusing can be completed according to the distance and the stroke curve of the camera focusing driving device corresponding to the distance.
Meanwhile, in the moving process of the object, the speed of the moving object and the predicted moving direction of the moving object are calculated through comparison of adjacent frames of images and through exposure time and a structured light ranging change value, so that a distance change value between the predicted distance and the camera is obtained, and then a stroke distance value corresponding to the camera driving device is derived according to the predicted distance, so that the camera is directly pushed to a corresponding focusing position, the time difference of a conventional focusing method after the camera judges at first after the object is in place is avoided, the track of the moving object is quickly tracked, and a clear picture of the moving object is guaranteed to be shot.
In summary, in the moving object shooting method according to the embodiment of the present invention, in the moving process of the object, the structured light sources are respectively projected to the object at multiple time points, and multiple structured light images of the structured light sources modulated by the object are shot, the phase corresponding to the deformed position pixel in each structured light image is demodulated, corresponding depth of field information is generated according to the phase, a motion curve between the object and the camera is determined according to the depth of field information of the multiple structured light images, the distance and the direction of the object from the camera when the preset time point is reached are predicted according to the motion curve, a focusing stroke curve of the camera is adjusted according to the distance and the direction, and then the image is shot when the preset time point is reached. Therefore, the fast focusing is realized, the clear photographing of the moving object is realized, and the photographing definition is improved.
In order to implement the above embodiment, the present invention further provides a camera of a moving object, fig. 4 is a block diagram of a camera of a moving object according to an embodiment of the present invention, and as shown in fig. 4, the camera of a moving object includes a photographing module 100, a generating module 200, a first determining module 300, a predicting module 400, and a focusing module 500.
The shooting module 100 is configured to respectively project the structured light source to the object at multiple time points during the motion of the object, and shoot multiple structured light images of the structured light source modulated by the object.
The generating module 200 is configured to demodulate a phase corresponding to the deformed pixel in each structured light image, and generate corresponding depth-of-field information according to the phase.
The first determining module 300 is configured to determine a motion curve between the object and the camera according to the depth information of the plurality of structured light images.
And the predicting module 400 is configured to predict a distance and a direction from the object to the camera when the preset time point is reached according to the motion curve.
And the focusing module 500 is used for adjusting a focusing stroke curve of the camera according to the distance and the direction, and further shooting an image when the shooting module reaches a preset time point.
In this embodiment, the focusing module 500 may adopt different implementation manners to adjust a stroke curve of the camera focusing according to different application scenarios, and in an embodiment of the present invention, as shown in fig. 5, the focusing module 500 includes a determining unit 510 and an adjusting unit 520. The determining unit 510 is configured to calculate a distance and a direction according to a preset first algorithm, and determine a first distance of the motor movement.
The adjusting unit 520 is configured to adjust a stroke curve of the camera focusing motor to a position matching the first distance, and then shoot an image when the preset time point is reached.
Fig. 6 is a block diagram of a structure of a moving object photographing device according to still another embodiment of the present invention, as shown in fig. 6, and further includes an acquisition module 600 and a second determination module 700 based on that shown in fig. 4. The obtaining module 600 is configured to obtain attribute information and environment information of a moving object.
A second determining module 700 for determining a position between the structured light projector and the camera and a brightness of the structured light source according to the attribute information and the environment information.
The division of each module in the moving object capturing device is only used for illustration, and in other embodiments, the moving object capturing device may be divided into different modules as needed to complete all or part of the functions of the moving object capturing device.
It should be noted that the invention described above focusing on the shooting method of the moving object is also applicable to the shooting device of the moving object according to the embodiment of the present invention, and the implementation principle is similar, and is not described herein again.
In summary, in the moving object shooting device according to the embodiment of the present invention, in the moving process of the object, the structured light sources are respectively projected to the object at multiple time points, and multiple structured light images of the structured light sources modulated by the object are shot, the phase corresponding to the deformed position pixel in each structured light image is demodulated, corresponding depth of field information is generated according to the phase, a motion curve between the object and the camera is determined according to the depth of field information of the multiple structured light images, the distance and the direction of the object from the camera when the preset time point is reached are predicted according to the motion curve, a focusing stroke curve of the camera is adjusted according to the distance and the direction, and then the image is shot when the preset time point is reached. Therefore, the fast focusing is realized, the clear photographing of the moving object is realized, and the photographing definition is improved.
In order to achieve the purpose, the invention further provides terminal equipment. The terminal includes therein an Image processing circuit, which may be implemented using hardware and/or software components, and may include various processing units defining an ISP (Image signal processing) pipeline. Fig. 7 is a schematic structural diagram of an image processing circuit in a terminal device according to an embodiment of the present invention. As shown in fig. 7, for ease of explanation, only aspects of the image processing techniques related to embodiments of the present invention are shown.
As shown in fig. 7, image processing circuit 110 includes an imaging device 1110, an ISP processor 1130, and control logic 1140. The imaging device 1110 may include a camera with one or more lenses 1112, an image sensor 1114, and a structured light projector 1116. The structured light projector 1116 projects structured light onto an object to be measured. The structured light pattern may be a laser stripe, a gray code, a sinusoidal stripe, or a randomly arranged speckle pattern. The image sensor 1114 captures a structured light image projected onto the object to be measured, and transmits the structured light image to the ISP processor 1130, and the ISP processor 1130 demodulates the structured light image to obtain depth information of the object to be measured. At the same time, the image sensor 1114 can also capture color information of the object under test. Of course, the structured light image and the color information of the object to be measured may be captured by the two image sensors 1114, respectively. Taking speckle structured light as an example, the ISP processor 1130 demodulates the structured light image, specifically including acquiring a speckle image of the measured object from the structured light image, performing image data calculation on the speckle image of the measured object and the reference speckle image according to a predetermined algorithm, and acquiring a moving distance of each scattered spot of the speckle image on the measured object relative to a reference scattered spot in the reference speckle image. And (4) converting and calculating by using a trigonometry method to obtain the depth value of each scattered spot of the speckle image, and obtaining the depth information of the measured object according to the depth value. Of course, the depth image information and the like may be acquired by a binocular vision method or a method based on the time difference of flight TOF, and the method is not limited thereto, as long as the depth information of the object to be measured can be acquired or obtained by calculation, and all methods fall within the scope of the present embodiment.
After the ISP processor 1130 receives the color information of the object to be measured captured by the image sensor 1114, image data corresponding to the color information of the object to be measured may be processed. ISP processor 1130 analyzes the image data to obtain image statistics that may be used to determine one or more control parameters of imaging device 1110. The image sensor 1114 may include an array of color filters (e.g., Bayer filters), and the image sensor 1114 may acquire light intensity and wavelength information captured with each imaging pixel of the image sensor 1114 and provide a set of raw image data that may be processed by the ISP processor 1130.
ISP processor 1130 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 1130 may perform one or more image processing operations on the raw image data, collecting image statistics about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 1130 may also receive pixel data from image memory 1120. The image memory 1120 may be a portion of a memory device, a storage device, or a separate dedicated memory within an electronic device, and may include a DMA (direct memory Access) feature.
Upon receiving the raw image data, ISP processor 1130 may perform one or more image processing operations.
After the ISP processor 1130 obtains the color information and the depth information of the object to be measured, it may be fused to obtain a three-dimensional image. The feature of the corresponding object to be measured can be extracted by at least one of an appearance contour extraction method or a contour feature extraction method. For example, the features of the object to be measured are extracted by methods such as an active shape model method ASM, an active appearance model method AAM, a principal component analysis method PCA, and a discrete cosine transform method DCT, which are not limited herein. And then the characteristics of the measured object extracted from the depth information and the characteristics of the measured object extracted from the color information are subjected to registration and characteristic fusion processing. The fusion processing may be a process of directly combining the features extracted from the depth information and the color information, a process of combining the same features in different images after weight setting, or a process of generating a three-dimensional image based on the features after fusion in other fusion modes.
Image data for a three-dimensional image may be sent to image memory 1120 for additional processing before being displayed. ISP processor 1130 receives processed data from image memory 1120 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. Image data for a three-dimensional image may be output to a display 1160 for viewing by a user and/or for further Processing by a Graphics Processing Unit (GPU). Further, the output of ISP processor 1130 may also be sent to image memory 1120, and display 1160 may read image data from image memory 1120. In one embodiment, image memory 1120 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 1130 may be transmitted to an encoder/decoder 1150 for encoding/decoding image data. The encoded image data may be saved and decompressed before being displayed on the display 1160 device. The encoder/decoder 1150 may be implemented by a CPU or GPU or coprocessor.
The image statistics determined by the ISP processor 1130 may be sent to the control logic processor 1140 unit. Control logic 1140 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters for imaging device 1110 based on the received image statistics.
The following steps are steps of implementing the method for shooting the moving object by using the image processing technology in fig. 7:
step 101', during the motion of the object, the structured light source is projected to the object at a plurality of time points, and a plurality of structured light images of the structured light source modulated by the object are captured.
And 102', demodulating the phase corresponding to the deformed position pixel in each structured light image, and generating corresponding depth-of-field information according to the phase.
And 103', determining a motion curve between the object and the camera according to the depth information of the structured light images, and predicting the distance and the direction between the object and the camera when the preset time point is reached according to the motion curve.
And step 104', adjusting the focusing stroke curve of the camera according to the distance and the direction, and further shooting an image when a preset time point is reached.
It should be noted that the foregoing explanation of the embodiment of the method for capturing a moving object is also applicable to the terminal device of this embodiment, and the implementation principle is similar, and is not described herein again.
In summary, in the terminal device according to the embodiment of the present invention, in the process of moving an object, a structured light source is projected to the object at a plurality of time points, and a plurality of structured light images of the structured light source modulated by the object are captured, a phase corresponding to a deformed position pixel in each structured light image is demodulated, corresponding depth-of-field information is generated according to the phase, a motion curve between the object and a camera is determined according to the depth-of-field information of the plurality of structured light images, the distance and the direction of the object from the camera when the preset time point is reached are predicted according to the motion curve, a stroke curve for focusing the camera is adjusted according to the distance and the direction, and then the image is captured when the preset time point is reached. Therefore, the fast focusing is realized, the clear photographing of the moving object is realized, and the photographing definition is improved.
An embodiment of the present invention also proposes a non-transitory computer-readable storage medium on which a computer program is stored, which, when executed by a processor, is capable of implementing the method for capturing a moving object as described in the foregoing embodiment.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (8)

1. A method of photographing a moving object, comprising:
in the moving process of an object, acquiring attribute information and environment information of the moving object, wherein the attribute information comprises the shape, the material, the volume and the moving speed of the moving object, and the environment information comprises environment brightness information;
determining the divergence area of a structural light source and the position between a structural light projector and a camera according to the volume and the motion speed, and determining the brightness of the structural light source according to the environment information;
projecting a structured light source to an object at a plurality of time points according to the position, the divergence area and the brightness, and shooting a plurality of structured light images of the structured light source modulated by the object;
demodulating a phase corresponding to a pixel at a deformation position in each structured light image, and generating corresponding depth-of-field information according to the phase;
determining a motion curve between the object and the camera according to the depth of field information of the plurality of structured light images, and predicting the distance and the direction of the object from the camera when a preset time point is reached according to the motion curve; and adjusting the focusing stroke curve of the camera according to the distance and the direction, and further shooting an image when the preset time point is reached.
2. The method of claim 1, wherein the adjusting the focusing travel curve of the camera according to the distance and the direction to capture the image at the preset time point comprises:
calculating the distance and the direction according to a preset first algorithm, and determining a first distance of the motor movement;
and adjusting the stroke curve of the camera focusing motor to reach a position matched with the first distance, and then shooting an image when the preset time point is reached.
3. The method of claim 1, wherein the adjusting the focusing travel curve of the camera according to the distance and the direction to capture the image at the preset time point comprises:
calculating the distance and the direction according to a preset second algorithm to determine a second distance of the movement of the micro-electro-mechanical system;
and adjusting the stroke curve of the camera micro-electromechanical system to reach a position matched with the second distance, and further shooting an image when the preset time point is reached.
4. The method of claim 1, wherein the structural features of the structured light source comprise:
laser stripes, gray codes, sinusoidal stripes, uniform speckles, or non-uniform speckles.
5. A photographing apparatus of a moving object, comprising:
the device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring attribute information and environment information of a moving object in the moving process of the object, the attribute information comprises the shape, the material, the volume and the moving speed of the moving object, and the environment information comprises environment brightness information;
the second determination module is used for determining the divergence area of the structured light source and the position between the structured light projector and the camera according to the volume and the motion speed, and determining the brightness of the structured light source according to the environment information;
the shooting module is used for respectively projecting a structural light source to an object at a plurality of time points according to the position, the divergence area and the brightness, and shooting a plurality of structural light images of the structural light source modulated by the object;
the generating module is used for demodulating the phase corresponding to the deformed position pixel in each structured light image and generating corresponding depth-of-field information according to the phase;
the first determining module is used for determining a motion curve between the object and the camera according to the depth information of the structured light images;
the prediction module is used for predicting the distance and the direction of the object from the camera when a preset time point is reached according to the motion curve; and the focusing module is used for adjusting a focusing stroke curve of the camera according to the distance and the direction so as to shoot the image when the shooting module reaches the preset time point.
6. The apparatus of claim 5, wherein the focusing module comprises:
the determining unit is used for calculating the distance and the direction according to a preset first algorithm and determining a first distance of the motor movement;
and the adjusting unit is used for adjusting the stroke curve of the camera focusing motor to reach a position matched with the first distance, and then shooting an image when the preset time point is reached.
7. A terminal device characterized by comprising a memory and a processor, the memory having stored therein computer-readable instructions which, when executed by the processor, cause the processor to execute the method of photographing a moving object according to any one of claims 1 to 4.
8. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements a method of capturing a moving object according to any one of claims 1 to 4.
CN201710677576.5A 2017-08-09 2017-08-09 Method and device for shooting moving object Active CN107483815B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710677576.5A CN107483815B (en) 2017-08-09 2017-08-09 Method and device for shooting moving object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710677576.5A CN107483815B (en) 2017-08-09 2017-08-09 Method and device for shooting moving object

Publications (2)

Publication Number Publication Date
CN107483815A CN107483815A (en) 2017-12-15
CN107483815B true CN107483815B (en) 2020-08-07

Family

ID=60600054

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710677576.5A Active CN107483815B (en) 2017-08-09 2017-08-09 Method and device for shooting moving object

Country Status (1)

Country Link
CN (1) CN107483815B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108332718B (en) * 2018-02-01 2024-04-05 武汉尺子科技有限公司 Structured light information acquisition system
CN109686031B (en) * 2018-12-21 2020-10-27 北京智行者科技有限公司 Identification following method based on security
CN110174075B (en) * 2019-04-08 2020-11-03 深圳奥比中光科技有限公司 Single-zoom-structure optical depth camera and zooming method
CN111047836A (en) * 2019-12-17 2020-04-21 李启同 Intelligent early warning device for falling object
CN112150382B (en) * 2020-09-24 2022-09-09 中国科学技术大学 High space-time resolution ratio periodic motion three-dimensional measuring method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104902246A (en) * 2015-06-17 2015-09-09 浙江大华技术股份有限公司 Video monitoring method and device
CN105554367A (en) * 2015-09-30 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Movement photographing method and mobile terminal
CN106454287A (en) * 2016-10-27 2017-02-22 深圳奥比中光科技有限公司 Combined camera shooting system, mobile terminal and image processing method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI530909B (en) * 2013-12-31 2016-04-21 財團法人工業技術研究院 System and method for image composition

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104902246A (en) * 2015-06-17 2015-09-09 浙江大华技术股份有限公司 Video monitoring method and device
CN105554367A (en) * 2015-09-30 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Movement photographing method and mobile terminal
CN106454287A (en) * 2016-10-27 2017-02-22 深圳奥比中光科技有限公司 Combined camera shooting system, mobile terminal and image processing method

Also Published As

Publication number Publication date
CN107483815A (en) 2017-12-15

Similar Documents

Publication Publication Date Title
CN107483815B (en) Method and device for shooting moving object
CN107368730B (en) Unlocking verification method and device
CN107480613B (en) Face recognition method and device, mobile terminal and computer readable storage medium
CN107464280B (en) Matching method and device for user 3D modeling
CN107590828B (en) Blurring processing method and device for shot image
CN107797664B (en) Content display method and device and electronic device
CN107517346B (en) Photographing method and device based on structured light and mobile device
CN107734267B (en) Image processing method and device
CN107564050B (en) Control method and device based on structured light and terminal equipment
US20130194390A1 (en) Distance measuring device
US11138740B2 (en) Image processing methods, image processing apparatuses, and computer-readable storage medium
CN107610080B (en) Image processing method and apparatus, electronic apparatus, and computer-readable storage medium
CN107610127B (en) Image processing method, image processing apparatus, electronic apparatus, and computer-readable storage medium
CN107610171B (en) Image processing method and device
CN107370951B (en) Image processing system and method
CN107392874B (en) Beauty treatment method and device and mobile equipment
CN107705278B (en) Dynamic effect adding method and terminal equipment
CN107734264B (en) Image processing method and device
CN107509043B (en) Image processing method, image processing apparatus, electronic apparatus, and computer-readable storage medium
CN107820019B (en) Blurred image acquisition method, blurred image acquisition device and blurred image acquisition equipment
CN107454336B (en) Image processing method and apparatus, electronic apparatus, and computer-readable storage medium
CN107592491B (en) Video communication background display method and device
CN107613239B (en) Video communication background display method and device
CN107493452B (en) Video picture processing method and device and terminal
CN107734266B (en) Image processing method and apparatus, electronic apparatus, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant