CN113436349A - 3D background replacing method and device, storage medium and terminal equipment - Google Patents

3D background replacing method and device, storage medium and terminal equipment Download PDF

Info

Publication number
CN113436349A
CN113436349A CN202110721434.0A CN202110721434A CN113436349A CN 113436349 A CN113436349 A CN 113436349A CN 202110721434 A CN202110721434 A CN 202110721434A CN 113436349 A CN113436349 A CN 113436349A
Authority
CN
China
Prior art keywords
rotation angle
images
image
data corresponding
clock offset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110721434.0A
Other languages
Chinese (zh)
Other versions
CN113436349B (en
Inventor
常玉军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Tianjin Co Ltd
Original Assignee
Spreadtrum Communications Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Tianjin Co Ltd filed Critical Spreadtrum Communications Tianjin Co Ltd
Priority to CN202110721434.0A priority Critical patent/CN113436349B/en
Publication of CN113436349A publication Critical patent/CN113436349A/en
Priority to PCT/CN2022/099532 priority patent/WO2023273923A1/en
Application granted granted Critical
Publication of CN113436349B publication Critical patent/CN113436349B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Engineering & Computer Science (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention provides a 3D background replacing method, a device, a storage medium and terminal equipment. The method comprises the following steps: acquiring a set number of characteristic images and inertial measurement unit IMU data corresponding to each characteristic image; performing off-line calibration according to a set number of characteristic images and IMU data corresponding to each characteristic image to generate a first clock offset; acquiring a plurality of shot images and IMU data corresponding to each shot image; performing online calibration according to the plurality of shot images and IMU data corresponding to each shot image to generate second clock skew; and performing 3D background replacement on each shot image according to the first clock offset and the second clock offset to generate a plurality of background replacement images. According to the technical scheme provided by the embodiment of the invention, the stability of 3D background replacement can be fully ensured according to the first clock offset generated by off-line calibration and the second clock offset generated by on-line calibration.

Description

3D background replacing method and device, storage medium and terminal equipment
[ technical field ] A method for producing a semiconductor device
The present invention relates to the field of image technologies, and in particular, to a 3D background replacement method, apparatus, storage medium, and terminal device.
[ background of the invention ]
The 3D background replacement is divided into two parts, wherein the first part is used for constructing a virtual 3D world and controlling a camera of the virtual 3D world in real time according to the camera posture in the real world, so that an image similar to a real camera and acquired in real time is obtained through rendering. The second part is that the current frame is divided into two areas of background and foreground by a scene division algorithm, and then the image obtained by the first part is taken as the background and the foreground of the current image to be fused together to form a new scene image.
In the related art, the rotation angle of the terminal device is calculated according to two adjacent shot images in the multiple shot images, the calculation amount is very large, the calculation time is too long, the calculation result has a great relationship with the shot scene corresponding to the shot images, if the shot scene lacks feature points, the calculated posture and the actual posture have great deviation, and the stability of 3D background replacement is reduced.
In another related technology, the attitude of the terminal device is controlled by using Inertial Measurement Unit (IMU) data, so that the problem of clock mismatch between an IMU system and a camera system of the terminal device may occur, which may cause the attitude control to fail to synchronize with an image, the problem of clock mismatch may occur, and the stability of 3D background replacement is reduced.
[ summary of the invention ]
In view of this, embodiments of the present invention provide a 3D background replacement method, apparatus, storage medium, and terminal device, so as to improve stability of 3D background replacement.
In one aspect, an embodiment of the present invention provides a 3D background replacement method, including:
acquiring a set number of characteristic images and Inertial Measurement Unit (IMU) data corresponding to each characteristic image;
performing off-line calibration according to a set number of characteristic images and IMU data corresponding to each characteristic image to generate a first clock offset;
acquiring a plurality of shot images and IMU data corresponding to each shot image;
performing online calibration according to a plurality of shot images and IMU data corresponding to each shot image to generate second clock skew;
and performing 3D background replacement on each shot image according to the first clock offset and the second clock offset to generate a plurality of background replacement images.
Optionally, the performing offline calibration according to a set number of the feature images and IMU data corresponding to each feature image to generate a first clock offset includes:
extracting two adjacent characteristic images in a set number of characteristic images to generate a first extracted image;
generating a first rotation angle set according to the first extracted image and IMU data corresponding to the first extracted image;
and generating the first clock offset according to the first rotation angle set and the acquired second rotation angle set, wherein the second rotation angle set comprises the rotation angle set of the second rotation angle set.
Optionally, the performing online calibration according to a plurality of captured images and IMU data corresponding to each captured image to generate a second clock offset includes:
extracting a first feature point set of a plurality of the shot images;
judging whether the number of elements in the first feature point set is larger than a set threshold value or not;
if the number of the elements in the first feature point set is larger than a set threshold value, extracting two adjacent shot images in the multiple shot images to generate a second extracted image;
generating a third rotation angle set according to the plurality of second extraction images and IMU data corresponding to the second extraction images;
and generating the second clock offset according to the third rotation angle set and an acquired fourth rotation angle set, wherein the fourth rotation angle set comprises a rotation angle set of the fourth rotation angle set.
Optionally, the generating a first rotation angle set according to the first extracted image and IMU data corresponding to the first extracted image includes:
calculating the first extracted image and IMU data corresponding to the first extracted image through an image processing technology function to generate a second feature point set;
and calculating the second feature point set through an optical flow pyramid function to generate the first rotation angle set.
Optionally, the generating the first clock offset according to the first rotation angle set and the acquired second rotation angle set includes:
generating a first image rotation angle curve according to the first rotation angle set and the acquired time stamp corresponding to the first rotation angle set;
generating a first IMU angle curve according to the second rotation angle set and the acquired time stamp corresponding to the second rotation angle set;
generating a plurality of first correlation distances according to the first image rotation angle curve and the first IMU angle curve;
and inquiring the minimum first correlation distance in the plurality of first correlation distances and the first clock offset corresponding to the minimum first correlation distance.
Optionally, the generating a third rotation angle set according to the plurality of second extracted images and IMU data corresponding to the second extracted images includes:
calculating the second extracted image and IMU data corresponding to the second extracted image through an image processing technology function to generate a third feature point set;
and calculating the third feature point set through an optical flow pyramid function to generate the third rotation angle set.
Optionally, the generating the second clock offset according to the third rotation angle set and the acquired fourth rotation angle set includes:
generating a second image rotation angle curve according to the third rotation angle set and the acquired timestamp corresponding to the third rotation angle set;
generating a second IMU angle curve according to the fourth rotation angle set and the acquired timestamp corresponding to the fourth rotation angle set;
generating a plurality of second correlation distances according to the second image rotation angle curve and the second IMU angle curve;
and inquiring the minimum second correlation distance in the plurality of second correlation distances and the second clock offset corresponding to the minimum second correlation distance.
On the other hand, an embodiment of the present invention provides a 3D background replacement apparatus, including:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a set number of characteristic images and inertial measurement unit IMU data corresponding to each characteristic image;
the first generation module is used for performing off-line calibration according to a set number of characteristic images and IMU data corresponding to each characteristic image to generate a first clock offset;
the second acquisition module is used for acquiring a plurality of shot images and IMU data corresponding to each shot image;
the second generation module is used for carrying out online calibration according to a plurality of shot images and IMU data corresponding to each shot image to generate second clock offset;
and the third generation module is used for performing 3D background replacement on each shot image according to the first clock offset and the second clock offset to generate a plurality of background replacement images.
In another aspect, an embodiment of the present invention provides a storage medium, including: the storage medium includes a stored program, wherein the apparatus on which the storage medium is located is controlled to execute the above-mentioned 3D background replacement method when the program runs.
In another aspect, an embodiment of the present invention provides a terminal device, including a memory and a processor, where the memory is used to store information including program instructions, and the processor is used to control execution of the program instructions, where the program instructions are loaded by the processor and executed to implement the steps of the 3D background replacement method.
According to the technical scheme of the 3D background replacement method, a set number of characteristic images and Inertial Measurement Unit (IMU) data corresponding to each characteristic image are obtained; performing off-line calibration according to a set number of characteristic images and IMU data corresponding to each characteristic image to generate a first clock offset; acquiring a plurality of shot images and IMU data corresponding to each shot image; performing online calibration according to the plurality of shot images and IMU data corresponding to each shot image to generate second clock skew; and performing 3D background replacement on each shot image according to the first clock offset and the second clock offset to generate a plurality of background replacement images. According to the technical scheme provided by the embodiment of the invention, the stability of 3D background replacement can be fully ensured according to the first clock offset generated by off-line calibration and the second clock offset generated by on-line calibration.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a flowchart of a 3D background replacement method according to an embodiment of the present invention;
FIG. 2 is a schematic illustration of a feature image;
FIG. 3 is a flowchart of FIG. 1 illustrating a first clock offset generated by performing an offline calibration according to a set number of feature images and IMU data corresponding to each feature image;
FIG. 4 is a flow chart of FIG. 1 illustrating the generation of a second clock offset by performing an online calibration based on a plurality of captured images and IMU data corresponding to each captured image;
fig. 5 is a schematic structural diagram of a 3D background replacing apparatus according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of the first generation module in FIG. 5;
FIG. 7 is a schematic diagram of a second generation module shown in FIG. 5;
fig. 8 is a schematic diagram of a terminal device according to an embodiment of the present invention.
[ detailed description ] embodiments
For better understanding of the technical solutions of the present invention, the following detailed descriptions of the embodiments of the present invention are provided with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of associative relationship that describes an associated object, meaning that three types of relationships may exist, e.g., A and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
An embodiment of the present invention provides a 3D background replacing method, and fig. 1 is a flowchart of the 3D background replacing method provided in the embodiment of the present invention, as shown in fig. 1, the method includes:
and 102, acquiring a set number of characteristic images and inertial measurement unit IMU data corresponding to each characteristic image.
In the embodiment of the invention, each step is executed by a terminal device provided with an image sensor and an IMU sensor, such as a mobile phone or a tablet computer.
In the embodiment of the invention, the IMU sensor is a device for measuring the three-axis attitude angle (or angular velocity) and acceleration of the object. The IMU sensor of the terminal device includes: the device comprises a gyroscope, an accelerometer, a gravity sensor and a geomagnetic sensor.
In an embodiment of the present invention, the IMU data includes one of gyroscope data, accelerometer data, gravity sensor data, geomagnetic sensor data, or any combination thereof.
In the embodiment of the invention, a user shoots a set number of characteristic images at different shooting angles through terminal equipment, and obtains inertial measurement unit IMU data corresponding to each characteristic image.
In the embodiment of the invention, the set number can be set according to the actual situation. For example, the set number is 100.
In an embodiment of the present invention, the characteristic image includes a timestamp, an exposure time, and a rolling shutter time.
In the embodiment of the present invention, fig. 2 is a schematic diagram of a feature image, and as shown in fig. 2, the feature image is a scene image with obvious feature points. Such as a checkerboard image.
And 104, performing off-line calibration according to the set number of characteristic images and IMU data corresponding to each characteristic image to generate a first clock offset.
In an embodiment of the present invention, fig. 3 is a flowchart of performing offline calibration according to a set number of feature images and IMU data corresponding to each feature image in fig. 1 to generate a first clock skew, and as shown in fig. 3, step 104 includes:
step 1042, two adjacent feature images in the set number of feature images are extracted, and a first extracted image is generated.
In this step, two adjacent frames of feature images in a set number of frames of feature images are extracted to generate a first extracted image.
Step 1044 is to generate a first rotation angle set according to the first extracted image and the IMU data corresponding to the first extracted image.
In the embodiment of the present invention, step 1044 includes:
step A1, the first extracted image and IMU data corresponding to the first extracted image are calculated through an image processing technology function, and a second feature point set is generated.
In an embodiment of the present invention, the image processing technology function includes a googfetaturettrack function of an opencv library.
And A2, calculating the second feature point set through an optical flow pyramid function to generate a first rotation angle set.
In an embodiment of the invention, the optical flow pyramid function comprises a calcOpticalFlowPyrLk function.
Step 1046, generating a first clock offset according to the first rotation angle set and the obtained second rotation angle set, where the second rotation angle set includes its own rotation angle set.
In this embodiment of the present invention, the second rotation angle set includes a rotation angle set of the terminal device. When the terminal device shoots a set number of characteristic images at different shooting angles, the terminal device acquires a rotation angle set when the characteristic images are shot at different shooting angles.
In this embodiment of the present invention, step 1046 includes:
and step B1, generating a first image rotation angle curve according to the first rotation angle set and the acquired time stamps corresponding to the first rotation angle set.
In the embodiment of the present invention, the terminal device stores the correspondence between the first rotation angle set and the timestamp, and acquires the timestamp corresponding to the first rotation angle set from the terminal device according to the correspondence between the first rotation angle set and the timestamp.
In this step, a first image rotation angle curve is generated by taking the first rotation angle set as a vertical coordinate and taking the timestamp corresponding to the first rotation angle set as a horizontal coordinate.
And step B2, generating a first IMU angle curve according to the second rotation angle set and the acquired time stamps corresponding to the second rotation angle set.
In the embodiment of the present invention, the terminal device stores the correspondence between the second rotation angle set and the timestamp, and acquires the timestamp corresponding to the second rotation angle set from the terminal device according to the correspondence between the second rotation angle set and the timestamp.
In this step, the first IMU angle curve is generated by taking the second rotation angle set as a vertical coordinate and taking the timestamp corresponding to the second rotation angle set as a horizontal coordinate.
And step B3, generating a plurality of first correlation distances according to the first image rotation angle curve and the first IMU angle curve.
In this step, a distance between the first image rotation angle curve and the first IMU angle curve is a first correlation distance. The first correlation distance may represent a degree of matching between the first rotation angle set and the second rotation angle set, and the smaller the value of the first correlation distance, the higher the degree of matching between the first rotation angle set and the second rotation angle set.
In the embodiment of the present invention, if the clock skew between the first rotation angle set and the second rotation angle set is set to be [ -50ms, +50ms ], then the timestamp is shifted within the range by using 0.5ms as a step length, and the first correlation distance is calculated once every time of shifting, so as to obtain a plurality of first correlation distances.
Step B4, finding out the smallest first correlation distance among the plurality of first correlation distances and the first clock offset corresponding to the smallest first correlation distance.
In the embodiment of the present invention, the terminal device stores the corresponding relationship between the first correlation distance and the first clock offset, and can query the first clock offset corresponding to the minimum first correlation distance according to the corresponding relationship between the first correlation distance and the first clock offset.
In an embodiment of the present invention, the first clock offset is a clock offset of the image sensor and the IMU sensor.
And 106, acquiring a plurality of shot images and IMU data corresponding to each shot image.
In the embodiment of the invention, a user shoots a plurality of shot images at different shooting angles through terminal equipment and acquires IMU data corresponding to each shot image.
In the embodiment of the invention, the shot image comprises a time stamp, exposure time and rolling shutter time.
And 108, performing online calibration according to the plurality of shot images and IMU data corresponding to each shot image to generate a second clock offset.
In the embodiment of the invention, the online calibration is executed in parallel with the 3D background replacement, and the online calibration is executed in a single thread in order to not block the operation of the 3D background replacement. The online calibration and the 3D background replacement are operated in parallel, the efficiency of the 3D background replacement is not influenced, the accuracy of the 3D background replacement can be ensured in real time, and the defect of insufficient timeliness of offline calibration is overcome.
In an embodiment of the present invention, fig. 4 is a flowchart of performing online calibration according to a plurality of captured images and IMU data corresponding to each captured image in fig. 1 to generate a second clock offset, as shown in fig. 4, where step 108 includes:
step 1082, extracting a first feature point set of the plurality of captured images.
In the embodiment of the present invention, after a new frame of captured image of the image sensor is acquired, step 1082 is executed.
Step 1084, determining whether the number of elements in the first feature point set is greater than a set threshold, and if so, executing step 1086; if not, the process is ended.
In the embodiment of the invention, for the accuracy of calculation, only enough characteristic points can be extracted from continuous multi-frame images to carry out online calibration.
In the embodiment of the present invention, the setting threshold can be set according to actual conditions, for example, if the frame rates of a plurality of captured images are in the range of [3,10], the setting threshold is set to 30.
In the embodiment of the invention, if the number of elements in the first feature point set is judged to be larger than a set threshold, the fact that enough feature points can be extracted from continuous multi-frame images is indicated; if the number of the elements in the first feature point set is judged to be less than or equal to the set threshold, the fact that enough feature points cannot be extracted from continuous multi-frame images is indicated.
Step 1086 extracts two adjacent captured images of the plurality of captured images, and generates a second extracted image.
In this step, two adjacent frames of the multiple frames of the shot images are extracted to generate a second extracted image.
And 1088, generating a third rotation angle set according to the plurality of second extracted images and the IMU data corresponding to the second extracted images.
In the embodiment of the present invention, step 1088 includes:
and step C1, calculating the second extracted image and IMU data corresponding to the second extracted image through an image processing technology function to generate a third feature point set.
And step C2, calculating the third feature point set through the optical flow pyramid function to generate a third rotation angle set.
And step 1090, generating a second clock offset according to the third rotation angle set and the acquired fourth rotation angle set, wherein the fourth rotation angle set comprises the rotation angle set of the fourth rotation angle set.
In this embodiment of the present invention, the fourth rotation angle set includes a rotation angle set of the terminal device. When the terminal device captures a plurality of captured images at different imaging angles, a set of rotation angles at which the captured images are captured at the different imaging angles is acquired.
In this embodiment of the present invention, step 1090 includes:
and D1, generating a second image rotation angle curve according to the third rotation angle set and the acquired time stamps corresponding to the third rotation angle set.
In the embodiment of the present invention, the terminal device stores a correspondence between the third rotation angle set and the timestamp, and acquires the timestamp corresponding to the third rotation angle set from the terminal device according to the correspondence between the third rotation angle set and the timestamp.
In this step, a second image rotation angle curve is generated by using the third rotation angle set as a vertical coordinate and using a timestamp corresponding to the third rotation angle set as a horizontal coordinate.
And D2, generating a second IMU angle curve according to the fourth rotation angle set and the acquired time stamps corresponding to the fourth rotation angle set.
In the embodiment of the present invention, the terminal device stores a correspondence between the fourth rotation angle set and the timestamp, and acquires the timestamp corresponding to the fourth rotation angle set from the terminal device according to the correspondence between the fourth rotation angle set and the timestamp.
In this step, a second IMU angle curve is generated with the fourth rotation angle set as a vertical coordinate and the timestamp corresponding to the fourth rotation angle set as a horizontal coordinate.
And D3, generating a plurality of second correlation distances according to the second image rotation angle curve and the second IMU angle curve.
In this step, a distance between the second image rotation angle curve and the second IMU angle curve is a second correlation distance. The second correlation distance may represent a degree of matching of the third rotation angle set with the fourth rotation angle set, and the smaller the value of the second correlation distance, the higher the degree of matching of the third rotation angle set with the fourth rotation angle set.
In the embodiment of the present invention, if the clock skew between the third rotation angle set and the fourth rotation angle set is set to be [ -50ms, +50ms ], then the timestamp is shifted within the range by using 0.5ms as a step length, and the second correlation distance is calculated once for each shift, so as to obtain a plurality of second correlation distances.
And D4, inquiring the minimum second correlation distance in the plurality of second correlation distances and the second clock offset corresponding to the minimum second correlation distance.
In the embodiment of the present invention, the terminal device stores the corresponding relationship between the second correlation distance and the second clock offset, and can query the second clock offset corresponding to the minimum second correlation distance according to the corresponding relationship between the second correlation distance and the second clock offset.
In an embodiment of the present invention, the second clock offset is a clock offset of the image sensor and the IMU sensor.
And 110, performing 3D background replacement on each shot image according to the first clock offset and the second clock offset to generate a plurality of background replacement images.
In the embodiment of the invention, when no continuous multi-frame shot images have enough characteristic points, 3D background replacement is carried out on each shot image according to first clock offset to generate a plurality of background replacement images; and when the continuous multi-frame shot images have enough characteristic points, performing 3D background replacement on each shot image according to the second clock offset to generate a plurality of background replacement images. The off-line calibration part can fully make up the effect problem caused by the uncertainty of the on-line calibration.
According to the technical scheme provided by the embodiment of the invention, a set number of characteristic images and inertial measurement unit IMU data corresponding to each characteristic image are obtained; performing off-line calibration according to a set number of characteristic images and IMU data corresponding to each characteristic image to generate a first clock offset; acquiring a plurality of shot images and IMU data corresponding to each shot image; performing online calibration according to the plurality of shot images and IMU data corresponding to each shot image to generate second clock skew; and performing 3D background replacement on each shot image according to the first clock offset and the second clock offset to generate a plurality of background replacement images. According to the technical scheme provided by the embodiment of the invention, the stability of 3D background replacement can be fully ensured according to the first clock offset generated by off-line calibration and the second clock offset generated by on-line calibration.
According to the technical scheme provided by the embodiment of the invention, the problems of non-ideal fusion caused by inaccurate attitude calculation of the terminal equipment during 3D background fusion and frame rate reduction caused by overlarge time consumption of the attitude calculation of the terminal equipment are solved.
The embodiment of the invention provides a 3D background replacing device. Fig. 5 is a schematic structural diagram of a 3D background replacing apparatus according to an embodiment of the present invention, and as shown in fig. 5, the apparatus includes: a first obtaining module 11, a first generating module 12, a second obtaining module 13, a second generating module 14 and a third generating module 15.
The first obtaining module 11 is configured to obtain a set number of feature images and inertial measurement unit IMU data corresponding to each feature image.
The first generating module 12 is configured to perform offline calibration according to a set number of feature images and IMU data corresponding to each feature image, and generate a first clock offset.
The second acquiring module 13 is configured to acquire a plurality of captured images and IMU data corresponding to each captured image.
The second generating module 14 is configured to perform online calibration according to the plurality of captured images and IMU data corresponding to each captured image, and generate a second clock offset.
The third generating module 15 is configured to perform 3D background replacement on each captured image according to the first clock offset and the second clock offset, and generate a plurality of background replacement images.
In an embodiment of the present invention, fig. 6 is a schematic structural diagram of the first generating module 12 in fig. 5, and as shown in fig. 6, the first generating module 12 includes: a first generation submodule 121, a second generation submodule 122 and a third generation submodule 123.
The first generation submodule 121 is configured to extract two adjacent feature images in the set number of feature images, and generate a first extracted image.
The second generation submodule 122 is configured to generate a first rotation angle set according to the first extracted image and IMU data corresponding to the first extracted image.
The third generation submodule 123 is configured to generate the first clock offset according to the first rotation angle set and the acquired second rotation angle set, where the second rotation angle set includes its own rotation angle set.
In an embodiment of the present invention, fig. 7 is a schematic structural diagram of the second generating module 14 in fig. 5, and as shown in fig. 7, the second generating module 14 includes: an extraction submodule 141, a judgment submodule 142, a fourth generation submodule 143, a fifth generation submodule 144, and a sixth generation submodule 145.
The extraction sub-module 141 is configured to extract a first feature point set of the plurality of captured images.
The determining submodule 142 is configured to determine whether the number of elements in the first feature point set is greater than a set threshold, and if it is determined that the number of elements in the first feature point set is greater than the set threshold, trigger the fourth generating submodule 143 to extract two adjacent captured images of the multiple captured images, and generate a second extracted image.
The fifth generation sub-module 144 is configured to generate a third rotation angle set according to the plurality of second extracted images and IMU data corresponding to the second extracted images.
The sixth generation submodule 145 is configured to generate the second clock offset according to the third rotation angle set and the acquired fourth rotation angle set, where the fourth rotation angle set includes its own rotation angle set.
In this embodiment of the present invention, the second generating sub-module 122 is specifically configured to calculate the first extracted image and IMU data corresponding to the first extracted image through an image processing technique function, and generate a second feature point set; and calculating the second feature point set through the optical flow pyramid function to generate a first rotation angle set.
In this embodiment of the present invention, the third generating sub-module 123 is specifically configured to generate a first image rotation angle curve according to the first rotation angle set and the timestamp corresponding to the acquired first rotation angle set; generating a first IMU angle curve according to the second rotation angle set and the acquired time stamp corresponding to the second rotation angle set; generating a plurality of first correlation distances according to the first image rotation angle curve and the first IMU angle curve; and inquiring the minimum first correlation distance in the plurality of first correlation distances and the first clock offset corresponding to the minimum first correlation distance.
In this embodiment of the present invention, the fifth generation sub-module 144 is specifically configured to calculate the second extracted image and IMU data corresponding to the second extracted image through an image processing technique function, and generate a third feature point set; and calculating the third feature point set through the optical flow pyramid function to generate a third rotation angle set.
In this embodiment of the present invention, the sixth generating submodule 145 is specifically configured to generate a second image rotation angle curve according to the third rotation angle set and the timestamp corresponding to the acquired third rotation angle set; generating a second IMU angle curve according to the fourth rotation angle set and the acquired timestamp corresponding to the fourth rotation angle set; generating a plurality of second correlation distances according to the second image rotation angle curve and the second IMU angle curve; and inquiring the minimum second correlation distance in the plurality of second correlation distances and the second clock offset corresponding to the minimum second correlation distance.
According to the technical scheme provided by the embodiment of the invention, a set number of characteristic images and inertial measurement unit IMU data corresponding to each characteristic image are obtained; performing off-line calibration according to a set number of characteristic images and IMU data corresponding to each characteristic image to generate a first clock offset; acquiring a plurality of shot images and IMU data corresponding to each shot image; performing online calibration according to the plurality of shot images and IMU data corresponding to each shot image to generate second clock skew; and performing 3D background replacement on each shot image according to the first clock offset and the second clock offset to generate a plurality of background replacement images. According to the technical scheme provided by the embodiment of the invention, the stability of 3D background replacement can be fully ensured according to the first clock offset generated by off-line calibration and the second clock offset generated by on-line calibration.
The 3D background replacing apparatus provided in this embodiment may be used to implement the 3D background replacing method in fig. 1 to 4, and for specific description, reference may be made to the embodiment of the 3D background replacing method, and a description is not repeated here.
An embodiment of the present invention provides a storage medium, where the storage medium includes a stored program, where, when the program runs, a device on which the storage medium is located is controlled to execute each step of the embodiment of the 3D background replacement method, and for a specific description, reference may be made to the embodiment of the 3D background replacement method.
An embodiment of the present invention provides a terminal device, including a memory and a processor, where the memory is configured to store information including program instructions, and the processor is configured to control execution of the program instructions, and the program instructions are loaded and executed by the processor to implement the steps of the embodiment of the 3D background replacement method, and for specific description, reference may be made to the embodiment of the 3D background replacement method.
Fig. 8 is a schematic diagram of a terminal device according to an embodiment of the present invention. As shown in fig. 8, the terminal device 20 of this embodiment includes: the processor 21, the memory 22, and the computer program 23 stored in the memory 22 and capable of running on the processor 21, where the computer program 23 is executed by the processor 21 to implement the method for replacing the 3D background in the embodiment, and in order to avoid repetition, the description is not repeated here. Alternatively, the computer program is executed by the processor 21 to implement the functions of each model/unit applied in the 3D background replacement apparatus in the embodiments, and for avoiding repetition, the description is omitted here.
The terminal device 20 includes, but is not limited to, a processor 21 and a memory 22. Those skilled in the art will appreciate that fig. 8 is merely an example of a terminal device 20 and does not constitute a limitation of terminal device 20 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the terminal device may also include input-output devices, network access devices, buses, etc.
The Processor 21 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 22 may be an internal storage unit of the terminal device 20, such as a hard disk or a memory of the terminal device 20. The memory 22 may also be an external storage device of the terminal device 20, such as a plug-in hard disk provided on the terminal device 20, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 22 may also include both an internal storage unit of the terminal device 20 and an external storage device. The memory 22 is used for storing computer programs and other programs and data required by the terminal device. The memory 22 may also be used to temporarily store data that has been output or is to be output.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a Processor (Processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. A 3D background replacement method, comprising:
acquiring a set number of characteristic images and Inertial Measurement Unit (IMU) data corresponding to each characteristic image;
performing off-line calibration according to a set number of characteristic images and IMU data corresponding to each characteristic image to generate a first clock offset;
acquiring a plurality of shot images and IMU data corresponding to each shot image;
performing online calibration according to a plurality of shot images and IMU data corresponding to each shot image to generate second clock skew;
and performing 3D background replacement on each shot image according to the first clock offset and the second clock offset to generate a plurality of background replacement images.
2. The method of claim 1, wherein the performing offline calibration according to a set number of the feature images and IMU data corresponding to each of the feature images to generate a first clock offset comprises:
extracting two adjacent characteristic images in a set number of characteristic images to generate a first extracted image;
generating a first rotation angle set according to the first extracted image and IMU data corresponding to the first extracted image;
and generating the first clock offset according to the first rotation angle set and the acquired second rotation angle set, wherein the second rotation angle set comprises the rotation angle set of the second rotation angle set.
3. The method of claim 1, wherein the performing the online calibration based on the plurality of captured images and the IMU data corresponding to each of the captured images to generate the second clock offset comprises:
extracting a first feature point set of a plurality of the shot images;
judging whether the number of elements in the first feature point set is larger than a set threshold value or not;
if the number of the elements in the first feature point set is larger than a set threshold value, extracting two adjacent shot images in the multiple shot images to generate a second extracted image;
generating a third rotation angle set according to the plurality of second extraction images and IMU data corresponding to the second extraction images;
and generating the second clock offset according to the third rotation angle set and an acquired fourth rotation angle set, wherein the fourth rotation angle set comprises a rotation angle set of the fourth rotation angle set.
4. The method of claim 2, wherein generating a first set of rotation angles from the first extracted image and IMU data corresponding to the first extracted image comprises:
calculating the first extracted image and IMU data corresponding to the first extracted image through an image processing technology function to generate a second feature point set;
and calculating the second feature point set through an optical flow pyramid function to generate the first rotation angle set.
5. The method of claim 2, wherein generating the first clock offset from the first set of rotation angles and the obtained second set of rotation angles comprises:
generating a first image rotation angle curve according to the first rotation angle set and the acquired time stamp corresponding to the first rotation angle set;
generating a first IMU angle curve according to the second rotation angle set and the acquired time stamp corresponding to the second rotation angle set;
generating a plurality of first correlation distances according to the first image rotation angle curve and the first IMU angle curve;
and inquiring the minimum first correlation distance in the plurality of first correlation distances and the first clock offset corresponding to the minimum first correlation distance.
6. The method of claim 3, wherein generating a third set of rotation angles from the plurality of second extracted images and IMU data corresponding to the second extracted images comprises:
calculating the second extracted image and IMU data corresponding to the second extracted image through an image processing technology function to generate a third feature point set;
and calculating the third feature point set through an optical flow pyramid function to generate the third rotation angle set.
7. The method of claim 3, wherein generating the second clock offset from the third set of rotation angles and the acquired fourth set of rotation angles comprises:
generating a second image rotation angle curve according to the third rotation angle set and the acquired timestamp corresponding to the third rotation angle set;
generating a second IMU angle curve according to the fourth rotation angle set and the acquired timestamp corresponding to the fourth rotation angle set;
generating a plurality of second correlation distances according to the second image rotation angle curve and the second IMU angle curve;
and inquiring the minimum second correlation distance in the plurality of second correlation distances and the second clock offset corresponding to the minimum second correlation distance.
8. A 3D background replacement apparatus, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a set number of characteristic images and inertial measurement unit IMU data corresponding to each characteristic image;
the first generation module is used for performing off-line calibration according to a set number of characteristic images and IMU data corresponding to each characteristic image to generate a first clock offset;
the second acquisition module is used for acquiring a plurality of shot images and IMU data corresponding to each shot image;
the second generation module is used for carrying out online calibration according to a plurality of shot images and IMU data corresponding to each shot image to generate second clock offset;
and the third generation module is used for performing 3D background replacement on each shot image according to the first clock offset and the second clock offset to generate a plurality of background replacement images.
9. A storage medium, comprising: the storage medium includes a stored program, wherein the apparatus in which the storage medium is located is controlled to execute a 3D background replacement method according to any one of claims 1 to 7 when the program runs.
10. A terminal device comprising a memory for storing information including program instructions and a processor for controlling the execution of the program instructions, characterized in that the program instructions are loaded and executed by the processor to implement the steps of a 3D background replacement method as claimed in any one of claims 1 to 7.
CN202110721434.0A 2021-06-28 2021-06-28 3D background replacement method and device, storage medium and terminal equipment Active CN113436349B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110721434.0A CN113436349B (en) 2021-06-28 2021-06-28 3D background replacement method and device, storage medium and terminal equipment
PCT/CN2022/099532 WO2023273923A1 (en) 2021-06-28 2022-06-17 3d background replacement method and apparatus, storage medium, and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110721434.0A CN113436349B (en) 2021-06-28 2021-06-28 3D background replacement method and device, storage medium and terminal equipment

Publications (2)

Publication Number Publication Date
CN113436349A true CN113436349A (en) 2021-09-24
CN113436349B CN113436349B (en) 2023-05-16

Family

ID=77755042

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110721434.0A Active CN113436349B (en) 2021-06-28 2021-06-28 3D background replacement method and device, storage medium and terminal equipment

Country Status (2)

Country Link
CN (1) CN113436349B (en)
WO (1) WO2023273923A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023273923A1 (en) * 2021-06-28 2023-01-05 展讯通信(天津)有限公司 3d background replacement method and apparatus, storage medium, and terminal device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107241544A (en) * 2016-03-28 2017-10-10 展讯通信(天津)有限公司 Video image stabilization method, device and camera shooting terminal
CN108988974A (en) * 2018-06-19 2018-12-11 远形时空科技(北京)有限公司 Measurement method, device and the system to electronic equipment time synchronization of time delays
CN109186592A (en) * 2018-08-31 2019-01-11 腾讯科技(深圳)有限公司 Method and apparatus and storage medium for the fusion of vision inertial navigation information
US20200250429A1 (en) * 2017-10-26 2020-08-06 SZ DJI Technology Co., Ltd. Attitude calibration method and device, and unmanned aerial vehicle
CN111798489A (en) * 2020-06-29 2020-10-20 北京三快在线科技有限公司 Feature point tracking method, device, medium and unmanned device
CN112396639A (en) * 2019-08-19 2021-02-23 虹软科技股份有限公司 Image alignment method
CN112907629A (en) * 2021-02-08 2021-06-04 浙江商汤科技开发有限公司 Image feature tracking method and device, computer equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9253416B2 (en) * 2008-06-19 2016-02-02 Motorola Solutions, Inc. Modulation of background substitution based on camera attitude and motion
CN108537845B (en) * 2018-04-27 2023-01-03 腾讯科技(深圳)有限公司 Pose determination method, pose determination device and storage medium
US10949978B2 (en) * 2019-01-22 2021-03-16 Fyusion, Inc. Automatic background replacement for single-image and multi-view captures
CN112752038B (en) * 2020-12-28 2024-04-19 广州虎牙科技有限公司 Background replacement method, device, electronic equipment and computer readable storage medium
CN113436349B (en) * 2021-06-28 2023-05-16 展讯通信(天津)有限公司 3D background replacement method and device, storage medium and terminal equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107241544A (en) * 2016-03-28 2017-10-10 展讯通信(天津)有限公司 Video image stabilization method, device and camera shooting terminal
US20200250429A1 (en) * 2017-10-26 2020-08-06 SZ DJI Technology Co., Ltd. Attitude calibration method and device, and unmanned aerial vehicle
CN108988974A (en) * 2018-06-19 2018-12-11 远形时空科技(北京)有限公司 Measurement method, device and the system to electronic equipment time synchronization of time delays
CN109186592A (en) * 2018-08-31 2019-01-11 腾讯科技(深圳)有限公司 Method and apparatus and storage medium for the fusion of vision inertial navigation information
CN112396639A (en) * 2019-08-19 2021-02-23 虹软科技股份有限公司 Image alignment method
CN111798489A (en) * 2020-06-29 2020-10-20 北京三快在线科技有限公司 Feature point tracking method, device, medium and unmanned device
CN112907629A (en) * 2021-02-08 2021-06-04 浙江商汤科技开发有限公司 Image feature tracking method and device, computer equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023273923A1 (en) * 2021-06-28 2023-01-05 展讯通信(天津)有限公司 3d background replacement method and apparatus, storage medium, and terminal device

Also Published As

Publication number Publication date
WO2023273923A1 (en) 2023-01-05
CN113436349B (en) 2023-05-16

Similar Documents

Publication Publication Date Title
Tanskanen et al. Live metric 3D reconstruction on mobile phones
WO2018119889A1 (en) Three-dimensional scene positioning method and device
US20210342990A1 (en) Image coordinate system transformation method and apparatus, device, and storage medium
CN106033621B (en) A kind of method and device of three-dimensional modeling
CN111354042A (en) Method and device for extracting features of robot visual image, robot and medium
CN113029128B (en) Visual navigation method and related device, mobile terminal and storage medium
CN112017216A (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN106537908A (en) Camera calibration
US20160210761A1 (en) 3d reconstruction
US11127156B2 (en) Method of device tracking, terminal device, and storage medium
CN114494388B (en) Three-dimensional image reconstruction method, device, equipment and medium in large-view-field environment
CN109040525B (en) Image processing method, image processing device, computer readable medium and electronic equipment
CN111609868A (en) Visual inertial odometer method based on improved optical flow method
CN113899364B (en) Positioning method and device, equipment and storage medium
CN109143205A (en) Integrated transducer external parameters calibration method, apparatus
CN108122280A (en) The method for reconstructing and device of a kind of three-dimensional point cloud
WO2023005457A1 (en) Pose calculation method and apparatus, electronic device, and readable storage medium
CN112991441A (en) Camera positioning method and device, electronic equipment and storage medium
CN111862150A (en) Image tracking method and device, AR device and computer device
CN113436349A (en) 3D background replacing method and device, storage medium and terminal equipment
CN108804161B (en) Application initialization method, device, terminal and storage medium
CN114882106A (en) Pose determination method and device, equipment and medium
KR101725166B1 (en) 3D image reconstitution method using 2D images and device for the same
CN115773759A (en) Indoor positioning method, device and equipment of autonomous mobile robot and storage medium
CN115311624A (en) Slope displacement monitoring method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant