WO2022063221A1 - 生成陀螺仪旋转方向的方法及计算机设备 - Google Patents

生成陀螺仪旋转方向的方法及计算机设备 Download PDF

Info

Publication number
WO2022063221A1
WO2022063221A1 PCT/CN2021/120272 CN2021120272W WO2022063221A1 WO 2022063221 A1 WO2022063221 A1 WO 2022063221A1 CN 2021120272 W CN2021120272 W CN 2021120272W WO 2022063221 A1 WO2022063221 A1 WO 2022063221A1
Authority
WO
WIPO (PCT)
Prior art keywords
acceleration
gyroscope
coordinate system
rotation direction
imu
Prior art date
Application number
PCT/CN2021/120272
Other languages
English (en)
French (fr)
Inventor
董鹏飞
陈聪
Original Assignee
影石创新科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 影石创新科技股份有限公司 filed Critical 影石创新科技股份有限公司
Priority to EP21871590.2A priority Critical patent/EP4221181A1/en
Priority to US18/028,516 priority patent/US20230362317A1/en
Priority to JP2023519149A priority patent/JP2023543250A/ja
Publication of WO2022063221A1 publication Critical patent/WO2022063221A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/11Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
    • G06F17/13Differential equations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2228Video assist systems used in motion picture production, e.g. video cameras connected to viewfinders of motion picture cameras or related video signal processing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P13/00Indicating or recording presence, absence, or direction, of movement
    • G01P13/02Indicating direction only, e.g. by weather vane
    • G01P13/04Indicating positive or negative direction of a linear movement or clockwise or anti-clockwise direction of a rotational movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/683Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Definitions

  • the present application belongs to the field of video processing, and in particular, relates to a method for generating a rotation direction of a gyroscope, a method and apparatus for realizing the shooting effect of bullet time, a computer-readable storage medium, computer equipment and a camera.
  • Bullet time is a computer-aided photography technique used in movies, TV commercials or computer games to simulate variable-speed special effects, such as enhanced slow motion, time stills and other effects.
  • Bullet time is characterized by extreme changes not only in time, but also in space: the shooting angle (viewer's perspective) also revolves around the scene while in slow motion.
  • the visual effects of bullet time seem gorgeous, but the finished effect often needs to be carefully edited by humans. If the angle of view is not adjusted manually, the generated video may be caused by the photographer's irregular rotation, resulting in people in the generated video screen. Or the scene fluctuates up and down, the change is unstable, and it is easy to make people feel dizzy.
  • the embodiments of the present application provide a method for generating a rotation direction of a gyroscope, a method, an apparatus, a computer-readable storage medium, a computer device, and a camera for realizing a bullet time shooting effect, aiming to solve at least one of the above problems.
  • the present application provides a method for generating a gyroscope rotation direction, the method comprising:
  • the rotation direction of the gyroscope is determined according to the acceleration components of the X, Y, and Z axes of the fourth acceleration.
  • the present application provides a device for generating a rotation direction of a gyroscope, the device comprising:
  • the estimation module is used to obtain the acceleration value and the angular velocity value of the inertial measurement unit IMU in real time, take the acceleration value as the first acceleration value, and estimate the posture of the IMU to the world coordinate system;
  • a conversion module for converting the first acceleration value from the IMU coordinate system to the world coordinate system to obtain the second acceleration value
  • the filtering module is used to filter the second acceleration in the world coordinate system, filter out the gravitational acceleration, and obtain the third acceleration;
  • the acceleration component determination module is used to convert the third acceleration into the IMU coordinate system to obtain the acceleration components of the X, Y, and Z axes of the fourth acceleration;
  • the direction determination module is used for determining the rotation direction of the gyroscope according to the acceleration components of the X, Y and Z axes of the fourth acceleration.
  • the application provides a method for realizing a bullet time shooting effect, the method comprising:
  • a picture corresponding to the rotation direction of the gyroscope in the panoramic video is generated according to the rotation direction of the gyroscope at the bullet time.
  • the application provides a device for realizing the shooting effect of bullet time, the device comprising:
  • the acquisition module is used to acquire the panoramic video shot when the camera rotates around the shooting target;
  • the estimation module is used to obtain the acceleration value and the angular velocity value of the inertial measurement unit IMU in real time, take the acceleration value as the first acceleration value, and estimate the posture of the IMU to the world coordinate system;
  • a conversion module for converting the first acceleration value from the IMU coordinate system to the world coordinate system to obtain the second acceleration value
  • the filtering module is used to filter the second acceleration in the world coordinate system, filter out the gravitational acceleration, and obtain the third acceleration;
  • the acceleration component determination module is used to convert the third acceleration into the IMU coordinate system to obtain the acceleration components of the X, Y, and Z axes of the fourth acceleration;
  • a direction determination module used for determining the rotation direction of the gyroscope according to the acceleration components of the X, Y, and Z axes of the fourth acceleration
  • the generating module is used to generate a picture corresponding to the rotation direction of the gyroscope in the panoramic video according to the rotation direction of the gyroscope at the bullet time.
  • the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the above-described method for generating a gyroscope rotation direction or The steps of the method for realizing the bullet time shooting effect as described.
  • the application provides a computer device, comprising:
  • processors one or more processors
  • the processor and the memory connected by a bus, wherein the one or more computer programs are stored in the memory and configured to be executed by the one or more processors , when the processor executes the computer program, it implements the steps of the method for generating the rotation direction of the gyroscope or the method for realizing the shooting effect of bullet time as described.
  • the present application provides a camera, including:
  • processors one or more processors
  • the processor and the memory connected by a bus, wherein the one or more computer programs are stored in the memory and configured to be executed by the one or more processors , when the processor executes the computer program, it implements the steps of the method for generating the rotation direction of the gyroscope or the method for realizing the shooting effect of bullet time as described.
  • the camera is parallel to the rotation axis when the camera is rotated, or the camera is perpendicular to the rotation axis when the camera is rotated, so different orientation processing is required.
  • the acceleration value and angular velocity value of the inertial measurement unit IMU are obtained, the acceleration value is used as the first acceleration value, and the first acceleration value is processed to obtain the fourth acceleration, and then according to the fourth acceleration value
  • the acceleration components of the X, Y, and Z axes determine the direction of rotation of the gyroscope. Therefore, the present application can judge the rotation direction of the gyroscope in real time, and can generate a bullet time video with stable visual effect without human-computer interaction or careful editing in the later stage, and the method of the present application is simple, fast, and robust.
  • FIG. 1 is a schematic diagram of an application scenario of a method for generating a rotation direction of a gyroscope or a method for realizing a bullet time shooting effect provided by an embodiment of the present application.
  • FIG. 2 is a flowchart of a method for generating a rotation direction of a gyroscope provided by an embodiment of the present application.
  • 3 to 8 are schematic diagrams for analyzing the rotation direction of the gyroscope.
  • FIG. 9 is a schematic diagram of an apparatus for generating a rotation direction of a gyroscope provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of a device for realizing the shooting effect of bullet time provided by an embodiment of the present application.
  • FIG. 11 is a specific structural block diagram of a computer device provided by an embodiment of the present application.
  • FIG. 12 is a specific structural block diagram of a camera provided by an embodiment of the present application.
  • An application scenario of the method for generating a gyroscope rotation direction or the method for realizing a bullet time shooting effect provided by an embodiment of the present application may be a computer device or a camera.
  • the method determines the rotation direction of the gyroscope, or, when the computer device or the camera executes the method for realizing the bullet time shooting effect provided by the embodiment of the present application, the method for generating the rotation direction of the gyroscope provided by the embodiment of the present application is performed first to obtain the rotation direction of the gyroscope. , and then generate a picture corresponding to the gyroscope direction in the panoramic video according to the gyroscope rotation direction.
  • An application scenario of the method for generating a gyroscope rotation direction or the method for realizing a bullet time shooting effect provided by an embodiment of the present application may also include a connected computer device 100 and a camera 200 (as shown in FIG. 1 ). At least one application may be executed in the computer device 100 and the camera 200 .
  • the computer device 100 may be a server, a desktop computer, a mobile terminal, and the like, and the mobile terminal includes a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, and the like.
  • the camera 200 may be an ordinary camera or a panoramic camera or the like.
  • a common camera refers to a photographing device for taking flat images and flat videos.
  • the computer device 100 or the camera 200 executes the method for generating the rotation direction of the gyroscope provided by an embodiment of the present application to determine the rotational direction of the gyroscope, or when the computer device or the camera executes the method for realizing the bullet time shooting effect provided by the embodiment of the present application.
  • FIG. 2 is a flowchart of a method for generating a gyroscope rotation direction provided by an embodiment of the present application.
  • This embodiment mainly takes the method for generating a gyroscope rotation direction applied to a computer device or a camera as an example for illustration.
  • the method for generating a gyroscope rotation direction provided by an embodiment of the application includes the following steps:
  • the acceleration value and the angular velocity value are taken as the first acceleration value, and the posture of the IMU to the world coordinate system is estimated.
  • an IMU is a device that measures the three-axis attitude angle (or angular velocity value) and acceleration value of an object.
  • an IMU includes a three-axis accelerometer and a three-axis gyroscope.
  • the accelerometer detects the acceleration signal of the object in the independent three-axis of the carrier coordinate system
  • the gyroscope detects the angular velocity signal of the carrier relative to the navigation coordinate system, and measures the object in three dimensions.
  • the angular velocity value and acceleration value in space are used to calculate the attitude of the object.
  • the real-time acquisition of the acceleration value and the angular velocity value of the IMU may specifically be: using a gravity sensor to read the acceleration value of the triaxial accelerometer, and using the angular velocity sensor to read the angular velocity value of the triaxial gyroscope.
  • the estimating the posture of the IMU to the world coordinate system may specifically be estimating the posture of the IMU to the world coordinate system using a Kalman filter. Specifically, it may be: using extended Kalman filtering to combine the acceleration value and the angular velocity value to estimate the rotation amount from the IMU to the world coordinate system Specifically include the following steps:
  • ⁇ t represents the sampling time interval of the gyroscope data.
  • S1013 Calculate the covariance matrix Q k of the state noise according to the sampling time interval ⁇ t of the gyroscope data, and update the state rotation prior estimator and process covariance prior estimation matrix Specifically:
  • in is the posterior estimator of the state rotation at time k-1; in, is the posterior estimation matrix of the process covariance at time k-1.
  • the following steps may be further included: use low-pass filtering to perform noise reduction processing on the first acceleration value to obtain a first acceleration value d' i after low-pass filtering and noise reduction, specifically including :
  • the first acceleration value after filtering and noise reduction d i represents the first acceleration value at the i-th moment
  • R i is the relative rotation of the gyroscope i-th frame video
  • R i exp[- ⁇ i ⁇ t] ⁇
  • ⁇ i represents the angular velocity value at the i-th time
  • d' i-1 represents the first acceleration value after low-pass filtering and noise reduction at the i-1-th time
  • represents the smoothing factor
  • f c is the cutoff frequency of the low-pass filtering
  • Rc is the time constant
  • ⁇ t is the sampling interval of the gyroscope data.
  • Extended Kalman Filtering is to linearize the nonlinear system, and then perform Kalman filtering. Kalman filtering is a high-efficiency recursive filter. In the measurement of noise, the state of a dynamic system is estimated.
  • S102 may specifically be:
  • S103 may specifically be:
  • the filtering out the acceleration of gravity can be processed by a high-pass filter, and the formula of the high-pass filter is: where H h (w) represents the medium of the high-pass filter;
  • a high-pass filter also known as a low-cut filter or a low-cut filter, allows frequencies above a certain cut-off frequency to pass through, while greatly attenuating lower frequencies. It removes unnecessary low-frequency components or low-frequency interference from the signal.
  • the method before the filtering of the second acceleration, the method may further include the following steps:
  • the denoising processing can be processed by a Butterworth low-pass filter, and the formula of the Butterworth low-pass filter is:
  • H b (w) represents the medium of the low-pass filter
  • w is the frequency
  • w c is the cutoff frequency
  • the Butterworth low-pass filter The cutoff frequency of the controller can be preset to (This is the normalized cutoff frequency, which is the cutoff frequency divided by the sampling frequency, expressed as angular frequency). Thereby eliminating the influence of noise and making the result more stable.
  • a Butterworth filter is a type of electronic filter, also known as a maximum flattening filter.
  • the characteristic of Butterworth filter is that the frequency response curve in the pass band is as flat as possible without ripple, and it gradually drops to zero in the stop band.
  • S104 may specifically be: by formula Calculate the fourth acceleration is the rotation matrix of the transformation from the world coordinate system to the IMU coordinate system at the kth moment, is the third acceleration.
  • the determining of the rotation direction of the gyroscope according to the acceleration components of the X, Y, and Z axes of the fourth acceleration may specifically be:
  • the method may further include:
  • the rotation direction of the gyroscope is used as the rotation direction of the camera, and the positional relationship between the camera and the rotation axis is determined by the rotation direction of the gyroscope.
  • the rotation direction is the direction of the rotation axis when the positive semi-axis of the Y-axis is rotated toward the camera, that is, the rotation direction of the camera is the direction of the rotation axis when the positive semi-axis of the Y-axis is rotated toward the camera.
  • the rotation axis when the camera is perpendicular to the camera is determined by the rotation direction of the gyroscope, and the top of the camera faces the rotation axis when the camera rotates.
  • the rotation direction is the direction of the rotation axis when the negative half-axis of the Z-axis is rotated toward the camera, that is, the rotation direction of the camera is the direction of the rotation axis when the negative half-axis of the Z-axis is rotated toward the camera.
  • the rotation axis of the camera when the camera is parallel to the camera is determined by the rotation direction of the gyroscope, and the positional relationship between the camera and the rotation axis is the rotation axis when the first side of the camera rotates toward the camera.
  • the rotation direction is the direction of the rotation axis when the negative semi-axis of the Y-axis is rotated toward the camera, that is, the rotation direction of the camera is the direction of the rotation axis when the negative semi-axis of the Y-axis is rotated toward the camera.
  • the rotation axis of the camera when the camera is perpendicular to the rotation of the camera is determined by the rotation direction of the gyroscope, and the positional relationship between the camera and the rotation axis is the rotation axis when the bottom of the camera rotates toward the camera.
  • the rotation direction is the direction of the rotation axis when the positive half-axis of the Z-axis is rotated toward the camera, that is, the rotation direction of the camera is the direction of the rotation axis when the positive half-axis of the Z-axis is rotated toward the camera.
  • the rotation axis of the camera when the camera is parallel to the camera is determined by the rotation direction of the gyroscope, and the positional relationship between the camera and the rotation axis is the rotation axis when the second side of the camera rotates toward the camera.
  • the rotation direction is the direction of the positive half-axis of the X-axis toward the rotation axis when the camera rotates, that is, the rotation direction of the camera is the direction of the positive half-axis of the X-axis toward the rotation axis when the camera rotates.
  • the rotation axis when the camera rotates parallel to the camera is determined by the rotation direction of the gyroscope, and the positional relationship between the camera and the rotation axis is the rotation axis when the third side of the camera rotates toward the camera.
  • the rotation direction is the direction of the rotation axis when the negative semi-axis of the X-axis is rotated toward the camera, that is, the rotation direction of the camera is the direction of the rotation axis when the negative semi-axis of the X-axis is rotated toward the camera.
  • the rotation axis of the camera when the camera is perpendicular to the rotation of the camera is determined by the rotation direction of the gyroscope, and the positional relationship between the camera and the rotation axis is the rotation axis when the fourth side of the camera rotates toward the camera.
  • the device for generating the rotation direction of the gyroscope provided by an embodiment of the present application may be a computer program or a piece of program code running in a computer device or a camera, for example, the device for generating the rotation direction of the gyroscope is an application software ;
  • the device for generating the rotation direction of the gyroscope can be used to perform the corresponding steps in the method for generating the rotation direction of the gyroscope provided by the embodiments of the present application.
  • An apparatus for generating a gyroscope rotation direction provided by an embodiment of the present application includes:
  • the estimation module 11 is used to obtain the acceleration value and the angular velocity value of the inertial measurement unit IMU in real time, take the acceleration value as the first acceleration value, and estimate the posture of the IMU to the world coordinate system;
  • the conversion module 12 is used to convert the first acceleration value from the IMU coordinate system to the world coordinate system to obtain the second acceleration value;
  • the filtering module 13 is used for filtering the second acceleration in the world coordinate system, filtering out the gravitational acceleration, and obtaining the third acceleration;
  • the acceleration component determination module 14 is used to convert the third acceleration into the IMU coordinate system, and obtain the acceleration components of the X, Y, and Z axes of the fourth acceleration;
  • the direction determination module 15 is configured to determine the rotation direction of the camera according to the acceleration components of the X, Y, and Z axes of the fourth acceleration, and use the rotation direction of the camera as the rotation direction of the gyroscope.
  • the device for generating the rotational direction of the gyroscope provided by an embodiment of the present application and the method for generating the rotational direction of the gyroscope provided by an embodiment of the present application belong to the same concept.
  • An embodiment of the present application also provides a method for realizing the shooting effect of bullet time.
  • the method for realizing the shooting effect of bullet time is applied to a computer device or a camera as an example for illustration.
  • the difference between the method for realizing the shooting effect of bullet time and the method for generating the rotation direction of the gyroscope in an embodiment of the present application is:
  • the following step is further included: generating a picture corresponding to the rotation direction of the gyroscope in the panoramic video according to the rotation direction of the gyroscope at the bullet time.
  • the device for realizing the shooting effect of bullet time provided by an embodiment of the present application may be a computer program or a piece of program code running in a computer device or a camera, for example, the device for realizing shooting effect of bullet time is an application software
  • the device for realizing the shooting effect of bullet time can be used to execute the corresponding steps in the method for realizing shooting effect of bullet time provided by the embodiment of the present application.
  • the device for realizing the shooting effect of bullet time provided by an embodiment of the present application includes:
  • an acquisition module 21 configured to acquire a panoramic video shot when the camera rotates around the shooting target
  • the estimation module 22 is used to obtain the acceleration value and the angular velocity value of the inertial measurement unit IMU in real time, take the acceleration value as the first acceleration value, and estimate the attitude of the IMU to the world coordinate system;
  • the conversion module 23 is used to convert the first acceleration value from the IMU coordinate system to the world coordinate system to obtain the second acceleration value;
  • the filtering module 24 is used for filtering the second acceleration in the world coordinate system, filtering out the gravitational acceleration, and obtaining the third acceleration;
  • the acceleration component determination module 25 is used to convert the third acceleration into the IMU coordinate system, and obtain the acceleration components of the X, Y, and Z axes of the fourth acceleration;
  • the direction determination module 26 is used to determine the rotation direction of the camera according to the acceleration components of the X, Y, and Z axes of the fourth acceleration, and use the rotation direction of the camera as the rotation direction of the gyroscope;
  • the generation module 27 is configured to generate a picture corresponding to the rotation direction of the gyroscope in the panoramic video according to the rotation direction of the gyroscope at the bullet time.
  • the device for realizing the shooting effect of bullet time provided by an embodiment of the present application and the method for realizing the shooting effect of bullet time provided by an embodiment of the present application belong to the same concept.
  • An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the generated gyroscope rotation as provided by an embodiment of the present application is implemented.
  • FIG. 11 shows a specific structural block diagram of a computer device provided by an embodiment of the present application.
  • the computer device may be the computer device shown in FIG. 1 .
  • a computer device 100 includes: one or more processors 101 and a memory 102 , and one or more computer programs, wherein the processor 101 and the memory 102 are connected by a bus, the one or more computer programs are stored in the memory 102 and are configured to be executed by the one or Multiple processors 101 execute, and when the processors 101 execute the computer program, the steps of the method for generating the rotation direction of the gyroscope or the method for realizing the bullet time shooting effect provided by an embodiment of the present application are implemented.
  • the computer equipment may be a server, a desktop computer, a mobile terminal, etc.
  • the mobile terminal includes a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, and the like.
  • FIG. 12 shows a specific structural block diagram of a camera provided by an embodiment of the present application.
  • the camera may be the camera shown in FIG. 1 .
  • a camera 200 includes: one or more processors 201 , a memory 202 , and one or more A plurality of computer programs, wherein the processor 201 and the memory 202 are connected by a bus, the one or more computer programs are stored in the memory 202, and are configured to be executed by the one or more processors 201 is executed.
  • the processor 201 executes the computer program
  • the processor 201 implements the steps of the method for generating the rotation direction of the gyroscope or the method for realizing the bullet time shooting effect provided by an embodiment of the present application.
  • the camera is parallel to the rotation axis when the camera is rotated, or the camera is perpendicular to the rotation axis when the camera is rotated, so different orientation processing is required.
  • the acceleration value and angular velocity value of the inertial measurement unit IMU are obtained, the acceleration value is used as the first acceleration value, and the first acceleration value is processed to obtain the fourth acceleration, and then according to the fourth acceleration value
  • the acceleration components of the X, Y, and Z axes determine the direction of rotation of the gyroscope. Therefore, the present application can judge the rotation direction of the gyroscope in real time, and can generate a bullet time video with stable visual effect without human-computer interaction or careful editing in the later stage, and the method of the present application is simple, fast, and robust.
  • the steps in the embodiments of the present application are not necessarily executed sequentially in the order indicated by the step numbers. Unless explicitly stated herein, the execution of these steps is not strictly limited to the order, and these steps may be performed in other orders. Moreover, at least a part of the steps in each embodiment may include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily executed and completed at the same time, but may be executed at different times. The execution of these sub-steps or stages The sequence is also not necessarily sequential, but may be performed alternately or alternately with other steps or sub-steps of other steps or at least a portion of a phase.
  • Nonvolatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory may include random access memory (RAM) or external cache memory.
  • RAM is available in various forms such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous chain Road (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), etc.
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDRSDRAM double data rate SDRAM
  • ESDRAM enhanced SDRAM
  • SLDRAM synchronous chain Road (Synchlink) DRAM
  • SLDRAM synchronous chain Road (Synchlink) DRAM
  • Rambus direct RAM
  • DRAM direct memory bus dynamic RAM
  • RDRAM memory bus dynamic RAM

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Operations Research (AREA)
  • Gyroscopes (AREA)
  • Studio Devices (AREA)

Abstract

本申请适用于视频处理领域,提供了一种生成陀螺仪旋转方向的方法、实现子弹时间拍摄效果的方法、计算机可读存储介质、计算机设备及相机。所述方法包括:实时获取IMU的加速度值和角速度值,将所述加速度值作为第一加速度值,并估计IMU到世界坐标系的姿态;将第一加速度值从IMU坐标系转换到世界坐标系,得到第二加速度值;在世界坐标系中,对第二加速度进行过滤,过滤掉重力加速度,得到第三加速度;将第三加速度转换到IMU坐标系下,得到第四加速度的X、Y、Z轴的加速度分量;根据加速度分量确定陀螺仪旋转方向。本申请的方法简单,运行速度快高,鲁棒性好。

Description

生成陀螺仪旋转方向的方法及计算机设备 技术领域
本申请属于视频处理领域,尤其涉及一种生成陀螺仪旋转方向的方法、实现子弹时间拍摄效果的方法、装置、计算机可读存储介质、计算机设备及相机。
背景技术
子弹时间(Bullet time)是一种使用在电影、电视广告或电脑游戏中,用计算机辅助的摄影技术模拟变速特效,例如强化的慢镜头、时间静止等效果。子弹时间的特点是不但在时间上极端变化,而且在空间上极端变化:在慢镜头的同时拍摄角度(观众视角)也围绕场景旋转。然而,子弹时间这一视觉特效看似华丽,但制作的成品效果往往需要人为精心剪辑,如果不进行人为调整视角剪辑,生成的视频就可能由于拍摄者没有规律的旋转导致生成的视频画面中人或景物上下波动,变化不稳定,容易让人产生眩晕感。
技术问题
本申请实施例在于提供一种生成陀螺仪旋转方向的方法、实现子弹时间拍摄效果的方法、装置、计算机可读存储介质、计算机设备及相机,旨在至少解决以上问题之一。
技术解决方案
第一方面,本申请提供了一种生成陀螺仪旋转方向的方法,所述方法包括:
实时获取惯性测量单元IMU的加速度值和角速度值,将所述加速度值作为第一加速度值,并估计IMU到世界坐标系的姿态;
将第一加速度值从IMU坐标系转换到世界坐标系,得到第二加速度值;
在世界坐标系中,对第二加速度进行过滤,过滤掉重力加速度,得到 第三加速度;
将第三加速度转换到IMU坐标系下,得到第四加速度的X、Y、Z轴的加速度分量;
根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向。
第二方面,本申请提供了一种生成陀螺仪旋转方向的装置,所述装置包括:
估计模块,用于实时获取惯性测量单元IMU的加速度值和角速度值,将所述加速度值作为第一加速度值,并估计IMU到世界坐标系的姿态;
转换模块,用于将第一加速度值从IMU坐标系转换到世界坐标系,得到第二加速度值;
过滤模块,用于在世界坐标系中,对第二加速度进行过滤,过滤掉重力加速度,得到第三加速度;
加速度分量确定模块,用于将第三加速度转换到IMU坐标系下,得到第四加速度的X、Y、Z轴的加速度分量;
方向确定模块,用于根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向。
第三方面,本申请提供了一种实现子弹时间拍摄效果的方法,所述方法包括:
获取相机围绕拍摄目标旋转时拍摄的全景视频;
执行如所述的生成陀螺仪旋转方向的方法的步骤;
根据子弹时间的陀螺仪旋转方向生成全景视频中陀螺仪旋转方向对应的画面。
第四方面,本申请提供了一种实现子弹时间拍摄效果的装置,所述装置包括:
获取模块,用于获取相机围绕拍摄目标旋转时拍摄的全景视频;
估计模块,用于实时获取惯性测量单元IMU的加速度值和角速度值,将所述加速度值作为第一加速度值,并估计IMU到世界坐标系的姿态;
转换模块,用于将第一加速度值从IMU坐标系转换到世界坐标系,得到第二加速度值;
过滤模块,用于在世界坐标系中,对第二加速度进行过滤,过滤掉重力加速度,得到第三加速度;
加速度分量确定模块,用于将第三加速度转换到IMU坐标系下,得到第四加速度的X、Y、Z轴的加速度分量;
方向确定模块,用于根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向;
生成模块,用于根据子弹时间的陀螺仪旋转方向生成全景视频中陀螺仪旋转方向对应的画面。
第五方面,本申请提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现如所述的生成陀螺仪旋转方向的方法或如所述的实现子弹时间拍摄效果的方法的步骤。
第六方面,本申请提供了一种计算机设备,包括:
一个或多个处理器;
存储器;以及
一个或多个计算机程序,所述处理器和所述存储器通过总线连接,其中所述一个或多个计算机程序被存储在所述存储器中,并且被配置成由所述一个或多个处理器执行,所述处理器执行所述计算机程序时实现如所述的生成陀螺仪旋转方向的方法或如所述的实现子弹时间拍摄效果的方法的步骤。
第七方面,本申请提供了一种相机,包括:
一个或多个处理器;
存储器;以及
一个或多个计算机程序,所述处理器和所述存储器通过总线连接,其中所述一个或多个计算机程序被存储在所述存储器中,并且被配置成由所 述一个或多个处理器执行,所述处理器执行所述计算机程序时实现如所述的生成陀螺仪旋转方向的方法或如所述的实现子弹时间拍摄效果的方法的步骤。
有益效果
由于相机配件的原因,导致用户可能会按照不同的方式安装相机,例如,相机平行于相机旋转时的旋转轴,或者,相机垂直于相机旋转时的旋转轴,因此需要不同的朝向处理。在本申请实施例中,由于获取惯性测量单元IMU的加速度值和角速度值,将所述加速度值作为第一加速度值,并对第一加速度值进行处理得到第四加速度,再根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向。因此本申请可以实时判断陀螺仪旋转方向,可以无需人机交互或后期精心剪辑就可以生成视觉效果稳定的子弹时间视频,且本申请的方法简单,运行速度快高,鲁棒性好。
附图说明
图1是本申请一实施例提供的生成陀螺仪旋转方向的方法或实现子弹时间拍摄效果的方法的应用场景示意图。
图2是本申请一实施例提供的生成陀螺仪旋转方向的方法的流程图。
图3至8是分析陀螺仪旋转方向的示意图。
图9是本申请一实施例提供的生成陀螺仪旋转方向的装置示意图。
图10是本申请一实施例提供的实现子弹时间拍摄效果的装置示意图。
图11是本申请一实施例提供的计算机设备的具体结构框图。
图12是本申请一实施例提供的相机的具体结构框图。
本发明的实施方式
为了使本申请的目的、技术方案及有益效果更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
为了说明本申请所述的技术方案,下面通过具体实施例来进行说明。
本申请一实施例提供的生成陀螺仪旋转方向的方法或实现子弹时间拍 摄效果的方法的应用场景可以是计算机设备或相机,计算机设备或相机执行本申请一实施例提供的生成陀螺仪旋转方向的方法确定陀螺仪旋转方向,或者,计算机设备或相机执行本申请一实施例提供的实现子弹时间拍摄效果的方法时先执行本申请一实施例提供的生成陀螺仪旋转方向的方法得到陀螺仪旋转方向,然后根据陀螺仪旋转方向生成全景视频中陀螺仪方向对应的画面。本申请一实施例提供的生成陀螺仪旋转方向的方法或实现子弹时间拍摄效果的方法的应用场景也可以包括相连接的计算机设备100和相机200(如图1所示)。计算机设备100和相机200中可运行至少一个应用程序。计算机设备100可以是服务器、台式计算机、移动终端等,移动终端包括手机、平板电脑、笔记本电脑、个人数字助理等。相机200可以是普通的相机或者全景相机等。普通的相机是指用于拍摄平面图像和平面视频的拍摄装置。计算机设备100或者是相机200执行本申请一实施例提供的生成陀螺仪旋转方向的方法确定陀螺仪旋转方向,或者,计算机设备或相机执行本申请一实施例提供的实现子弹时间拍摄效果的方法时先执行本申请一实施例提供的生成陀螺仪旋转方向的方法得到陀螺仪旋转方向,然后根据陀螺仪旋转方向生成全景视频中陀螺仪方向对应的画面。
请参阅图2,是本申请一实施例提供的生成陀螺仪旋转方向的方法的流程图,本实施例主要以该生成陀螺仪旋转方向的方法应用于计算机设备或相机为例来举例说明,本申请一实施例提供的生成陀螺仪旋转方向的方法包括以下步骤:
S101、实时获取IMU(Inertial measurement unit,惯性测量单元)
的加速度值和角速度值,将所述加速度值作为第一加速度值,并估计IMU到世界坐标系的姿态。
IMU是测量物体三轴姿态角(或角速度值)以及加速度值的装置。一般的,一个IMU包含了三轴加速度计和三轴陀螺仪,加速度计检测物体在载体坐标***独立三轴的加速度信号,而陀螺仪检测载体相对于导航坐标系的角速度信号,测量物体在三维空间中的角速度值和加速度值,并以此解 算出物体的姿态。
在本申请一实施例中,实时获取IMU的加速度值和角速度值具体可以是:利用重力感应器读取三轴加速度计的加速度值,利用角速度感应器读取三轴陀螺仪的角速度值。
所述估计IMU到世界坐标系的姿态具体可以是使用卡尔曼滤波器估计IMU到世界坐标系的姿态。具体可以是:利用扩展卡尔曼滤波结合所述加速度值和角速度值,估计得到IMU到世界坐标系的旋转量
Figure PCTCN2021120272-appb-000001
具体包括以下步骤:
S1011、根据公式
Figure PCTCN2021120272-appb-000002
计算初始状态旋转量
Figure PCTCN2021120272-appb-000003
其中,d 0为初始测得的加速度值,g为世界坐标系的重力矢量;根据公式
Figure PCTCN2021120272-appb-000004
计算初始过程协方差
Figure PCTCN2021120272-appb-000005
S1012、利用第K时刻的角速度值ω k计算第K时刻的状态转移矩阵Φ(ω k);
具体为根据公式Φ(ω k)=exp(-[ω k·Δt] ×)计算第K时刻的状态转移矩阵Φ(ω k),
Δt表示陀螺仪数据的采样时间间隔。
S1013、根据陀螺仪数据的采样时间间隔Δt计算状态噪声的协方差矩阵Q k,更新状态旋转先验估计量
Figure PCTCN2021120272-appb-000006
和过程协方差先验估计矩阵
Figure PCTCN2021120272-appb-000007
具体为:
Figure PCTCN2021120272-appb-000008
其中,
Figure PCTCN2021120272-appb-000009
是第k-1时刻的状态旋转后验估计量;
Figure PCTCN2021120272-appb-000010
其中,
Figure PCTCN2021120272-appb-000011
是第k-1时刻的过程协方差后验估计矩阵。
S1014、由第k时刻的加速度值d k更新观测量的噪声方差矩阵R k,计算观测转移雅克比矩阵H k,计算当前观测量和估计观测量误差e k;具体为:
Figure PCTCN2021120272-appb-000012
其中,
Figure PCTCN2021120272-appb-000013
Figure PCTCN2021120272-appb-000014
α为加速度变化量的平滑因子,β为加速度模长的影响因子;
Figure PCTCN2021120272-appb-000015
其中h为观察函数,h(q,v)=q·g+v k,g世界坐标系下的重力矢量,q为世界坐标系到陀螺仪坐标系的旋转量,v k为测量噪声;
Figure PCTCN2021120272-appb-000016
S1015、更新第k时刻的最优卡尔曼增益矩阵K k
Figure PCTCN2021120272-appb-000017
S1016、根据最优卡尔曼增益矩阵K k和估计观测量误差e k更新IMU到世界坐标系的第k时刻的状态旋转后验估计量
Figure PCTCN2021120272-appb-000018
和过程协方差后验估计矩阵
Figure PCTCN2021120272-appb-000019
将更新后的所述状态旋转后验估计量
Figure PCTCN2021120272-appb-000020
作为IMU到世界坐标系的旋转量
Figure PCTCN2021120272-appb-000021
在本申请一实施例中,S101之后还可以包括以下步骤:利用低通滤波对第一加速度值进行降噪处理,得到经过低通滤波和降噪后的第一加速度值d' i,具体包括:
通过公式d' i=α·d i+(1-α)·R i·d' i-1对第一加速度值进行低通滤波和降噪处理,其中,d' i表示第i时刻经过低通滤波和降噪后的第一加速度值,d i表示第i时刻的第一加速度值,R i为陀螺仪第i帧视频的相对旋转量,R i=exp[-ω i·Δt] ×,ω i表示第i时刻的角速度值,d' i-1表示第i-1时刻经过低通滤波和降噪后的第一加速度值,α表示平滑因子,
Figure PCTCN2021120272-appb-000022
Figure PCTCN2021120272-appb-000023
其中f c表示低通滤波的截止频率,Rc表示时间常数,Δt表示陀螺仪数据的采样时间间隔。
扩展卡尔曼滤波(Extended Kalman Filtering):扩展卡尔曼滤波是将非线性***线性化,然后进行卡尔曼滤波,卡尔曼滤波是一种高效率的递归滤波器,它能够从一系列的不完全包含噪声的测量中,估计动态***的状态。
S102、将第一加速度值从IMU坐标系转换到世界坐标系,得到第二加速度值。
在本申请一实施例中,S102具体可以为:
根据公式
Figure PCTCN2021120272-appb-000024
在k时刻,将第一加速度值从IMU坐标系转换到世界坐标系,得到的第二加速度值
Figure PCTCN2021120272-appb-000025
其中
Figure PCTCN2021120272-appb-000026
是估计得到的IMU到世界坐标系的旋转量,d' i是经过低通滤波和降噪后的第一加速度值。
S103、在世界坐标系中,对第二加速度进行过滤,过滤掉重力加速度,得到第三加速度。
在本申请一实施例中,S103具体可以为:
在世界坐标系中,在第k时刻,对第二加速度值
Figure PCTCN2021120272-appb-000027
进行过滤,过滤掉重力加速度,生成第三加速度
Figure PCTCN2021120272-appb-000028
所述过滤掉重力加速度可以采用高通滤波器进行处理,高通滤波器公式为
Figure PCTCN2021120272-appb-000029
其中,H h(w)表示高通滤波器的介质;
其中,n为滤波器的阶数,可由预设值获得,例如n=4,w为频率,w c为截止频率,高通滤波器的截止频率预设为
Figure PCTCN2021120272-appb-000030
(这是归一化后的截止频率,即截止频率除以采样频率,以角频率的方式表示)。
高通滤波器(High-pass filter),又称低截止滤波器、低阻滤波器,允许高于某一截频的频率通过,而大大衰减较低频率的一种滤波器。它去掉了信号中不必要的低频成分或者说去掉了低频干扰。
在本申请一实施例中,所述对第二加速度进行过滤之前,所述方法还可以包括以下步骤:
在世界坐标系中,在第k时刻,对第二加速度值
Figure PCTCN2021120272-appb-000031
做去噪处理,所述去噪处理可以采用巴特沃斯低通滤波器来处理,巴特沃斯低通滤波器的公式为
Figure PCTCN2021120272-appb-000032
其中,H b(w)表示低通滤波器的介质,n为滤波器的阶 数,可由预设值获得,例如n=4,w为频率,w c为截止频率,巴特沃斯低通滤波器的截止频率可预设为
Figure PCTCN2021120272-appb-000033
(这是归一化后的截止频率,即截止频率除以采样频率,以角频率的方式表示)。从而消除噪声的影响,让结果更稳定一些。
巴特沃斯滤波器(Butterworth filter)是电子滤波器的一种,它也被称作最大平坦滤波器。巴特沃斯滤波器的特点是通频带内的频率响应曲线最大限度平坦,没有纹波,而在阻频带则逐渐下降为零。
S104、将第三加速度转换到IMU坐标系下,得到第四加速度的X、Y、Z轴的加速度分量。
在本申请一实施例中,S104具体可以为:通过公式
Figure PCTCN2021120272-appb-000034
计算第四加速度
Figure PCTCN2021120272-appb-000035
为第k时刻,世界坐标系转换到IMU坐标系的旋转矩阵,
Figure PCTCN2021120272-appb-000036
是第三加速度。
S105、根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向。
在本申请一实施例中,所述根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向具体可以为:
比较第四加速度的X、Y、Z轴的加速度分量的大小,得到最大加速度分量对应的轴,确定陀螺仪旋转方向为与最大加速度分量对应的、且与最大加速度分量的方向同向的轴朝相机旋转时的旋转轴的方向。
在本申请一实施例中,所述根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向之后,所述方法还可以包括:
将所述陀螺仪旋转方向作为相机的旋转方向,通过所述陀螺仪旋转方向确定相机与所述旋转轴的位置关系。
如图3所示,若第四加速度的Y轴的加速度分量大于第四加速度的X轴和Z轴的加速度分量,且加速度分量的方向与Y轴的正半轴同向,说明此时陀螺仪旋转方向为Y轴正半轴朝相机旋转时的旋转轴的方向,即相机 的旋转方向为Y轴正半轴朝相机旋转时的旋转轴的方向。通过所述陀螺仪旋转方向确定相机垂直于相机旋转时的旋转轴,且相机的顶部朝相机旋转时的旋转轴。
如图4所示,若第四加速度的Z轴的加速度分量大于第四加速度的X轴和Y轴的加速度分量,且加速度分量的方向与Z轴的负半轴同向,说明此时陀螺仪旋转方向为Z轴负半轴朝相机旋转时的旋转轴的方向,即相机的旋转方向为Z轴负半轴朝相机旋转时的旋转轴的方向。通过所述陀螺仪旋转方向确定相机平行于相机旋转时的旋转轴,则相机与旋转轴的位置关系为相机的第一侧面朝相机旋转时的旋转轴。
如图5所示,若第四加速度的Y轴的加速度分量大于第四加速度的X轴和Z轴的加速度分量,且加速度分量的方向与Y轴的负半轴同向,说明此时陀螺仪旋转方向为Y轴负半轴朝相机旋转时的旋转轴的方向,即相机的旋转方向为Y轴负半轴朝相机旋转时的旋转轴的方向。通过所述陀螺仪旋转方向确定相机垂直于相机旋转时的旋转轴,则相机与旋转轴的位置关系为相机的底部朝相机旋转时的旋转轴。
如图6所示,若第四加速度的Z轴的加速度分量大于第四加速度的X轴和Y轴的加速度分量,且加速度分量的方向与Z轴的正半轴同向,说明此时陀螺仪旋转方向为Z轴正半轴朝相机旋转时的旋转轴的方向,即相机的旋转方向为Z轴正半轴朝相机旋转时的旋转轴的方向。通过所述陀螺仪旋转方向确定相机平行于相机旋转时的旋转轴,则相机与旋转轴的位置关系为相机的第二侧面朝相机旋转时的旋转轴。
如图7所示,若第四加速度的X轴的加速度分量大于第四加速度的Y轴和Z轴的加速度分量,且加速度分量的方向与X轴的正半轴同向,说明此时陀螺仪旋转方向为X轴正半轴朝相机旋转时的旋转轴的方向,即相机的旋转方向为X轴正半轴朝相机旋转时的旋转轴的方向。通过所述陀螺仪旋转方向确定相机平行于相机旋转时的旋转轴,则相机与旋转轴的位置关系为相机的第三侧面朝相机旋转时的旋转轴。
如图8所示,若第四加速度的X轴的加速度分量大于第四加速度的Y轴和Z轴的加速度分量,且加速度分量的方向与X轴的负半轴同向,说明此时陀螺仪旋转方向为X轴负半轴朝相机旋转时的旋转轴的方向,即相机的旋转方向为X轴负半轴朝相机旋转时的旋转轴的方向。通过所述陀螺仪旋转方向确定相机垂直于相机旋转时的旋转轴,则相机与旋转轴的位置关系为相机的第四侧面朝相机旋转时的旋转轴。
以上所述仅为本发明的较佳实施例而已,相机用其他角度朝相机旋转时的的旋转轴旋转时,陀螺仪旋转方向的分析如图3至图8示例所示,在此不再赘述。
请参阅图9,本申请一实施例提供的生成陀螺仪旋转方向的装置可以是运行于计算机设备或相机中的一个计算机程序或一段程序代码,例如该生成陀螺仪旋转方向的装置为一个应用软件;该生成陀螺仪旋转方向的装置可以用于执行本申请实施例提供的生成陀螺仪旋转方向的方法中的相应步骤。本申请一实施例提供的生成陀螺仪旋转方向的装置包括:
估计模块11,用于实时获取惯性测量单元IMU的加速度值和角速度值,将所述加速度值作为第一加速度值,并估计IMU到世界坐标系的姿态;
转换模块12,用于将第一加速度值从IMU坐标系转换到世界坐标系,得到第二加速度值;
过滤模块13,用于在世界坐标系中,对第二加速度进行过滤,过滤掉重力加速度,得到第三加速度;
加速度分量确定模块14,用于将第三加速度转换到IMU坐标系下,得到第四加速度的X、Y、Z轴的加速度分量;
方向确定模块15,用于根据第四加速度的X、Y、Z轴的加速度分量确定相机的旋转方向,将相机的旋转方向作为陀螺仪旋转方向。
本申请一实施例提供的生成陀螺仪旋转方向的装置与本申请一实施例提供的生成陀螺仪旋转方向的方法属于同一构思,其具体实现过程详见说 明书全文,此处不再赘述。
本申请一实施例还提供了一种实现子弹时间拍摄效果的方法,本实施例主要以该实现子弹时间拍摄效果的方法应用于计算机设备或相机为例来举例说明,本申请一实施例提供的实现子弹时间拍摄效果的方法与本申请一实施例生成陀螺仪旋转方向的方法的区别在于:
在S101之前,还包括以下步骤:获取相机围绕拍摄目标旋转时拍摄的全景视频;
在S105之后,还包括以下步骤:根据子弹时间的陀螺仪旋转方向生成全景视频中陀螺仪旋转方向对应的画面。
请参阅图10,本申请一实施例提供的实现子弹时间拍摄效果的装置可以是运行于计算机设备或相机中的一个计算机程序或一段程序代码,例如该实现子弹时间拍摄效果的装置为一个应用软件;该实现子弹时间拍摄效果的装置可以用于执行本申请实施例提供的实现子弹时间拍摄效果的方法中的相应步骤。本申请一实施例提供的实现子弹时间拍摄效果的装置包括:
获取模块21,用于获取相机围绕拍摄目标旋转时拍摄的全景视频;
估计模块22,用于实时获取惯性测量单元IMU的加速度值和角速度值,将所述加速度值作为第一加速度值,并估计IMU到世界坐标系的姿态;
转换模块23,用于将第一加速度值从IMU坐标系转换到世界坐标系,得到第二加速度值;
过滤模块24,用于在世界坐标系中,对第二加速度进行过滤,过滤掉重力加速度,得到第三加速度;
加速度分量确定模块25,用于将第三加速度转换到IMU坐标系下,得到第四加速度的X、Y、Z轴的加速度分量;
方向确定模块26,用于根据第四加速度的X、Y、Z轴的加速度分量确定相机的旋转方向,将相机的旋转方向作为陀螺仪旋转方向;
生成模块27,用于根据子弹时间的陀螺仪旋转方向生成全景视频中陀螺仪旋转方向对应的画面。
本申请一实施例提供的实现子弹时间拍摄效果的装置与本申请一实施例提供的实现子弹时间拍摄效果的方法属于同一构思,其具体实现过程详见说明书全文,此处不再赘述。
本申请一实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现如本申请一实施例提供的生成陀螺仪旋转方向的方法或实现子弹时间拍摄效果的方法的步骤。
图11示出了本申请一实施例提供的计算机设备的具体结构框图,该计算机设备可以是图1中所示的计算机设备,一种计算机设备100包括:一个或多个处理器101、存储器102、以及一个或多个计算机程序,其中所述处理器101和所述存储器102通过总线连接,所述一个或多个计算机程序被存储在所述存储器102中,并且被配置成由所述一个或多个处理器101执行,所述处理器101执行所述计算机程序时实现如本申请一实施例提供的生成陀螺仪旋转方向的方法或实现子弹时间拍摄效果的方法的步骤。
计算机设备可以是服务器、台式计算机、移动终端等,移动终端包括手机、平板电脑、笔记本电脑、个人数字助理等。
图12示出了本申请一实施例提供的相机的具体结构框图,该相机可以是图1中所示的相机,一种相机200包括:一个或多个处理器201、存储器202、以及一个或多个计算机程序,其中所述处理器201和所述存储器202通过总线连接,所述一个或多个计算机程序被存储在所述存储器202中,并且被配置成由所述一个或多个处理器201执行,所述处理器201执行所述计算机程序时实现如本申请一实施例提供的生成陀螺仪旋转方向的方法或实现子弹时间拍摄效果的方法的步骤。
由于相机配件的原因,导致用户可能会按照不同的方式安装相机,例如,相机平行于相机旋转时的旋转轴,或者,相机垂直于相机旋转时的旋 转轴,因此需要不同的朝向处理。在本申请实施例中,由于获取惯性测量单元IMU的加速度值和角速度值,将所述加速度值作为第一加速度值,并对第一加速度值进行处理得到第四加速度,再根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向。因此本申请可以实时判断陀螺仪旋转方向,可以无需人机交互或后期精心剪辑就可以生成视觉效果稳定的子弹时间视频,且本申请的方法简单,运行速度快高,鲁棒性好。
应该理解的是,本申请各实施例中的各个步骤并不是必然按照步骤标号指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,各实施例中至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一非易失性计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和/或易失性存储器。非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这 些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上实施例仅表达了本发明的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干变形和改进,这些都属于本发明的保护范围。因此,本发明专利的保护范围应以所附权利要求为准。

Claims (17)

  1. 一种生成陀螺仪旋转方向的方法,其特征在于,所述方法包括:
    实时获取惯性测量单元IMU的加速度值和角速度值,将所述加速度值作为第一加速度值,并估计IMU到世界坐标系的姿态;
    将第一加速度值从IMU坐标系转换到世界坐标系,得到第二加速度值;
    在世界坐标系中,对第二加速度进行过滤,过滤掉重力加速度,得到第三加速度;
    将第三加速度转换到IMU坐标系下,得到第四加速度的X、Y、Z轴的加速度分量;
    根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向。
  2. 如权利要求1所述的生成陀螺仪旋转方向的方法,其特征在于,所述估计IMU到世界坐标系的姿态具体是:使用卡尔曼滤波器估计IMU到世界坐标系的姿态。
  3. 如权利要求2所述的生成陀螺仪旋转方向的方法,其特征在于,
    所述使用卡尔曼滤波器估计IMU到世界坐标系的姿态具体是:利用扩展卡尔曼滤波结合所述加速度值和角速度值,估计得到IMU到世界坐标系的旋转量。
  4. 如权利要求1所述的生成陀螺仪旋转方向的方法,其特征在于,所述将所述加速度值作为第一加速度值之后,所述方法还包括:利用低通滤波对第一加速度值进行降噪处理,得到经过低通滤波和降噪后的第一加速度值d′ i
  5. 如权利要求4所述的生成陀螺仪旋转方向的方法,其特征在于,所述利用低通滤波对第一加速度值进行降噪处理,得到经过低通滤波和降噪后的第一加速度值d′ i具体包括:
    通过公式d' i=α·d i+(1-α)·R i·d' i-1对第一加速度值进行低通滤波和降噪处理,其中,d′ i表示第i时刻经过低通滤波和降噪后的第一加速度值,d i表示第i时刻的第一加速度值,R i为陀螺仪第i帧视频的相对旋转量,R i=exp[-ω i·Δt] ×,ω i表示第i时刻的角速度值,d′ i-1表示第i-1时刻经过低通 滤波和降噪后的第一加速度值,α表示平滑因子,
    Figure PCTCN2021120272-appb-100001
    其中f c表示低通滤波的截止频率,Rc表示时间常数,Δt表示陀螺仪数据的采样时间间隔。
  6. 如权利要求5所述的生成陀螺仪旋转方向的方法,其特征在于,所述将第一加速度值从IMU坐标系转换到世界坐标系,得到第二加速度值具体为:
    根据公式
    Figure PCTCN2021120272-appb-100002
    在k时刻,将第一加速度值从IMU坐标系转换到世界坐标系,得到的第二加速度值
    Figure PCTCN2021120272-appb-100003
    其中
    Figure PCTCN2021120272-appb-100004
    是估计得到的IMU到世界坐标系的旋转量,d′ i是经过低通滤波和降噪后的第一加速度值。
  7. 如权利要求6所述的生成陀螺仪旋转方向的方法,其特征在于,所述在世界坐标系中,对第二加速度进行过滤,过滤掉重力加速度,得到第三加速度具体为:
    在世界坐标系中,在第k时刻,对第二加速度值
    Figure PCTCN2021120272-appb-100005
    进行过滤,过滤掉重力加速度,生成第三加速度
    Figure PCTCN2021120272-appb-100006
  8. 如权利要求7所述的生成陀螺仪旋转方向的方法,其特征在于,所述对第二加速度进行过滤之前,所述方法还包括:
    在世界坐标系中,在第k时刻,对第二加速度值
    Figure PCTCN2021120272-appb-100007
    做去噪处理。
  9. 如权利要求1至8中任一项所述的生成陀螺仪旋转方向的方法,其特征在于,所述将第三加速度转换到IMU坐标系下,得到第四加速度的X、Y、Z轴的加速度分量具体为:
    通过公式
    Figure PCTCN2021120272-appb-100008
    计算第四加速度
    Figure PCTCN2021120272-appb-100009
    为第k时刻,世界坐标系转换到IMU坐标系的旋转矩阵,
    Figure PCTCN2021120272-appb-100010
    是第三加速度。
  10. 如权利要求9所述的生成陀螺仪旋转方向的方法,其特征在于,所述根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向具体为:
    比较第四加速度的X、Y、Z轴的加速度分量的大小,得到最大加速度分 量对应的轴,确定陀螺仪旋转方向为与最大加速度分量对应的、且与最大加速度分量的方向同向的轴朝相机旋转时的旋转轴的方向。
  11. 如权利要求10所述的生成陀螺仪旋转方向的方法,其特征在于,所述根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向之后,所述方法还包括:
    将所述陀螺仪旋转方向作为相机的旋转方向,通过所述陀螺仪旋转方向确定相机与所述旋转轴的位置关系。
  12. 一种生成陀螺仪旋转方向的装置,其特征在于,所述装置包括:
    估计模块,用于实时获取惯性测量单元IMU的加速度值和角速度值,将所述加速度值作为第一加速度值,并估计IMU到世界坐标系的姿态;
    转换模块,用于将第一加速度值从IMU坐标系转换到世界坐标系,得到第二加速度值;
    过滤模块,用于在世界坐标系中,对第二加速度进行过滤,过滤掉重力加速度,得到第三加速度;
    加速度分量确定模块,用于将第三加速度转换到IMU坐标系下,得到第四加速度的X、Y、Z轴的加速度分量;
    方向确定模块,用于根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向。
  13. 一种实现子弹时间拍摄效果的方法,其特征在于,所述方法包括:
    获取相机围绕拍摄目标旋转时拍摄的全景视频;
    执行如权利要求1至11任一项所述的生成陀螺仪旋转方向的方法的步骤;
    根据陀螺仪旋转方向生成全景视频中陀螺仪旋转方向对应的画面。
  14. 一种实现子弹时间拍摄效果的装置,其特征在于,所述装置包括:
    获取模块,用于获取相机围绕拍摄目标旋转时拍摄的全景视频;
    估计模块,用于实时获取惯性测量单元IMU的加速度值和角速度值,将所述加速度值作为第一加速度值,并估计IMU到世界坐标系的姿态;
    转换模块,用于将第一加速度值从IMU坐标系转换到世界坐标系,得到第二加速度值;
    过滤模块,用于在世界坐标系中,对第二加速度进行过滤,过滤掉重力加速度,得到第三加速度;
    加速度分量确定模块,用于将第三加速度转换到IMU坐标系下,得到第四加速度的X、Y、Z轴的加速度分量;
    方向确定模块,用于根据第四加速度的X、Y、Z轴的加速度分量确定陀螺仪旋转方向;
    生成模块,用于根据陀螺仪旋转方向生成全景视频中陀螺仪旋转方向对应的画面。
  15. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至11任一项所述的生成陀螺仪旋转方向的方法或如权利要求13所述的实现子弹时间拍摄效果的方法的步骤。
  16. 一种计算机设备,包括:
    一个或多个处理器;
    存储器;以及
    一个或多个计算机程序,所述处理器和所述存储器通过总线连接,其中所述一个或多个计算机程序被存储在所述存储器中,并且被配置成由所述一个或多个处理器执行,其特征在于,所述处理器执行所述计算机程序时实现如权利要求1至11任一项所述的生成陀螺仪旋转方向的方法或如权利要求13所述的实现子弹时间拍摄效果的方法的步骤。
  17. 一种相机,包括:
    一个或多个处理器;
    存储器;以及
    一个或多个计算机程序,所述处理器和所述存储器通过总线连接,其中所述一个或多个计算机程序被存储在所述存储器中,并且被配置成由所述一个或 多个处理器执行,其特征在于,所述处理器执行所述计算机程序时实现如权利要求1至11任一项所述的生成陀螺仪旋转方向的方法或如权利要求13所述的实现子弹时间拍摄效果的方法的步骤。
PCT/CN2021/120272 2020-09-24 2021-09-24 生成陀螺仪旋转方向的方法及计算机设备 WO2022063221A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP21871590.2A EP4221181A1 (en) 2020-09-24 2021-09-24 Method for generating rotation direction of gyroscope and computer device
US18/028,516 US20230362317A1 (en) 2020-09-24 2021-09-24 Method for generating rotation direction of gyroscope and computer device
JP2023519149A JP2023543250A (ja) 2020-09-24 2021-09-24 ジャイロスコープ回転方向を生成する方法及びコンピュータ機器

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011018567.3A CN112188037B (zh) 2020-09-24 2020-09-24 生成陀螺仪旋转方向的方法及计算机设备
CN202011018567.3 2020-09-24

Publications (1)

Publication Number Publication Date
WO2022063221A1 true WO2022063221A1 (zh) 2022-03-31

Family

ID=73943426

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/120272 WO2022063221A1 (zh) 2020-09-24 2021-09-24 生成陀螺仪旋转方向的方法及计算机设备

Country Status (5)

Country Link
US (1) US20230362317A1 (zh)
EP (1) EP4221181A1 (zh)
JP (1) JP2023543250A (zh)
CN (1) CN112188037B (zh)
WO (1) WO2022063221A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112188037B (zh) * 2020-09-24 2023-03-24 影石创新科技股份有限公司 生成陀螺仪旋转方向的方法及计算机设备
CN117288187B (zh) * 2023-11-23 2024-02-23 北京小米机器人技术有限公司 机器人位姿确定方法、装置、电子设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103278177A (zh) * 2013-04-27 2013-09-04 中国人民解放军国防科学技术大学 基于摄像组网测量的惯性测量组合标定方法
CN106959110A (zh) * 2017-04-06 2017-07-18 亿航智能设备(广州)有限公司 一种云台姿态检测方法及装置
CN108458714A (zh) * 2018-01-11 2018-08-28 山东大学 一种姿态检测***中不含重力加速度的欧拉角求解方法
CN109540126A (zh) * 2018-12-03 2019-03-29 哈尔滨工业大学 一种基于光流法的惯性视觉组合导航方法
CN109798891A (zh) * 2019-01-25 2019-05-24 上海交通大学 基于高精度动作捕捉***的惯性测量单元标定***
WO2019221763A1 (en) * 2018-05-18 2019-11-21 Ensco, Inc. Position and orientation tracking system, apparatus and method
CN112188037A (zh) * 2020-09-24 2021-01-05 影石创新科技股份有限公司 生成陀螺仪旋转方向的方法及计算机设备

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5794078A (en) * 1995-09-11 1998-08-11 Nikon Corporation Image movement correction of camera
CN100587641C (zh) * 2007-08-06 2010-02-03 北京航空航天大学 一种适用于任意运动微小型***的定姿***
JP5614527B2 (ja) * 2010-03-05 2014-10-29 セイコーエプソン株式会社 姿勢情報算出装置、姿勢情報算出システム、姿勢情報算出方法及び姿勢情報算出プログラム
DE102017203755A1 (de) * 2017-03-08 2018-09-13 Robert Bosch Gmbh Bestimmung einer räumlichen Ausrichtung
JP6407368B2 (ja) * 2017-07-05 2018-10-17 ヤフー株式会社 推定装置、移動方向推定方法及び移動方向推定プログラム
CN107801014B (zh) * 2017-10-25 2019-11-08 深圳岚锋创视网络科技有限公司 一种全景视频防抖的方法、装置及便携式终端
CN110887506A (zh) * 2019-10-14 2020-03-17 交通运输部水运科学研究所 一种受海浪影响的惯性传感器的运动幅度检测方法及***
CN110702107A (zh) * 2019-10-22 2020-01-17 北京维盛泰科科技有限公司 一种单目视觉惯性组合的定位导航方法
CN110972112B (zh) * 2019-12-10 2022-04-05 Oppo广东移动通信有限公司 地铁运行方向的确定方法、装置、终端及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103278177A (zh) * 2013-04-27 2013-09-04 中国人民解放军国防科学技术大学 基于摄像组网测量的惯性测量组合标定方法
CN106959110A (zh) * 2017-04-06 2017-07-18 亿航智能设备(广州)有限公司 一种云台姿态检测方法及装置
CN108458714A (zh) * 2018-01-11 2018-08-28 山东大学 一种姿态检测***中不含重力加速度的欧拉角求解方法
WO2019221763A1 (en) * 2018-05-18 2019-11-21 Ensco, Inc. Position and orientation tracking system, apparatus and method
CN109540126A (zh) * 2018-12-03 2019-03-29 哈尔滨工业大学 一种基于光流法的惯性视觉组合导航方法
CN109798891A (zh) * 2019-01-25 2019-05-24 上海交通大学 基于高精度动作捕捉***的惯性测量单元标定***
CN112188037A (zh) * 2020-09-24 2021-01-05 影石创新科技股份有限公司 生成陀螺仪旋转方向的方法及计算机设备

Also Published As

Publication number Publication date
JP2023543250A (ja) 2023-10-13
CN112188037A (zh) 2021-01-05
US20230362317A1 (en) 2023-11-09
EP4221181A1 (en) 2023-08-02
CN112188037B (zh) 2023-03-24

Similar Documents

Publication Publication Date Title
JP6937809B2 (ja) 制約ベースの回転平滑化を介してデジタルビデオを安定化するためのシステムおよび方法
CN107801014B (zh) 一种全景视频防抖的方法、装置及便携式终端
WO2022063221A1 (zh) 生成陀螺仪旋转方向的方法及计算机设备
CN107241544B (zh) 视频稳像方法、装置及摄像终端
WO2018223381A1 (zh) 一种视频防抖方法及移动设备
CN111314604B (zh) 视频防抖方法和装置、电子设备、计算机可读存储介质
CN105635588B (zh) 一种稳像方法及装置
WO2016146001A1 (zh) 一种三维建模的方法及装置
TWI700000B (zh) 全景影片影像穩定方法及裝置與影像穩定演算法評估方法
CN108307118B (zh) 一种基于惯导参数流形优化的低延时视频稳像方法
CN113395454B (zh) 图像拍摄的防抖方法与装置、终端及可读存储介质
CN109561254B (zh) 一种全景视频防抖的方法、装置及便携式终端
CN109040525B (zh) 图像处理方法、装置、计算机可读介质及电子设备
WO2019000664A1 (zh) 一种信息处理方法及电子设备
WO2018121794A1 (zh) 一种控制方法、电子设备及存储介质
CN110602377B (zh) 一种视频稳像方法及装置
Dasari et al. A joint visual-inertial image registration for mobile HDR imaging
CN109389645B (zh) 相机自校准方法、***、相机、机器人及云端服务器
WO2020199198A1 (zh) 图像采集控制方法,图像采集控制设备和可移动平台
CN104796595B (zh) 图像处理方法和电子设备
CN111345023B (zh) 图像消抖方法、装置、终端及计算机可读存储介质
CN116170689A (zh) 视频生成方法、装置、计算机设备和存储介质
CN113014823A (zh) 摄像装置的防抖处理方法、***、设备及存储介质
CN117221725A (zh) 一种视频运动平滑处理方法、装置、设备和存储介质
TW201424366A (zh) 滾動快門的校正方法與影像處理裝置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21871590

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023519149

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021871590

Country of ref document: EP

Effective date: 20230424