CN113160338A - AR/VR virtual reality fusion studio character space positioning - Google Patents

AR/VR virtual reality fusion studio character space positioning Download PDF

Info

Publication number
CN113160338A
CN113160338A CN202110541254.4A CN202110541254A CN113160338A CN 113160338 A CN113160338 A CN 113160338A CN 202110541254 A CN202110541254 A CN 202110541254A CN 113160338 A CN113160338 A CN 113160338A
Authority
CN
China
Prior art keywords
module
unit
camera
studio
output end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110541254.4A
Other languages
Chinese (zh)
Inventor
杨凯雪
张鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vision Technology Shenzhen Co ltd
Original Assignee
Vision Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vision Technology Shenzhen Co ltd filed Critical Vision Technology Shenzhen Co ltd
Priority to CN202110541254.4A priority Critical patent/CN113160338A/en
Publication of CN113160338A publication Critical patent/CN113160338A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to the technical field of AR/VR studios, in particular to AR/VR virtual reality fusion studio character space positioning, which comprises a real camera (foreground), a digital video delay module, an image synthesis module (color key control), a direct broadcasting or storage module, a camera position analysis and control module, a virtual camera module, a CCU system, a tracking unit module, a main control workstation unit, a graphic workstation unit, a color key unit, a switching panel unit, a time delay module and a mechanical sensor. The invention overcomes the defects of the prior art, and the image workstation unit and the mechanical sensor are arranged, so that the camera does not need to be modified, the lens calibration is not needed, the operation of a cameraman is convenient, and the camera can move without a rail; one tracker can be used for a plurality of cameras simultaneously, so that the motion range of the cameras is sufficient, and the problem that the motion range of the cameras is insufficient, and the motion range of actors in the background of a blue range is limited is solved.

Description

AR/VR virtual reality fusion studio character space positioning
Technical Field
The invention relates to the technical field of AR/VR studios, in particular to a character space positioning method for an AR/VR virtual reality fusion studio.
Background
AR/VR is a brand-new man-machine interaction technology, and real environment and virtual objects are overlaid to the same picture or space in real time and exist simultaneously by utilizing a camera, a sensor and a real-time calculation and matching technology. The user can feel the reality of 'being personally on the scene' experienced in the objective physical world through the virtual reality system, and can also break through space, time and other objective limitations and feel the experience which cannot be experienced in the real world in person.
The virtual studio system applies a camera tracking technology to obtain real camera data, and combines the real camera data with a background generated by a computer, and the background imaging is based on lens parameters obtained by real camera shooting, so that the background imaging is completely consistent with the three-dimensional perspective relation of actors, and unreal and unnatural feelings are avoided. Since the background is mostly computer-generated and can change rapidly, this makes the design of rich and colorful studio scenes possible in a very economical way, which has been paid more and more attention to program production and related personnel so far due to its own endless charm and its inexplicable development prospects.
The existing virtual reality fusion studio has the problems that environment reflection can cause blue interference components on a foreground object and an actor, interference is more easily caused on a transparent or semitransparent object, the motion range of a camera is insufficient, and the motion range of the actor in the background of a blue range is limited.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides an AR/VR virtual reality fusion studio character space positioning method, which overcomes the defects of the prior art, has simple structural design, and effectively solves the problems that environment reflection of the AR/VR virtual reality fusion studio causes blue interference components on a foreground object and an actor, interference is more easily caused on a transparent or semitransparent object, the motion range of a camera is insufficient, and the motion range of the actor in the blue background is limited.
In order to solve the technical problems, the invention provides the following technical scheme:
an AR/VR virtual reality fusion studio character space positioning system comprises a real camera (foreground), a digital video time delay module, an image synthesis module (color key control), a direct broadcasting or storage module, a camera position analysis and control module, a virtual camera module, a CCU system, a tracking unit module, a main control workstation unit, a graphic workstation unit, a color key unit, a switching panel unit and a time delay module.
As a preferred embodiment of the present invention, the virtual camera module includes a graphic computer (background) and a material library (background).
As a preferred technical solution of the present invention, the output end of the real camera (foreground) is unidirectionally and electrically connected to the input end of the digital video delay module, the output end of the real camera (foreground) is unidirectionally and electrically connected to the input end of the camera position analyzing and controlling module, the output end of the digital video delay module is unidirectionally and electrically connected to the input end of the image synthesizing module (color key control), the output end of the image synthesizing module (color key control) is unidirectionally and electrically connected to the input end of the direct broadcasting or storing module, and the output end of the virtual camera module is unidirectionally and electrically connected to the input end of the image synthesizing module (color key control).
As a preferred technical solution of the present invention, the output end of the real camera (foreground) is unidirectionally electrically connected to the input end of the CCU system, the output end of the real camera (foreground) is unidirectionally electrically connected to the input end of the tracking unit, the output end of the CCU system is unidirectionally electrically connected to the input end of the switching panel unit, the output end of the switching panel unit is unidirectionally electrically connected to the input end of the retarder module, the output end of the retarder is unidirectionally electrically connected to the input end of the color key unit, the output end of the tracking unit is unidirectionally electrically connected to the input end of the main control workstation unit, the output end of the main control workstation unit is unidirectionally electrically connected to the output end of the graphic workstation unit, and the output end of the graphic workstation unit is unidirectionally electrically connected to the input end of the color key unit.
As a preferred technical scheme of the invention, the graphic workstation unit adopts a dark blue background plate with a light blue grid pattern, when in shooting, the system can carry out positioning tracking on grids, the change of the movement and the proportion of the picture can be obtained by calculating the brightness signals of a plurality of pixel points in each picture, and the measured value and the motion parameter are used for establishing a graphic identification technology of a simultaneous equation of a pixel subset.
As a preferred embodiment of the present invention, the color key unit assigns each pixel in the virtual scene to a corresponding Z-axis depth value by using a pixel-level depth key technique.
The embodiment of the invention provides AR/VR virtual reality fusion studio character space positioning, which has the following beneficial effects:
1. by arranging the graphic workstation unit and the mechanical sensor, the camera does not need to be modified, the lens calibration is not needed, the operation of a cameraman is facilitated, and the camera can move without a rail; one tracker can be used for a plurality of cameras simultaneously, so that the purposes of sufficient motion range of the cameras and large motion range of actors in the background of a blue range are achieved, and the problem that the motion range of the cameras is insufficient and the motion range of the actors in the background of the blue range is limited is solved.
2. Through setting up the look key ware, reached and carried out very complicated blue control that disappears to the prospect, suppress weak blue composition promptly, do not make the prospect produce the purpose of colour distortion again, solved AR/VR virtual reality fusion studio appear the environmental reflection of light and can cause the problem of some blue interference components on the body of prospect object and actor.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic view of the overall structure of the present invention;
FIG. 2 is a schematic diagram of the connection structure of each unit of AR/VR according to the present invention.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
Example (b): as shown in fig. 1-2, an AR/VR virtual reality fusion studio character space positioning includes a real camera (foreground), a digital video delay module, an image synthesis module (color key control), a direct broadcast or storage module, a camera position analysis and control module, a virtual camera module, a CCU system, a tracking unit module, a main control workstation unit, a graphic workstation unit, a color key unit, a switching panel unit, a delay module, and a mechanical sensor module.
Wherein the virtual camera module comprises a graphics computer (background) and a material library (background).
The output end of the real camera (foreground) is in one-way electric connection with the input end of the digital video delay module, the output end of the real camera (foreground) is in one-way electric connection with the input end of the camera position analysis and control module, the output end of the digital video delay module is in one-way electric connection with the input end of the image synthesis module (color key control), the output end of the image synthesis module (color key control) is in one-way electric connection with the input end of the direct broadcasting or storage module, and the output end of the virtual camera module is in one-way electric connection with the input end of the image synthesis module (color key control).
The output end of the real camera (foreground) is in one-way electric connection with the input end of the CCU system, the output end of the real camera (foreground) is in one-way electric connection with the input end of the tracking unit, the output end of the CCU system is in one-way electric connection with the input end of the switching panel unit, the output end of the switching panel unit is in one-way electric connection with the input end of the delayer module, the output end of the delayer is in one-way electric connection with the input end of the color key unit, the output end of the tracking unit is in one-way electric connection with the input end of the main control workstation unit, the output end of the main control workstation unit is in one-way electric connection with the output end of the graphic workstation unit, and the output end of the graphic workstation unit is in one-way electric connection with the input end of the color key unit.
The graphic workstation unit adopts a dark blue background plate with a light blue grid pattern, the system can carry out positioning tracking on grids during shooting, the change of the movement and the proportion of the picture can be obtained by calculating the brightness signals of a plurality of pixel points in each picture, and the measured value and the motion parameter are used for establishing a graphic identification technology of a simultaneous equation of a pixel subset.
The master control workstation controls the graphic workstation, identifies the grid equipment, installs positioning software and the like, and the workstation completes the positioning of the camera.
The color key unit assigns each pixel in the virtual scene to a corresponding Z-axis depth value by adopting a pixel-level depth key technology.
The working principle is as follows: the graphic workstation unit and the mechanical sensor do not need to be modified, the lens calibration is not needed, the operation of a cameraman is convenient, and the camera can move without a rail; one tracker can be used for a plurality of cameras simultaneously, so that the camera motion range is sufficient, and the motion range of actors in the background of a blue range is large; the color key device achieves very complex blue elimination control on the foreground, namely, weak blue components are restrained, color distortion of the foreground is avoided, the defect of a pattern recognition mode can be made up by utilizing a mechanical sensor additionally arranged in the system, and the motion parameters of the camera are measured very accurately; the camera motion is not limited and the actor has a large freedom of movement within the blue set.
Finally, it should be noted that: in the description of the present invention, it should be noted that the terms "vertical", "upper", "lower", "horizontal", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention.
In the description of the present invention, it should also be noted that, unless otherwise explicitly specified or limited, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly and may, for example, be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
Although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (7)

1. An AR/VR virtual reality fusion studio character space positioning system comprises a real camera (foreground), a digital video time delay module, an image synthesis module (color key control), a direct broadcasting or storage module, a camera position analysis and control module, a virtual camera module, a CCU system, a tracking unit module, a main control workstation unit, a graphic workstation unit, a color key unit, a switching panel unit, a time delay module and a mechanical sensor module.
2. The AR/VR virtual reality fusion studio character space orientation of claim 1, where the virtual camera module includes a graphics computer (background) and a story bank (background).
3. The AR/VR virtual reality fusion studio character spatial orientation of claim 1, wherein the output of the real camera (foreground) is in one-way electrical connection with the input of the digital video delay module, the output of the real camera (foreground) is in one-way electrical connection with the input of the camera position analysis and control module, the output of the digital video delay module is in one-way electrical connection with the input of the image synthesis module (color key control), the output of the image synthesis module (color key control) is in one-way electrical connection with the input of the direct broadcast or storage module, and the output of the virtual camera module is in one-way electrical connection with the input of the image synthesis module (color key control).
4. The AR/VR virtual reality fusion studio character space positioning of claim 1, it is characterized in that the output end of the real camera (foreground) is in one-way electric connection with the input end of the CCU system, the output end of the real camera (foreground) is in one-way electric connection with the input end of the tracking unit, the output end of the CCU system is unidirectionally and electrically connected with the input end of the switching panel unit, the output end of the switching panel unit is unidirectionally and electrically connected with the input end of the time delay module, the output end of the time delay unit is in one-way electric connection with the input end of the color key unit, the output end of the tracking unit is in one-way electric connection with the input end of the main control workstation unit, the output end of the main control workstation unit is in one-way electric connection with the output end of the graphic workstation, and the output end of the graphic workstation unit is in one-way electric connection with the input end of the color key unit.
5. The AR/VR virtual reality fusion studio character space positioning of claim 1 where the graphics workstation unit is a dark blue background board with light blue grid pattern, the system performs positioning tracking of the grid during shooting, by calculating the luminance signal of many pixels in each picture, the motion and scale changes of the picture can be obtained, and this measurement and motion parameters are used to establish a simultaneous equation of graphics recognition technology for the subset of pixels.
6. The AR/VR virtual reality fusion studio character space positioning of claim 1 where the master workstation controls the graphics workstation, recognition grid devices and installs positioning software, etc. that completes the positioning of the cameras.
7. The AR/VR virtual reality fusion studio character space orientation of claim 1, wherein the color key unit assigns each pixel in the virtual scene with a corresponding Z-axis depth value using pixel-level depth key technology.
CN202110541254.4A 2021-05-18 2021-05-18 AR/VR virtual reality fusion studio character space positioning Pending CN113160338A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110541254.4A CN113160338A (en) 2021-05-18 2021-05-18 AR/VR virtual reality fusion studio character space positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110541254.4A CN113160338A (en) 2021-05-18 2021-05-18 AR/VR virtual reality fusion studio character space positioning

Publications (1)

Publication Number Publication Date
CN113160338A true CN113160338A (en) 2021-07-23

Family

ID=76876650

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110541254.4A Pending CN113160338A (en) 2021-05-18 2021-05-18 AR/VR virtual reality fusion studio character space positioning

Country Status (1)

Country Link
CN (1) CN113160338A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116600063A (en) * 2023-05-11 2023-08-15 深圳市天擎数字有限责任公司 Virtual reality fuses studio space positioning system and device system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1477856A (en) * 2002-08-21 2004-02-25 北京新奥特集团 True three-dimensional virtual studio system and its implement method
CN104836938A (en) * 2015-04-30 2015-08-12 江苏卡罗卡国际动漫城有限公司 Virtual studio system based on AR technology
CN105072314A (en) * 2015-08-13 2015-11-18 黄喜荣 Virtual studio implementation method capable of automatically tracking objects
CN105959513A (en) * 2016-06-06 2016-09-21 杭州同步科技有限公司 True three-dimensional virtual studio system and realization method thereof
CN106210453A (en) * 2016-08-09 2016-12-07 安徽喜悦信息科技有限公司 A kind of intelligent virtual studio system
CN106657719A (en) * 2017-01-04 2017-05-10 海南大学 Intelligent virtual studio system
US20190012801A1 (en) * 2017-07-07 2019-01-10 GameFace Labs Inc. Systems and methods for position and pose determination and tracking
CN109688343A (en) * 2017-10-18 2019-04-26 深圳市掌网科技股份有限公司 The implementation method and device of augmented reality studio

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1477856A (en) * 2002-08-21 2004-02-25 北京新奥特集团 True three-dimensional virtual studio system and its implement method
CN104836938A (en) * 2015-04-30 2015-08-12 江苏卡罗卡国际动漫城有限公司 Virtual studio system based on AR technology
CN105072314A (en) * 2015-08-13 2015-11-18 黄喜荣 Virtual studio implementation method capable of automatically tracking objects
CN105959513A (en) * 2016-06-06 2016-09-21 杭州同步科技有限公司 True three-dimensional virtual studio system and realization method thereof
CN106210453A (en) * 2016-08-09 2016-12-07 安徽喜悦信息科技有限公司 A kind of intelligent virtual studio system
CN106657719A (en) * 2017-01-04 2017-05-10 海南大学 Intelligent virtual studio system
US20190012801A1 (en) * 2017-07-07 2019-01-10 GameFace Labs Inc. Systems and methods for position and pose determination and tracking
CN109688343A (en) * 2017-10-18 2019-04-26 深圳市掌网科技股份有限公司 The implementation method and device of augmented reality studio

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
中国教育电视协会编: "中国教育电视的改革与发展", vol. 1, 31 October 2001, 苏州大学出版社, pages: 389 - 395 *
陈健森;: "浅谈虚拟演播室技术及构建实例", 广播与电视技术, no. 07 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116600063A (en) * 2023-05-11 2023-08-15 深圳市天擎数字有限责任公司 Virtual reality fuses studio space positioning system and device system

Similar Documents

Publication Publication Date Title
CN112040092B (en) Real-time virtual scene LED shooting system and method
CN111698390B (en) Virtual camera control method and device, and virtual studio implementation method and system
CN108525298B (en) Image processing method, image processing device, storage medium and electronic equipment
Matsuyama et al. 3D video and its applications
CN107341832B (en) Multi-view switching shooting system and method based on infrared positioning system
US20060165310A1 (en) Method and apparatus for a virtual scene previewing system
US11681834B2 (en) Test cell presence system and methods of visualizing a test environment
CN110691175B (en) Video processing method and device for simulating motion tracking of camera in studio
CN110866978A (en) Camera synchronization method in real-time mixed reality video shooting
CN114051129A (en) Film virtualization production system and method based on LED background wall
CN113240700B (en) Image processing method and device, computer readable storage medium and electronic equipment
CN112435558A (en) Holographic 3D intelligent interactive digital virtual sand table and interactive method thereof
CN112446939A (en) Three-dimensional model dynamic rendering method and device, electronic equipment and storage medium
EP4111677B1 (en) Multi-source image data synchronization
CN115118880A (en) XR virtual shooting system based on immersive video terminal is built
US11615755B1 (en) Increasing resolution and luminance of a display
CN113692734A (en) System and method for acquiring and projecting images, and use of such a system
CN113160338A (en) AR/VR virtual reality fusion studio character space positioning
CN208506731U (en) Image display systems
KR101273531B1 (en) Between Real image and CG Composed Animation authoring method and system by using motion controlled camera
CN112887514A (en) Real-time visualization method and system suitable for virtual shooting and production of film and television
CN102118573B (en) Virtual sports system with increased virtuality and reality combination degree
CN114885147B (en) Fusion production and broadcast system and method
CN117974796A (en) XR augmented reality camera calibration method, device and system
CN109389538A (en) A kind of Intelligent campus management system based on AR technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination