CN113516761A - Optical illusion type naked eye 3D content manufacturing method and device - Google Patents

Optical illusion type naked eye 3D content manufacturing method and device Download PDF

Info

Publication number
CN113516761A
CN113516761A CN202110898708.3A CN202110898708A CN113516761A CN 113516761 A CN113516761 A CN 113516761A CN 202110898708 A CN202110898708 A CN 202110898708A CN 113516761 A CN113516761 A CN 113516761A
Authority
CN
China
Prior art keywords
scene
screen
content
camera
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110898708.3A
Other languages
Chinese (zh)
Other versions
CN113516761B (en
Inventor
肖田
吴懿
魏宁
陈硕宏
宋振华
于路
盛文辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Four Open Garden Network Technology Guangzhou Co ltd
Original Assignee
Four Open Garden Network Technology Guangzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Four Open Garden Network Technology Guangzhou Co ltd filed Critical Four Open Garden Network Technology Guangzhou Co ltd
Priority to CN202110898708.3A priority Critical patent/CN113516761B/en
Publication of CN113516761A publication Critical patent/CN113516761A/en
Application granted granted Critical
Publication of CN113516761B publication Critical patent/CN113516761B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides an optical illusion type naked eye 3D content manufacturing method and device, and the method comprises the following steps: acquiring basic image perspective information and elevation angle information of a screen display scene, and performing camera parameter inverse summation scene simulation on a viewing site; performing content design and modeling of a live-action LED for creative association based on a real scene structure; carrying out CG scene making and preliminary detection in the scene, and carrying out real shooting camera simulation, shooting and CG content synthesis; performing resolution matching and rendering output on a scene under a CG camera according to the size of the resolution required by an LED screen, expanding an output video sequence onto an LED screen model in a three-dimensional scene in a way of baking in CG software, and obtaining a final video format result of the 3D visual illusion screen according to the file format requirement required by the screen; the application can effectively ensure the uniform and correct perspective relation of the impression effect during multi-screen co-broadcasting.

Description

Optical illusion type naked eye 3D content manufacturing method and device
Technical Field
The application relates to the technical field of vision, in particular to a method and a device for making visual illusion type naked eye 3D content.
Background
The visual system is the most important sensory pathway for humans, and approximately 80% of the external information enters the human brain via the visual system. But sometimes our visual system can deceive our brains, causing some interesting phenomenon called "visual illusion". The visual illusion art refers to an art in which artists draw pictures by using the principle of visual illusion. 3D street art painting has appeared since the 16 th century, and static 3D painting is created through 2D painting, and the corresponding effect can be observed only through a specific angle.
In recent years, thanks to the development of LED screen display ends and ultra-high definition content manufacturing technologies, a batch of naked eye 3D contents with strong authenticity are promoted to be displayed at an outdoor ultra-high definition large screen end. The contents belong to the visual illusion art category, the original static visual illusion 3D picture is changed into a dynamic naked eye 3D video, and the dynamic naked eye 3D video has shocking visual impact force, and a phenomenon set concept of urban hot click is formed. With the introduction and promotion of the project of 'hundreds of cities and thousands of screens' of the central broadcast television total station and the demand of advertisers and audiences for exquisite contents of outdoor large screen ends under the line, the visual illusion type naked-eye 3D contents suitable for showing hot contents and advertisement products will gradually increase.
These optical illusion naked eye 3D contents are either completely produced by the computer CG or produced by the computer CG and synthesized with real shooting contents, and there are three problems with the production of these contents at present. First, there is no unified standardized manufacturing process. Secondly, due to the fact that different outdoor LEDs are different in shape and different in field watching angle, multi-screen simulcasting cannot guarantee uniform watching effect. Thirdly, for naked eye 3D images with real shot content synthesis, the correct perspective relationship cannot be guaranteed even when the images are viewed by an outdoor LED.
Disclosure of Invention
Aiming at the problems in the prior art, the application provides an illusion type naked eye 3D content making method and device, which can effectively guarantee the uniform appearance effect and correct perspective relation during multi-screen co-broadcasting.
In order to solve at least one of the above problems, the present application provides the following technical solutions:
in a first aspect, the present application provides a method for making an optical illusion type naked-eye 3D content, including:
acquiring basic image perspective information and elevation angle information of a screen display scene, and performing camera parameter inverse summation scene simulation on a viewing site;
performing content design and creative association with modeling of a live-action LED based on a real scene structure, performing CG scene manufacturing and preliminary detection in the scene, simulating and shooting by a live-action camera, and synthesizing with CG content;
according to the required resolution size of the LED screen, the resolution matching and rendering output are carried out on the scene under the CG camera, the output video sequence is expanded to an LED screen model in a three-dimensional scene in a baking mode in CG software, and the final video format result of the 3D visual illusion screen is obtained according to the required file format requirement of the screen.
Further, the performing of the camera inverse summation scene simulation of the viewing field includes:
performing reverse calculation and simulation calculation of the camera angle through data information, image information and screen parameters acquired by a live-action scene, and calculating to obtain camera parameters by using a perspective formula algorithm;
the method comprises the steps of building a simple model according to a real scene, positioning average test points on a model screen, respectively placing basic geometric bodies for calibration, simulating a sector angle visual angle of field survey based on a camera in a matrix arrangement mode, and obtaining a perspective deformation effect of screen contents.
Further, the performing CG scene creation and intra-scene preliminary detection includes:
according to the design scheme and the manufacturing requirements, model manufacturing, scene construction, material mapping drawing, animation manufacturing, light arrangement and rendering output are completed in CG manufacturing software, and a CG scene meeting the manufacturing requirements is obtained;
and (3) carrying out preliminary detection on the CG scene in the observation area, and adjusting the structure with tensile deformation in the observation area.
Further, the simulating, shooting and synthesizing with CG content by the real-time camera includes:
simulating the performance of a real role model and manufacturing a low-surface-number three-dimensional role model, adding a calibration basic geometric body into a scene as a background, dividing regions for the three-dimensional models of the role and the background and marking digital numbers, positioning and designing the three-dimensional space relationship between a main element model and a virtual scene according to an optimal viewing point, combining the condition of an actual shooting place and shooting equipment parameters, and obtaining initial real-shooting camera position parameter information by zooming the position of a camera in a CG scene and moving the CG scene or changing the shooting angle;
constructing real-time shooting screen tables according to camera parameters and real-time scene shooting conditions, and arranging lamplight matched with a three-dimensional scene light source;
and (4) matting and separating the character performance contents, synthesizing the character performance contents into a three-dimensional scene, and correcting colors.
In a second aspect, the present application provides an optical illusion-like naked-eye 3D content producing device, comprising:
the restoration simulation module is used for acquiring basic image perspective information and elevation angle information of a screen display scene and carrying out scene simulation of the reverse summation of camera parameters of a viewing site;
the content detection module is used for performing content design based on the real scene structure, performing creative association with the modeling of a live-action LED, performing CG scene manufacturing and in-scene preliminary detection, performing real-action camera simulation and shooting, and synthesizing with CG content;
and the content synthesis module is used for matching the resolution of the scene under the CG camera and rendering and outputting the scene according to the size of the resolution required by the LED screen, expanding the output video sequence to an LED screen model in a three-dimensional scene in a baking mode in CG software, and obtaining the final video format result of the 3D optical illusion screen according to the file format requirement required by the screen.
Further, the restoration simulation module includes:
the camera parameter determining unit is used for performing reverse calculation and simulation calculation of the camera angle through data information, image information and screen parameters acquired by a real scene, and calculating the camera parameters by using a perspective formula algorithm;
and the perspective deformation effect determining unit is used for building a simple model according to the real scene, positioning average test points on the model screen, respectively placing basic geometric bodies for calibration, and simulating the visual angle of the sector angle of field survey based on the camera in a matrix arrangement mode to obtain the perspective deformation effect of the screen content.
Further, the content detection module includes:
the CG scene making unit is used for completing model making, scene building, material mapping drawing, animation making, light arrangement and rendering output in CG making software according to a design scheme and making requirements to obtain a CG scene meeting making requirements;
and the structure adjusting unit is used for preliminarily detecting the manufacture of the CG scene in the observation area and adjusting the structure with tensile deformation in the observation area.
Further, the content composition module includes:
the camera position parameter determining unit is used for simulating the performance of a real role model and manufacturing a low-surface-number three-dimensional role model, adding a calibration basic geometric body into a scene as a background, dividing regions for the three-dimensional models of the role and the background and marking digital numbers, positioning the three-dimensional spatial relationship between a main element model and a virtual scene in design according to an optimal viewing point, combining the condition of an actual shooting place and shooting equipment parameters, and obtaining preliminary real-shooting camera position parameter information by zooming the position of a camera in a CG scene, moving the CG scene or changing the shooting angle;
the scene building unit is used for building a real-shooting screen platform according to the camera parameters and the real-scene shooting conditions and arranging lamplight matched with the three-dimensional scene light source;
and the synthesis unit is used for separating the image matting of the character performance content, synthesizing the image matting into the three-dimensional scene and correcting the color.
In a third aspect, the present application provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor executes the program to implement the steps of the optical illusion-like naked-eye 3D content making method.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the optical illusion-like naked-eye 3D content production method.
According to the technical scheme, the visual illusion naked eye 3D content manufacturing method and device can effectively guarantee the uniform and correct perspective relation of the impression effect during multi-screen co-broadcasting by establishing the uniform standardized manufacturing process flow.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a method for making optical illusion-like naked-eye 3D content according to an embodiment of the present application;
fig. 2 is a second schematic flow chart of the optical illusion-like naked-eye 3D content manufacturing method in the embodiment of the present application;
fig. 3 is a third schematic flow chart of a method for making optical illusion-like naked-eye 3D content according to an embodiment of the present application;
fig. 4 is a fourth schematic flow chart of the optical illusion-like naked-eye 3D content manufacturing method in the embodiment of the present application;
fig. 5 is one of the structural diagrams of the optical illusion-like naked-eye 3D content creating apparatus in the embodiment of the present application;
fig. 6 is a second configuration diagram of the optical illusion-based naked-eye 3D content creation apparatus according to the embodiment of the present application;
fig. 7 is a third structural diagram of an optical illusion-like naked-eye 3D content creating device in the embodiment of the present application;
fig. 8 is a fourth configuration diagram of an optical illusion-like naked-eye 3D content creating apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Considering that these optical illusion naked eye 3D contents are either completely produced by the computer CG or produced by the computer CG and synthesized with real shooting contents, there are three problems with the production of these contents at present. First, there is no unified standardized manufacturing process. Secondly, due to the fact that different outdoor LEDs are different in shape and different in field watching angle, multi-screen simulcasting cannot guarantee uniform watching effect. Thirdly, for naked eye 3D pictures with real shot content synthesis, the problem that the correct perspective relation cannot be guaranteed when the pictures are watched by an outdoor LED is solved.
In order to effectively ensure that the impression effect is uniform and the perspective relation is correct during multi-screen co-broadcasting, the application provides an embodiment of an illusion type naked-eye 3D content making method, and referring to fig. 1, the illusion type naked-eye 3D content making method specifically includes the following contents:
step S101: acquiring basic image perspective information and elevation angle information of a screen display scene, and performing camera parameter inverse summation scene simulation on a viewing site;
specifically, the optical illusion naked eye 3D content needs to be observed at a specific angle to obtain the corresponding effect. Firstly, the condition of a screen display scene needs to be analyzed, and a sector-shaped watching interval with good visual effect and uniformity is obtained through a field investigation and a real-scene shooting mode. Factors to be considered in the actual survey include the main screen structure and the surrounding environment structure of the real-scene, the viewing time, the height between the viewing equipment and the people, the street direction, the direction of the people flow or the traffic flow, and the like. Based on these factors, the optimal viewing point is determined, and basic perspective information, elevation information and the like of the image are obtained in the process. For the multi-screen playing situation, the unified viewing angle of the multiple screens needs to be considered, the launched screen is planned, and the situation that the deviation of the multi-screen viewing angle is large cannot occur.
Step S102: performing content design and creative association with modeling of a live-action LED based on a real scene structure, performing CG scene manufacturing and preliminary detection in the scene, simulating and shooting by a live-action camera, and synthesizing with CG content;
specifically, content is created based on the real scene construction according to the conclusions obtained in the steps. The design content is creatively associated with the shape of the live-action LED, so that the picture display content can be linked with or even fused with the real environment, the correct visual perspective relation can be ensured, and the reality and the telepresence of the visual illusion 3D effect are improved. To achieve the "out-of-screen" optical illusion 3D effect, the picture "whiteout" is done appropriately based on the real scene. The "white space" may be a part of the real scene in the daytime or may be "empty black" in the night scene. The specific position and size of the three-dimensional interactive model are determined according to the motion trail and the perspective change requirement of the creation main body, and the interaction effect of the creation main body and the real environment is achieved.
Step S103: according to the required resolution size of the LED screen, the resolution matching and rendering output are carried out on the scene under the CG camera, the output video sequence is expanded to an LED screen model in a three-dimensional scene in a baking mode in CG software, and the final video format result of the 3D visual illusion screen is obtained according to the required file format requirement of the screen.
Specifically, after all post-production is completed, resolution matching is performed on a scene under the CG camera according to the resolution size required by the LED screen, so that it is ensured that the content displayed in the screen in the CG scene has sufficient resolution required by the LED screen, and then rendering output is performed. The output video sequence is expanded to an LED screen model in a three-dimensional scene in a baking mode in CG software, and a final video format result of the 3D visual illusion screen is obtained according to the file format requirement required by the screen.
Specifically, the content of the finished product is played on site, and live-action inspection is performed to confirm whether the final viewing effect is expected or not, including whether the perspective proportion relation is correct in the optimal viewing range of each screen, whether obviously deformed and stretched elements appear or not, and the like. If the problem exists, the adjustment is modified, and finally a satisfactory display effect is obtained.
From the above description, the visual illusion naked eye 3D content making method provided by the embodiment of the application can effectively ensure that the uniform and correct perspective relation of the impression effect during multi-screen co-broadcasting can be ensured by establishing the uniform standardized making process flow.
In an embodiment of the optical illusion-like naked eye 3D content making method of the present application, referring to fig. 2, the following may be further included:
step S201: performing reverse calculation and simulation calculation of the camera angle through data information, image information and screen parameters acquired by a live-action scene, and calculating to obtain camera parameters by using a perspective formula algorithm;
step S202: the method comprises the steps of building a simple model according to a real scene, positioning average test points on a model screen, respectively placing basic geometric bodies for calibration, simulating a sector angle visual angle of field survey based on a camera in a matrix arrangement mode, and obtaining a perspective deformation effect of screen contents so as to obtain how to deploy main contents in the screen.
(1) Camera reverse solution for viewing scene
The three-dimensional space of the scene watched by the LED can be restored by reversely solving the camera of the shooting equipment for shooting the picture on the scene. Scenes such as the scene LED shape, camera lens parameters, shooting angle and the like are restored in the CG to solve the alignment problem of the scene and the CG perspective space.
The method comprises the specific steps of carrying out reverse calculation and simulation calculation on the angle of a camera in three-dimensional design software through data information, image information, screen parameters and the like acquired by a live-action, calculating to obtain camera parameters close to reality by using a perspective formula algorithm, and applying to obtain a visual effect output result matched with an optimal viewing point in reality so as to ensure that the perspective relation of video picture contents is consistent with the live-action.
(2) Scene simulation
After the camera simulation matching is completed, a preliminary test is required. The simple model is built according to the real scene, average test points are positioned on a model screen, basic geometric bodies (cubes and spheres) for calibration are respectively placed, the visual angle of a sector angle of field survey is simulated based on a camera in a matrix arrangement mode, and the approximate perspective deformation effect of the screen content can be rapidly and intuitively obtained. The multi-cast will repeat this step. The purpose of this step is to make a priori problem evasion at the later design stage, and to eliminate the obvious deformation area of the image information caused by the larger perspective stretching.
Specifically, the method is used for building a simple model according to a real scene, positioning average test points on a model screen, respectively placing basic geometric bodies (cubes and spheres) for calibration, dividing regional ranges of each model, then making digital marks, rendering a single frame and baking the single frame back on a three-dimensional screen of a CG scene based on a fan-shaped angle visual angle of a camera simulation field survey in a matrix arrangement mode, and then obtaining a rough perspective deformation effect of screen contents quickly and intuitively based on an actual visual angle range obtained by field survey. The multi-cast will repeat this step. The purpose of this step is to make advance problem evasion in the later design stage, obtain the deployment range of the important contents of the picture during design, and eliminate the obvious deformation area of the image information caused by larger perspective stretching.
In an embodiment of the optical illusion-like naked-eye 3D content production method of the present application, referring to fig. 3, the following may be further specifically included:
step S301: according to the design scheme and the manufacturing requirements, model manufacturing, scene construction, material mapping drawing, animation manufacturing, light arrangement and rendering output are completed in CG manufacturing software, and a CG scene meeting the manufacturing requirements is obtained;
step S302: and (3) carrying out preliminary detection on the CG scene in the observation area, and adjusting the structure with tensile deformation in the observation area.
(1) CG scene creation
This step is the same as the common CG fabrication step.
And according to the design scheme and the manufacturing requirements, the steps of model manufacturing, scene construction, material mapping drawing, animation manufacturing, light arrangement, rendering output and the like are completed in CG manufacturing software, and the CG scene meeting the manufacturing requirements is finally obtained.
(2) Intra-scene preliminary detection
And (3) carrying out preliminary detection on the manufacture of the CG scene in the observation area obtained in the step (1), and adjusting the structure with obvious tensile deformation in the observation range.
In an embodiment of the optical illusion-like naked eye 3D content production method of the present application, referring to fig. 4, the following may be further included:
step S401: simulating the performance of a real role model and manufacturing a low-surface-number three-dimensional role model, adding a calibration basic geometric body into a scene as a background, dividing regions for the three-dimensional models of the role and the background and marking digital numbers, positioning and designing the three-dimensional space relationship between a main element model and a virtual scene according to an optimal viewing point, combining the condition of an actual shooting place and shooting equipment parameters, and obtaining initial real-shooting camera position parameter information by zooming the position of a camera in a CG scene and moving the CG scene or changing the shooting angle;
step S402: constructing real-time shooting screen tables according to camera parameters and real-time scene shooting conditions, and arranging lamplight matched with a three-dimensional scene light source;
step S403: and (4) matting and separating the character performance contents, synthesizing the character performance contents into a three-dimensional scene, and correcting colors.
1) Real shooting camera simulation
Simulating the performance of a real role model and manufacturing a low-surface-number three-dimensional role model, adding a calibration basic geometric body into a scene as a background, dividing regions for the three-dimensional models of the role and the background and marking digital numbers, positioning and designing the three-dimensional space relationship between a main element model and a virtual scene according to an optimal viewing point, combining the condition of an actual shooting place and shooting equipment parameters, and obtaining preliminary real-shooting camera position parameter information by zooming the position of a camera in a CG scene, moving the CG scene, changing the shooting angle and the like. Because the change of the position of the camera may cause the change of the perspective relationship, the calibration basic geometry can be reused in the process, whether the image deformation effect caused by the perspective effect exists in the three-dimensional scene is continuously confirmed in the observation range obtained in the step one, and scene optimization and visual effect adjustment and teaching are carried out to obtain more uniform and high-quality viewing experience in the optimal viewing range. And finally obtaining accurate position parameter information of the real shooting camera.
(2) Live-action content shooting
The real-shooting screen platform is built according to the camera parameters and the real-scene shooting conditions obtained based on the steps, and lamplight matched with the three-dimensional scene light source is arranged, so that the unification of light and shadow changes, color expression, perspective proportion relation and the like of real characters and a digital scene is ensured, and a more real and natural shooting effect is achieved.
(3) And (5) synthesizing the CG and the real shot content.
This step is the same as the common CG live shot synthesis.
After shooting is finished, character performance content is scratched and separated through post-production, the character performance content is synthesized into a three-dimensional scene, color correction is carried out, and matching of color tone is guaranteed.
In order to effectively ensure a unified viewing effect and a correct perspective relation during multi-screen co-broadcasting, the present application provides an embodiment of an optical illusion naked-eye 3D content making apparatus for implementing all or part of the content of the optical illusion naked-eye 3D content making method, and referring to fig. 4, the optical illusion naked-eye 3D content making apparatus specifically includes the following contents:
the restoration simulation module 10 is used for acquiring basic image perspective information and elevation angle information of a screen display scene, and performing camera parameter inverse summation scene simulation on a viewing site;
the content detection module 20 is used for performing content design based on the real scene structure, performing creative association with modeling of a live-action LED, performing CG scene production and in-scene preliminary detection, performing real-action camera simulation, shooting and CG content synthesis;
and the content synthesis module 30 is configured to perform resolution matching and rendering output on a scene under the CG camera according to a resolution size required by the LED screen, expand an output video sequence onto an LED screen model in a three-dimensional scene in a manner of baking in CG software, and obtain a final video format result of the 3D optical illusion screen according to a file format requirement required by the screen.
From the above description, the optical illusion naked-eye 3D content making device provided in the embodiment of the application can effectively ensure a uniform and correct perspective relation of the impression effect during multi-screen co-broadcasting by establishing a uniform standardized making process flow.
In an embodiment of the optical illusion-like naked-eye 3D content producing apparatus of the present application, referring to fig. 6, the restoring simulation module 10 includes:
the camera parameter determining unit 11 is configured to perform inverse solution and simulation calculation of a camera angle through data information, image information and screen parameters acquired by a real scene, and calculate camera parameters by using a perspective formula algorithm;
and the perspective deformation effect determining unit 12 is used for building a simple model according to the real scene, positioning average test points on the model screen, respectively placing basic geometric bodies for calibration, and simulating the viewing angle of the sector angle of the field survey based on the camera in a matrix arrangement mode to obtain the perspective deformation effect of the screen content.
Specifically, the method is used for building a simple model according to a real scene, positioning average test points on a model screen, respectively placing basic geometric bodies (cubes and spheres) for calibration, dividing regional ranges of each model, then making digital marks, rendering a single frame and baking the single frame back on a three-dimensional screen of a CG scene based on a fan-shaped angle visual angle of a camera simulation field survey in a matrix arrangement mode, and then obtaining a rough perspective deformation effect of screen contents quickly and intuitively based on an actual visual angle range obtained by field survey. The multi-cast will repeat this step. The purpose of this step is to make advance problem evasion in the later design stage, obtain the deployment range of the important contents of the picture during design, and eliminate the obvious deformation area of the image information caused by larger perspective stretching.
In an embodiment of the optical illusion-like naked eye 3D content producing apparatus of the present application, referring to fig. 7, the content detecting module 20 includes:
a CG scene making unit 21, configured to complete model making, scene construction, material mapping, animation making, lighting arrangement, and rendering output in CG making software according to a design scheme and making requirements, to obtain a CG scene meeting making requirements;
and a structure adjusting unit 22, configured to perform preliminary detection on the CG scene generation in the observation area, and adjust a structure in which a tensile deformation occurs in the observation area.
In an embodiment of the optical illusion-like naked-eye 3D content producing apparatus of the present application, referring to fig. 8, the content synthesizing module 30 includes:
the camera position parameter determining unit 31 is used for simulating the performance of a real character model and manufacturing a low-surface-number three-dimensional character model, adding a calibration basic geometric body into a scene as a background, dividing regions for the three-dimensional models of the character and the background and marking digital numbers, positioning the three-dimensional spatial relationship between a main element model and a virtual scene in design according to an optimal viewing point, combining the condition of an actual shooting place and shooting equipment parameters, and obtaining preliminary camera position parameter information of the actual shooting camera by zooming the position of a camera in a CG scene, moving the CG scene or changing the shooting angle;
the scene building unit 32 is used for building a real shooting screen platform according to the camera parameters and the real scene shooting conditions, and arranging lamplight matched with the three-dimensional scene light source;
and the synthesizing unit 33 is used for separating the character performance contents, synthesizing the character performance contents into a three-dimensional scene, and correcting colors.
In order to effectively ensure a uniform and correct perspective relation of an impression effect during multi-screen co-broadcasting, the present application provides an embodiment of an electronic device for implementing all or part of contents in the illusion-like naked-eye 3D content making method, where the electronic device specifically includes the following contents:
a processor (processor), a memory (memory), a communication Interface (Communications Interface), and a bus; the processor, the memory and the communication interface complete mutual communication through the bus; the communication interface is used for realizing information transmission between the optical illusion naked eye 3D content making device and relevant equipment such as a core service system, a user terminal and a relevant database; the logic controller may be a desktop computer, a tablet computer, a mobile terminal, and the like, but the embodiment is not limited thereto. In this embodiment, the logic controller may be implemented with reference to the embodiment of the method for making the optical illusion-like naked eye 3D content and the embodiment of the apparatus for making the optical illusion-like naked eye 3D content in the embodiment, and the contents thereof are incorporated herein, and repeated details are not repeated.
It is understood that the user terminal may include a smart phone, a tablet electronic device, a network set-top box, a portable computer, a desktop computer, a Personal Digital Assistant (PDA), an in-vehicle device, a smart wearable device, and the like. Wherein, intelligence wearing equipment can include intelligent glasses, intelligent wrist-watch, intelligent bracelet etc..
In practical applications, part of the optical illusion-like naked eye 3D content production method may be executed on the electronic device side as described above, or all operations may be completed in the client device. The selection may be specifically performed according to the processing capability of the client device, the limitation of the user usage scenario, and the like. This is not a limitation of the present application. The client device may further include a processor if all operations are performed in the client device.
The client device may have a communication module (i.e., a communication unit), and may be communicatively connected to a remote server to implement data transmission with the server. The server may include a server on the task scheduling center side, and in other implementation scenarios, the server may also include a server on an intermediate platform, for example, a server on a third-party server platform that is communicatively linked to the task scheduling center server. The server may include a single computer device, or may include a server cluster formed by a plurality of servers, or a server structure of a distributed apparatus.
Fig. 9 is a schematic block diagram of a system configuration of an electronic device 9600 according to an embodiment of the present application. As shown in fig. 9, the electronic device 9600 can include a central processor 9100 and a memory 9140; the memory 9140 is coupled to the central processor 9100. Notably, this fig. 9 is exemplary; other types of structures may also be used in addition to or in place of the structure to implement telecommunications or other functions.
In one embodiment, the optical illusion-like naked eye 3D content production method function may be integrated into the central processor 9100. The central processor 9100 may be configured to control as follows:
step S101: acquiring basic image perspective information and elevation angle information of a screen display scene, and performing camera parameter inverse summation scene simulation on a viewing site;
step S102: performing content design and creative association with modeling of a live-action LED based on a real scene structure, performing CG scene manufacturing and preliminary detection in the scene, simulating and shooting by a live-action camera, and synthesizing with CG content;
step S103: according to the required resolution size of the LED screen, the resolution matching and rendering output are carried out on the scene under the CG camera, the output video sequence is expanded to an LED screen model in a three-dimensional scene in a baking mode in CG software, and the final video format result of the 3D visual illusion screen is obtained according to the required file format requirement of the screen.
As can be seen from the above description, the electronic device provided in the embodiment of the present application can effectively ensure a uniform and correct perspective relation of the impression effect during multi-screen co-broadcasting by establishing a uniform standardized manufacturing process flow.
In another embodiment, the optical illusion-like naked eye 3D content creation device may be configured separately from the central processor 9100, for example, the optical illusion-like naked eye 3D content creation device may be configured as a chip connected to the central processor 9100, and the optical illusion-like naked eye 3D content creation method function is realized by the control of the central processor.
As shown in fig. 9, the electronic device 9600 may further include: a communication module 9110, an input unit 9120, an audio processor 9130, a display 9160, and a power supply 9170. It is noted that the electronic device 9600 also does not necessarily include all of the components shown in fig. 9; in addition, the electronic device 9600 may further include components not shown in fig. 9, which may be referred to in the prior art.
As shown in fig. 9, a central processor 9100, sometimes referred to as a controller or operational control, can include a microprocessor or other processor device and/or logic device, which central processor 9100 receives input and controls the operation of the various components of the electronic device 9600.
The memory 9140 can be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. The information relating to the failure may be stored, and a program for executing the information may be stored. And the central processing unit 9100 can execute the program stored in the memory 9140 to realize information storage or processing, or the like.
The input unit 9120 provides input to the central processor 9100. The input unit 9120 is, for example, a key or a touch input device. Power supply 9170 is used to provide power to electronic device 9600. The display 9160 is used for displaying display objects such as images and characters. The display may be, for example, an LCD display, but is not limited thereto.
The memory 9140 can be a solid state memory, e.g., Read Only Memory (ROM), Random Access Memory (RAM), a SIM card, or the like. There may also be a memory that holds information even when power is off, can be selectively erased, and is provided with more data, an example of which is sometimes called an EPROM or the like. The memory 9140 could also be some other type of device. Memory 9140 includes a buffer memory 9141 (sometimes referred to as a buffer). The memory 9140 may include an application/function storage portion 9142, the application/function storage portion 9142 being used for storing application programs and function programs or for executing a flow of operations of the electronic device 9600 by the central processor 9100.
The memory 9140 can also include a data store 9143, the data store 9143 being used to store data, such as contacts, digital data, pictures, sounds, and/or any other data used by an electronic device. The driver storage portion 9144 of the memory 9140 may include various drivers for the electronic device for communication functions and/or for performing other functions of the electronic device (e.g., messaging applications, contact book applications, etc.).
The communication module 9110 is a transmitter/receiver 9110 that transmits and receives signals via an antenna 9111. The communication module (transmitter/receiver) 9110 is coupled to the central processor 9100 to provide input signals and receive output signals, which may be the same as in the case of a conventional mobile communication terminal.
Based on different communication technologies, a plurality of communication modules 9110, such as a cellular network module, a bluetooth module, and/or a wireless local area network module, may be provided in the same electronic device. The communication module (transmitter/receiver) 9110 is also coupled to a speaker 9131 and a microphone 9132 via an audio processor 9130 to provide audio output via the speaker 9131 and receive audio input from the microphone 9132, thereby implementing ordinary telecommunications functions. The audio processor 9130 may include any suitable buffers, decoders, amplifiers and so forth. In addition, the audio processor 9130 is also coupled to the central processor 9100, thereby enabling recording locally through the microphone 9132 and enabling locally stored sounds to be played through the speaker 9131.
An embodiment of the present application further provides a computer-readable storage medium capable of implementing all steps in the method for producing an optical illusion-like naked-eye 3D content with an execution subject being a server or a client in the above-mentioned embodiment, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the computer program implements all steps of the method for producing an optical illusion-like naked-eye 3D content with an execution subject being a server or a client in the above-mentioned embodiment, for example, the processor implements the following steps when executing the computer program:
step S101: acquiring basic image perspective information and elevation angle information of a screen display scene, and performing camera parameter inverse summation scene simulation on a viewing site;
step S102: performing content design and creative association with modeling of a live-action LED based on a real scene structure, performing CG scene manufacturing and preliminary detection in the scene, simulating and shooting by a live-action camera, and synthesizing with CG content;
step S103: according to the required resolution size of the LED screen, the resolution matching and rendering output are carried out on the scene under the CG camera, the output video sequence is expanded to an LED screen model in a three-dimensional scene in a baking mode in CG software, and the final video format result of the 3D visual illusion screen is obtained according to the required file format requirement of the screen.
As can be seen from the above description, the computer-readable storage medium provided in the embodiment of the present application can effectively ensure a uniform and correct perspective relation of the impression effect during multi-screen co-broadcasting by establishing a uniform standardized manufacturing process flow.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The principle and the implementation mode of the invention are explained by applying specific embodiments in the invention, and the description of the embodiments is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. An optical illusion naked eye 3D content production method is characterized by comprising the following steps:
acquiring basic image perspective information and elevation angle information of a screen display scene, and performing camera parameter inverse summation scene simulation on a viewing site;
performing content design and creative association with modeling of a live-action LED based on a real scene structure, performing CG scene manufacturing and preliminary detection in the scene, simulating and shooting by a live-action camera, and synthesizing with CG content;
according to the required resolution size of the LED screen, the resolution matching and rendering output are carried out on the scene under the CG camera, the output video sequence is expanded to an LED screen model in a three-dimensional scene in a baking mode in CG software, and the final video format result of the 3D visual illusion screen is obtained according to the required file format requirement of the screen.
2. The optical illusion naked eye 3D content production method according to claim 1, wherein the performing of camera inverse summation scene simulation of the viewing scene comprises:
performing reverse calculation and simulation calculation of the camera angle through data information, image information and screen parameters acquired by a live-action scene, and calculating to obtain camera parameters by using a perspective formula algorithm;
the method comprises the steps of building a simple model according to a real scene, positioning average test points on a model screen, respectively placing basic geometric bodies for calibration, simulating a sector angle visual angle of field survey based on a camera in a matrix arrangement mode, and obtaining a perspective deformation effect of screen contents.
3. The optical illusion naked eye 3D content making method according to claim 1, wherein the performing CG scene making and in-scene preliminary detection comprises:
according to the design scheme and the manufacturing requirements, model manufacturing, scene construction, material mapping drawing, animation manufacturing, light arrangement and rendering output are completed in CG manufacturing software, and a CG scene meeting the manufacturing requirements is obtained;
and (3) carrying out preliminary detection on the CG scene in the observation area, and adjusting the structure with tensile deformation in the observation area.
4. The method for producing optical illusion-like naked-eye 3D content according to claim 1, wherein the simulating and shooting by the real shooting camera and the synthesizing with CG content comprise:
simulating the performance of a real role model and manufacturing a low-surface-number three-dimensional role model, adding a calibration basic geometric body into a scene as a background, dividing regions for the three-dimensional models of the role and the background and marking digital numbers, positioning and designing the three-dimensional space relationship between a main element model and a virtual scene according to an optimal viewing point, combining the condition of an actual shooting place and shooting equipment parameters, and obtaining initial real-shooting camera position parameter information by zooming the position of a camera in a CG scene and moving the CG scene or changing the shooting angle;
constructing real-time shooting screen tables according to camera parameters and real-time scene shooting conditions, and arranging lamplight matched with a three-dimensional scene light source;
and (4) matting and separating the character performance contents, synthesizing the character performance contents into a three-dimensional scene, and correcting colors.
5. An optical illusion-like naked eye 3D content production device, comprising:
the restoration simulation module is used for acquiring basic image perspective information and elevation angle information of a screen display scene and carrying out scene simulation of the reverse summation of camera parameters of a viewing site;
the content detection module is used for performing content design based on the real scene structure, performing creative association with the modeling of a live-action LED, performing CG scene manufacturing and in-scene preliminary detection, performing real-action camera simulation and shooting, and synthesizing with CG content;
and the content synthesis module is used for matching the resolution of the scene under the CG camera and rendering and outputting the scene according to the size of the resolution required by the LED screen, expanding the output video sequence to an LED screen model in a three-dimensional scene in a baking mode in CG software, and obtaining the final video format result of the 3D optical illusion screen according to the file format requirement required by the screen.
6. The optical illusion-like naked eye 3D content production device according to claim 5, wherein the restoration simulation module includes:
the camera parameter determining unit is used for performing reverse calculation and simulation calculation of the camera angle through data information, image information and screen parameters acquired by a real scene, and calculating the camera parameters by using a perspective formula algorithm;
and the perspective deformation effect determining unit is used for building a simple model according to the real scene, positioning average test points on the model screen, respectively placing basic geometric bodies for calibration, and simulating the visual angle of the sector angle of field survey based on the camera in a matrix arrangement mode to obtain the perspective deformation effect of the screen content.
7. The optical illusion-like naked eye 3D content production device according to claim 5, wherein the content detection module comprises:
the CG scene making unit is used for completing model making, scene building, material mapping drawing, animation making, light arrangement and rendering output in CG making software according to a design scheme and making requirements to obtain a CG scene meeting making requirements;
and the structure adjusting unit is used for preliminarily detecting the manufacture of the CG scene in the observation area and adjusting the structure with tensile deformation in the observation area.
8. The optical illusion-like naked-eye 3D content producing device according to claim 5, wherein the content synthesizing module includes:
the camera position parameter determining unit is used for simulating the performance of a real role model and manufacturing a low-surface-number three-dimensional role model, adding a calibration basic geometric body into a scene as a background, dividing regions for the three-dimensional models of the role and the background and marking digital numbers, positioning the three-dimensional spatial relationship between a main element model and a virtual scene in design according to an optimal viewing point, combining the condition of an actual shooting place and shooting equipment parameters, and obtaining preliminary real-shooting camera position parameter information by zooming the position of a camera in a CG scene, moving the CG scene or changing the shooting angle;
the scene building unit is used for building a real-shooting screen platform according to the camera parameters and the real-scene shooting conditions and arranging lamplight matched with the three-dimensional scene light source;
and the synthesis unit is used for separating the image matting of the character performance content, synthesizing the image matting into the three-dimensional scene and correcting the color.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the optical illusion-like naked-eye 3D content production method of any of claims 1 to 4 when executing the program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method for producing an optical illusion-like naked-eye 3D content of any of claims 1 to 4.
CN202110898708.3A 2021-08-05 2021-08-05 Method and device for manufacturing naked eye 3D content with optical illusion Active CN113516761B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110898708.3A CN113516761B (en) 2021-08-05 2021-08-05 Method and device for manufacturing naked eye 3D content with optical illusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110898708.3A CN113516761B (en) 2021-08-05 2021-08-05 Method and device for manufacturing naked eye 3D content with optical illusion

Publications (2)

Publication Number Publication Date
CN113516761A true CN113516761A (en) 2021-10-19
CN113516761B CN113516761B (en) 2024-02-27

Family

ID=78068957

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110898708.3A Active CN113516761B (en) 2021-08-05 2021-08-05 Method and device for manufacturing naked eye 3D content with optical illusion

Country Status (1)

Country Link
CN (1) CN113516761B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114663558A (en) * 2022-03-14 2022-06-24 南京青臣创意数字科技有限公司 Construction method and device for realizing double-sided naked eye 3D animation effect, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001155177A (en) * 1999-11-25 2001-06-08 Sony Computer Entertainment Inc Entertainement device, image generating method, and storage medium
KR20110117487A (en) * 2010-04-21 2011-10-27 동서대학교산학협력단 Between real image and cg composed animation authoring method and system by using motion controlled camera
CN103426195A (en) * 2013-09-09 2013-12-04 天津常青藤文化传播有限公司 Method for generating three-dimensional virtual animation scenes watched through naked eyes
US20180068488A1 (en) * 2015-03-30 2018-03-08 Capcom Co., Ltd. Method of generating virtual three-dimensional space, video system, method of controlling video system, and non-transitory computer readable storage medium
CN111327886A (en) * 2020-03-05 2020-06-23 胡嘉君 3D light field rendering method and device
CN112714302A (en) * 2019-10-25 2021-04-27 苏州苏大维格科技集团股份有限公司 Naked eye 3D image manufacturing method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001155177A (en) * 1999-11-25 2001-06-08 Sony Computer Entertainment Inc Entertainement device, image generating method, and storage medium
KR20110117487A (en) * 2010-04-21 2011-10-27 동서대학교산학협력단 Between real image and cg composed animation authoring method and system by using motion controlled camera
CN103426195A (en) * 2013-09-09 2013-12-04 天津常青藤文化传播有限公司 Method for generating three-dimensional virtual animation scenes watched through naked eyes
US20180068488A1 (en) * 2015-03-30 2018-03-08 Capcom Co., Ltd. Method of generating virtual three-dimensional space, video system, method of controlling video system, and non-transitory computer readable storage medium
CN112714302A (en) * 2019-10-25 2021-04-27 苏州苏大维格科技集团股份有限公司 Naked eye 3D image manufacturing method and device
CN111327886A (en) * 2020-03-05 2020-06-23 胡嘉君 3D light field rendering method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZIYAO WANG等: "Naked Eye Pseudo 3D Display Technology Outside the Screen", 《JOURNAL OF PHYSICS:CONFERENCE SERIES》, vol. 1098, pages 1 - 9 *
张岩等: "基于摄像机标定的多视点图像校正算法", 《自动化与仪器仪表》, no. 07, pages 21 - 24 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114663558A (en) * 2022-03-14 2022-06-24 南京青臣创意数字科技有限公司 Construction method and device for realizing double-sided naked eye 3D animation effect, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113516761B (en) 2024-02-27

Similar Documents

Publication Publication Date Title
JP4481166B2 (en) Method and system enabling real-time mixing of composite and video images by a user
US20220264068A1 (en) Telepresence system and method
CN106730815B (en) Somatosensory interaction method and system easy to realize
CN107358659B (en) Multi-picture fusion display method based on 3D technology and storage device
US11488348B1 (en) Computing virtual screen imagery based on a stage environment, camera position, and/or camera settings
CN107134000B (en) Reality-fused three-dimensional dynamic image generation method and system
CN105282536A (en) Naked-eye 3D picture-text interaction method based on Unity3D engine
CN105069827A (en) Method for processing video transitions through three-dimensional model
US8979281B2 (en) System and method for imagination park tree projections
CN105007477A (en) Method for realizing naked eye 3D display based on Unity3D engine
JP2020526125A (en) Method and system for fusing user-specific content into video production
CN114035682A (en) Naked eye 3D interactive immersive virtual reality CAVE system
CN105578172B (en) Bore hole 3D image display methods based on Unity3D engines
CN105263038A (en) Method and apparatus for dynamic display of virtual auditorium
CN114035681A (en) 3D active stereo interactive immersive virtual reality CAVE system
CN102508398B (en) Method for performing ball screen projection processing on planar picture to be displayed by using computer
CN113516761A (en) Optical illusion type naked eye 3D content manufacturing method and device
CN111047711A (en) Immersive interactive Box image manufacturing method
CN109996060B (en) Virtual reality cinema system and information processing method
US20180213196A1 (en) Method of projection mapping
KR100799304B1 (en) A system and method for projecting high-resolution moving pictures under non-planar display environments
CN105263015A (en) Method for controlling glasses-free 3D video display based on Android device
CN102737567A (en) Multimedia orthographic projection digital model interactive integration system
JP2020530218A (en) How to project immersive audiovisual content
KR20230022153A (en) Single-image 3D photo with soft layering and depth-aware restoration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant