CN103136793A - Live-action fusion method based on augmented reality and device using the same - Google Patents

Live-action fusion method based on augmented reality and device using the same Download PDF

Info

Publication number
CN103136793A
CN103136793A CN2011103961666A CN201110396166A CN103136793A CN 103136793 A CN103136793 A CN 103136793A CN 2011103961666 A CN2011103961666 A CN 2011103961666A CN 201110396166 A CN201110396166 A CN 201110396166A CN 103136793 A CN103136793 A CN 103136793A
Authority
CN
China
Prior art keywords
augmented reality
data
image processor
video
fusion method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011103961666A
Other languages
Chinese (zh)
Inventor
张心宇
佟新鑫
赵刚
宫俊玲
杨光宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Institute of Automation of CAS
Original Assignee
Shenyang Institute of Automation of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Institute of Automation of CAS filed Critical Shenyang Institute of Automation of CAS
Priority to CN2011103961666A priority Critical patent/CN103136793A/en
Publication of CN103136793A publication Critical patent/CN103136793A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a live-action fusion method based on augmented reality and a device using the live-action fusion method based on the augmented reality. The device comprises an image processor, a camera and a displayer, wherein the image data output end of the camera is connected with the data input terminal of the image processor and the output end of the image processor is connected with the displayer. The method includes the following steps: inputting real scene video data collected by the camera to the image processor to be processed; transforming the video data into continuous single frame images, and transforming each frame of an image into graphic texture data; mapping the texture data into a rectangular surface beyond the distance of all the virtual scene through a three dimensional graphic engine; rendering the virtual scene generated by the rectangular surface and the image processor in the three dimensional graphic engine and forming an image video combing virtuality with reality. The live-action fusion method based on the augmented reality and the device using the live-action fusion method based on the augmented reality can greatly improve sense of reality of simulated training of aiming tracking devices and enable the simulated training of the aiming tracking devices to be expanded from indoor to outdoor.

Description

A kind of outdoor scene fusion method and device based on augmented reality
Technical field
The present invention relates to a kind of image augmented reality vision technique, specifically a kind of outdoor scene fusion method and device based on augmented reality.
Background technology
A large amount of all kinds of sights of using are at present taken aim at the simulated training device of following the tracks of the class device and mainly are based on virtual reality technology, generate pure virtual two-dimensional or three-dimensional scenic by computing machine, the depth feelings of scene, stereovision, weather effect, scene changes etc. all have fairly obvious difference with actual scenery, and scene fidelity is relatively poor.In addition, owing to adopting pure virtual scene, this type of equipment of the overwhelming majority is in indoor use, can not be combined with other outdoor training, has limited the usable range of trainer.
Summary of the invention
For simulated training device visual simulation effect in prior art exist scene fidelity relatively poor, the weak point such as can not be combined with other outdoor training, the technical problem to be solved in the present invention be to provide a kind of authenticity strong, easy to use, can be in outdoor utility, more be close to outdoor scene fusion method and the device of the augmented reality of environment under battle conditions.
For solving the problems of the technologies described above, the technical solution used in the present invention is:
The outdoor scene fusing device that the present invention is based on augmented reality comprises: image processor, video camera and display, and wherein the view data output terminal of video camera is connected with the data input pin of image processor, and the output terminal of image processor is connected to display.
Described video camera adopts the industry color digital camera, scan mode is for lining by line scan, the photo-sensitive cell wide high proportion is 4: 3, has the IEEE-1934b output interface, can be not less than the video data that 640 * 480 pixels, refresh rate are not less than 25fps with the rgb format output resolution ratio; Shutter is controlled with gain, with auto-exposure control.
Described display adopts the binocular-type glasses display, and resolution is not less than 640 * 480 pixels, and refresh rate is not less than 60Hz, and color depth is not less than 16, has the VGA signal input interface.
The outdoor scene fusion side that the present invention is based on augmented reality comprises the following steps:
Utilize the video data of the true scenery of camera acquisition, be input in image processor and process;
Above-mentioned video data is changed into continuous single-frame images, each two field picture is converted to the graphical textures data;
Utilize the three-dimensional picture engine that this data texturing is mapped on rectangular surfaces over all virtual scene distances;
In the three-dimensional picture engine, the virtual scene of above-mentioned rectangular surfaces and image processor generation is played up simultaneously, formed the image/video of actual situation combination.
Utilize the three-dimensional picture engine that this data texturing is mapped to over the process on the rectangular surfaces of all virtual scene distances to be: utilize the three-dimensional picture engine to draw a rectangular surfaces identical with the FOV cross-sectional sizes of three-dimensional picture engine on the space of all virtual scene distances surpassing; Utilize the three-dimensional picture engine to carry out texture filtering to data texturing; This data texturing is bundled on above-mentioned rectangular surfaces.
The video data of the true scenery of camera acquisition adopts the multithreading acquisition method, and the video that need to gather for each road is opened up the collecting thread of two asynchronous executions.
The invention has the beneficial effects as follows:
1. can significantly carry seeing and take aim at the sense of reality of following the tracks of the training of class unit simulation.
A large amount of all kinds of sights of using are at present taken aim at the computing machine vision simulation simulated training device overwhelming majority who follows the tracks of the class device and are generated pure virtual two-dimensional or three-dimensional scenic by computing machine, the depth feelings of scene, stereovision, weather effect, scene changes etc. all have fairly obvious difference with actual scenery, and scene fidelity is relatively poor.The present invention is combined real outdoor scene with the virtual scene that computing machine generates, the main body of scene is outdoor true scene, has significantly improved the sense of reality of Training scene.
2. sight is taken aim at follow the tracks of the training of class unit simulation from indoor expand to outdoor
Because the pure virtual Training scene of tracking class unit simulation training device use is taken aim in most of sight at present, so the overwhelming majority can only be in indoor use.Device of the present invention can not only have been expanded the usable range of analog training device in outdoor utility, and simulated training can be combined with other outdoor training subjects, has helped to improve trainee's integrating skills.
Description of drawings
Fig. 1 is that apparatus of the present invention form structured flowchart;
Fig. 2 is outdoor scene fusion method process flow diagram of the present invention;
Fig. 3 is CVid class framework schematic diagram in the inventive method;
Fig. 4 is CVid class model schematic diagram.
Embodiment
Below by by reference to the accompanying drawings the present invention being described in further detail.
As shown in Figure 1, composition structure for the outdoor scene fusing device that the present invention is based on augmented reality, comprise image processor, video camera and display, wherein the view data output terminal of video camera is connected with the data input pin of image processor, and the output terminal of image processor is connected to display.
Originally be in embodiment, image processor is a high-performance computer, connects the colorful digital video camera by the IEEE-1394b interface, connects the binocular-type glasses display by the VGA interface.Graphics engine be a kind of based on certain shape library (as OpenGL, DirectX etc.) change part platform or software environment, run on computing machine.Video camera adopts the industry color digital camera, scan mode is for lining by line scan, the photo-sensitive cell wide high proportion is 4: 3, has the IEEE-1934b output interface, can be not less than the video data that 640 * 480 pixels, refresh rate are not less than 25fps with the rgb format output resolution ratio.Shutter is controlled with gain, with auto-exposure control (automatic gate method, automatic gain method can be carried out the maximum gain method and be controlled exposure).The Flea2 colorful digital video camera that the present embodiment employing Point Grey Research company produces.During installation, the optical axis of digital color video camera is taken aim at the tracking means boresight with the simulation sight and is overlapped.The binocular-type glasses display is that a kind of resolution is not less than 640 * 480 pixels, and the eyes that refresh rate is not less than 60Hz are displays, and color depth is not less than 16, has the VGA signal input interface.The binocular-type glasses display resolution that the present embodiment adopts is 800 * 600,32 color depths, and refresh rate 60Hz is with the VGA signal input interface.
The outdoor scene fusion method that the present invention is based on augmented reality comprises the following steps:
Utilize the video data of the true scenery of camera acquisition, be input in image processor and process;
Above-mentioned video data is changed into continuous single-frame images, each two field picture is converted to data texturing;
Utilize the three-dimensional picture engine that this data texturing is mapped on rectangular surfaces over all virtual scene distances;
In the three-dimensional picture engine, the virtual scene of above-mentioned rectangular surfaces and image processor generation is played up simultaneously, formed the image/video of actual situation combination.
Utilize the three-dimensional picture engine that this data texturing is mapped to over the process on the rectangular surfaces of all virtual scene distances to be: utilize the three-dimensional picture engine to draw a rectangular surfaces identical with the FOV cross-sectional sizes of three-dimensional picture engine on the space of all virtual scene distances surpassing; Utilize the three-dimensional picture engine to carry out texture filtering to data texturing; This data texturing is bundled on above-mentioned rectangular surfaces.
The video data of the true scenery of camera acquisition adopts the multithreading acquisition method, and the video that need to gather for each road is opened up the collecting thread of two asynchronous executions.
As shown in Figure 2, be originally that the implementation procedure of outdoor scene fusion method in embodiment is as follows:
The video data that comes from digital camera becomes the data texturing of a sequence after the CVid class object is processed.
As shown in Figure 3, be CVid class framework schematic diagram.The CVid class object is implemented on ARToolkit storehouse and DSVL storehouse, realizes video acquisition, the single frames video is converted to the function of data texturing.The CVid class object has mainly been realized the continuous acquisition of camera video and obtaining of single-frame images.
As shown in Fig. 2,4, be the model of CVid class.Constructed fuction CVid () obtains to describe the .xml file that video camera arranges.RunCapture () reads these settings and sets up the acquisition buffer district, sets up collecting thread and begin to gather video.The high-performance acquisition method is a kind of asynchronous execution multithreading acquisition method, for the video that each road need to gather is opened up the collecting thread of two asynchronous executions to guarantee to gather uninterruptedly video.Wherein DummyThreadProc (LPVOID lpParameter), be the collecting thread of opening up specially, for guaranteeing the performance of gatherer process, opened up two DummyThreadProc threads, in order that can gather incessantly video.SceneBlend () is responsible for data texturing and the virtual scene of outdoor scene are merged, and its concrete processing procedure is as follows: the VideoGetImage () method of calling the CVid class object obtains the former frame of working as of video, is kept in global variable; A frame video that obtains is become texture; Utilize that the three-dimensional picture engine is upper in the space (the present embodiment is set on the space at ten thousand metres place) that surpasses all virtual scene distances draws a just rectangular surfaces identical with the FOV cross-sectional sizes of three-dimensional picture engine; Utilize the three-dimensional picture engine to carry out texture filtering to data texturing; This data texturing is bundled on above-mentioned rectangular surfaces, and mixes with virtual scene.Call the VideoGetNext () method of CVid object and obtain the next frame image.After SceneBlend () was called and carries out, the rectangle of drafting and the texture of mapping entered frame buffer, were next carried out depth test and the subsequent operation such as were played up by the three-dimensional picture engine internal, finally generated the figure of a frame actual situation combination.This process circulates in carries out the video that has just formed the actual situation combination circularly in the simulation software major cycle, flow at last the binocular-type glasses display.

Claims (6)

1. outdoor scene fusion method based on augmented reality is characterized in that comprising the following steps:
Utilize the video data of the true scenery of camera acquisition, be input in image processor and process;
Above-mentioned video data is changed into continuous single-frame images, each two field picture is converted to the graphical textures data;
Utilize the three-dimensional picture engine that this data texturing is mapped on rectangular surfaces over all virtual scene distances;
In the three-dimensional picture engine, the virtual scene of above-mentioned rectangular surfaces and image processor generation is played up simultaneously, formed the image/video of actual situation combination.
2. a kind of outdoor scene fusion method based on augmented reality according to claim 1 is characterized in that: utilize the three-dimensional picture engine that this data texturing is mapped to over the process on the rectangular surfaces of all virtual scene distances to be: utilize the three-dimensional picture engine to draw a rectangular surfaces identical with the FOV cross-sectional sizes of three-dimensional picture engine on the space of all virtual scene distances surpassing; Utilize the three-dimensional picture engine to carry out texture filtering to data texturing; This data texturing is bundled on above-mentioned rectangular surfaces.
3. a kind of outdoor scene fusion method based on augmented reality according to claim 1, it is characterized in that: the video data of the true scenery of camera acquisition adopts the multithreading acquisition method, and the video that need to gather for each road is opened up the collecting thread of two asynchronous executions.
4. outdoor scene fusing device based on augmented reality, it is characterized in that comprising: image processor, video camera and display, wherein the view data output terminal of video camera is connected with the data input pin of image processor, and the output terminal of image processor is connected to display.
5. by the outdoor scene fusing device based on augmented reality claimed in claim 4, it is characterized in that: described video camera adopts the industry color digital camera, scan mode is for lining by line scan, the photo-sensitive cell wide high proportion is 4: 3, have the IEEE-1934b output interface, can be not less than the video data that 640 * 480 pixels, refresh rate are not less than 25fps with the rgb format output resolution ratio; Shutter is controlled with gain, with auto-exposure control.
6. by the described outdoor scene fusing device based on augmented reality of claim 4, it is characterized in that: described display adopts the binocular-type glasses display, and resolution is not less than 640 * 480 pixels, and refresh rate is not less than 60Hz, color depth is not less than 16, has the VGA signal input interface.
CN2011103961666A 2011-12-02 2011-12-02 Live-action fusion method based on augmented reality and device using the same Pending CN103136793A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011103961666A CN103136793A (en) 2011-12-02 2011-12-02 Live-action fusion method based on augmented reality and device using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011103961666A CN103136793A (en) 2011-12-02 2011-12-02 Live-action fusion method based on augmented reality and device using the same

Publications (1)

Publication Number Publication Date
CN103136793A true CN103136793A (en) 2013-06-05

Family

ID=48496576

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011103961666A Pending CN103136793A (en) 2011-12-02 2011-12-02 Live-action fusion method based on augmented reality and device using the same

Country Status (1)

Country Link
CN (1) CN103136793A (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761734A (en) * 2014-01-08 2014-04-30 北京航空航天大学 Binocular stereoscopic video scene fusion method for keeping time domain consistency
CN104539925A (en) * 2014-12-15 2015-04-22 北京邮电大学 3D scene reality augmentation method and system based on depth information
CN105005836A (en) * 2014-04-18 2015-10-28 北京睿蓝空信息技术有限公司 Site integrated there-dimensional system and site integrated three-dimensional management platform
CN105488840A (en) * 2015-11-26 2016-04-13 联想(北京)有限公司 Information processing method and electronic equipment
CN105787994A (en) * 2016-01-26 2016-07-20 王创 Entertainment method using 3D technology for simulating street scenery
CN106027855A (en) * 2016-05-16 2016-10-12 深圳迪乐普数码科技有限公司 Method and terminal for realizing virtual rocker arm
CN106130886A (en) * 2016-07-22 2016-11-16 聂迪 The methods of exhibiting of extension information and device
CN106713988A (en) * 2016-12-09 2017-05-24 福建星网视易信息***有限公司 Beautifying method and system for virtual scene live
CN107154197A (en) * 2017-05-18 2017-09-12 河北中科恒运软件科技股份有限公司 Immersion flight simulator
CN107222718A (en) * 2017-06-20 2017-09-29 中国人民解放军78092部队 Actual situation combination remote exhibition device and method based on augmented reality
CN107519640A (en) * 2016-12-07 2017-12-29 福建蓝帽子互动娱乐科技股份有限公司 One kind fishing toy, system and method
CN107682688A (en) * 2015-12-30 2018-02-09 视辰信息科技(上海)有限公司 Video real time recording method and recording arrangement based on augmented reality
CN108536286A (en) * 2018-03-22 2018-09-14 上海皮格猫信息科技有限公司 A kind of VR work auxiliary system, method and the VR equipment of fusion real-world object
WO2019041351A1 (en) * 2017-09-04 2019-03-07 艾迪普(北京)文化科技股份有限公司 Real-time aliasing rendering method for 3d vr video and virtual three-dimensional scene
CN114185320A (en) * 2020-09-15 2022-03-15 中国科学院软件研究所 Evaluation method, device and system for unmanned system cluster and storage medium
CN114419293A (en) * 2022-01-26 2022-04-29 广州鼎飞航空科技有限公司 Augmented reality data processing method, device and equipment
CN115661419A (en) * 2022-12-26 2023-01-31 广东新禾道信息科技有限公司 Live-action three-dimensional augmented reality visualization method and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1609895A (en) * 2003-10-20 2005-04-27 上海科技馆 Method for generating animal image moved along with person
CN101021952A (en) * 2007-03-23 2007-08-22 北京中星微电子有限公司 Method and apparatus for realizing three-dimensional video special efficiency

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1609895A (en) * 2003-10-20 2005-04-27 上海科技馆 Method for generating animal image moved along with person
CN101021952A (en) * 2007-03-23 2007-08-22 北京中星微电子有限公司 Method and apparatus for realizing three-dimensional video special efficiency

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王学伟等: "基于增强现实的动态红外视景生成研究", 《红外与激光工程》, vol. 37, 30 June 2008 (2008-06-30), pages 358 - 361 *

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761734B (en) * 2014-01-08 2016-09-28 北京航空航天大学 A kind of binocular stereoscopic video scene fusion method of time domain holding consistency
CN103761734A (en) * 2014-01-08 2014-04-30 北京航空航天大学 Binocular stereoscopic video scene fusion method for keeping time domain consistency
CN105005836A (en) * 2014-04-18 2015-10-28 北京睿蓝空信息技术有限公司 Site integrated there-dimensional system and site integrated three-dimensional management platform
CN104539925A (en) * 2014-12-15 2015-04-22 北京邮电大学 3D scene reality augmentation method and system based on depth information
CN105488840B (en) * 2015-11-26 2019-04-23 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN105488840A (en) * 2015-11-26 2016-04-13 联想(北京)有限公司 Information processing method and electronic equipment
CN107682688A (en) * 2015-12-30 2018-02-09 视辰信息科技(上海)有限公司 Video real time recording method and recording arrangement based on augmented reality
CN105787994A (en) * 2016-01-26 2016-07-20 王创 Entertainment method using 3D technology for simulating street scenery
CN106027855A (en) * 2016-05-16 2016-10-12 深圳迪乐普数码科技有限公司 Method and terminal for realizing virtual rocker arm
CN106027855B (en) * 2016-05-16 2019-06-25 深圳迪乐普数码科技有限公司 A kind of implementation method and terminal of virtual rocker arm
CN106130886A (en) * 2016-07-22 2016-11-16 聂迪 The methods of exhibiting of extension information and device
CN107519640A (en) * 2016-12-07 2017-12-29 福建蓝帽子互动娱乐科技股份有限公司 One kind fishing toy, system and method
CN106713988A (en) * 2016-12-09 2017-05-24 福建星网视易信息***有限公司 Beautifying method and system for virtual scene live
CN107154197A (en) * 2017-05-18 2017-09-12 河北中科恒运软件科技股份有限公司 Immersion flight simulator
CN107222718A (en) * 2017-06-20 2017-09-29 中国人民解放军78092部队 Actual situation combination remote exhibition device and method based on augmented reality
US11076142B2 (en) 2017-09-04 2021-07-27 Ideapool Culture & Technology Co., Ltd. Real-time aliasing rendering method for 3D VR video and virtual three-dimensional scene
WO2019041351A1 (en) * 2017-09-04 2019-03-07 艾迪普(北京)文化科技股份有限公司 Real-time aliasing rendering method for 3d vr video and virtual three-dimensional scene
CN108536286A (en) * 2018-03-22 2018-09-14 上海皮格猫信息科技有限公司 A kind of VR work auxiliary system, method and the VR equipment of fusion real-world object
CN114185320A (en) * 2020-09-15 2022-03-15 中国科学院软件研究所 Evaluation method, device and system for unmanned system cluster and storage medium
CN114185320B (en) * 2020-09-15 2023-10-24 中国科学院软件研究所 Evaluation method, device and system for unmanned system cluster and storage medium
CN114419293A (en) * 2022-01-26 2022-04-29 广州鼎飞航空科技有限公司 Augmented reality data processing method, device and equipment
CN115661419A (en) * 2022-12-26 2023-01-31 广东新禾道信息科技有限公司 Live-action three-dimensional augmented reality visualization method and system
CN115661419B (en) * 2022-12-26 2023-04-28 广东新禾道信息科技有限公司 Live-action three-dimensional augmented reality visualization method and system

Similar Documents

Publication Publication Date Title
CN103136793A (en) Live-action fusion method based on augmented reality and device using the same
CN103019507B (en) Method for changing view point angles and displaying three-dimensional figures based on human face tracking
US11436787B2 (en) Rendering method, computer product and display apparatus
CN101968890B (en) 360-degree full-view simulation system based on spherical display
CN106897976B (en) Single video card triple channel solid what comes into a driver's projection software based on GPU corrects fusion method
CN109510975B (en) Video image extraction method, device and system
CN102572391B (en) Method and device for genius-based processing of video frame of camera
CN113891060B (en) Free viewpoint video reconstruction method, play processing method, device and storage medium
CN101631257A (en) Method and device for realizing three-dimensional playing of two-dimensional video code stream
CN112446939A (en) Three-dimensional model dynamic rendering method and device, electronic equipment and storage medium
JP2012253690A (en) Program, information storage medium, and image generation system
US11783445B2 (en) Image processing method, device and apparatus, image fitting method and device, display method and apparatus, and computer readable medium
CN103489219A (en) 3D hair style effect simulation system based on depth image analysis
CN113238472B (en) High-resolution light field display method and device based on frequency domain displacement
CN105578172A (en) Naked-eye 3D video displaying method based on Unity 3D engine
CN102647602B (en) System for converting 2D (two-dimensional) video into 3D (three-dimensional) video on basis of GPU (Graphics Processing Unit)
CN104217461A (en) A parallax mapping method based on a depth map to simulate a real-time bump effect
EP3057316B1 (en) Generation of three-dimensional imagery to supplement existing content
CN108564654B (en) Picture entering mode of three-dimensional large scene
RU2606875C2 (en) Method and system for displaying scaled scenes in real time
CN102521876A (en) Method and system for realizing three dimensional (3D) stereoscopic effect of user interface
CN109658488A (en) Accelerate the method for decoding camera shooting head video flowing in a kind of virtual reality fusion system by Programmable GPU
CN202331865U (en) Real scene fusion device based on augmented reality
CN203250506U (en) Video generator for mixed reality
CN113093903B (en) Image display method and display equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20130605