CN105989623B - The implementation method of augmented reality application based on handheld mobile device - Google Patents

The implementation method of augmented reality application based on handheld mobile device Download PDF

Info

Publication number
CN105989623B
CN105989623B CN201510075241.7A CN201510075241A CN105989623B CN 105989623 B CN105989623 B CN 105989623B CN 201510075241 A CN201510075241 A CN 201510075241A CN 105989623 B CN105989623 B CN 105989623B
Authority
CN
China
Prior art keywords
scene
realization
augmented reality
application
key frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510075241.7A
Other languages
Chinese (zh)
Other versions
CN105989623A (en
Inventor
曹阳
杨旭波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN201510075241.7A priority Critical patent/CN105989623B/en
Publication of CN105989623A publication Critical patent/CN105989623A/en
Application granted granted Critical
Publication of CN105989623B publication Critical patent/CN105989623B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

A kind of augmented reality application implementation method based on handheld mobile device in virtual reality technology field carries out recognition and tracking to pattern by the client software being deployed on handheld mobile device, and is empty original application establishing scene for pattern positioning;Then from material database import needed for material be added in scene, by the visualized operation based on graphical interfaces applied in static scene realize;Dynamic scene realization is carried out to the material in scene by the visualized operation based on graphical interfaces again;The application after realization is finally saved as into XML file and is uploaded to the publication that server end completes mobile augmented reality application.This invention simplifies implementation process is applied, the realization efficiency of user is improved;It specifies all key frames one by one without user, greatly improves the realization efficiency and accuracy of Comlex-locus animation.

Description

The implementation method of augmented reality application based on handheld mobile device
Technical field
It is specifically a kind of to be based on handheld mobile device the present invention relates to a kind of technology in virtual reality technology field Augmented reality application implementation method.
Background technique
Augmented reality have by " seamless " the integrated characteristic of real world information and virtual world information, make its The various application fields such as manufacture, education possess vast potential for future development.And due to universal (such as smart phone of handheld mobile device With tablet computer) and other new techniques development (pattern identification and tracking such as based on two dimensional code, based on physical feature), increase Strong reality technology is also gradually applied to the mobile applications fields such as guide to visitors, has embodied huge market potential and application prospect.But The exploitation of mobile augmented reality application at present is related to a large amount of graphics and the relevant coding work of platform, therefore substantially by professional people Member's customized development according to demand.To simplify implementation process, reduces and realize difficulty, or even make the ordinary user that will not be programmed (following Abbreviation ordinary user) it also can be carried out the research of the design tool and implementation method applied for mobile augmented reality using design Also increasingly popular.
Currently, there are two main problems for the implementation method applied of the mobile augmented reality towards ordinary user: first, it is existing There is mobile augmented reality application implementation process complex.General use first encodes the process generated afterwards, i.e., user needs to be directed to Target pattern perhaps checked by way of compiling or issuing again and most terminate by other kinds of marker design virtual scene Fruit.The process causes user to want to be deviated with real income, then make user fall into scene details adjustment and can not be wholwe-hearted In design application content.Second, existing tool supports the Visual Implementation of dynamic scene in application insufficient.Generally only provide Script realizes environment, and for ordinary user, this application implementation is difficult to grasp and use.
After searching and discovering the prior art, the mobile augmented reality application design tool of current comparative maturity be by The Metaio Creator software of Metaio company exploitation, the tool are deployed on PC, are provided for picture, really The visualization static scene design function of object, actual environment etc. target object, can be under three visual angles to the element in scene Material translated, rotated, zoom operations, and the material of support includes pattern, threedimensional model, video etc., and provides script Editing machine by write AREL script (Augmented Reality Experience Language) applied in dynamic field The design and realization of scape.But the technology does not simultaneously solve above-mentioned two large problems.
Chinese patent literature CN103150658A, publication date 2013-06-12 describe " a kind of terminaloriented user's Reality enhancing custom-built system and method ", this method is related to computer augmented reality, image recognition, internet area, more particularly to The reality enhancing custom-built system and method for terminaloriented user Internet-based, specifically includes that service management unit, Yong Hushe Count production unit, AR processing unit, installation kit packaged unit, installation kit administrative unit, user's notification unit.This method will enhance The content of reality is mutually separated with technology realization, provides interactive circle of terminal user's flexible choice customization augmented reality content Face automatically generates augmented reality application installation package from the background, and augmented reality that is flexible, quick, accurately generating customization needed for user is answered Use system.But the technology still use it is traditional first encode the implementation process generated afterwards, user need to repeatedly by designing and producing, Be packaged, the process of installation modifies debugging to augmented reality application, be not suitable for the realization when participating in the cintest based on handheld mobile device Situation.
Eitsuka M. and Hirakawa M. are in " IIAI International Conference on Advanced Applied Informatics " on " the Authoring Animations of Virtual Objects in that delivers Augmented Reality-Based 3D Space " propose in (2013) it is a kind of based on the mobile enhancing for defining key frame Model cartoon implementing method in practical application is difficult with dynamic field in script-editor realization application to solve ordinary user The problem of scape.This method is directed to the rigid motion animation of threedimensional model, inscribes when user specifies several, and model is in the scene Corresponding posture, i.e., a series of key frame, system automatically at run time carry out in interpolation calculation generation key frame Between frame, thus generate application in dynamic scene.But at model sport track or complex attitudes vibration, especially comprising When a large amount of curvilinear motions, due to a large amount of key frame for needing user to specify one by one, the applicability of this method is insufficient.
Summary of the invention
The present invention In view of the above shortcomings of the prior art, proposes that a kind of augmented reality based on handheld mobile device is answered User is arranged to carry out using the time of running using realizing, i.e., user is to one by simplifying implementation process with implementation method Scene is content needed for empty running application addition, and the operation of each step can show the final effect of the application, It feeds back user in time, realizes efficiency to improve the application of user.It is realized by the method based on successive frame animation The dynamic scene in provides the scribble motion profile that function allows user rapidly to draw material in scene, subsequent system is certainly Dynamic to generate a series of crucial frame delineation animation, user can carry out part and whole adjustment to animation on this basis. Compared with prior art, it this invention simplifies the application implementation process of user, allows users to quickly realize that augmented reality is answered With, while user can be allowed to realize the dynamic scene in increasingly complex augmented reality application.
The present invention is achieved by the following technical solutions:
The augmented reality application implementation method based on handheld mobile device that the present invention relates to a kind of, by being deployed in hand-held shifting Client software in dynamic equipment carries out recognition and tracking to pattern, and initially answers establishing scene for pattern positioning for empty With;Then the material needed for importing in material database is added in scene, is answered by the visualized operation based on graphical interfaces Static scene in is realized;It is real that dynamic scene is carried out to the material in scene by the visualized operation based on graphical interfaces again It is existing;The application after realization is finally saved as into extending mark language format (XML) file and is uploaded to server end and completes movement The publication of augmented reality application.
The present invention specifically includes the following steps:
Step 1 carries out the recognition and tracking based on physical feature to predefined pattern, and establishing scene is the first of sky Beginning mobile augmented reality application;
The pattern is positioned for scene, and the tracking link of this step has determined that virtual scene is relative to target figure in application The three-dimensional system of coordinate of case, the coordinate system for specifying posture of the material relative to target pattern during subsequent realization, i.e., position, Angle and size.
Step 2 adds required material into scene, and carry out static state by visualized operation by retrieving material database Scene realization;
The static scene, which is realized, to be referred to: the touch-control interaction adjustment scene that user is assisted using the plane of reference and projection line The posture of middle material.
The plane of reference refers to: cross user in scene choose material central point p and specified normal direction n with grid Virtual plane Π.The plane of reference includes to be parallel to target pattern and equipment oriented images first two perpendicular to target pattern and always.
The projection line refers to: the central point p of the not selected material of other in sceneiTo the vertical line L of the plane of referencei
The adjustment material posture, specifically includes the following steps:
1) moving operation is selected, material is moved to designated position in space;
2) rotation process is selected, material is rotated along any axis to required angle;
3) zoom operations are selected, material is zoomed into required size.
The material pose adjustment step can be carried out by any order, and unlimited number of operations.
Step 3 carries out dynamic scene realization to the material in scene by visualized operation;
The dynamic scene realization refers to: for the material in scene, by way of designing continuous frame animation, completing The design of rigid motion animation.
Dynamic scene realize comprising the following steps:
1) material is chosen, selection scribble function draws material by reference to face and the auxiliary of projection line in three dimensions Motion profile;
2) system will record touch point of user during scribble, and map that corresponding three on current reference face Coordinate points are tieed up, and corresponding key frame timestamp is generated to these three-dimensional coordinate points according to preset movement velocity, are then completed The generation of key frame;
3) to existing key frame, user can carry out the adjustment on partly or wholly;
The adjustment includes:
Editor's key frame: insertion key frame deletes keys and selects key frame and change its corresponding material posture.
Change animation mode: being single or loop play by cartoon setting.
Change animation duration: adjustment animation time, system can recalculate adjustment according to the time interval between key frame The timestamp of each key frame afterwards.
Application after realization is saved as XML format file and is uploaded to server end by step 4.
Technical effect
Compared with prior art, the process that the user that the present invention uses is realized in the time of running of application, and is first compiled The process generated after code is compared, and the realization efficiency that user is improved using implementation process is simplified;
The dynamic scene implementation method based on successive frame animation that the present invention uses, can be with quickly creation movement of scribbling Track is that animation increases details as animation prototype, then by editor's key frame.Compared in a few support application at present The implementation method of Visual Dynamic Scenario Design specifies all key frames without user one by one, greatly improves complicated fortune The realization efficiency and accuracy of dynamic rail mark animation.
Detailed description of the invention
Fig. 1 is implementation flow chart proposed by the present invention.
Fig. 2 is implementation example figure of the invention.
Specific embodiment
It elaborates below to the embodiment of the present invention, the present embodiment carries out under the premise of the technical scheme of the present invention Implement, the detailed implementation method and specific operation process are given, but protection scope of the present invention is not limited to following implementation Example.
Embodiment 1
As shown in Figure 1, using newton in the present embodiment puts scene.In the scene, the swing ball that newton puts the rightmost side can be fallen Under, after colliding adjacent swing ball, the swing ball of the leftmost side can be bounced.Augmented reality application comprising Three Dimensional Dynamic Scene can give birth to It is dynamic visually to show this physical phenomenon.The present embodiment is illustrated using above-mentioned realization step.
As shown in Figure 1, the present embodiment includes the following steps:
Step 1, user are empty application for the newly-built scene of target pattern.
Step 2, on the application foundation that step 1 creates, user, which searches from model library and adds newton, puts model Into scene, the reference attitude of most right swing ball is adjusted in the auxiliary operation by reference to face and projection line, including following specific step It is rapid:
The first step selects search operation, inputs the model name " NewtonBall " of retrieval, can return to all names herein In include the keyword model.Required model is selected in the model retrieved, system is automatically added in scene.This Step has imported material in application scenarios;
Second step clicks rightmost side swing ball in the scene, and system shows the plane of reference, most by the adjustment of mobile and rotation process Right side swing ball reference attitude.
Step 3, based on the static scene in step 2, the swing ball for putting right, left two sides to newton realizes whereabouts and bullet respectively The animation risen, the specific steps are as follows:
The first step selects rightmost side swing ball, and the fall trajectory of swing ball is drawn by scribble;
Second step, to the point p on tracki(i=0~n), wherein p0For starting point, pnFor end point, system is according to default Speed v calculates corresponding timestamp ti(i=0~n): t0=0, ti=ti‐1+|pi‐pi‐1|/v (i=1~n).This step is used for Generate the key frame for describing the animation;
Third step adjusts animation duration to required duration t ', and system recalculates key frame timestamp ti' (i=0~n): ti'=t0+(ti–t0)/tn× t ' (i=0~n);
4th step chooses leftmost side swing ball, and is inserted into a frame key frame at the t ' moment, for keeping leftmost side swing ball most Swing ball whereabouts in right side collides stationary before adjacent swing ball.Then swing ball of scribbling bounces track;
5th step, system generate key frame with second step;
6th step is adjusted the duration for bouncing animation with third step.
Step 4 saves and applies and be uploaded to server, and system can will be deposited using XML format file is saved as herein It takes.
As shown in Fig. 2, figure (a) is initial space-time scene, figure (b) is that user examines in model library for the operating process of the present embodiment Rope model, figure (c) are that user adds newton pendulum model into scene, and figure (d) is the appearance that user touches swing ball in spherical model in adjustment State, figure (e) are users in the whereabouts animation for realizing rightmost side swing ball, and figure (f) is that user is realizing that bouncing for leftmost side swing ball is dynamic It draws, figure (g) is that the static scene of Metaio Creator realizes that environment, figure (h) are that the dynamic scene of Metaio Creator is real Existing environment.It can be seen from the figure that this method is more intuitive compared with prior art, is easy to learn, also it is more suitable for program Ordinary user use.
Present invention could apply in middle school teaching field, teacher can be designed when preparing lessons for the illustration in textbook Augmented reality application, and will be using server is uploaded to, subsequent students can pass through the client on smart phone or tablet computer Hold browse application.Compared with the classrooms such as traditional pattern, lantern slide methods of exhibiting, the augmented reality application comprising moving three-dimensional scene More intuitive, vivo knowledge point can be shown.

Claims (9)

1. a kind of augmented reality application implementation method based on handheld mobile device, which is characterized in that by being deployed in hand-held shifting Client software in dynamic equipment carries out recognition and tracking to pattern, and initially answers establishing scene for pattern positioning for empty With;Then the material needed for importing in material database is added in scene, is answered by the visualized operation based on graphical interfaces Static scene in is realized;It is real that dynamic scene is carried out to the material in scene by the visualized operation based on graphical interfaces again It is existing;The application after realization is finally saved as into XML file and is uploaded to the publication that server end completes mobile augmented reality application;
The dynamic scene realize comprising the following steps:
1) material is chosen, selection scribble function draws material movement by reference to face and the auxiliary of projection line in three dimensions Track;
2) system will record touch point of user during scribble, and map that corresponding three-dimensional seat on current reference face Punctuate, and corresponding key frame timestamp is generated to these three-dimensional coordinate points according to preset movement velocity, it then completes crucial The generation of frame;
3) adjustment on partly or wholly is carried out to existing key frame.
2. according to the method described in claim 1, it is characterized in that, the method specifically includes the following steps:
Step 1 carries out the recognition and tracking based on physical feature to predefined pattern, and establishing scene is empty initial shifting Dynamic augmented reality application;
Step 2 adds required material into scene, and carry out static scene by visualized operation by retrieving material database It realizes;
Step 3 carries out dynamic scene realization to the material in scene by visualized operation;
Application after realization is saved as XML format file and is uploaded to server end by step 4.
3. according to the method described in claim 2, it is characterized in that, the predefined pattern for scene position, it is described Tracking has determined three-dimensional system of coordinate of the virtual scene relative to target pattern in application, and the coordinate system is for specifying subsequent realized Posture of the material relative to target pattern in journey, i.e. position, angle and size.
4. method according to claim 1 or 2, characterized in that the static scene realization refers to: user uses reference The posture of material in the touch-control interaction adjustment scene of face and projection line auxiliary.
5. according to the method described in claim 4, it is characterized in that, the plane of reference refers to: cross scene in user choose material Central point p and specified normal direction n the virtual plane Π with grid;The plane of reference includes to be parallel to target pattern and perpendicular to mesh Case of marking on a map and equipment oriented images first two always.
6. according to the method described in claim 4, it is characterized in that, the projection line refers to: the selected element of other in scene The central point p of materialiTo the vertical line L of the plane of referencei
7. according to the method described in claim 4, it is characterized in that, the posture of material in the adjustment scene, specifically include with Lower step:
1) moving operation is selected, material is moved to designated position in space;
2) rotation process is selected, material is rotated along any axis to required angle;
3) zoom operations are selected, material is zoomed into required size.
8. method according to claim 1 or 2, characterized in that the dynamic scene realization refers to: in scene Material completes the design of rigid motion animation by way of designing continuous frame animation.
9. according to the method described in claim 1, it is characterized in that, it is described partly or wholly on adjustment include:
Editor's key frame: insertion key frame deletes keys and selects key frame and change its corresponding material posture;
Change animation mode: being single or loop play by cartoon setting;
Change animation duration: adjustment animation time, system can recalculate every after adjustment according to the time interval between key frame The timestamp of a key frame.
CN201510075241.7A 2015-02-12 2015-02-12 The implementation method of augmented reality application based on handheld mobile device Active CN105989623B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510075241.7A CN105989623B (en) 2015-02-12 2015-02-12 The implementation method of augmented reality application based on handheld mobile device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510075241.7A CN105989623B (en) 2015-02-12 2015-02-12 The implementation method of augmented reality application based on handheld mobile device

Publications (2)

Publication Number Publication Date
CN105989623A CN105989623A (en) 2016-10-05
CN105989623B true CN105989623B (en) 2019-01-11

Family

ID=57042007

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510075241.7A Active CN105989623B (en) 2015-02-12 2015-02-12 The implementation method of augmented reality application based on handheld mobile device

Country Status (1)

Country Link
CN (1) CN105989623B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109117034A (en) * 2017-06-23 2019-01-01 卢俊谚 The editing device and system of augmented reality are integrated on line
CN109727504A (en) * 2018-12-19 2019-05-07 安徽建筑大学 A kind of animation interactive system based on Art Design
CN111475026B (en) * 2020-04-10 2023-08-22 李斌 Spatial positioning method based on mobile terminal application augmented virtual reality technology
CN111899348A (en) * 2020-07-14 2020-11-06 四川深瑞视科技有限公司 Projection-based augmented reality experiment demonstration system and method
CN113867872A (en) * 2021-09-30 2021-12-31 北京市商汤科技开发有限公司 Project editing method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102867333A (en) * 2012-07-18 2013-01-09 西北工业大学 DI-GUY-based virtual character behavior visualization method
CN103679204A (en) * 2013-12-23 2014-03-26 上海安琪艾可网络科技有限公司 Image identification and creation application system and method based on intelligent mobile device platform
CN103870485A (en) * 2012-12-13 2014-06-18 华为终端有限公司 Method and device for achieving augmented reality application

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102867333A (en) * 2012-07-18 2013-01-09 西北工业大学 DI-GUY-based virtual character behavior visualization method
CN103870485A (en) * 2012-12-13 2014-06-18 华为终端有限公司 Method and device for achieving augmented reality application
CN103679204A (en) * 2013-12-23 2014-03-26 上海安琪艾可网络科技有限公司 Image identification and creation application system and method based on intelligent mobile device platform

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Authoring Animations of Virtual Objects in Augmented Reality-based 3D Space;Manabu Eitsuka 等;《2013 Second IIAI International Conference on Advanced Applied Informatics》;20131231;第256-261页
Jes'us Gimeno1 等.AN OCCLUSION-AWARE AR AUTHORING TOOL FOR ASSEMBLY AND REPAIR TASKS.《Computer Graphics Theory and Applications》.2012,第377-386页.
Sketching up the world: in situ authoring for mobile Augmented Reality;Tobias Langlotz 等;《Personal and Ubiquitous Computing》;20111231;第623–630页
易用型增强现实***开发工具的设计与应用;邵兴旦;《中国优秀硕士学位论文全文数据库信息科技辑》;20110215(第02期);I138-271

Also Published As

Publication number Publication date
CN105989623A (en) 2016-10-05

Similar Documents

Publication Publication Date Title
US10657693B2 (en) Method for scripting inter-scene transitions
Amin et al. Comparative study of augmented reality SDKs
CN107835436B (en) A kind of real-time virtual reality fusion live broadcast system and method based on WebGL
CN105989623B (en) The implementation method of augmented reality application based on handheld mobile device
Li et al. Layoutgan: Synthesizing graphic layouts with vector-wireframe adversarial networks
CN105339987A (en) Image extraction and image-based rendering for manifolds of terrestrial, aerial and/or crowd-sourced visualizations
US8508534B1 (en) Animating objects using relative motion
CN103258338A (en) Method and system for driving simulated virtual environments with real data
De Souza et al. Generating human action videos by coupling 3d game engines and probabilistic graphical models
US11393153B2 (en) Systems and methods performing object occlusion in augmented reality-based assembly instructions
Nguyen et al. Civil War battlefield experience: Historical event simulation using augmented reality technology
CN110120087A (en) The label for labelling method, apparatus and terminal device of three-dimensional sand table
Zhang et al. The Application of Folk Art with Virtual Reality Technology in Visual Communication.
CN103606178A (en) Interactive motion data acquisition method based on portable terminal
Chen et al. Research on augmented reality system for childhood education reading
Mao et al. A sketch-based gesture interface for rough 3D stick figure animation
Brutzman Teaching 3D modeling and simulation: virtual kelp forest case study
Romli et al. AR@ UNIMAP: A development of interactive map using augmented reality
CN111739134B (en) Model processing method and device for virtual character and readable storage medium
Tao A VR/AR-based display system for arts and crafts museum
Sundari et al. Development of 3D Building Model Using Augmented Reality
Zhang et al. Implementation of a notation-based motion choreography system
Quevedo-Fernández et al. idAnimate: a general-Purpose animation sketching tool for Multi-Touch devices
Moloney Augmented reality visualisation of the built environment to support design decision making
Hsu et al. The visual web user interface design in augmented reality technology

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant