CN103677211B - Realize the device and method of augmented reality application - Google Patents

Realize the device and method of augmented reality application Download PDF

Info

Publication number
CN103677211B
CN103677211B CN201310664448.9A CN201310664448A CN103677211B CN 103677211 B CN103677211 B CN 103677211B CN 201310664448 A CN201310664448 A CN 201310664448A CN 103677211 B CN103677211 B CN 103677211B
Authority
CN
China
Prior art keywords
terminal
frequency
tracking module
picture frame
reference information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310664448.9A
Other languages
Chinese (zh)
Other versions
CN103677211A (en
Inventor
刘峥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201310664448.9A priority Critical patent/CN103677211B/en
Publication of CN103677211A publication Critical patent/CN103677211A/en
Application granted granted Critical
Publication of CN103677211B publication Critical patent/CN103677211B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of device and method realizing augmented reality application, relate to human-computer interaction technique field, when picture frame photographic head shot for solving AR to be applied in processes, the how reasonably problem at setting process interval.The device realizing augmented reality application provided by the invention includes: acquiring unit, is used for obtaining the first reference information, and described first reference information is that during augmented reality AR is applied, the frequency of calling of identification module is adjusted the reference information of time institute's foundation;Processing unit, for according to described acquiring unit obtain first reference information adjust described identification module call frequency.The present invention is applicable to field of human-computer interaction, is used for realizing augmented reality application.

Description

Realize the device and method of augmented reality application
Technical field
The present invention relates to human-computer interaction technique field, particularly relate to a kind of device and method realizing augmented reality application.
Background technology
AR (AugmentedReality, augmented reality) technology is a kind of brand-new human-computer interaction technology, by intelligent terminal and visualization technique, virtual information is applied to real world so that virtual information and real world are added to same picture simultaneously or user is presented in space.Along with popularizing of intelligent terminal, the application of AR technology is further extensive, it is possible to experience by installing AR application on intelligent terminal.Concrete, the workflow of AR application is as follows: terminal passes through photographic head photographing image frame;Picture frame is identified, it is determined that AR target object;AR target object in picture frame is tracked, it is determined that the position of AR target object;Obtain the AR virtual information being associated with described AR target object, picture frame is rendered, described AR virtual information is superimposed upon on AR target object and displays, show that AR target object and AR virtual content interact for user on a terminal screen simultaneously.
In theory, in order to provide a user with optimum experience, AR application needs each two field picture of photographic head shooting all processes (including process such as identifying, follow the tracks of and render), so can increase the power consumption of terminal greatly.In order to reduce power consumption of terminal, it is possible to set fixing process interval, process according to the picture frame to photographic head shooting of described process gap periods;When processing interval and being bigger, can cause that terminal is omitted AR target object or the location of AR target object is occurred error, cause that Consumer's Experience is poor;When processing interval and being less, it is impossible to effectively reduce power consumption.
In sum, AR is applied in the picture frame to photographic head shooting when processing, and how reasonably setting process interval, is a problem demanding prompt solution.
Summary of the invention
How reasonably embodiments of the invention provide a kind of device and method realizing augmented reality application, it is possible to solve AR in prior art and be applied in the picture frame to photographic head shooting when processing, the problem at setting process interval.
For reaching above-mentioned purpose, embodiments of the invention adopt the following technical scheme that
First aspect, embodiments provides a kind of device realizing augmented reality application, and described device includes:
Acquiring unit, is used for obtaining the first reference information, and described first reference information is that during augmented reality AR is applied, the frequency of calling of identification module is adjusted the reference information of time institute's foundation;
Processing unit, for according to described acquiring unit obtain first reference information adjust described identification module call frequency.
In conjunction with first aspect, in the implementation that the first is possible, described acquiring unit, the relevant information of described AR virtual content is obtained from AR specifically for the rendering module in applying, described relevant information includes described AR virtual content position on screen and size;
Described processing unit, specifically for when shared area is more than or equal to first threshold on screen for described AR virtual content, that reduces described identification module calls frequency;When shared area is less than first threshold on screen for described AR virtual content, maintain or improve described identification module call frequency.
In conjunction with first aspect, in the implementation that the second is possible, described acquiring unit, the relevant information of described AR virtual content is obtained from AR specifically for the rendering module in applying, described relevant information includes described AR virtual content position on screen and size;
Described processing unit, is additionally operable to the position on screen according to described AR virtual content and size, it is determined that the remaining area on described screen;When shared area is less than Second Threshold on screen for described remaining area, that reduces described identification module calls frequency;When shared area is more than or equal to Second Threshold on screen for described remaining area, maintain or improve described identification module call frequency.
In conjunction with first aspect, in the implementation that the third is possible:
Described acquiring unit, specifically for obtaining the running status of AR virtual content;
Described processing unit, specifically for when described AR virtual content is currently running, that reduces described identification module calls frequency;When described AR virtual content terminates to run, maintain or improve described identification module call frequency.
In conjunction with first aspect, in the 4th kind of possible implementation:
Described acquiring unit, specifically for obtaining the acceleration information of terminal from acceleration transducer;
Described processing unit, specifically for when the current acceleration of described terminal is more than three threshold values, that reduces described identification module calls frequency;When the current acceleration of described terminal is less than or equal to three threshold values, maintain or improve described identification module call frequency.
The 4th kind of possible implementation in conjunction with first aspect, in the 5th kind of possible implementation, described processing unit, be additionally operable to adjust AR application in tracking module call frequency, including: when the current acceleration of described terminal is more than four threshold values, that reduces described tracking module calls frequency;When the current acceleration of described terminal is less than or equal to four threshold values, maintain or improve described tracking module call frequency.
In conjunction with first aspect, in the 6th kind of possible implementation:
Described acquiring unit, specifically for obtaining the attitude information of terminal from gyroscope;
Described processing unit, specifically for determining the described AR current application scene applied, obtains the described terminal available attitude under described current application scene;When the attitude of described terminal is not belonging to described available attitude, that reduces described identification module calls frequency;When the attitude of described terminal belongs to described available attitude, maintain or improve described identification module call frequency.
The 6th kind of possible implementation in conjunction with first aspect, in the 7th kind of possible implementation, described processing unit, be additionally operable to adjust AR application in tracking module call frequency, including: when the attitude of described terminal is not belonging to described available attitude, that reduces described tracking module calls frequency;When the attitude of described terminal belongs to described available attitude, maintain or improve described tracking module call frequency.
In conjunction with first aspect, in the 8th kind of possible implementation:
Described acquiring unit, is additionally operable to determine the quality of the picture frame that terminal obtains;
Described processing unit, specifically for when the quality of the picture frame that described terminal obtains is lower than five threshold values, that reduces described identification module calls frequency;When the quality of the picture frame that described terminal obtains reaches at or above five threshold values, maintain or improve described identification module call frequency.
In conjunction with the 8th kind of possible implementation, in the 9th kind of possible implementation, described processing unit, be additionally operable to adjust AR application in tracking module call frequency, including: when the quality of the picture frame that described terminal obtains is lower than six threshold values, that reduces described tracking module calls frequency;When the quality of the picture frame that described terminal obtains reaches at or above six threshold values, maintain or improve described tracking module call frequency.
In conjunction with the first of first aspect and first aspect to any one the possible implementation in the 9th kind of possible implementation, in the 11st kind of possible implementation, described processing unit, be additionally operable to according to described tracking module call frequency adjust described AR application in rendering module call frequency so that described rendering module adopt the picture frame that call frequency, described tracking module was processed identical with tracking module render;
Or, described processing unit, it is additionally operable on the picture frame that processed according to tracking module the position of AR target object, the position of AR target object on the untreated picture frame of tracking module is estimated, obtain estimated result, so that described raw image frame is rendered by rendering module according to described estimated result.
What the present embodiment provided realizes strengthening the device of display application, by obtaining the first reference information, and the frequency of calling of identification module is adjusted in AR being applied according to described first reference information, under the premise ensureing Consumer's Experience, can effectively reduce the power consumption of terminal, thus realizing the frequency of calling of identification module in AR application is carried out relatively reasonable adjustment.
Second aspect, embodiments provides a kind of device realizing augmented reality application, and described device includes:
Acquiring unit, is used for obtaining the second reference information, and described second reference information is that during augmented reality AR is applied, the frequency of calling of tracking module is adjusted the reference information of time institute's foundation;
Processing unit, for according to described second reference information adjust described tracking module call frequency.
In conjunction with second aspect, in the implementation that the first is possible:
Described acquiring unit, specifically for obtaining the acceleration information of terminal from acceleration sensor;
Described processing unit, specifically for when the current acceleration of described terminal is more than four threshold values, that reduces described tracking module calls frequency;When the current acceleration of described terminal is less than or equal to four threshold values, maintain or improve described tracking module call frequency.
In conjunction with second aspect, in the implementation that the second is possible:
Described acquiring unit, specifically for obtaining the attitude information of terminal from gyroscope;
Described processing unit, specifically for determining the described AR current application scene applied, obtains the described terminal available attitude under described current application scene;When the attitude of described terminal is not belonging to described available attitude, that reduces described tracking module calls frequency;When the attitude of described terminal belongs to described available attitude, maintain or improve described tracking module call frequency.
In conjunction with second aspect, in the implementation that the third is possible:
Described acquiring unit, specifically for determining the quality of picture frame that terminal obtains
Described processing unit, specifically for when the quality of the picture frame that described terminal obtains is lower than six threshold values, that reduces described tracking module calls frequency;When the quality of the picture frame that described terminal obtains reaches at or above six threshold values, maintain or improve described tracking module call frequency.
The implementation possible in conjunction with the first possible implementation of second aspect or second aspect or the second of second aspect or the third possible implementation of the third aspect, in the 4th kind of possible implementation, described processing unit, be additionally operable to according to described tracking module call frequency adjust described AR application in rendering module call frequency so that described rendering module adopt the picture frame that call frequency, described tracking module was processed identical with tracking module render;
Or, described processing unit, it is additionally operable on the picture frame that processed according to tracking module the position of AR target object, the position of AR target object on the untreated picture frame of tracking module is estimated, obtain estimated result, so that described raw image frame is rendered by rendering module according to described estimated result.
What the present embodiment provided realizes strengthening the device of display application, by obtaining the second reference information, and the frequency of calling of tracking module is adjusted in AR being applied according to described second reference information, under the premise ensureing Consumer's Experience, can effectively reduce the power consumption of terminal, thus realizing the frequency of calling of tracking module in AR application is carried out relatively reasonable adjustment.
The third aspect, embodiments provides a kind of method realizing augmented reality application, and described method includes:
Obtaining the first reference information, described first reference information is that during augmented reality AR is applied, the frequency of calling of identification module is adjusted the reference information of time institute's foundation;
According to described first reference information adjust described identification module call frequency.
In conjunction with the third aspect, in the implementation that the first is possible, described first reference information includes the relevant information of AR virtual content;
Described acquisition the first reference information, including: the rendering module applying from AR obtains the relevant information of described AR virtual content, and described relevant information includes described AR virtual content position on screen and size;
Described according to described first reference information adjust described identification module call frequency, including:
When shared area is more than or equal to first threshold on screen for described AR virtual content, that reduces described identification module calls frequency;
When shared area is less than first threshold on screen for described AR virtual content, maintain or improve described identification module call frequency.
In conjunction with the third aspect, in the implementation that the second is possible, described first reference information includes the relevant information of AR virtual content;
Described acquisition the first reference information, including: the rendering module applying from AR obtains the relevant information of described AR virtual content, and described relevant information includes described AR virtual content position on screen and size;
Described according to described first reference information adjust described identification module call frequency, including:
The position on screen according to described AR virtual content and size, it is determined that the remaining area on described screen;When shared area is less than Second Threshold on screen for described remaining area, that reduces described identification module calls frequency;When shared area is more than or equal to Second Threshold on screen for described remaining area, maintain or improve described identification module call frequency.
In conjunction with the third aspect, in the implementation that the third is possible, described first reference information includes the running status of AR virtual content;
Described according to described first reference information adjust described identification module call frequency, including:
When described AR virtual content is currently running, that reduces described identification module calls frequency;
When described AR virtual content terminates to run, maintain or improve described identification module call frequency.
In conjunction with the third aspect, in the 4th kind of possible implementation, described first reference information includes the acceleration information of terminal;
Described acquisition the first reference information, including: the acceleration information of terminal is obtained from acceleration transducer;
Described according to described first reference information adjust described identification module call frequency, including:
When the current acceleration of described terminal is more than three threshold values, that reduces described identification module calls frequency;
When the current acceleration of described terminal is less than or equal to three threshold values, maintain or improve described identification module call frequency.
In conjunction with the 4th kind of possible implementation of the third aspect, in the 5th kind of possible implementation, described method also includes:
Adjust AR application in tracking module call frequency, including:
When the current acceleration of described terminal is more than four threshold values, that reduces described tracking module calls frequency;
When the current acceleration of described terminal is less than or equal to four threshold values, maintain or improve described tracking module call frequency.
In conjunction with the third aspect, in the 6th kind of possible implementation, described first reference information includes the attitude information of terminal;
Described acquisition the first reference information, including: the attitude information of terminal is obtained from gyroscope;
Described according to described first reference information adjust described identification module call frequency, including:
Determine the described AR current application scene applied, obtain the described terminal available attitude under described current application scene;
When the attitude of described terminal is not belonging to described available attitude, that reduces described identification module calls frequency;
When the attitude of described terminal belongs to described available attitude, maintain or improve described identification module call frequency.
In conjunction with the 6th kind of possible implementation of the third aspect, in the 7th kind of possible implementation, described method also includes:
Adjust AR application in tracking module call frequency, including:
When the attitude of described terminal is not belonging to described available attitude, that reduces described tracking module calls frequency;
When the attitude of described terminal belongs to described available attitude, maintain or improve described tracking module call frequency.
In conjunction with the third aspect, in the 8th kind of possible implementation, described first reference information includes the quality of the picture frame that terminal obtains;
Described according to described first reference information adjust described identification module call frequency, including:
When the quality of the picture frame that described terminal obtains is lower than five threshold values, that reduces described identification module calls frequency;
When the quality of the picture frame that described terminal obtains reaches at or above five threshold values, maintain or improve described identification module call frequency.
In conjunction with the 8th kind of possible implementation, in the 9th kind of possible implementation, described method also includes:
Adjust AR application in tracking module call frequency, including:
When the quality of the picture frame that described terminal obtains is lower than six threshold values, that reduces described tracking module calls frequency;
When the quality of the picture frame that described terminal obtains reaches at or above six threshold values, maintain or improve described tracking module call frequency.
In conjunction with the first of the third aspect and the third aspect to any one the possible implementation in the 9th kind of possible implementation, in the 11st kind of possible implementation, described method also includes:
According to described tracking module call frequency adjust described AR application in rendering module call frequency, including:
Described rendering module adopts the picture frame that call frequency, described tracking module processed identical with tracking module to render;
Or, according to the position of AR target object on the picture frame that tracking module processed, the position of AR target object on the untreated picture frame of tracking module is estimated, obtaining estimated result, described raw image frame is rendered by described rendering module according to described estimated result.
The method realizing strengthening display application that the present embodiment provides, by obtaining the first reference information, and the frequency of calling of identification module is adjusted in AR being applied according to described first reference information, under the premise ensureing Consumer's Experience, can effectively reduce the power consumption of terminal, thus realizing the frequency of calling of identification module in AR application is carried out relatively reasonable adjustment.
Fourth aspect, embodiments provides a kind of method realizing augmented reality application, and described method includes:
Obtaining the second reference information, described second reference information is that during augmented reality AR is applied, the frequency of calling of tracking module is adjusted the reference information of time institute's foundation;
According to described second reference information adjust described tracking module call frequency.
In conjunction with fourth aspect, in the implementation that the first is possible, described second reference information includes the acceleration information of terminal;
Described acquisition the second reference information, including: the acceleration information of terminal is obtained from acceleration sensor;
Described according to described second reference information adjust described tracking module call frequency, including:
When the current acceleration of described terminal is more than four threshold values, that reduces described tracking module calls frequency;
When the current acceleration of described terminal is less than or equal to four threshold values, maintain or improve described tracking module call frequency.
In conjunction with fourth aspect, in the implementation that the second is possible, described second reference information includes the attitude information of terminal;
Described acquisition the second reference information, including: the attitude information of terminal is obtained from gyroscope;
Described according to described second reference information adjust described tracking module call frequency, including:
Determine the described AR current application scene applied, obtain the described terminal available attitude under described current application scene;
When the attitude of described terminal is not belonging to described available attitude, that reduces described tracking module calls frequency;
When the attitude of described terminal belongs to described available attitude, maintain or improve described tracking module call frequency.
In conjunction with fourth aspect, in the implementation that the third is possible, described second reference information includes the quality of the picture frame that terminal obtains;
Described according to described second reference information adjust described tracking module call frequency, including:
When the quality of the picture frame that described terminal obtains is lower than six threshold values, that reduces described tracking module calls frequency;
When the quality of the picture frame that described terminal obtains reaches at or above six threshold values, maintain or improve described tracking module call frequency.
The implementation possible in conjunction with the first possible implementation of fourth aspect or fourth aspect or the second of fourth aspect or the third possible implementation of the third aspect, in the 4th kind of possible implementation, described method also includes:
According to described tracking module call frequency adjust described AR application in rendering module call frequency, including:
Described rendering module adopts the picture frame that call frequency, described tracking module processed identical with tracking module to render;
According to the position of AR target object on the picture frame that tracking module processed, the position of AR target object on the untreated picture frame of tracking module is estimated, obtaining estimated result, described raw image frame is rendered by described rendering module according to described estimated result.
The method realizing strengthening display application that the present embodiment provides, by obtaining the second reference information, and the frequency of calling of tracking module is adjusted in AR being applied according to described second reference information, under the premise ensureing Consumer's Experience, can effectively reduce the power consumption of terminal, thus realizing the frequency of calling of tracking module in AR application is carried out relatively reasonable adjustment.
Accompanying drawing explanation
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, the accompanying drawing used required in embodiment or description of the prior art will be briefly described below, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the premise not paying creative work, it is also possible to obtain other accompanying drawing according to these accompanying drawings.
The schematic flow sheet of the method realizing AR application that Fig. 1 provides for the embodiment of the present invention one;
The schematic flow sheet of the method realizing AR application that Fig. 2 provides for the embodiment of the present invention two;
The schematic diagram of the screen partition mode that Fig. 3 provides for the embodiment of the present invention two;
The schematic flow sheet of the method realizing AR application that Fig. 4 provides for the embodiment of the present invention three;
The schematic flow sheet of the method realizing AR application that Fig. 5 provides for the embodiment of the present invention four;
The schematic flow sheet of the method realizing AR application that Fig. 6 provides for the embodiment of the present invention five;
The schematic flow sheet of the method realizing AR application that Fig. 7 provides for the embodiment of the present invention six;
The schematic flow sheet of the method realizing AR application that Fig. 8 provides for the embodiment of the present invention seven;
The schematic flow sheet of the method realizing AR application that Fig. 9 provides for the embodiment of the present invention eight;
The schematic flow sheet of the method realizing AR application that Figure 10 provides for the embodiment of the present invention nine;
The schematic flow sheet of the method realizing AR application that Figure 11 provides for the embodiment of the present invention ten;
The structured flowchart of the device realizing AR application that Figure 12 provides for the embodiment of the present invention 11;
The structured flowchart of the device realizing AR application that Figure 13 provides for the embodiment of the present invention 12.
Detailed description of the invention
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is only a part of embodiment of the present invention, rather than whole embodiments.Based on the embodiment in the present invention, the every other embodiment that those of ordinary skill in the art obtain under not making creative work premise, broadly fall into the scope of protection of the invention.
Embodiment one
Embodiments provide a kind of method realizing AR application, as it is shown in figure 1, the method that the present embodiment provides includes:
101, obtaining the first reference information, described first reference information is that during augmented reality AR is applied, the frequency of calling of identification module is adjusted the reference information of time institute's foundation.
What deserves to be explained is, in the present embodiment, described first reference information is for being adjusted the frequency of calling of identification module for AR application.Concrete, described first reference information at least includes one or more in following type information: the quality information of the AR virtual content picture frame showing position, the running status of AR virtual content, the status information (such as acceleration information and attitude information) of terminal, photographic head shooting on a terminal screen.
102, according to described first reference information adjust described identification module call frequency
Concrete, described AR application can set corresponding adjustable strategies for the first reference information;After obtaining the first reference information by step 101, according to corresponding strategy, the frequency of calling of identification module is adjusted.
The method realizing strengthening display application that the present embodiment provides, by obtaining the first reference information, and the frequency of calling of identification module is adjusted in AR being applied according to described first reference information, under the premise ensureing Consumer's Experience, can effectively reduce the power consumption of terminal, thus realizing the frequency of calling of identification module in AR application is carried out relatively reasonable adjustment.
Embodiment two
In order to make it easy to understand, the method provided based on embodiment one, the present embodiment further provides for a kind of method realizing augmented reality application, and the present embodiment illustrates for the relevant information that the first reference information is AR virtual content.
As in figure 2 it is shown, the method that the present embodiment provides includes:
201, terminal rendering module from AR application obtains the relevant information of the AR virtual content that current screen shows, described relevant information includes described AR virtual content position on screen and size.
202, the relevant information of described AR virtual content and preset first threshold value are compared by terminal, it is judged that whether described AR virtual content shared area on screen reaches described first threshold;If so, step 203 is performed;If it is not, perform step 204.
203, terminal reduce AR application in identification module call frequency.
It is to say, AR application adopts relatively low frequency that the AR target object in picture frame is identified.
By step 203, when shared area is bigger on screen for AR virtual content, on screen, remaining clear area area is less, and the quantity of the AR target object that can show in clear area can reduce;In this case, AR application reduces identification module and calls frequency, still can keep the AR Consumer's Experience applied;Simultaneously as reduce AR application in identification module call frequency, it is possible to effectively reduce the power consumption of terminal.
204, terminal maintain or improve described identification module call frequency.
By step 204, when shared area is less on screen for AR virtual content, on screen, remaining clear area area is bigger;In this case, in order to improve the utilization rate of screen, it is necessary to improve AR application in identification module call frequency, photographic head shooting picture frame in the AR target object that identifies as much as possible, improve Consumer's Experience.
Concrete, about above-mentioned steps 202, the present embodiment provides following detailed description of the invention for reference:
Mode one:
S1, terminal are according to the AR target object position and size on screen, divide screen.
Such as, it is 1 be illustrated for the AR target object quantity that terminal screen is currently displaying: described AR is applied in after identifying AR target object, determine the minimum outer area-encasing rectangle (BoundingBox) of this AR target object, carry out screen is divided.
In order to make it easy to understand, it is for reference to present embodiments provide a kind of concrete screen dividing mode.As it is shown on figure 3, rectangle S is the BoundingBox of identified AR target object;S11-S41 is the piecemeal rectangle of AR target object surrounding.
S2, terminal calculate rectangle S and account for the percentage ratio of screen area, it is determined that whether described AR virtual content shared area on screen reaches described first threshold.
Such as, described first threshold can be a percentage ratio;For 50%, when the percentage ratio that rectangle S accounts for screen area reaches 50%, then judge that described AR virtual content shared area on screen reaches described first threshold;Otherwise, then judge described AR virtual content on screen shared area not up to described first threshold.
What deserves to be explained is, except above-mentioned using area shared by AR target object as judgment basis, described terminal can also according to the remaining area on terminal screen as judgment basis, specific as follows: terminal is according to the position on screen of described AR virtual content and size, it is determined that the remaining area on described screen;According to described remaining area space consuming size on screen, what adjust described identification module calls frequency, specifically includes: when shared area is less than Second Threshold on screen for described remaining area, that reduces described identification module calls frequency;When shared area is more than or equal to Second Threshold on screen for described remaining area, maintain or improve described identification module call frequency.
For Fig. 3, after terminal screen is divided, the piecemeal rectangle S11-S14 for AR target object surrounding sets corresponding threshold value T1-T4, and judges whether S11-S14 shared area on screen reaches T1-T4 successively;When S11-S14 reaches respective threshold, in raising AR application, identification module calls frequency;When S11-S14 is not up to respective threshold, in reduction AR application, identification module calls frequency.
The explanation of value, the screen dividing mode in the present embodiment is not fixing, it is possible to determine concrete dividing mode according to screen size, displaying ratio, screen resolution and AR application scenarios.
Embodiment three
In order to make it easy to understand, the method provided based on embodiment one, the present embodiment further provides for a kind of method realizing augmented reality application, and the present embodiment illustrates for the running status that the first reference information is AR virtual content.
As shown in Figure 4, the method that the present embodiment provides includes:
401, terminal applied by AR in rendering module obtain the AR virtual content corresponding with AR target object, AR virtual content is superimposed upon on picture frame and displays.
402, terminal monitoring is to the trigger action of user, runs described AR virtual content.
Concrete, based on the attribute of AR virtual content, described operation AR virtual content at least includes the following two kinds mode: described AR virtual content is played out, or according to user operation, described AR virtual content is carried out can interactive operation.
403, the running status of AR virtual content described in terminal monitoring;When described AR virtual content is currently running, perform step 404;When described AR virtual content terminates to run, perform step 405.
404, terminal reduce AR application in identification module call frequency.
405, terminal maintain or improve AR application in identification module call frequency.
Adopt the method that the present embodiment provides, on the one hand, by step 404, when terminal is currently running AR virtual content, now the attention of user is concentrated mainly on the AR virtual content being currently running, and the identification demand of new AR target object can be reduced by user;In this case, terminal reduce AR application in identification module call frequency, it is ensured that Consumer's Experience, at the same time it can also be reduce power consumption of terminal.
On the other hand, by step 405, when terminal terminates to run AR virtual content, the identification demand of new AR target object can be increased by user;In this case, in being applied by raising AR, identification module calls frequency, it is possible to further meet user's demand to new AR target object, thus effectively promoting Consumer's Experience.
Embodiment four
In order to make it easy to understand, the method provided based on embodiment one, the present embodiment further provides for a kind of method realizing augmented reality application, and the present embodiment illustrates for the acceleration information that the first reference information is terminal.
Acceleration can weigh the residing kinestate of terminal.Generally, the identification module in AR application requires that picture frame to be identified is an amplitude ratio more clearly image, the picture frame that namely terminal catches when more stable.In the present embodiment, it is thus necessary to determine that the most high acceleration that in AR application, identification module can bear when picture frame is identified.Such as, terminal can catch, under a series of different acceleration, the picture frame including AR target object, call identification module these picture frames are identified, determine the terminal peak acceleration when identification module can recognize that AR target object, the frequency of calling of identification module is adjusted as the 3rd threshold value by this peak acceleration.
As it is shown in figure 5, the method that the present embodiment provides includes:
501, terminal obtains the acceleration information of terminal from acceleration transducer.
502, terminal determines current acceleration, and is compared with the 3rd threshold value preset by current acceleration, it is judged that whether the current acceleration of described terminal is more than described 3rd threshold value;If so, step 503 is performed;If it is not, perform step 504.
503, terminal reduce AR application in identification module call frequency.
By step 503, when the peak acceleration that the current acceleration of terminal can bear when running AR application more than this terminal, in this case, the picture frame photographed cannot be effectively identified by the identification module in AR application, even if terminal adopts higher frequency that picture frame is identified, without effectively identifying AR target object;So, it is possible to that reduces identification module calls frequency, thus reducing the power consumption of terminal.
504, terminal maintain or improve AR application in identification module call frequency
By step 504, when the peak acceleration that the current acceleration of terminal can bear when running AR application less than or equal to this terminal, the picture frame photographed can be effectively identified by the identification module in AR application;In this case, terminal can maintain or improve AR application in identification module call frequency, thus significantly more efficient from shooting picture frame identify AR target object, thus providing a user with better experience.
Preferably, in the present embodiment, in AR is applied identification module call under the premise that frequency is adjusted, the present embodiment is also based on during AR is applied by the acceleration information of terminal the frequency of calling of tracking module and is adjusted.Specific as follows:
S1, when the current acceleration of described terminal is more than four threshold values, what terminal reduced tracking module in AR application calls frequency.
In the present embodiment, what described 4th threshold value characterized is terminal acceleration information, and described 4th threshold value is the tracking module AR target object in the picture frame photographed in AR application when normally following the tracks of, can bear peak acceleration.When the current acceleration of terminal is more than described four threshold value, the AR target object in picture frame cannot effectively be followed the tracks of by the tracking module in AR application;In this case, it is possible to save power consumption of terminal by reducing the frequency of calling of tracking module in AR application.
S2, when the current acceleration of described terminal is less than or equal to described four threshold value, terminal maintain or improve tracking module in AR application call frequency.
When the peak acceleration that tracking module during the current acceleration of terminal is applied less than or equal to AR can bear, the AR target object in picture frame can effectively be followed the tracks of by the tracking module in AR application;In such a case, it is possible to maintain or improve AR application in tracking module call frequency, it is ensured that the AR target object newly increased is tracked, simultaneously improve follow the tracks of AR target object accuracy rate, provide a user with better experience.
What deserves to be explained is, the tracking module in AR being applied by step S1-S2 calls after frequency is adjusted, described terminal can also calling the frequency of calling of rendering module during AR is applied by frequency and be adjusted according to tracking module, specific as follows:
Rendering module in AR application adopts the picture frame that call frequency, described tracking module processed identical with tracking module to render.
By the above method of adjustment that rendering module is called frequency, terminal is without rendering each picture frame, only the picture frame processed through tracking module is rendered, thus showing AR virtual content on the picture frame that tracking module processes, such that it is able to guarantee the AR Consumer's Experience applied, and the power consumption of terminal can be reduced.
On the other hand, each picture frame of terminal taking can be rendered by rendering module, namely rendering module to call frequency identical with the frequency of terminal taking picture frame.Concrete, according to the position of AR target object on the picture frame that tracking module processed, the position of AR target object on the untreated picture frame of tracking module is estimated, obtaining estimated result, described raw image frame is rendered by described rendering module according to described estimated result.Such as, by the picture frame of terminal taking being carried out buffer memory (cache image frame includes picture frame and the untreated picture frame of tracking module that tracking module processed), the picture frame that processed based on tracking module, call difference arithmetic and determine the position of AR target object on the untreated picture frame of described tracking module, it is simple to the untreated picture frame of described tracking module is rendered by rendering module.
Embodiment five
In order to make it easy to understand, the method provided based on embodiment one, the present embodiment further provides for a kind of method realizing augmented reality application, and the present embodiment illustrates for the attitude information that the first reference information is terminal.
Gyroscope can the current attitude of monitoring terminal.Under different AR application scenarios, the attitude of terminal is had different requirements by AR application.Such as, in oil painting exhibition center, when oil painting exhibition is carried out AR application, owing to oil painting is vertically suspended, so the cross section of finding a view that the ideal pose of terminal should keep photographic head is vertical;General, the photographic head in terminal is fixing, it is possible to judge whether finding a view of photographic head meets current AR application scenarios by the attitude of terminal.
As shown in Figure 6, the method that the present embodiment provides includes:
601, terminal obtains the attitude information of terminal from gyroscope.
602, terminal determines the AR current application scene applied, and obtains the described terminal available attitude under described current application scene.
Concrete, terminal at least can determine the AR current application scene applied by following three kinds of modes, specific as follows:
Mode one:
The AR application scenarios applied is determined by the method pre-seted.Concrete, AR application can preset several scenes and select for user;Or, terminal is equipment for customizing, for instance, under the scene of oil painting exhibition, the terminal in the present embodiment can be the intelligent glasses customized in advance, and the acquiescence operative scenario of this intelligent glasses is oil painting exhibition.
Mode two:
Terminal is determined current location by positioning and determines the AR application scenarios applied according to current temporal information.Such as, for the open hour in oil painting exhibition center for 9:00-17:00, after terminal starts AR application, when terminal determines that being presently in position is the open hour that oil painting exhibition center and current time are in oil painting exhibition center by positioning, then terminal determines that the AR application scenarios applied is oil painting exhibition AR.
Mode three:
The picture frame that photographic head is photographed by terminal is identified, and determines the AR current application scene applied according to the content in described picture frame.Such as, when AR is applied in and identifies substantial amounts of oil painting in the picture frame of photographic head shooting, then display reminding information, ask the user whether to need AR application scenarios is defined as oil painting exhibition scene.
By three of the above mode, terminal may determine that the AR current application scene applied.What deserves to be explained is, the present embodiment being determined, the method for AR application scenarios is not limited only to this.
Optionally, for different AR application scenarios, AR application can pre-set corresponding available attitude;Or, it is possible to it is set by the user the available attitude under different AR application scenarios.
Whether the current pose 603, detecting terminal belongs to the available attitude under current application scene;If so, step 604 is performed;If it is not, perform step 605.
604, terminal maintain or improve AR application in identification module call frequency.
By step 604, when the current pose of terminal belongs to the available attitude under current application scene, the picture frame photographed can be effectively identified by the identification module in AR application;In this case, terminal can maintain or improve AR application in identification module call frequency, thus significantly more efficient from shooting picture frame identify AR target object, provide a user with better experience.
605, terminal reduce AR application in identification module call frequency.
By step 605, when the current pose of terminal is not belonging to the available attitude under current application scene, the picture frame photographed cannot be effectively identified by the identification module in AR application, even if terminal adopts higher frequency that picture frame is identified, without effectively identifying AR target object;So, it is possible to that reduces identification module calls frequency, thus reducing the power consumption of terminal.
Preferably, in AR is applied identification module call under the premise that frequency is adjusted, the present embodiment is also based on during AR is applied by the attitude information of terminal the frequency of calling of tracking module and is adjusted.Concrete, after step 604, also comprise the steps:
606, terminal maintain or improve AR application in tracking module call frequency.
When the current pose of terminal belongs to the available attitude under current application scene, the picture frame photographed can be effectively identified by the identification module in AR application, it is to say, identification module can effectively identify the AR target object newly increased in the picture frame of shooting;In such a case, it is possible to maintain or improve AR application in tracking module call frequency, the AR target object newly increased is tracked, provides a user with better experience.
After step 605, also comprise the steps:
607, terminal reduce AR application in tracking module call frequency.
When the current pose of terminal is not belonging to the available attitude under current application scene, the picture frame photographed cannot be effectively identified by the identification module in AR application;In this case, have new AR target object hardly and be identified, so tracking module can adopt relatively low frequency to be tracked AR target object processing, thus reducing the power consumption of terminal.
What deserves to be explained is, after tracking module in AR being applied by step 606-607 is adjusted, described terminal can also be adjusted according to the frequency of calling of rendering module during AR is applied by frequency of calling of tracking module, specific as follows: the rendering module in AR application adopts the picture frame that call frequency, described tracking module processed identical with tracking module to render.
By the above method of adjustment that rendering module is called frequency, terminal is without rendering each picture frame, only the picture frame processed through tracking module is rendered, thus showing AR virtual content on the picture frame that tracking module processes, such that it is able to guarantee the AR Consumer's Experience applied, and the power consumption of terminal can be reduced.
On the other hand, each picture frame of terminal taking can be rendered by rendering module, namely rendering module to call frequency identical with the frequency of terminal taking picture frame.Concrete, according to the position of AR target object on the picture frame that tracking module processed, the position of AR target object on the untreated picture frame of tracking module is estimated, obtaining estimated result, described raw image frame is rendered by described rendering module according to described estimated result.Such as, by the picture frame of terminal taking being carried out buffer memory (cache image frame includes picture frame and the untreated picture frame of tracking module that tracking module processed), the picture frame that processed based on tracking module, call difference arithmetic and determine the position of AR target object on the untreated picture frame of described tracking module, it is simple to the untreated picture frame of described tracking module is rendered by rendering module.
Embodiment six
In order to make it easy to understand, the method provided based on embodiment one, the present embodiment further provides for a kind of method realizing augmented reality application, and the present embodiment illustrates for the quality that the first reference information is the picture frame that terminal obtains.
As it is shown in fig. 7, the method that the present embodiment provides includes:
701, terminal obtains the quality of the picture frame of photographic head shooting.
Concrete, terminal can adopt the picture frame that photographic head is shot by independent picture quality detection module to detect, it is determined that the quality of picture frame.Or, in Android platform, terminal can obtain the quality of picture frame by the successful tag of focusing (such as: the tag of ContinuousAutoFocus) that photographic head provides.
702, whether the quality of picture frame of terminal judges photographic head shooting reaches the 5th threshold value preset;If so, step 703 is then performed;If it is not, perform step 704.
In the present embodiment, the 5th threshold value is set in advance, and described 5th threshold value is the marginal value of phenogram picture frame quality.It is to say, when the quality of picture frame reaches described five threshold value, the identification module in AR application can recognize that the AR target object in this picture frame;When the quality of picture frame is not up to described five threshold value, the identification module None-identified in AR application goes out the AR target object in this picture frame, or the AR target object success rate that the identification module in AR application identifies in this picture frame is relatively low.
703, terminal maintain or improve AR application in identification module call frequency.
By step 703, when the quality of the picture frame of photographic head shooting reaches described five threshold value, identification module effectively can identify AR target object from this picture frame;In this case, by maintain or improve identification module in AR application call frequency, it is possible to when picture frame is identified, effectively avoid the omission of AR target object in picture frame, promote Consumer's Experience.
704, terminal reduce AR application in identification module call frequency.
By step 704, when the quality of the picture frame that photographic head shoots is not up to described five threshold value, identification module effectively can not identify AR target object from this picture frame;In this case, improve identification module call frequency also cannot be effectively improved AR application Consumer's Experience, it is possible to reduce AR application in identification module call frequency, thus saving the power consumption of terminal.
Preferably, in AR is applied identification module call under the premise that frequency is adjusted, the present embodiment is also based on during AR is applied by picture frame quality the frequency of calling of tracking module and is adjusted.Specific as follows:
S1, when the quality of the picture frame that described terminal obtains reaches at or above six threshold values, maintain or improve described tracking module call frequency.
S2, when the quality of the picture frame that described terminal obtains is lower than six threshold values, that reduces described tracking module calls frequency.
Wherein, what described 6th threshold value characterized is the picture frame quality information of terminal taking.When the quality of picture frame reaches described six threshold value, the AR target object in picture frame can effectively be followed the tracks of by the tracking module in AR application;In this case, it is possible to by maintain or improve tracking module in AR application call frequency, improve the accuracy of AR target object location information in picture frame, promote Consumer's Experience.On the other hand, when the quality of picture frame is lower than described six threshold value, the AR target object in picture frame cannot effectively be followed the tracks of by the tracking module in AR application;In such a case, it is possible to by reduce described tracking module call frequency, reduce power consumption of terminal.
What deserves to be explained is, after tracking module in AR being applied by step S1-S2 is adjusted, described terminal can also be adjusted according to the frequency of calling of rendering module during AR is applied by frequency of calling of tracking module, specific as follows: the rendering module in AR application adopts the picture frame that call frequency, described tracking module processed identical with tracking module to render.
By the above method of adjustment that rendering module is called frequency, terminal is without rendering each picture frame, only the picture frame processed through tracking module is rendered, thus showing AR virtual content on the picture frame that tracking module processes, such that it is able to guarantee the AR Consumer's Experience applied, and the power consumption of terminal can be reduced.
On the other hand, each picture frame of terminal taking can be rendered by rendering module, namely rendering module to call frequency identical with the frequency of terminal taking picture frame.Concrete, according to the position of AR target object on the picture frame that tracking module processed, the position of AR target object on the untreated picture frame of tracking module is estimated, obtaining estimated result, described raw image frame is rendered by described rendering module according to described estimated result.Such as, by the picture frame of terminal taking being carried out buffer memory (cache image frame includes picture frame and the untreated picture frame of tracking module that tracking module processed), the picture frame that processed based on tracking module, call difference arithmetic and determine the position of AR target object on the untreated picture frame of described tracking module, it is simple to the untreated picture frame of described tracking module is rendered by rendering module.
What the method that embodiment one to embodiment six in the application provides was mainly used in adjusting in AR application identification module calls frequency;Wherein, embodiment two to embodiment six has been described in detail for different types of first reference information.What deserves to be explained is, in practical application, it is possible to consider number of different types the first reference information (such as: AR virtual content display position on a terminal screen, the running status of AR virtual content, the acceleration information of terminal, the attitude information of terminal, picture frame quality information) carry out considering the frequency of calling of identification module during AR is applied and be adjusted.Such as, AR application sets corresponding weight for different types of first reference information, and multiple first reference information that embodiment two to embodiment six is provided considers, and during AR is applied, the frequency of calling of identification module is adjusted.Optionally, AR application can be preset multiple different frequency of calling identification module is adjusted.
Embodiment seven
The present embodiment provides a kind of method realizing augmented reality application, and as shown in Figure 8, the method that the present embodiment provides includes:
801, terminal obtains the second reference information, and described second reference information is that during augmented reality AR is applied, the frequency of calling of tracking module is adjusted the reference information of time institute's foundation.
What deserves to be explained is, in the present embodiment, described second reference information is for being adjusted the frequency of calling of tracking module for AR application.Concrete, described second reference information at least includes one or more in following type information: the acceleration information of terminal, the attitude information of terminal, picture frame quality information.
802, according to described second reference information adjust described tracking module call frequency.
Concrete, described AR application can set corresponding adjustable strategies for the second reference information;After obtaining the second reference information by step 801, according to corresponding strategy, the frequency of calling of tracking module is adjusted.
The method realizing strengthening display application that the present embodiment provides, by obtaining the second reference information, and the frequency of calling of tracking module is adjusted in AR being applied according to described second reference information, under the premise ensureing Consumer's Experience, can effectively reduce the power consumption of terminal, thus realizing the frequency of calling of tracking module in AR application is carried out relatively reasonable adjustment.
Embodiment eight
In order to make it easy to understand, the method provided based on embodiment seven, the present embodiment further provides for a kind of method realizing and strengthening application, and concrete, the present embodiment illustrates for the acceleration information that the second reference information is terminal.
As it is shown in figure 9, the method that the present embodiment provides includes:
901, terminal obtains the acceleration information of terminal from acceleration transducer.
902, terminal determines current acceleration, and is compared with the 4th threshold value preset by current acceleration, it is judged that whether the current acceleration of described terminal is more than the 4th threshold value;If so, step 903 is performed;If it is not, perform step 904.
903, terminal reduce AR application in tracking module call frequency.
904, terminal maintain or improve AR application in tracking module call frequency.
In the present embodiment, what described 4th threshold value characterized is terminal acceleration information, and described 4th threshold value is the tracking module AR target object in the picture frame photographed in AR application when normally following the tracks of, can bear peak acceleration.When the current acceleration of terminal is more than described four threshold value, the AR target object in picture frame cannot effectively be followed the tracks of by the tracking module in AR application;In this case, it is possible to save power consumption of terminal by reducing the frequency of calling of tracking module in AR application.
When the peak acceleration that the current acceleration of terminal can bear when running AR application more than tracking module in applying less than or equal to this terminal AR, the picture frame photographed can be effectively identified by the identification module in AR application, it is to say, identification module can effectively identify that the AR target object in picture frame can effectively be followed the tracks of by the AR target object tracking module increased in the picture frame of shooting;In such a case, it is possible to maintain or improve AR application in tracking module call frequency, it is ensured that the AR target object newly increased is tracked, simultaneously improve follow the tracks of AR target object accuracy rate, provide a user with better experience.
What deserves to be explained is, after tracking module in AR being applied by step 903-904 is adjusted, described terminal can also be adjusted according to the frequency of calling of rendering module during AR is applied by frequency of calling of tracking module, specific as follows: the rendering module in AR application adopts the picture frame that call frequency, described tracking module processed identical with tracking module to render.
By the above method of adjustment that rendering module is called frequency, terminal is without rendering each picture frame, only the picture frame processed through tracking module is rendered, thus showing AR virtual content on the picture frame that tracking module processes, such that it is able to guarantee the AR Consumer's Experience applied, and the power consumption of terminal can be reduced.
On the other hand, each picture frame of terminal taking can be rendered by rendering module, namely rendering module to call frequency identical with the frequency of terminal taking picture frame.Concrete, according to the position of AR target object on the picture frame that tracking module processed, the position of AR target object on the untreated picture frame of tracking module is estimated, obtaining estimated result, described raw image frame is rendered by described rendering module according to described estimated result.Such as, by the picture frame of terminal taking being carried out buffer memory (cache image frame includes picture frame and the untreated picture frame of tracking module that tracking module processed), the picture frame that processed based on tracking module, call difference arithmetic and determine the position of AR target object on the untreated picture frame of described tracking module, it is simple to the untreated picture frame of described tracking module is rendered by rendering module.
Embodiment nine
In order to make it easy to understand, the method provided based on embodiment seven, the present embodiment further provides for a kind of method realizing and strengthening application, and concrete, the present embodiment illustrates for the attitude information that the second reference information is terminal.
As shown in Figure 10, the method that the present embodiment provides includes:
1001, terminal obtains the attitude information of terminal from gyroscope.
1002, terminal determines the AR current application scene applied, and obtains the described terminal available attitude under described current application scene.
Concrete, terminal at least can determine the AR current application scene applied by following three kinds of modes, specific as follows:
Mode one:
The AR application scenarios applied is determined by the method pre-seted.Concrete, AR application can preset several scenes and select for user;Or, terminal is equipment for customizing, for instance, under the scene of oil painting exhibition, the terminal in the present embodiment can be the intelligent glasses customized in advance, and the acquiescence operative scenario of this intelligent glasses is oil painting exhibition.
Mode two:
Terminal is determined current location by positioning and determines the AR application scenarios applied according to current temporal information.Such as, for the open hour in oil painting exhibition center for 9:00-17:00, after terminal starts AR application, when terminal determines that being presently in position is the open hour that oil painting exhibition center and current time are in oil painting exhibition center by positioning, then terminal determines that the AR application scenarios applied is oil painting exhibition AR.
Mode three:
The picture frame that photographic head is photographed by terminal is identified, and determines the AR current application scene applied according to the content in described picture frame.Such as, when AR is applied in and identifies substantial amounts of oil painting in the picture frame of photographic head shooting, then display reminding information, ask the user whether to need AR application scenarios is defined as oil painting exhibition scene.
By three of the above mode, terminal may determine that the AR current application scene applied.What deserves to be explained is, the present embodiment being determined, the method for AR application scenarios is not limited only to this.
Optionally, for different AR application scenarios, AR application can pre-set corresponding available attitude;Or, it is possible to it is set by the user the available attitude under different AR application scenarios.
Whether the current pose 1003, detecting terminal belongs to the available attitude under current application scene;If so, step 1004 is performed;If it is not, perform step 1005.
1004, terminal maintain or improve AR application in tracking module call frequency.
When the current pose of terminal belongs to the available attitude under current application scene, the picture frame photographed can be effectively identified by the identification module in AR application, it is to say, identification module can effectively identify the AR target object newly increased in the picture frame of shooting;In such a case, it is possible to maintain or improve AR application in tracking module call frequency, the AR target object newly increased is tracked, provides a user with better experience.
1005, terminal reduce AR application in tracking module call frequency.
When the current pose of terminal is not belonging to the available attitude under current application scene, the picture frame photographed cannot be effectively identified by the identification module in AR application;In this case, have new AR target object hardly and be identified, so tracking module can adopt relatively low frequency to be tracked AR target object processing, thus reducing the power consumption of terminal.
What deserves to be explained is, after tracking module in AR being applied by step 1004-1005 is adjusted, described terminal can also be adjusted according to the frequency of calling of rendering module during AR is applied by frequency of calling of tracking module, specific as follows: the rendering module in AR application adopts the picture frame that call frequency, described tracking module processed identical with tracking module to render.
By the above method of adjustment that rendering module is called frequency, terminal is without rendering each picture frame, only the picture frame processed through tracking module is rendered, thus showing AR virtual content on the picture frame that tracking module processes, such that it is able to guarantee the AR Consumer's Experience applied, and the power consumption of terminal can be reduced.
On the other hand, each picture frame of terminal taking can be rendered by rendering module, namely rendering module to call frequency identical with the frequency of terminal taking picture frame.Concrete, according to the position of AR target object on the picture frame that tracking module processed, the position of AR target object on the untreated picture frame of tracking module is estimated, obtaining estimated result, described raw image frame is rendered by described rendering module according to described estimated result.Such as, by the picture frame of terminal taking being carried out buffer memory (cache image frame includes picture frame and the untreated picture frame of tracking module that tracking module processed), the picture frame that processed based on tracking module, call difference arithmetic and determine the position of AR target object on the untreated picture frame of described tracking module, it is simple to the untreated picture frame of described tracking module is rendered by rendering module.
Embodiment ten
In order to make it easy to understand, the method provided based on embodiment seven, the present embodiment further provides for a kind of method realizing augmented reality application, and the present embodiment illustrates for the quality that the second reference information is the picture frame that terminal obtains.
As shown in figure 11, the method that the present embodiment provides includes:
1101, terminal obtains the quality of the picture frame of photographic head shooting.
Concrete, terminal can adopt the picture frame that photographic head is shot by independent picture quality detection module to detect, it is determined that the quality of picture frame.Or, under Android platform, terminal can obtain the quality of picture frame by the successful tag of focusing (such as: the tag of ContinuousAutoFocus) that photographic head provides.
1102, whether the quality of picture frame of terminal judges photographic head shooting reaches the 6th threshold value preset;If so, step 1103 is then performed;If it is not, perform step 1104.
In the present embodiment, the 6th threshold value is set in advance, and described 6th threshold value is the marginal value of phenogram picture frame quality.It is to say, when the quality of picture frame reaches described six threshold value, the AR target object in this picture frame can be tracked by the tracking module in AR application;When the quality of picture frame is not up to described six threshold value, the AR target object in this picture frame cannot be tracked by the tracking module in AR application.
1103, terminal maintain or improve AR application in tracking module call frequency.
By step 1103, when the quality of the picture frame of photographic head shooting reaches described six threshold value, the AR target object in this picture frame can be tracked by tracking module;In such a case, it is possible to by maintain or improve tracking module in AR application call frequency, improve the accuracy of AR target object location information in picture frame, promote Consumer's Experience.
1104, terminal reduce AR application in tracking module call frequency.
By step 1104, when the quality of the picture frame that photographic head shoots is not up to described six threshold value, the AR target object in this picture frame cannot be tracked by tracking module;In such a case, it is possible to save the power consumption of terminal by reducing the frequency of calling of tracking module in AR application.
What deserves to be explained is, after tracking module in AR being applied by step 1103-1104 is adjusted, described terminal can also be adjusted according to the frequency of calling of rendering module during AR is applied by frequency of calling of tracking module, specific as follows: the rendering module in AR application adopts the picture frame that call frequency, described tracking module processed identical with tracking module to render.
By the above method of adjustment that rendering module is called frequency, terminal is without rendering each picture frame, only the picture frame processed through tracking module is rendered, thus showing AR virtual content on the picture frame that tracking module processes, such that it is able to guarantee the AR Consumer's Experience applied, and the power consumption of terminal can be reduced.
On the other hand, each picture frame of terminal taking can be rendered by rendering module, namely rendering module to call frequency identical with the frequency of terminal taking picture frame.Concrete, according to the position of AR target object on the picture frame that tracking module processed, the position of AR target object on the untreated picture frame of tracking module is estimated, obtaining estimated result, described raw image frame is rendered by described rendering module according to described estimated result.Such as, by the picture frame of terminal taking being carried out buffer memory (cache image frame includes picture frame and the untreated picture frame of tracking module that tracking module processed), the picture frame that processed based on tracking module, call difference arithmetic and determine the position of AR target object on the untreated picture frame of described tracking module, it is simple to the untreated picture frame of described tracking module is rendered by rendering module.
What the method that embodiment seven to embodiment ten in the application provides was mainly used in adjusting in AR application tracking module calls frequency;Wherein, embodiment eight to embodiment ten has been described in detail for different types of second reference information.What deserves to be explained is, in practical application, it is possible to consider number of different types the second reference information (such as: the acceleration information of terminal, the attitude information of terminal, picture frame quality information) carry out considering the frequency of calling of tracking module during AR is applied and be adjusted.Such as, AR application sets corresponding weight for different types of second reference information, and multiple second reference information that embodiment eight to embodiment ten is provided considers, and during AR is applied, the frequency of calling of tracking module is adjusted.In practical application, AR application can be preset multiple different frequency of calling and tracking module is adjusted.
Embodiment 11
Present embodiments provide a kind of device realizing augmented reality application, it is possible to realize implementing the method realizing augmented reality application that one to embodiment six provides.
As shown in figure 12, the device that the present embodiment provides includes:
Acquiring unit 1201, is used for obtaining the first reference information, and described first reference information is that during augmented reality AR is applied, the frequency of calling of identification module is adjusted the reference information of time institute's foundation;
Processing unit 1202, what be used for the first reference information described identification module of adjustment according to the acquisition of described acquiring unit 1201 calls frequency.
What deserves to be explained is, the device of augmented reality application that what the present embodiment provided realize is applied to terminal, can add the mode of required common hardware by software and realize, naturally it is also possible to by hardware, but in a lot of situation, the former is embodiment more preferably.
Based on the device realizing augmented reality application shown in Figure 12, in the implementation that the first is possible:
Described acquiring unit 1201, obtains the relevant information of described AR virtual content from AR specifically for the rendering module in applying, and described relevant information includes described AR virtual content position on screen and size;
Described processing unit 1202, specifically for when shared area is more than or equal to first threshold on screen for described AR virtual content, that reduces described identification module calls frequency;When shared area is less than first threshold on screen for described AR virtual content, maintain or improve described identification module call frequency.
Further, described processing unit 1202, it is additionally operable to the position on screen according to described AR virtual content and size, it is determined that the remaining area on described screen;When shared area is less than Second Threshold on screen for described remaining area, that reduces described identification module calls frequency;When shared area is more than or equal to Second Threshold on screen for described remaining area, maintain or improve described identification module call frequency.
Adopt the implementation that the first is possible, on the one hand, when shared area is bigger on screen for AR virtual content, on screen, remaining clear area area is less, and the quantity of the AR target object that can show in clear area can reduce;In this case, AR application reduces identification module and calls frequency, still can keep the AR Consumer's Experience applied;Simultaneously as reduce AR application in identification module call frequency, it is possible to effectively reduce the power consumption of terminal.On the other hand, when shared area is less on screen for AR virtual content, on screen, remaining clear area area is bigger;In this case, in order to improve the utilization rate of screen, it is necessary to improve AR application in identification module call frequency, photographic head shooting picture frame in the AR target object that identifies as much as possible, improve Consumer's Experience.
Based on the device realizing augmented reality application shown in Figure 12, in the implementation that the second is possible:
Described acquiring unit 1201, specifically for obtaining the running status of AR virtual content;
Described processing unit 1202, specifically for when described AR virtual content is currently running, that reduces described identification module calls frequency;When described AR virtual content terminates to run, maintain or improve described identification module call frequency.
Adopt the implementation that the second is possible, on the one hand, when terminal is currently running AR virtual content, now the attention of user is concentrated mainly on the AR virtual content being currently running, and the identification demand of new AR target object can be reduced by user;In this case, terminal reduce AR application in identification module call frequency, it is ensured that Consumer's Experience, at the same time it can also be reduce power consumption of terminal.On the other hand, when terminal terminates to run AR virtual content, the identification demand of new AR target object can be increased by user;In this case, in being applied by raising AR, identification module calls frequency, it is possible to further meet user's demand to new AR target object, thus effectively promoting Consumer's Experience.
Based on the device realizing augmented reality application shown in Figure 12, in the implementation that the third is possible:
Described acquiring unit 1201, specifically for obtaining the acceleration information of terminal from acceleration transducer;
Described processing unit 1202, specifically for when the current acceleration of described terminal is more than three threshold values, that reduces described identification module calls frequency;When the current acceleration of described terminal is less than or equal to three threshold values, maintain or improve described identification module call frequency.
Further, described processing unit 1202, what be additionally operable to adjust in AR application tracking module calls frequency, including: when the current acceleration of described terminal is more than four threshold values, that reduces described tracking module calls frequency;When the current acceleration of described terminal is less than or equal to four threshold values, maintain or improve described tracking module call frequency.
Adopt the implementation that the third is possible, on the one hand, when the peak acceleration that the current acceleration of terminal can bear when running AR application more than this terminal, in this case, the picture frame photographed cannot be effectively identified by the identification module in AR application, even if terminal adopts higher frequency that picture frame is identified, without effectively identifying AR target object;So, it is possible to that reduces identification module calls frequency, thus reducing the power consumption of terminal.On the other hand, when the peak acceleration that the current acceleration of terminal can bear when running AR application less than or equal to this terminal, the picture frame photographed can be effectively identified by the identification module in AR application;In this case, terminal can maintain or improve AR application in identification module call frequency, thus significantly more efficient from shooting picture frame identify AR target object, thus providing a user with better experience.
Based on the device realizing augmented reality application shown in Figure 12, in the 4th kind of possible implementation:
Described acquiring unit 1201, specifically for obtaining the attitude information of terminal from gyroscope;
Described processing unit 1202, specifically for determining the described AR current application scene applied, obtains the described terminal available attitude under described current application scene;When the attitude of described terminal is not belonging to described available attitude, that reduces described identification module calls frequency;When the attitude of described terminal belongs to described available attitude, maintain or improve described identification module call frequency.
Further, described processing unit 1202, be additionally operable to adjust AR application in tracking module call frequency, including: when the attitude of described terminal is not belonging to described available attitude, that reduces described tracking module calls frequency;When the attitude of described terminal belongs to described available attitude, maintain or improve described tracking module call frequency.
Adopting the 4th kind of possible implementation, on the one hand, when the current pose of terminal belongs to the available attitude under current application scene, the picture frame photographed can be effectively identified by the identification module in AR application;In this case, terminal can maintain or improve AR application in identification module call frequency, thus significantly more efficient from shooting picture frame identify AR target object, provide a user with better experience.On the other hand, when the current pose of terminal is not belonging to the available attitude under current application scene, the picture frame photographed cannot be effectively identified by the identification module in AR application, even if terminal adopts higher frequency that picture frame is identified, without effectively identifying AR target object;So, it is possible to that reduces identification module calls frequency, thus reducing the power consumption of terminal.
Based on the device realizing augmented reality application shown in Figure 12, in the 5th kind of possible implementation:
Described acquiring unit 1201, is additionally operable to determine the quality of the picture frame that terminal obtains;
Described processing unit 1202, specifically for when the quality of the picture frame that described terminal obtains is lower than five threshold values, that reduces described identification module calls frequency;When the quality of the picture frame that described terminal obtains reaches at or above five threshold values, maintain or improve described identification module call frequency.
Further, described processing unit 1202, be additionally operable to adjust AR application in tracking module call frequency, including: when the quality of the picture frame that described terminal obtains is lower than six threshold values, that reduces described tracking module calls frequency;When the quality of the picture frame that described terminal obtains reaches at or above six threshold values, maintain or improve described tracking module call frequency.
Adopt the 5th kind of possible implementation, on the one hand, when the quality of the picture frame of photographic head shooting reaches described five threshold value, identification module effectively can identify AR target object from this picture frame;In this case, by maintain or improve identification module in AR application call frequency, it is possible to when picture frame is identified, effectively avoid the omission of AR target object in picture frame, promote Consumer's Experience.On the other hand, when the quality of the picture frame that photographic head shoots is not up to described five threshold value, identification module effectively can not identify AR target object from this picture frame;In this case, improve identification module call frequency also cannot be effectively improved AR application Consumer's Experience, it is possible to reduce AR application in identification module call frequency, thus saving the power consumption of terminal.
Based on above implementation five kinds possible, further, described processing unit 1202, be additionally operable to according to described tracking module call frequency adjust described AR application in rendering module call frequency so that described rendering module adopt the picture frame that call frequency, described tracking module was processed identical with tracking module render;
Or, described processing unit 1202, it is additionally operable on the picture frame that processed according to tracking module the position of AR target object, the position of AR target object on the untreated picture frame of tracking module is estimated, obtain estimated result, so that described raw image frame is rendered by rendering module according to described estimated result.
What the present embodiment provided realizes strengthening the device of display application, the first reference information is obtained by acquiring unit 1201, and the frequency of calling of identification module is adjusted in AR being applied according to described first reference information by processing unit 1202, under the premise ensureing Consumer's Experience, can effectively reduce the power consumption of terminal, thus realizing the frequency of calling of identification module in AR application is carried out relatively reasonable adjustment.
Embodiment 12
Present embodiments provide a kind of device realizing augmented reality application, it is possible to realize implementing the method realizing augmented reality application that seven to embodiment ten provides.
As shown in figure 13, the device that the present embodiment provides includes:
Acquiring unit 1301, is used for obtaining the second reference information, and described second reference information is that during augmented reality AR is applied, the frequency of calling of tracking module is adjusted the reference information of time institute's foundation;
Processing unit 1302, for according to described second reference information adjust described tracking module call frequency.
What deserves to be explained is, the device of augmented reality application that what the present embodiment provided realize is applied to terminal, can add the mode of required common hardware by software and realize, naturally it is also possible to by hardware, but in a lot of situation, the former is embodiment more preferably.
Based on the device realizing augmented reality application shown in Figure 13, in the implementation that the first is possible:
Described acquiring unit 1301, specifically for obtaining the acceleration information of terminal from acceleration sensor;
Described processing unit 1302, specifically for when the current acceleration of described terminal is more than four threshold values, that reduces described tracking module calls frequency;When the current acceleration of described terminal is less than or equal to four threshold values, maintain or improve described tracking module call frequency.
Based on the device realizing augmented reality application shown in Figure 13, in the implementation that the second is possible:
Described acquiring unit 1301, specifically for obtaining the attitude information of terminal from gyroscope;
Described processing unit 1302, specifically for determining the described AR current application scene applied, obtains the described terminal available attitude under described current application scene;When the attitude of described terminal is not belonging to described available attitude, that reduces described tracking module calls frequency;When the attitude of described terminal belongs to described available attitude, maintain or improve described tracking module call frequency.
Based on the device realizing augmented reality application shown in Figure 13, in the implementation that the third is possible:
Described acquiring unit 1301, specifically for determining the quality of picture frame that terminal obtains
Described processing unit 1302, specifically for when the quality of the picture frame that described terminal obtains is lower than six threshold values, that reduces described tracking module calls frequency;When the quality of the picture frame that described terminal obtains reaches at or above six threshold values, maintain or improve described tracking module call frequency.
Based on the implementation that the device realizing augmented reality application shown in Figure 13 and three of the above are possible, further, described processing unit 1302, be additionally operable to according to described tracking module call frequency adjust described AR application in rendering module call frequency so that described rendering module adopt the picture frame that call frequency, described tracking module was processed identical with tracking module render;
Or, described processing unit 1302, it is additionally operable on the picture frame that processed according to tracking module the position of AR target object, the position of AR target object on the untreated picture frame of tracking module is estimated, obtain estimated result, so that described raw image frame is rendered by rendering module according to described estimated result.
What the present embodiment provided realizes strengthening the device of display application, by obtaining the second reference information, and the frequency of calling of tracking module is adjusted in AR being applied according to described second reference information, under the premise ensureing Consumer's Experience, can effectively reduce the power consumption of terminal, thus realizing the frequency of calling of tracking module in AR application is carried out relatively reasonable adjustment.
Through the above description of the embodiments, those skilled in the art is it can be understood that can add the mode of required common hardware by software to the present invention and realize, naturally it is also possible to by hardware, but in a lot of situation, the former is embodiment more preferably.Based on such understanding, the part that prior art is contributed by technical scheme substantially in other words can embody with the form of software product, this computer software product is stored in the storage medium that can read, floppy disk such as computer, hard disk or CD etc., including some instructions with so that a computer equipment (can be personal computer, server, or the network equipment etc.) performs the method described in each embodiment of the present invention.
The above; being only the specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, any those familiar with the art is in the technical scope that the invention discloses; change can be readily occurred in or replace, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with described scope of the claims.

Claims (32)

1. the device realizing augmented reality application, it is characterised in that including:
Acquiring unit, for obtaining the first reference information, described first reference information is that during augmented reality AR is applied, the frequency of calling of identification module is adjusted the reference information of time institute's foundation, and described first reference information at least includes one or more in following type information: the quality information of the AR virtual content picture frame showing position, the running status of AR virtual content, the status information of terminal, photographic head shooting on a terminal screen;
Processing unit, for according to described acquiring unit obtain first reference information adjust described identification module call frequency.
2. device according to claim 1, it is characterised in that:
Described acquiring unit, obtains the relevant information of described AR virtual content from AR specifically for the rendering module in applying, and described relevant information includes described AR virtual content position on screen and size;
Described processing unit, specifically for when shared area is more than or equal to first threshold on screen for described AR virtual content, that reduces described identification module calls frequency;When shared area is less than first threshold on screen for described AR virtual content, maintain or improve described identification module call frequency.
3. device according to claim 1, it is characterised in that:
Described acquiring unit, obtains the relevant information of described AR virtual content from AR specifically for the rendering module in applying, and described relevant information includes described AR virtual content position on screen and size;
Described processing unit, is additionally operable to the position on screen according to described AR virtual content and size, it is determined that the remaining area on described screen;When shared area is less than Second Threshold on screen for described remaining area, that reduces described identification module calls frequency;When shared area is more than or equal to Second Threshold on screen for described remaining area, maintain or improve described identification module call frequency.
4. device according to claim 1, it is characterised in that:
Described acquiring unit, specifically for obtaining the running status of AR virtual content;
Described processing unit, specifically for when described AR virtual content is currently running, that reduces described identification module calls frequency;When described AR virtual content terminates to run, maintain or improve described identification module call frequency.
5. device according to claim 1, it is characterised in that:
Described acquiring unit, specifically for obtaining the acceleration information of terminal from acceleration transducer;
Described processing unit, specifically for when the current acceleration of described terminal is more than three threshold values, that reduces described identification module calls frequency;When the current acceleration of described terminal is less than or equal to three threshold values, maintain or improve described identification module call frequency.
6. device according to claim 5, it is characterised in that:
Described processing unit, what be additionally operable to adjust in AR application tracking module calls frequency, including: when the current acceleration of described terminal is more than four threshold values, that reduces described tracking module calls frequency;When the current acceleration of described terminal is less than or equal to four threshold values, maintain or improve described tracking module call frequency.
7. device according to claim 1, it is characterised in that:
Described acquiring unit, specifically for obtaining the attitude information of terminal from gyroscope;
Described processing unit, specifically for determining the described AR current application scene applied, obtains the described terminal available attitude under described current application scene;When the attitude of described terminal is not belonging to described available attitude, that reduces described identification module calls frequency;When the attitude of described terminal belongs to described available attitude, maintain or improve described identification module call frequency.
8. device according to claim 7, it is characterised in that:
Described processing unit, be additionally operable to adjust AR application in tracking module call frequency, including: when the attitude of described terminal is not belonging to described available attitude, that reduces described tracking module calls frequency;When the attitude of described terminal belongs to described available attitude, maintain or improve described tracking module call frequency.
9. device according to claim 1, it is characterised in that:
Described acquiring unit, is additionally operable to determine the quality of the picture frame that terminal obtains;
Described processing unit, specifically for when the quality of the picture frame that described terminal obtains is lower than five threshold values, that reduces described identification module calls frequency;When the quality of the picture frame that described terminal obtains reaches at or above five threshold values, maintain or improve described identification module call frequency.
10. device according to claim 9, it is characterised in that:
Described processing unit, be additionally operable to adjust AR application in tracking module call frequency, including: when the quality of the picture frame that described terminal obtains is lower than six threshold values, that reduces described tracking module calls frequency;When the quality of the picture frame that described terminal obtains reaches at or above six threshold values, maintain or improve described tracking module call frequency.
11. the device according to any one of claim 1-10, it is characterised in that:
Described processing unit, be additionally operable to according to tracking module call frequency adjust described AR application in rendering module call frequency so that described rendering module adopt the picture frame that call frequency, described tracking module was processed identical with tracking module render;
Or, described processing unit, it is additionally operable on the picture frame that processed according to tracking module the position of AR target object, the position of AR target object on the untreated picture frame of tracking module is estimated, obtain estimated result, so that raw image frame is rendered by rendering module according to described estimated result.
12. the device realizing augmented reality application, it is characterised in that including:
Acquiring unit, for obtaining the second reference information, described second reference information is that during augmented reality AR is applied, the frequency of calling of tracking module is adjusted the reference information of time institute's foundation, and described second reference information at least includes one or more in following type information: the acceleration information of terminal, the attitude information of terminal, picture frame quality information;
Processing unit, for according to described second reference information adjust described tracking module call frequency.
13. device according to claim 12, it is characterised in that:
Described acquiring unit, specifically for obtaining the acceleration information of terminal from acceleration sensor;
Described processing unit, specifically for when the current acceleration of described terminal is more than four threshold values, that reduces described tracking module calls frequency;When the current acceleration of described terminal is less than or equal to four threshold values, maintain or improve described tracking module call frequency.
14. device according to claim 12, it is characterised in that:
Described acquiring unit, specifically for obtaining the attitude information of terminal from gyroscope;
Described processing unit, specifically for determining the described AR current application scene applied, obtains the described terminal available attitude under described current application scene;When the attitude of described terminal is not belonging to described available attitude, that reduces described tracking module calls frequency;When the attitude of described terminal belongs to described available attitude, maintain or improve described tracking module call frequency.
15. device according to claim 12, it is characterised in that:
Described acquiring unit, specifically for determining the quality of picture frame that terminal obtains
Described processing unit, specifically for when the quality of the picture frame that described terminal obtains is lower than six threshold values, that reduces described tracking module calls frequency;When the quality of the picture frame that described terminal obtains reaches at or above six threshold values, maintain or improve described tracking module call frequency.
16. the device according to any one of claim 12-15, it is characterised in that:
Described processing unit, be additionally operable to according to described tracking module call frequency adjust described AR application in rendering module call frequency so that described rendering module adopt the picture frame that call frequency, described tracking module was processed identical with tracking module render;
Or, described processing unit, it is additionally operable on the picture frame that processed according to tracking module the position of AR target object, the position of AR target object on the untreated picture frame of tracking module is estimated, obtain estimated result, so that raw image frame is rendered by rendering module according to described estimated result.
17. the method realizing augmented reality application, it is characterised in that including:
Obtain the first reference information, described first reference information is that during augmented reality AR is applied, the frequency of calling of identification module is adjusted the reference information of time institute's foundation, and described first reference information at least includes one or more in following type information: the quality information of the AR virtual content picture frame showing position, the running status of AR virtual content, the status information of terminal, photographic head shooting on a terminal screen;
According to described first reference information adjust described identification module call frequency.
18. method according to claim 17, it is characterised in that described first reference information includes the relevant information of AR virtual content;
Described acquisition the first reference information, including: the rendering module applying from AR obtains the relevant information of described AR virtual content, and described relevant information includes described AR virtual content position on screen and size;
Described according to described first reference information adjust described identification module call frequency, including:
When shared area is more than or equal to first threshold on screen for described AR virtual content, that reduces described identification module calls frequency;
When shared area is less than first threshold on screen for described AR virtual content, maintain or improve described identification module call frequency.
19. method according to claim 17, it is characterised in that described first reference information includes the relevant information of AR virtual content;
Described acquisition the first reference information, including: the rendering module applying from AR obtains the relevant information of described AR virtual content, and described relevant information includes described AR virtual content position on screen and size;
Described according to described first reference information adjust described identification module call frequency, including:
The position on screen according to described AR virtual content and size, it is determined that the remaining area on described screen;When shared area is less than Second Threshold on screen for described remaining area, that reduces described identification module calls frequency;When shared area is more than or equal to Second Threshold on screen for described remaining area, maintain or improve described identification module call frequency.
20. method according to claim 17, it is characterised in that described first reference information includes the running status of AR virtual content;
Described according to described first reference information adjust described identification module call frequency, including:
When described AR virtual content is currently running, that reduces described identification module calls frequency;
When described AR virtual content terminates to run, maintain or improve described identification module call frequency.
21. method according to claim 17, it is characterised in that described first reference information includes the acceleration information of terminal;
Described acquisition the first reference information, including: the acceleration information of terminal is obtained from acceleration transducer;
Described according to described first reference information adjust described identification module call frequency, including:
When the current acceleration of described terminal is more than three threshold values, that reduces described identification module calls frequency;
When the current acceleration of described terminal is less than or equal to three threshold values, maintain or improve described identification module call frequency.
22. method according to claim 21, it is characterised in that described method also includes:
Adjust AR application in tracking module call frequency, including:
When the current acceleration of described terminal is more than four threshold values, that reduces described tracking module calls frequency;
When the current acceleration of described terminal is less than or equal to four threshold values, maintain or improve described tracking module call frequency.
23. method according to claim 17, it is characterised in that described first reference information includes the attitude information of terminal;
Described acquisition the first reference information, including: the attitude information of terminal is obtained from gyroscope;
Described according to described first reference information adjust described identification module call frequency, including:
Determine the described AR current application scene applied, obtain the described terminal available attitude under described current application scene;
When the attitude of described terminal is not belonging to described available attitude, that reduces described identification module calls frequency;
When the attitude of described terminal belongs to described available attitude, maintain or improve described identification module call frequency.
24. method according to claim 23, it is characterised in that described method also includes:
Adjust AR application in tracking module call frequency, including:
When the attitude of described terminal is not belonging to described available attitude, that reduces described tracking module calls frequency;
When the attitude of described terminal belongs to described available attitude, maintain or improve described tracking module call frequency.
25. method according to claim 17, it is characterised in that described first reference information includes the quality of the picture frame that terminal obtains;
Described according to described first reference information adjust described identification module call frequency, including:
When the quality of the picture frame that described terminal obtains is lower than five threshold values, that reduces described identification module calls frequency;
When the quality of the picture frame that described terminal obtains reaches at or above five threshold values, maintain or improve described identification module call frequency.
26. method according to claim 25, it is characterised in that described method also includes:
Adjust AR application in tracking module call frequency, including:
When the quality of the picture frame that described terminal obtains is lower than six threshold values, that reduces described tracking module calls frequency;
When the quality of the picture frame that described terminal obtains reaches at or above six threshold values, maintain or improve described tracking module call frequency.
27. the method according to any one of claim 17-26, it is characterised in that described method also includes:
According to tracking module call frequency adjust described AR application in rendering module call frequency, including:
Described rendering module adopts the picture frame that call frequency, described tracking module processed identical with tracking module to render;
Or, according to the position of AR target object on the picture frame that tracking module processed, the position of AR target object on the untreated picture frame of tracking module is estimated, obtaining estimated result, raw image frame is rendered by described rendering module according to described estimated result.
28. the method realizing augmented reality application, it is characterised in that including:
Obtain the second reference information, described second reference information is that during augmented reality AR is applied, the frequency of calling of tracking module is adjusted the reference information of time institute's foundation, and described second reference information at least includes one or more in following type information: the acceleration information of terminal, the attitude information of terminal, picture frame quality information;
According to described second reference information adjust described tracking module call frequency.
29. method according to claim 28, it is characterised in that described second reference information includes the acceleration information of terminal;
Described acquisition the second reference information, including: the acceleration information of terminal is obtained from acceleration sensor;
Described according to described second reference information adjust described tracking module call frequency, including:
When the current acceleration of described terminal is more than four threshold values, that reduces described tracking module calls frequency;
When the current acceleration of described terminal is less than or equal to four threshold values, maintain or improve described tracking module call frequency.
30. method according to claim 28, it is characterised in that described second reference information includes the attitude information of terminal;
Described acquisition the second reference information, including: the attitude information of terminal is obtained from gyroscope;
Described according to described second reference information adjust described tracking module call frequency, including:
Determine the described AR current application scene applied, obtain the described terminal available attitude under described current application scene;
When the attitude of described terminal is not belonging to described available attitude, that reduces described tracking module calls frequency;
When the attitude of described terminal belongs to described available attitude, maintain or improve described tracking module call frequency.
31. method according to claim 28, it is characterised in that described second reference information includes the quality of the picture frame that terminal obtains;
Described according to described second reference information adjust described tracking module call frequency, including:
When the quality of the picture frame that described terminal obtains is lower than six threshold values, that reduces described tracking module calls frequency;
When the quality of the picture frame that described terminal obtains reaches at or above six threshold values, maintain or improve described tracking module call frequency.
32. the method according to any one of claim 28-31, it is characterised in that described method also includes:
According to described tracking module call frequency adjust described AR application in rendering module call frequency, including:
Described rendering module adopts the picture frame that call frequency, described tracking module processed identical with tracking module to render;
According to the position of AR target object on the picture frame that tracking module processed, the position of AR target object on the untreated picture frame of tracking module is estimated, obtaining estimated result, raw image frame is rendered by described rendering module according to described estimated result.
CN201310664448.9A 2013-12-09 2013-12-09 Realize the device and method of augmented reality application Active CN103677211B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310664448.9A CN103677211B (en) 2013-12-09 2013-12-09 Realize the device and method of augmented reality application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310664448.9A CN103677211B (en) 2013-12-09 2013-12-09 Realize the device and method of augmented reality application

Publications (2)

Publication Number Publication Date
CN103677211A CN103677211A (en) 2014-03-26
CN103677211B true CN103677211B (en) 2016-07-06

Family

ID=50315036

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310664448.9A Active CN103677211B (en) 2013-12-09 2013-12-09 Realize the device and method of augmented reality application

Country Status (1)

Country Link
CN (1) CN103677211B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106445088B (en) * 2015-08-04 2020-05-22 上海宜维计算机科技有限公司 Method and system for reality augmentation
CN105491365A (en) * 2015-11-25 2016-04-13 罗军 Image processing method, device and system based on mobile terminal
CN105635712B (en) * 2015-12-30 2018-01-19 视辰信息科技(上海)有限公司 Video real time recording method and recording arrangement based on augmented reality
CN105786561B (en) * 2016-01-29 2020-06-02 北京小米移动软件有限公司 Method and device for calling process
CN105892619A (en) * 2016-03-31 2016-08-24 宇龙计算机通信科技(深圳)有限公司 Low-power-consumption human-face or iris identification method and device and terminal
CN106201712B (en) * 2016-06-28 2019-05-21 Oppo广东移动通信有限公司 The method of adjustment of target identification frequency, device and mobile terminal in augmented reality
CN106027802B (en) * 2016-07-06 2020-01-03 捷开通讯(深圳)有限公司 Mobile terminal and parameter setting method thereof
CN106681696B (en) * 2016-12-12 2019-05-14 中国航空工业集团公司西安航空计算技术研究所 A kind of large-scale parallel program optimization arrangement method
CN107945719A (en) * 2017-12-08 2018-04-20 快创科技(大连)有限公司 A kind of product introduction control system based on AR augmented realities
CN108875538B (en) * 2018-03-05 2022-07-08 北京旷视科技有限公司 Attribute detection method, device and system and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100470452C (en) * 2006-07-07 2009-03-18 华为技术有限公司 Method and system for implementing three-dimensional enhanced reality
US9606612B2 (en) * 2010-07-20 2017-03-28 Empire Technology Development Llc Augmented reality proximity sensing
US8711206B2 (en) * 2011-01-31 2014-04-29 Microsoft Corporation Mobile camera localization using depth maps

Also Published As

Publication number Publication date
CN103677211A (en) 2014-03-26

Similar Documents

Publication Publication Date Title
CN103677211B (en) Realize the device and method of augmented reality application
US8295683B2 (en) Temporal occlusion costing applied to video editing
CN109558008B (en) Control method, control device, computer equipment and storage medium
CN110853076A (en) Target tracking method, device, equipment and storage medium
CN112308095A (en) Picture preprocessing and model training method and device, server and storage medium
WO2017092332A1 (en) Method and device for image rendering processing
CN106937090A (en) The method and device of a kind of video storage
CN105190685B (en) Method and apparatus for providing the self-adapting data path for being used for computer vision application program
CN104618656A (en) Information processing method and electronic equipment
CN110473227A (en) Method for tracking target, device, equipment and storage medium
CN108259877A (en) A kind of white balancing treatment method and device
CN111160187A (en) Method, device and system for detecting left-behind object
CN111145215A (en) Target tracking method and device
CN104282013B (en) A kind of image processing method and device for foreground target detection
CN111738085B (en) System construction method and device for realizing automatic driving simultaneous positioning and mapping
CN107085845A (en) Image blurring detection method and device
CN112449115A (en) Shooting method and device and electronic equipment
CN104104878A (en) Reminding mthod and system for shooting time length in shooting process
CN108664847A (en) A kind of object identifying method, equipment and system
CN116051478A (en) Method and device for detecting quality of coating material on surface of battery diaphragm
CN113438468B (en) Dynamic control method and device for video quality, storage medium and electronic equipment
CN104112266A (en) Image edge blurring detecting method and device
CN114973344A (en) Face detection method, face detection device, terminal equipment and computer readable storage medium
CN112085002A (en) Portrait segmentation method, portrait segmentation device, storage medium and electronic equipment
US20200143518A1 (en) Computer device and method for generating dynamic images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200213

Address after: 518129 Bantian HUAWEI headquarters office building, Longgang District, Guangdong, Shenzhen

Patentee after: HUAWEI TECHNOLOGIES Co.,Ltd.

Address before: 210012 HUAWEI Nanjing base, 101 software Avenue, Yuhuatai District, Jiangsu, Nanjing

Patentee before: Huawei Technologies Co.,Ltd.