CN108114471B - AR service processing method and device, server and mobile terminal - Google Patents

AR service processing method and device, server and mobile terminal Download PDF

Info

Publication number
CN108114471B
CN108114471B CN201711261596.0A CN201711261596A CN108114471B CN 108114471 B CN108114471 B CN 108114471B CN 201711261596 A CN201711261596 A CN 201711261596A CN 108114471 B CN108114471 B CN 108114471B
Authority
CN
China
Prior art keywords
service
mobile terminal
data
server
virtual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711261596.0A
Other languages
Chinese (zh)
Other versions
CN108114471A (en
Inventor
王晓振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Guangzhou UCWeb Computer Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou UCWeb Computer Technology Co Ltd filed Critical Guangzhou UCWeb Computer Technology Co Ltd
Priority to CN201711261596.0A priority Critical patent/CN108114471B/en
Publication of CN108114471A publication Critical patent/CN108114471A/en
Priority to PCT/CN2018/117486 priority patent/WO2019109828A1/en
Application granted granted Critical
Publication of CN108114471B publication Critical patent/CN108114471B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/655Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • A63F2300/695Imported photos, e.g. of the player

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)
  • Telephonic Communication Services (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the invention provides an AR service processing method, an AR service processing device, a server and a mobile terminal, wherein the AR service processing method comprises the following steps: receiving geographic position information of a mobile terminal, and if determining that AR service position information matched with the geographic position information is locally stored, sending an image acquisition instruction to the mobile terminal; receiving image data returned by the mobile terminal according to the image acquisition instruction; and if the returned image data is matched with the image data corresponding to the AR service position information, sending the AR service data to the mobile terminal. By the embodiment of the invention, the geographic position, the AR service scene image and the AR service are associated, so that the AR technology can be effectively applied to game applications related to the position, and the use experience of a user is improved.

Description

AR service processing method and device, server and mobile terminal
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to an AR service processing method, an AR service processing device, a server and a mobile terminal.
Background
The AR (Augmented Reality) technology is a new technology that integrates real world information and virtual world information "seamlessly", and superimposes information (such as visual information, sound information, and the like) that is originally hard to experience in a certain time-space range of the real world, into real information after simulation, and real environment and virtual objects are superimposed on the same picture or space in real time.
With the development of AR technology, it is becoming a trend to apply AR technology to various application scenarios. In all applications, game-class applications occupy a large proportion of the applications, and therefore, how to effectively apply the AR technology to some game-class applications, for example, location-related game-class applications, by processing data related to the AR technology, so as to enhance the game experience of the user is becoming a research focus of the AR technology application.
Disclosure of Invention
In view of this, embodiments of the present invention provide an AR service processing method, an AR service processing apparatus, a server, and a mobile terminal, so as to implement processing on data related to an AR technology, so that the AR technology can be effectively applied to location-related game applications.
According to a first aspect of the embodiments of the present invention, an AR service processing method is provided, including: receiving geographic position information of a mobile terminal, and if determining that AR service position information matched with the geographic position information is locally stored, sending an image acquisition instruction to the mobile terminal; receiving image data returned by the mobile terminal according to the image acquisition instruction; and if the returned image data is matched with the image data corresponding to the AR service position information, sending the AR service data to the mobile terminal.
According to a second aspect of the embodiments of the present invention, another AR service processing method is provided, including: acquiring geographical position information of a mobile terminal, and sending the geographical position information to a server; receiving an image acquisition instruction sent by the server after the geographic position information is confirmed; acquiring image data according to the image acquisition instruction, and sending the image data to the server; receiving AR service data sent by the server after the image data is confirmed; and performing AR service display according to the AR service data.
According to a third aspect of the embodiments of the present invention, an AR service processing apparatus is provided, including: the mobile terminal comprises a first receiving module, a second receiving module and a third receiving module, wherein the first receiving module is used for receiving the geographical position information of the mobile terminal, and if the local storage of the AR service position information matched with the geographical position information is determined, an image acquisition instruction is sent to the mobile terminal; the second receiving module is used for receiving image data returned by the mobile terminal according to the image acquisition instruction; and the first sending module is used for sending the AR service data to the mobile terminal if the returned image data is matched with the image data corresponding to the AR service position information.
According to a fourth aspect of the embodiments of the present invention, there is provided another AR service processing apparatus, including: the system comprises a first acquisition module, a second acquisition module and a server, wherein the first acquisition module is used for acquiring the geographical position information of the mobile terminal and sending the geographical position information to the server; the third receiving module is used for receiving an image acquisition instruction sent by the server after the geographic position information is confirmed; the second sending module is used for collecting image data according to the image collecting instruction and sending the image data to the server; a fourth receiving module, configured to receive the AR service data sent by the server after the image data is confirmed; and the display module is used for displaying the AR service according to the AR service data.
According to a fifth aspect of the embodiments of the present invention, there is provided a server including: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus; the memory is configured to store at least one executable instruction, where the executable instruction causes the processor to perform an operation corresponding to the AR service processing method according to the first aspect.
According to a fifth aspect of the embodiments of the present invention, there is provided a mobile terminal, including: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus; the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the AR service processing method according to the second aspect.
As can be seen from the above technical solutions, when implementing the AR service, the geographic location information needs to be combined with image data, where the image data should be an image of a location indicated by the geographic location information, such as an image of a foreground in an office, and the image may describe an implementation scene of the AR service. If the geographic position indicated by the geographic position information does not belong to the position where the AR service can be set, or if the geographic position indicated by the geographic position information belongs to the position where the AR service can be set, but there is no implementation scene of the AR service (there is no matched image data), the corresponding AR service cannot be implemented. Therefore, by the embodiment of the invention, the geographic position, the AR service scene image and the AR service are associated, so that the AR technology can be effectively applied to the game application related to the position, and the use experience of the user is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present invention, and it is also possible for a person skilled in the art to obtain other drawings based on the drawings.
Fig. 1 is a flowchart illustrating steps of an AR service processing method according to a first embodiment of the present invention;
fig. 2 is a flowchart illustrating steps of an AR service processing method according to a second embodiment of the present invention;
fig. 3 is a flowchart illustrating steps of an AR service processing method according to a third embodiment of the present invention;
fig. 4 is a flowchart illustrating steps of an AR service processing method according to a fourth embodiment of the present invention;
fig. 5 is a flowchart illustrating steps of an AR service processing method according to a fifth embodiment of the present invention;
fig. 6 is a block diagram of an AR service processing apparatus according to a sixth embodiment of the present invention;
fig. 7 is a block diagram of an AR service processing apparatus according to a seventh embodiment of the present invention;
fig. 8 is a schematic structural diagram of a server according to an eighth embodiment of the present invention;
fig. 9 is a schematic structural diagram of a mobile terminal according to the ninth embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments of the present invention shall fall within the scope of the protection of the embodiments of the present invention.
The following further describes specific implementation of the embodiments of the present invention with reference to the drawings.
Example one
Referring to fig. 1, a flowchart illustrating steps of an AR service processing method according to a first embodiment of the present invention is shown.
The embodiment describes an AR service processing method provided in the embodiment of the present invention from the perspective of a server, where the AR service processing method of the embodiment includes the following steps:
step S102: and receiving the geographical position information of the mobile terminal, and sending an image acquisition instruction to the mobile terminal if the local storage of the AR service position information matched with the geographical position information is determined.
In this embodiment, if the mobile terminal wants to trigger the AR service, it needs to report the geographical location information to the server first, and the server can determine whether the mobile terminal is currently located in the usage scene of the AR service according to the geographical location information. The higher the accuracy of the geographical location information, the better the effect. Preferably, the accuracy of the geographical location information can be controlled within 10 meters, and more preferably, can be controlled within 3 meters.
The AR service may be any suitable service that can display AR effects and/or interact with AR virtual objects.
The AR service location information locally stored by the server and matched with the geographic location information may be information that is the same as the geographic location information, such as latitude and longitude coordinates, or specific location information generated according to the geographic location information, such as XX company foreground (in this case, when the server receives the geographic location information of the mobile terminal, the server needs to correspondingly convert the geographic location information into a corresponding specific location), and the like.
And if the server judges that the mobile terminal is currently located at the position of the use scene of the AR service, sending an image acquisition instruction to the mobile terminal to instruct the mobile terminal to acquire the image of the current position and carry out subsequent scene comparison.
Step S104: and receiving image data returned by the mobile terminal according to the image acquisition instruction.
After receiving the image acquisition instruction, the mobile terminal controls a camera in the mobile terminal to acquire an image at the current position and uploads acquired image data to the server. And the server receives the image data uploaded by the mobile terminal and performs subsequent comparison and judgment on the use scene.
Step S106: and if the returned image data is matched with the image data corresponding to the AR service position information, sending the AR service data to the mobile terminal.
The server stores not only the AR service location information but also image data corresponding to the AR service location information, wherein the image data is image data of the usage scene of the AR service, and may include one or more (two or more) images of the usage scene. In addition, the server also stores logic codes for processing the AR service to implement functions of receiving, processing, and transmitting information and data.
And if the current geographic position and the use scene of the mobile terminal are determined to be matched with the AR service according to the geographic position information and the image data sent by the mobile terminal, sending the AR service data to the mobile terminal. The AR service data includes but is not limited to: the system comprises an AR virtual object, an AR virtual item and operable data aiming at the AR virtual object.
With the embodiment, when the AR service is implemented, the geographic location information and the image data need to be combined, where the image data should be an image of the location indicated by the geographic location information, such as an image of a foreground in an office, and the image may describe an implementation scene of the AR service. If the geographic position indicated by the geographic position information does not belong to the position where the AR service can be set, or if the geographic position indicated by the geographic position information belongs to the position where the AR service can be set, but there is no implementation scene of the AR service (there is no matched image data), the corresponding AR service cannot be implemented. Therefore, by the embodiment, the geographic position, the AR service scene image and the AR service are associated, so that the AR technology can be effectively applied to the game application related to the position, and the use experience of the user is improved.
Example two
Referring to fig. 2, a flowchart illustrating steps of an AR service processing method according to a second embodiment of the present invention is shown.
The embodiment of the present invention further describes an AR service processing method from the perspective of a server.
The AR service processing method of the embodiment comprises the following steps:
step S202: the server receives the geographical position information of the mobile terminal, and if the local storage of the AR service position information matched with the geographical position information is determined, an image acquisition instruction is sent to the mobile terminal.
The server locally stores data required for the AR service, including but not limited to: AR service location information, image data corresponding to the AR service location information, AR service data, user information, AR service progress information, and the like.
It should be noted that, when the mobile terminal user uses the AR service for the first time, the data may not exist, and in this case, the user may be prompted to set the AR service, and the mobile terminal may set the position and the usage scene image of the AR service and upload the set information and data to the server for storage. The scene image suggestion is used for acquiring a plurality of images from different angles, and a video stream mode can be adopted, so that a more flexible acquisition angle is provided for a user in the subsequent image comparison.
In this step, after receiving the geographical location information of the mobile terminal, the server compares the geographical location information with locally stored AR service location information to determine whether the current location of the mobile terminal is an AR service location, and if the current location is the AR service location, an image acquisition instruction is sent to the mobile terminal to instruct the mobile terminal to perform image acquisition of a use scene of the AR service; if the position is not the AR service position, the prompt information is returned to the mobile terminal so as to prompt the user of the mobile terminal about errors or prompt the user to set the AR service. In another feasible mode, if the position is not the AR service position, namely the server determines that the AR service position information matched with the geographical position information is not stored locally, the AR service position information closest to the geographical position information is searched from the stored AR service position information; and sending prompt information comprising the closest AR service position information to the mobile terminal. This is because although the current location at which the mobile terminal is located has no AR service, the mobile terminal may choose to participate in the AR service when it is likely that there is an AR service within a certain distance range. Therefore, the corresponding position information of the AR service can be sent to the mobile terminal, and the user of the mobile terminal is guided to the position to participate in the AR service. Optionally, the mobile terminal side may also invoke a navigation application to navigate the user of the mobile terminal according to the current geographic location information of the mobile terminal and the closest AR service location information.
Step S204: and the server receives image data returned by the mobile terminal according to the image acquisition instruction.
The mobile terminal collects the image at the position according to the image collecting instruction and uploads the corresponding image data, and the server receives the image data uploaded by the mobile terminal.
Step S206: the server judges whether the image data returned by the mobile terminal is matched with the image data which is locally stored and corresponds to the AR service position information; if yes, go to step S208; if not, go to step S214.
The server locally stores image data of the use scene of the AR service, namely image data corresponding to the position information of the AR service, and can judge whether the mobile terminal is currently in the use scene of the AR service or not by comparing the image data returned by the mobile terminal.
In a feasible mode, the server performs image description processing on image data returned by the mobile terminal to obtain text description of the image data. The image data corresponding to the AR service location information stored locally by the server is also a text description of the stored image data, i.e., an image description. And the server compares whether the text description returned by the mobile terminal is matched with the stored image description. By carrying out image description processing on the image data, the data storage burden of the server side is reduced, and the comparison efficiency of the image data is improved.
The image description process comprises the following steps: after the image is segmented, the image is divided into a plurality of areas, including objects and backgrounds with different characteristics, wherein the areas may include certain shapes, such as rectangles, circles, curves and areas with any shapes; after the segmentation is completed, the regions with different characteristics are represented by data, symbols and formal languages, and a text description of the image is generated. The description of the image region can be divided into description of the region itself and description of the relationship and structure between the regions, including description of various forms such as lines, curves, regions, geometric features and the like. In the embodiment of the present invention, a person skilled in the art may perform image description processing on an image represented by image data returned by a mobile terminal by using any suitable image description model and algorithm, which is not limited in this embodiment of the present invention.
Step S208: and if the server judges that the returned image data is matched with the image data corresponding to the AR service position information, the server sends the AR service data to the mobile terminal.
Wherein, the AR service data comprises: AR service characteristic data, the AR service characteristic data comprising at least one of: the system comprises an AR virtual object, an AR virtual item and operable data aiming at the AR virtual object. The AR virtual object can be any appropriate two-dimensional image and/or special effect and three-dimensional image and/or special effect which can be displayed in the mobile terminal, and can also be a video and the like, such as a three-dimensional virtual plant or a virtual animal; the AR virtual item may be any appropriate item capable of operating an AR virtual object, such as a virtual kettle for watering virtual plants, a virtual shovel for shoveling virtual plants, a virtual insecticide for killing insects on virtual plants, and the like; the operational data for the AR virtual object is related to the operation performed on the AR virtual object and the current state, e.g., the upper limit of watering the virtual plant is five times a day, the operational data corresponding to watering is initially 5 every day, when the user waters the virtual plant 1 time, the operational data is updated to 4, and so on. Through the AR virtual object, the AR virtual item and the operable data aiming at the AR virtual object, the functions of effective interaction between the user and the virtual object, state updating and viewing of the virtual object and the like in the AR service can be realized.
But not limited to, the AR service feature data may also be set by those skilled in the art according to actual requirements; and, in addition to the above AR service feature data, the AR service data may also include but is not limited to: AR service progress data, AR service user data, and so on.
After receiving the corresponding AR service data, the mobile terminal can render and display the AR virtual object based on the image acquired by the image acquisition equipment in real time. Of course, the AR virtual item may also be rendered and displayed synchronously, and may perform operations such as corresponding information prompt according to the operational data for the AR virtual object.
Step S210: and the server receives the operation of the mobile terminal on the AR virtual object through the AR virtual prop in the AR service data, and updates the operable data of the AR virtual object according to the operation.
After the mobile terminal performs corresponding AR service display according to the AR service data, a user of the mobile terminal can operate the AR virtual object through the AR virtual prop and feed back operation information to the server in real time. After receiving the operation information, the server updates the operable data of the AR virtual object according to the operation. Optionally, the AR virtual object and/or the AR virtual tool may also be updated according to the updated operational data, and the updated AR virtual object and/or the AR virtual tool may be sent to the mobile terminal for the mobile terminal to update.
It should be noted that, for the social game application, a plurality of users participate in the same application through respective mobile terminals, and the users can operate the same AR virtual object through the respective mobile terminals at the same time or non-simultaneously, as if watering the same virtual plant with a virtual kettle at the same time; or when one user uses the virtual kettle to water the virtual plants through the mobile terminal of the user, the other user uses the virtual shovel to shovel the same virtual plants through the mobile terminal of the user, the operation information is reported to the server, and the server updates the states of the virtual plants in real time according to the operation information and feeds the states back to the mobile terminals.
That is, when the mobile terminal includes a plurality of mobile terminals, the server receives at least one operation performed on the AR virtual object by at least one of the mobile terminals through the AR virtual item, and updates the operable data of the corresponding AR virtual object according to the at least one operation. Optionally, the AR virtual object and/or the AR virtual item may also be updated at the same time.
Step S212: the server synchronizes the updated operational data to the mobile terminal, and returns to step S210 to continue the execution.
As described above, for a certain mobile terminal, the server may update the operational data for the AR virtual object according to the operation information uploaded by the mobile terminal, and further synchronize the updated operational data to the mobile terminal. However, in addition, the AR virtual object and/or the AR virtual item may also be updated, in which case, the updated AR virtual object and/or the updated AR virtual item may also be synchronized to the mobile terminal while the updated operational data is synchronized to the mobile terminal.
When the mobile terminals include a plurality of mobile terminals, the updated AR virtual object and/or the updated AR virtual item are synchronized to the plurality of mobile terminals while the updated operational data are synchronized to the plurality of mobile terminals. The plurality of mobile terminals display operation information of operations performed on the AR virtual objects and user information of the mobile terminals corresponding to the operations. For example, user XX waters the XX plant once, user YY fertilizes the XX plant once, and so on.
Therefore, the synchronization among the AR services participated by multiple persons is realized.
Step S214: and if the server judges that the image data returned by the mobile terminal is not matched with the image data corresponding to the AR service position information, sending the image data corresponding to the AR service position information to the mobile terminal, and prompting the mobile terminal to carry out image acquisition according to the sent image data.
When the image data returned by the mobile terminal is not matched with the image data locally stored in the server, the angle of the image shot by the user of the mobile terminal is possibly incorrect, so that the image data stored in the server can be sent to the mobile terminal for the user of the mobile terminal to take the image from the correct angle after referencing. By sending the image data for the user to refer, the accuracy of image acquisition by the user using the mobile terminal can be improved, and the user experience is improved.
However, it should be understood by those skilled in the art that, in actual use, if the server determines that the image data returned by the mobile terminal does not match the image data corresponding to the AR service location information, other appropriate processing, such as prompting an error, may be performed.
With the embodiment, when the AR service is implemented, the geographic location information and the image data need to be combined, where the image data should be an image of the location indicated by the geographic location information, such as an image of a foreground in an office, and the image may describe an implementation scene of the AR service. If the geographic position indicated by the geographic position information does not belong to the position where the AR service can be set, or if the geographic position indicated by the geographic position information belongs to the position where the AR service can be set, but there is no implementation scene of the AR service (there is no matched image data), the corresponding AR service cannot be implemented. Therefore, by the embodiment, the geographic position, the AR service scene image and the AR service are associated, so that the AR technology can be effectively applied to the game application related to the position, and the use experience of the user is improved.
EXAMPLE III
Referring to fig. 3, a flowchart illustrating steps of an AR service processing method according to a third embodiment of the present invention is shown.
The embodiment of the present invention describes an AR service processing method from the perspective of a mobile terminal.
The AR service processing method of the embodiment comprises the following steps:
step S302: and acquiring the geographical position information of the mobile terminal, and sending the geographical position information to a server.
When the mobile terminal wants to trigger the AR service, the mobile terminal needs to send its own geographical location information to the server first, so that the server can determine whether the mobile terminal is currently in a use scene of the AR service. The higher the accuracy of the geographical location information, the better the effect.
Step S304: and receiving an image acquisition instruction sent by the server after the geographic position information is confirmed.
The server stores information related to the AR service, including but not limited to: AR service location information, image data corresponding to the AR service location information, AR service data, user information, AR service progress information, and the like.
And after receiving the geographical position information sent by the mobile terminal and determining that the geographical position information is matched with the AR service position information stored in the server, the server sends an image acquisition instruction to the mobile terminal.
For the user of the mobile terminal which uses the AR service for the first time, the position and the image of the use scene of the AR service can be set through the mobile terminal, and the information or the data of the set position and the image are sent to the server for storage. The server judges whether the message is a message of a setting type or a message of an application type according to the message sent by the mobile terminal, and further determines the subsequent operation. If the message is the set type message, saving corresponding information or data uploaded by the mobile terminal; if the message is an application type message, the scheme described in this embodiment is executed.
Step S306: and acquiring image data according to the image acquisition instruction, and sending the acquired image data to a server.
After receiving the image acquisition instruction, the mobile terminal controls an image acquisition device in the mobile terminal, such as a camera, to shoot a scene at the current position, and uploads the shot image to the server for the server to identify and judge the use scene of the AR service.
Step S308: and receiving the AR service data sent by the server after the image data is confirmed.
If the geographic position information and the image data are confirmed by the server, the current position and scene of the mobile terminal are matched with the corresponding AR service, and then the AR service data are sent to the mobile terminal, so that the mobile terminal can render, display and interact the AR service data based on the image acquired by the local image acquisition equipment in real time.
The AR service data received by the mobile terminal from the server includes but is not limited to: AR service characteristic data, the AR service characteristic data comprising at least one of: an AR virtual object, an AR virtual item, operational data for the AR virtual object. But not limited to, the AR service feature data may also be set by those skilled in the art according to actual requirements; and, in addition to the above AR service feature data, the AR service data may also include but is not limited to: AR service progress data, AR service user data, and so on.
Step S310: and performing AR service display according to the AR service data.
After receiving the AR service data, the mobile terminal renders and draws corresponding AR service data, such as AR virtual objects and AR virtual props, on the basis of the collected real-time images, and displays the AR service data.
With the embodiment, when the AR service is implemented, the geographic location information and the image data need to be combined, where the image data should be an image of the location indicated by the geographic location information, such as an image of a foreground in an office, and the image may describe an implementation scene of the AR service. If the geographic position indicated by the geographic position information does not belong to the position where the AR service can be set, or if the geographic position indicated by the geographic position information belongs to the position where the AR service can be set, but there is no implementation scene of the AR service (there is no matched image data), the corresponding AR service cannot be implemented. Therefore, by the embodiment, the geographic position, the AR service scene image and the AR service are associated, so that the AR technology can be effectively applied to the game application related to the position, and the use experience of the user is improved.
Example four
Referring to fig. 4, a flowchart illustrating steps of an AR service processing method according to a fourth embodiment of the present invention is shown.
The embodiment of the present invention will explain an AR service processing method provided in the embodiment of the present invention from the perspective of a mobile terminal.
The AR service processing method of the embodiment comprises the following steps:
step S402: and the mobile terminal receives a trigger instruction of the AR service.
The trigger instruction may be a trigger instruction generated by triggering an AR application in a mobile terminal when the AR application is started, may also be a trigger instruction generated by triggering an AR service option in a conventional application, and may also be an instruction generated by triggering in another appropriate manner.
Step S404: and the mobile terminal acquires the geographical position information of the mobile terminal according to the trigger instruction and sends the geographical position information to the server.
In this embodiment, the accuracy of the geographical location information of the mobile terminal is set to be within 10 meters, preferably within 3 meters.
Step S406: and the mobile terminal receives an image acquisition instruction sent by the server after the geographic position information is confirmed.
Optionally, if the server determines that the geographical location information sent by the mobile terminal does not exist, the server may further search, from the stored AR service location information, for the AR service location information closest to the geographical location information; and sending prompt information including the closest AR service position information to the mobile terminal so as to guide a user of the mobile terminal to go to the position to participate in the AR service. In this case, the mobile terminal may invoke the navigation application to navigate the user of the mobile terminal according to the current geographic location information of the mobile terminal and the closest AR service location information. Of course, if the server determines that there is no geographical location information sent by the mobile terminal, the server may also prompt error information or information for prompting the user to set the AR service, and the like.
Step S408: the mobile terminal collects image data according to the image collection instruction and sends the collected image data to the server.
Step S410: and the mobile terminal receives the AR service data sent by the server after the image data is confirmed.
Wherein, the AR service data comprises: AR service characteristic data, the AR service characteristic data comprising at least one of: the system comprises an AR virtual object, an AR virtual item and operable data aiming at the AR virtual object.
Step S412: and the mobile terminal displays the AR service according to the AR service data.
For example, after receiving the AR service data sent by the server, rendering and drawing an AR virtual object and an AR virtual item on an image acquired by the mobile terminal in real time, and the like.
Step S414: and the mobile terminal receives the update data sent by the server after the AR service data is updated, updates the local AR service data according to the update data and displays the AR service.
On one hand, for the current mobile terminal, the current mobile terminal can operate the AR virtual object by using the AR virtual prop according to the operable data and send the operation information of the operation to the server; and then, receiving data obtained after the server updates the operable data stored in the server according to the operation information, and performing AR service display according to the updated data.
Optionally, the server may further update the AR virtual object and/or the AR virtual item according to the updated operational data. Based on this, the mobile terminal can receive the updated AR virtual object and/or AR virtual prop in addition to the updated operable data, and perform AR service display according to the updated data.
On the other hand, in the case of a plurality of mobile terminals, there is a possibility that the user of the current mobile terminal does not operate the AR virtual object but the user of another mobile terminal operates the AR virtual object, or that both the user of the current mobile terminal and the user of another mobile terminal operate the AR virtual object. In any case, however, for the AR service participated in by multiple users, corresponding update data needs to be sent to multiple mobile terminals to update the AR service data and update the AR service presentation. Based on this, the current mobile terminal may receive update data sent by the server, where the update data may only include updated operable data, or may include, in addition to the updated operable data, an updated AR virtual object and/or an updated AR virtual item, and the update data is generated by the server according to operation information of a plurality of mobile terminals including the current mobile terminal, where the operation information is used by the AR virtual item to operate the AR virtual object; and performing AR service display according to the updating data.
In addition, each mobile terminal may display operation information of operations performed on the AR virtual object by the plurality of mobile terminals participating in the AR service, and user information of the mobile terminal corresponding to the operations. For example, user A fertilizes the XX plant once, user B waters the XX plant once, user C waters the XX plant once, and so on.
Therefore, the updating and the display of the AR service referenced by multiple users are realized.
With the embodiment, when the AR service is implemented, the geographic location information and the image data need to be combined, where the image data should be an image of the location indicated by the geographic location information, such as an image of a foreground in an office, and the image may describe an implementation scene of the AR service. If the geographic position indicated by the geographic position information does not belong to the position where the AR service can be set, or if the geographic position indicated by the geographic position information belongs to the position where the AR service can be set, but there is no implementation scene of the AR service (there is no matched image data), the corresponding AR service cannot be implemented. Therefore, by the embodiment, the geographic position, the AR service scene image and the AR service are associated, so that the AR technology can be effectively applied to the game application related to the position, and the use experience of the user is improved.
EXAMPLE five
Referring to fig. 5, a flowchart illustrating steps of an AR service processing method according to a fifth embodiment of the present invention is shown.
In this embodiment, the AR service processing method provided by the embodiment of the present invention is described by taking an AR game for cultivating virtual plants as an example from the perspective of interaction between a mobile terminal and a server.
The AR service processing method of the embodiment comprises the following steps:
step S502: the mobile terminal A sets the use position and the use scene of the AR game for cultivating the virtual plants, and uploads the corresponding geographic position information and scene image data to the server.
The mobile terminal A starts or enters an AR game through a corresponding account, when the AR game is used for the first time, such as the AR game for cultivating virtual plants, any data related to the AR game are not stored in the server, and the data are uploaded to the server after the mobile terminal is set.
For example, after the mobile terminal a is started, the GPS data thereof is sent to the server, and then the camera is started according to the image acquisition instruction sent by the server, and the camera video stream is sent to the server.
Step S504: and the server receives and stores the geographical position information and the image data uploaded by the mobile terminal A, generates initialization data of the AR game and sends the initialization data to the mobile terminal A.
For example, after receiving the GPS data of the mobile terminal a, the server first checks whether the GPS data is stored, and when the server is used for the first time, the server does not have the GPS data, stores the GPS data, and sends an image acquisition instruction to the mobile terminal a to instruct the mobile terminal a to upload the image data; after waiting for the mobile terminal A to upload the video stream, establishing association between the data of the video stream and the GPS data; and after acquiring enough image data from the video stream, the server correspondingly stores the image data and the GPS data, generates initialization data of the AR game for cultivating the virtual plants, and sends the initialization data to the mobile terminal A. Alternatively, the saved image data may be an original image, or feature information of the image, or a textual description of the image.
In general, since images in a video stream are used to generate a portal of an AR game, it is necessary to acquire sufficient data of a usage scene of the AR game from the images. For example, when a company foreground is used as an AR game entrance, images of the company foreground from left to right are required to be continuous, so that users participating in the AR game can normally enter the AR game when shooting the images of the company foreground from different angles. The initialization data of the AR game at least includes an AR virtual object and an AR virtual item, for example, the initialization data of the AR game for cultivating virtual plants in this embodiment may include: an initial 3D model of the plant image, and interactive settings such as watering kettles and the like.
In addition, if the server does not have the GPS data sent by the mobile terminal a, the server may also search for a location closest to the current location of the mobile terminal a according to the stored GPS data and the GPS data of the mobile terminal a, and send a prompt message to the mobile terminal a according to the GPS data of the location to prompt the mobile terminal a of the location closest to the mobile terminal a to participate in the AR game.
In addition, the server also stores AR game data corresponding to each GPS data, wherein the AR game data comprises user id, AR game progress, AR game feature data and the like. The AR game feature data may be AR game virtual objects, virtual props, AR game operational data, e.g., how many pests are on a virtual plant, how many nutrients are needed by a plant, how much water is needed by a plant, etc.
Step S506: and the mobile terminal A receives the initialization data of the AR game and displays an AR game interface according to the initialization data.
After the AR game for cultivating the virtual plants in the mobile terminal A receives the initialization data returned by the server, the initialization data is drawn on the image based on the image shot by the camera at present so as to display an AR game interface for the user.
After the initialization process of the AR game is completed, a plurality of mobile terminals including the mobile terminal A can directly enter the AR game for cultivating the virtual plants through the GPS data and the image data of the using scene.
Step S508: and the mobile terminal B triggers the AR game, acquires the geographic position information of the mobile terminal B and sends the geographic position information to the server.
For example, the mobile terminal B transmits GPS data to the server.
It should be noted that, in the present embodiment, when a mobile terminal B is taken as an example, and when a plurality of mobile terminals are described, the process of the AR game is performed by other mobile terminals except the mobile terminal a which implements the initialization described above, it should be understood by those skilled in the art that, in actual use, any mobile terminal participating in the AR game, including the mobile terminals a and B, may perform the AR game operation of the present embodiment with reference to the operation of the mobile terminal B.
Step S510: the server receives the geographical position information of the mobile terminal B, matches the geographical position information stored in the server, and executes the step S512 if the matching is successful; and if the matching fails, sending a prompt message to the mobile terminal B to prompt an error.
Step S512: and the server sends an image acquisition instruction to the mobile terminal B.
The server checks whether the GPS data exists in the server or not after receiving the GPS data of the mobile terminal B, and if so, checks whether the GPS data has corresponding image data or not.
Step S514: and the mobile terminal B acquires the image of the position according to the image acquisition instruction and sends the image to the server.
For example, after receiving the image acquisition instruction, the mobile terminal B starts a camera and sends a camera video stream to the server.
Step S516: the server receives the image data of the image sent by the mobile terminal B, matches the image data corresponding to the geographical position information and stored in the server, and if the matching is successful, executes the step S518; and if the matching fails, sending prompt information to the mobile terminal B to prompt the user of the mobile terminal B whether an AR game use scene exists.
And after the server waits for the video stream returned by the mobile terminal B, comparing the image data in the video stream with the stored image data.
If the comparison is successful, performing the operation of the subsequent step S518; and if the comparison fails, sending a scene prompt to the mobile terminal B. For example, the company foreground is a game entry, the server does not recognize the image of the company foreground after the user of the mobile terminal B shoots the company foreground, but the GPS data of the mobile terminal B is matched, the server prompts the user of the mobile terminal B to shoot a few more recently, or extracts several frames of pictures from the image data of the company foreground stored in the server and sends the pictures to the user of the mobile terminal B, and prompts the user of the mobile terminal B to shoot the company foreground according to the angle in the image data.
Step S518: the server sends the AR game data to the mobile terminal B.
And if the image data in the video stream is successfully compared with the stored image data, returning the game data stored in the server to the mobile terminal B.
Step S520: and the mobile terminal B displays an AR game interface according to the AR game data.
And after waiting for the server to transmit the game data back, the mobile terminal B displays a game interface for the user according to the game data.
The AR game interface is an interface combining an AR virtual object and a real scene, wherein the AR virtual object is from AR game data sent by the server, and the real scene is from the view of a camera of the mobile terminal B.
Step S522: and the mobile terminal B receives the AR game operation performed by the user through the AR game interface and sends the operation information to the server.
Step S524: and the server updates AR game data according to the operation information and sends the AR game data to the mobile terminals A and B.
Step S526: and the mobile terminals A and B receive the updated AR game data and update the game interface.
The AR virtual object responds according to the operation of the user of the mobile terminal B, the mobile terminal B receives the operation of the user, if the operation is effective operation of the AR game, information corresponding to the operation is uploaded to the server, after the server receives the corresponding information, AR game data are updated and are transmitted to the mobile terminal A while being returned to the mobile terminal B, and after the mobile terminals A and B receive the updated data, the respective AR game data can be updated according to the updated data.
For example, in the AR game for cultivating virtual plants of the present embodiment, a user clicks a kettle icon to water the virtual plants; the AR game receives the clicked information of the kettle icon, and sends a corresponding event, such as water, after the kettle icon is clicked to the server; the server receives the water event, if the water shortage of the virtual plant corresponding to the AR game is 5 at the moment, and after the kettle image is clicked, the water shortage of the virtual plant is 4; the server sends a 3d model of the virtual plant corresponding to the water shortage of 4 to the mobile terminals A and B; and the AR game receives the information of the 3d model change, acquires the data of the new 3d model and respectively draws the data in the game interfaces of the mobile terminals A and B.
If other users participate in the AR game, the server sends the updated AR game data to a plurality of mobile terminals accessed simultaneously.
With this embodiment, an AR game of office vegetation is constructed. First, a user uploads the position data of the AR game by using a mobile terminal, and starts a camera to scan a specific position corresponding to the position data, such as a company foreground, so as to initialize the whole AR game. Then, other users use respective mobile terminals to upload position data and start the camera to scan the position again, such as a company foreground, and then the AR game can be entered. Meanwhile, the AR game has the characteristic of multi-user cooperative and social interaction, the social interaction requirement of the user can be effectively met, and the use experience of the user is improved.
It should be noted that the AR service processing scheme provided in the embodiment of the present invention is applicable to not only game application scenarios but also other AR service scenarios related to geographic location information.
EXAMPLE six
Referring to fig. 6, a block diagram of an AR service processing apparatus according to a sixth embodiment of the present invention is shown.
The AR service processing apparatus of this embodiment may be disposed in a server, and the AR service processing apparatus includes: a first receiving module 602, configured to receive geographic location information of a mobile terminal, and send an image acquisition instruction to the mobile terminal if it is determined that AR service location information matching the geographic location information is locally stored; a second receiving module 604, configured to receive image data returned by the mobile terminal according to the image acquisition instruction; a first sending module 606, configured to send the AR service data to the mobile terminal if the returned image data matches the image data corresponding to the AR service location information.
Optionally, the first sending module 606 is configured to perform image description processing on image data returned by the mobile terminal, and obtain a text description of the image data; and if the text description is matched with the image description corresponding to the AR service position information, sending AR service data to the mobile terminal.
Optionally, the AR service data includes: AR service characteristic data, the AR service characteristic data including at least one of: an AR virtual object, an AR virtual item, operational data for the AR virtual object.
Optionally, the AR service processing apparatus of this embodiment further includes: a first updating module 608, configured to receive an operation performed on the AR virtual object by the mobile terminal through the AR virtual item after the first sending module 606 sends the AR service data to the mobile terminal, and update the operable data of the AR virtual object according to the operation; a synchronization module 610 for synchronizing the updated operational data to the mobile terminal.
Optionally, the first updating module 608 is further configured to update the AR virtual object and/or the AR virtual item according to the updated operational data; the synchronization module 610 is further configured to synchronize the updated AR virtual object and/or the updated AR virtual item to the mobile terminal while synchronizing the updated operational data to the mobile terminal.
Optionally, the mobile terminal comprises a plurality of terminals; the first updating module 608 is configured to receive at least one operation performed on an AR virtual object by at least one mobile terminal of the plurality of mobile terminals through an AR virtual item, and update operable data of the corresponding AR virtual object according to the at least one operation; the synchronization module 610 is configured to synchronize the updated AR virtual object and/or the updated AR virtual item to the plurality of mobile terminals while synchronizing the updated operational data to the plurality of mobile terminals.
Optionally, the first sending module 606 is further configured to send the image data corresponding to the AR service location information to the mobile terminal if the image data returned by the mobile terminal is not matched with the image data corresponding to the AR service location information, and prompt the mobile terminal to perform image acquisition according to the sent image data.
Optionally, the first sending module 606 is further configured to, if it is determined that the AR service location information matched with the geographic location information is not locally stored, search, from the stored AR service location information, for AR service location information closest to the geographic location information; and sending prompt information comprising the closest AR service position information to the mobile terminal.
The AR service processing apparatus of this embodiment is configured to implement the corresponding AR service processing method at the server side in the foregoing multiple method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again.
EXAMPLE seven
Referring to fig. 7, a block diagram of an AR service processing apparatus according to a seventh embodiment of the present invention is shown.
The AR service processing apparatus of this embodiment may be disposed in any suitable mobile terminal, and includes: a first obtaining module 702, configured to obtain geographic location information of a mobile terminal, and send the geographic location information to a server; a third receiving module 704, configured to receive an image acquisition instruction sent by the server after the geographic location information is confirmed; a second sending module 706, configured to collect image data according to the image collection instruction, and send the image data to the server; a fourth receiving module 708, configured to receive the AR service data sent by the server after the image data is confirmed; and the display module 710 is configured to perform AR service display according to the AR service data.
Optionally, the AR service data includes: AR service characteristic data, the AR service characteristic data including at least one of: an AR virtual object, an AR virtual item, operational data for the AR virtual object.
Optionally, the AR service processing apparatus of this embodiment further includes: a third sending module 712, configured to, after the presentation module 710 presents the AR service according to the AR service data, operate the AR virtual object using the AR virtual item according to the operational data, and send operation information of the operation to the server.
Optionally, the AR data processing apparatus of this embodiment further includes: a fifth receiving module 714, configured to receive, after the third sending module 712 sends the operation information of the operation to the server, update data sent by the server, where the update data includes, in addition to the updated operable data, an updated AR virtual object and/or an updated AR virtual item, and the update data is generated by the server according to the operation information of the multiple mobile terminals including the current mobile terminal, where the operation information uses the AR virtual item to operate the AR virtual object; and a second updating module 716, configured to perform AR service display according to the update data.
Optionally, the display module 710 is further configured to display, on the mobile terminal, operation information of operations performed on the AR virtual object by the multiple mobile terminals and user information of the mobile terminal corresponding to the operations.
The AR service processing apparatus of this embodiment is configured to implement the corresponding AR service processing method of the mobile terminal in the foregoing multiple method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again.
Example eight
Referring to fig. 8, a schematic structural diagram of a server according to an eighth embodiment of the present invention is shown, and the specific embodiment of the present invention does not limit the specific implementation of the server.
As shown in fig. 8, the server may include: a processor (processor)802, a Communications Interface 804, a memory 806, and a communication bus 808.
Wherein:
the processor 802, communication interface 804, and memory 806 communicate with one another via a communication bus 808.
A communication interface 804 for communicating with other servers or mobile terminals.
The processor 802 is configured to execute the program 810, and may specifically execute relevant steps in the above-described server-side AR service processing method embodiment.
In particular, the program 810 may include program code comprising computer operating instructions.
The processor 802 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement an embodiment of the invention. The one or more processors included in the mobile terminal may be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
The memory 806 stores a program 810. The memory 806 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 810 may be specifically configured to cause the processor 802 to perform the following operations: receiving geographic position information of the mobile terminal, and if determining that AR service position information matched with the geographic position information is locally stored, sending an image acquisition instruction to the mobile terminal; receiving image data returned by the mobile terminal according to the image acquisition instruction; and if the returned image data is matched with the image data corresponding to the AR service position information, sending the AR service data to the mobile terminal.
In an optional implementation, the program 810 is further configured to enable the processor 802 to perform image description processing on the image data to obtain text description of the image data when sending the AR service data to the mobile terminal if the image data matches the image data corresponding to the AR service location information; and if the text description is matched with the image description corresponding to the AR service position information, sending AR service data to the mobile terminal.
In an optional embodiment, the AR service data includes: AR service characteristic data, the AR service characteristic data including at least one of: an AR virtual object, an AR virtual item, operational data for the AR virtual object.
In an alternative embodiment, the program 810 is further configured to enable the processor 802 to receive an operation performed on the AR virtual object by the mobile terminal through the AR virtual item, and update the operational data of the AR virtual object according to the operation; and synchronizing the updated operational data to the mobile terminal.
In an alternative embodiment, program 810 is further operative to cause processor 802 to update the AR virtual objects and/or AR virtual props according to the updated actionable data; and synchronizing the updated AR virtual object and/or the updated AR virtual item to the mobile terminal while synchronizing the updated operable data to the mobile terminal.
In an alternative embodiment, the mobile terminal comprises a plurality of terminals; the program 810 is further configured to enable the processor 802, when receiving an operation performed by the mobile terminal on the AR virtual object through the AR virtual item, and updating the operational data of the AR virtual object according to the operation, to receive at least one operation performed by at least one of the plurality of mobile terminals on the AR virtual object through the AR virtual item, and update the operational data of the corresponding AR virtual object according to the at least one operation; program 810 is also for causing processor 802 to synchronize the updated AR virtual object and/or the updated AR virtual item to the plurality of mobile terminals while synchronizing the updated operational data to the plurality of mobile terminals.
In an optional implementation, the program 810 is further configured to enable the processor 802 to send the image data corresponding to the AR service location information to the mobile terminal if the returned image data does not match the image data corresponding to the AR service location information, and prompt the mobile terminal to perform image acquisition according to the sent image data.
In an alternative embodiment, the program 810 is further configured to enable the processor 802 to search, if it is determined that AR service location information matching the geographic location information is not locally stored, for AR service location information closest to the geographic location information from the stored AR service location information; and sending prompt information comprising the closest AR service position information to the mobile terminal.
For specific implementation of each step in the program 810, reference may be made to corresponding steps and corresponding descriptions in units in the above embodiment of the server-side AR service processing method, which are not described herein again. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
With the embodiment, when the AR service is implemented, the geographic location information and the image data need to be combined, where the image data should be an image of the location indicated by the geographic location information, such as an image of a foreground in an office, and the image may describe an implementation scene of the AR service. If the geographic position indicated by the geographic position information does not belong to the position where the AR service can be set, or if the geographic position indicated by the geographic position information belongs to the position where the AR service can be set, but there is no implementation scene of the AR service (there is no matched image data), the corresponding AR service cannot be implemented. Therefore, by the embodiment of the invention, the geographic position, the AR service scene image and the AR service are associated, so that the AR technology can be effectively applied to the game application related to the position, and the use experience of the user is improved.
Example nine
Referring to fig. 9, a schematic structural diagram of a mobile terminal according to a ninth embodiment of the present invention is shown, and the specific embodiment of the present invention does not limit the specific implementation of the mobile terminal.
As shown in fig. 9, the mobile terminal may include: a processor (processor)902, a communication Interface 904, a memory 906, and a communication bus 908.
Wherein:
the processor 902, communication interface 904, and memory 906 communicate with one another via a communication bus 908.
A communication interface 904 for communicating with other mobile terminals or servers.
The processor 902 is configured to execute the program 910, and may specifically execute relevant steps in the above embodiment of the AR service processing method at the mobile terminal side.
In particular, the program 910 may include program code that includes computer operating instructions.
The processor 902 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement an embodiment of the invention. The one or more processors included in the mobile terminal may be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
A memory 906 for storing a program 910. The memory 906 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 910 may specifically be configured to cause the processor 902 to perform the following operations: acquiring geographical position information of a mobile terminal, and sending the geographical position information to a server; receiving an image acquisition instruction sent by a server after the geographic position information is confirmed; acquiring image data according to an image acquisition instruction, and sending the image data to a server; receiving AR service data sent by a server after the image data is confirmed; and performing AR service display according to the AR service data.
In an optional embodiment, the AR service data includes: AR service characteristic data, the AR service characteristic data including at least one of: an AR virtual object, an AR virtual item, operational data for the AR virtual object.
In an optional implementation manner, the program 910 is further configured to enable the processor 902, after performing AR service display according to the AR service data, to operate the AR virtual object according to the operational data by using the AR virtual item, and send operation information of the operation to the server.
In an optional implementation manner, the program 910 is further configured to enable the processor 902, after performing AR service display according to AR service data, to receive update data sent by a server, where the update data includes, in addition to updated operable data, an updated AR virtual object and/or an updated AR virtual item, and the update data is generated by the server according to operation information of a plurality of mobile terminals including a current mobile terminal, where the operation information is that the AR virtual item is used by the mobile terminals to operate the AR virtual object; and performing AR service display according to the updating data.
In an alternative embodiment, the program 910 is further configured to enable the processor 902 to display, on the mobile terminal, operation information of operations performed on the AR virtual object by the plurality of mobile terminals and user information of the mobile terminal corresponding to the operations.
For specific implementation of each step in the procedure 910, reference may be made to corresponding steps and corresponding descriptions in units in the above embodiment of the AR service processing method at the mobile terminal side, which are not described herein again. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
With the embodiment, when the AR service is implemented, the geographic location information and the image data need to be combined, where the image data should be an image of the location indicated by the geographic location information, such as an image of a foreground in an office, and the image may describe an implementation scene of the AR service. If the geographic position indicated by the geographic position information does not belong to the position where the AR service can be set, or if the geographic position indicated by the geographic position information belongs to the position where the AR service can be set, but there is no implementation scene of the AR service (there is no matched image data), the corresponding AR service cannot be implemented. Therefore, by the embodiment of the invention, the geographic position, the AR service scene image and the AR service are associated, so that the AR technology can be effectively applied to the game application related to the position, and the use experience of the user is improved.
It should be noted that, according to the implementation requirement, each component/step described in the embodiment of the present invention may be divided into more components/steps, and two or more components/steps or partial operations of the components/steps may also be combined into a new component/step to achieve the purpose of the embodiment of the present invention.
The above-described method according to an embodiment of the present invention may be implemented in hardware, firmware, or as software or computer code storable in a recording medium such as a CD ROM, a RAM, a floppy disk, a hard disk, or a magneto-optical disk, or as computer code originally stored in a remote recording medium or a non-transitory machine-readable medium downloaded through a network and to be stored in a local recording medium, so that the method described herein may be stored in such software processing on a recording medium using a general-purpose computer, a dedicated processor, or programmable or dedicated hardware such as an ASIC or FPGA. It is understood that the computer, processor, microprocessor controller or programmable hardware includes a storage component (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, implements the two-dimensional code data processing methods described herein. Further, when a general-purpose computer accesses code for implementing the two-dimensional code data processing method shown herein, execution of the code converts the general-purpose computer into a special-purpose computer for executing the two-dimensional code data processing method shown herein.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present embodiments.
The above embodiments are only for illustrating the embodiments of the present invention and not for limiting the embodiments of the present invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the embodiments of the present invention, so that all equivalent technical solutions also belong to the scope of the embodiments of the present invention, and the scope of patent protection of the embodiments of the present invention should be defined by the claims.

Claims (28)

1. An Augmented Reality (AR) service processing method comprises the following steps:
receiving geographic position information of a mobile terminal, and if determining that AR service position information matched with the geographic position information is locally stored, sending an image acquisition instruction to the mobile terminal;
receiving image data returned by the mobile terminal according to the image acquisition instruction;
and if the returned image data is matched with the image data corresponding to the AR service position information, sending the AR service data to the mobile terminal.
2. The method of claim 1, wherein if the image data matches the image data corresponding to the AR service location information, sending the AR service data to the mobile terminal comprises:
performing image description processing on the image data to acquire text description of the image data;
and if the text description is matched with the image description corresponding to the AR service position information, sending AR service data to the mobile terminal.
3. The method of claim 1 or 2, wherein the AR traffic data comprises: AR service characteristic data, the AR service characteristic data comprising at least one of: an AR virtual object, an AR virtual item, operational data for the AR virtual object.
4. The method of claim 3, wherein after transmitting the AR traffic data to the mobile terminal, the method further comprises:
receiving the operation of the mobile terminal on the AR virtual object through the AR virtual prop, and updating the operable data of the AR virtual object according to the operation;
synchronizing the updated operational data to the mobile terminal.
5. The method of claim 4, wherein the method further comprises:
updating the AR virtual object and/or the AR virtual prop according to the updated operable data;
and synchronizing the updated AR virtual object and/or the updated AR virtual item to the mobile terminal while synchronizing the updated operable data to the mobile terminal.
6. The method of claim 5, wherein the mobile terminal comprises a plurality;
the receiving the operation of the mobile terminal on the AR virtual object through the AR virtual item, and updating the operational data of the AR virtual object according to the operation includes: receiving at least one item of operation of at least one mobile terminal in a plurality of mobile terminals on the AR virtual object through the AR virtual prop, and updating corresponding operable data of the AR virtual object according to the at least one item of operation;
the synchronizing the updated AR virtual object and/or the updated AR virtual item to the mobile terminal while synchronizing the updated operational data to the mobile terminal includes: and synchronizing the updated AR virtual object and/or the updated AR virtual item to the plurality of mobile terminals while synchronizing the updated operational data to the plurality of mobile terminals.
7. The method of claim 1, wherein the method further comprises:
and if the returned image data are not matched with the image data corresponding to the AR service position information, sending the image data corresponding to the AR service position information to the mobile terminal, and prompting the mobile terminal to carry out image acquisition according to the sent image data.
8. The method of claim 1, wherein the method further comprises:
if the fact that the AR service position information matched with the geographic position information is not stored locally is determined, the AR service position information closest to the geographic position information is searched from the stored AR service position information;
and sending prompt information comprising the closest AR service position information to the mobile terminal.
9. An Augmented Reality (AR) service processing method comprises the following steps:
acquiring geographical position information of a mobile terminal, and sending the geographical position information to a server;
receiving an image acquisition instruction sent by the server after the geographic position information is confirmed;
acquiring image data according to the image acquisition instruction, and sending the image data to the server;
receiving AR service data sent by the server after the image data is confirmed;
and performing AR service display according to the AR service data.
10. The method of claim 9, wherein the AR traffic data comprises: AR service characteristic data, the AR service characteristic data comprising at least one of: an AR virtual object, an AR virtual item, operational data for the AR virtual object.
11. The method of claim 10, wherein after the AR service exposure according to the AR service data, the method further comprises:
and operating the AR virtual object by using the AR virtual prop according to the operable data, and sending the operating information of the operation to the server.
12. The method of claim 10, wherein after the sending operation information for the operation to the server, the method further comprises:
receiving update data sent by the server, wherein the update data comprises an updated AR virtual object and/or an updated AR virtual item besides updated operable data, and the update data is generated by the server according to operation information of a plurality of mobile terminals including the mobile terminal, which use the AR virtual item to operate the AR virtual object;
and performing AR service display according to the updating data.
13. The method of claim 12, wherein the method further comprises:
and displaying operation information of the operation performed by the plurality of mobile terminals on the AR virtual object and user information of the mobile terminal corresponding to the operation on the mobile terminal.
14. An Augmented Reality (AR) service processing apparatus, comprising:
the mobile terminal comprises a first receiving module, a second receiving module and a third receiving module, wherein the first receiving module is used for receiving the geographical position information of the mobile terminal, and if the local storage of the AR service position information matched with the geographical position information is determined, an image acquisition instruction is sent to the mobile terminal;
the second receiving module is used for receiving image data returned by the mobile terminal according to the image acquisition instruction;
and the first sending module is used for sending the AR service data to the mobile terminal if the returned image data is matched with the image data corresponding to the AR service position information.
15. The apparatus according to claim 14, wherein the first sending module is configured to perform image description processing on the image data to obtain a text description of the image data; and if the text description is matched with the image description corresponding to the AR service position information, sending AR service data to the mobile terminal.
16. The apparatus of claim 14 or 15, wherein the AR traffic data comprises: AR service characteristic data, the AR service characteristic data comprising at least one of: an AR virtual object, an AR virtual item, operational data for the AR virtual object.
17. The apparatus of claim 16, wherein the apparatus further comprises:
a first updating module, configured to receive, after the first sending module sends the AR service data to the mobile terminal, an operation performed by the mobile terminal on the AR virtual object through the AR virtual item, and update, according to the operation, operational data of the AR virtual object;
and the synchronization module is used for synchronizing the updated operable data to the mobile terminal.
18. The apparatus of claim 17, wherein,
the first updating module is further used for updating the AR virtual object and/or the AR virtual prop according to the updated operable data;
the synchronization module is further configured to synchronize the updated AR virtual object and/or the updated AR virtual item to the mobile terminal while synchronizing the updated operational data to the mobile terminal.
19. The apparatus of claim 18, wherein the mobile terminal comprises a plurality;
the first updating module is configured to receive at least one item of operation performed on the AR virtual object by at least one mobile terminal of the plurality of mobile terminals through the AR virtual item, and update the corresponding operational data of the AR virtual object according to the at least one item of operation;
the synchronization module is configured to synchronize the updated AR virtual object and/or the updated AR virtual item to the plurality of mobile terminals while synchronizing the updated operational data to the plurality of mobile terminals.
20. The apparatus of claim 14, wherein the first sending module is further configured to send the image data corresponding to the AR service location information to the mobile terminal if the returned image data is not matched with the image data corresponding to the AR service location information, and prompt the mobile terminal to perform image acquisition according to the sent image data.
21. The apparatus of claim 14, wherein the first sending module is further configured to, if it is determined that AR service location information matching the geographic location information is not locally stored, find, from the stored AR service location information, AR service location information closest to the geographic location information; and sending prompt information comprising the closest AR service position information to the mobile terminal.
22. An Augmented Reality (AR) service processing apparatus, comprising:
the system comprises a first acquisition module, a second acquisition module and a server, wherein the first acquisition module is used for acquiring the geographical position information of the mobile terminal and sending the geographical position information to the server;
the third receiving module is used for receiving an image acquisition instruction sent by the server after the geographic position information is confirmed;
the second sending module is used for collecting image data according to the image collecting instruction and sending the image data to the server;
a fourth receiving module, configured to receive the AR service data sent by the server after the image data is confirmed;
and the display module is used for displaying the AR service according to the AR service data.
23. The apparatus of claim 22, wherein the AR traffic data comprises: AR service characteristic data, the AR service characteristic data comprising at least one of: an AR virtual object, an AR virtual item, operational data for the AR virtual object.
24. The apparatus of claim 23, wherein the apparatus further comprises:
and the third sending module is used for operating the AR virtual object by using the AR virtual prop according to the operable data after the display module displays the AR service according to the AR service data, and sending the operating information of the operation to the server.
25. The apparatus of claim 23, wherein the apparatus further comprises:
a fifth receiving module, configured to receive, after the third sending module sends the operation information of the operation to the server, update data sent by the server, where the update data includes, in addition to the updated operable data, an updated AR virtual object and/or an updated AR virtual item, and the update data is generated by the server according to operation information of a plurality of mobile terminals including the mobile terminal, where the operation information uses the AR virtual item to operate the AR virtual object;
and the second updating module is used for displaying the AR service according to the updating data.
26. The apparatus of claim 25, wherein the presentation module is further configured to display, on the mobile terminal, operation information of the operation performed on the AR virtual object by the plurality of mobile terminals and user information of a mobile terminal corresponding to the operation.
27. A server, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the AR service processing method of any one of claims 1-8.
28. A mobile terminal, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the AR service processing method according to any one of claims 9-13.
CN201711261596.0A 2017-12-04 2017-12-04 AR service processing method and device, server and mobile terminal Active CN108114471B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201711261596.0A CN108114471B (en) 2017-12-04 2017-12-04 AR service processing method and device, server and mobile terminal
PCT/CN2018/117486 WO2019109828A1 (en) 2017-12-04 2018-11-26 Ar service processing method, device, server, mobile terminal, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711261596.0A CN108114471B (en) 2017-12-04 2017-12-04 AR service processing method and device, server and mobile terminal

Publications (2)

Publication Number Publication Date
CN108114471A CN108114471A (en) 2018-06-05
CN108114471B true CN108114471B (en) 2020-05-22

Family

ID=62228892

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711261596.0A Active CN108114471B (en) 2017-12-04 2017-12-04 AR service processing method and device, server and mobile terminal

Country Status (2)

Country Link
CN (1) CN108114471B (en)
WO (1) WO2019109828A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108114471B (en) * 2017-12-04 2020-05-22 广州市动景计算机科技有限公司 AR service processing method and device, server and mobile terminal
CN109003317A (en) * 2018-07-04 2018-12-14 百度在线网络技术(北京)有限公司 Virtual information processing method, device, equipment and storage medium
CN108905199A (en) * 2018-07-24 2018-11-30 合肥爱玩动漫有限公司 A kind of game skill acquisition and game skill upgrade method based on AR
US10854004B2 (en) * 2018-08-24 2020-12-01 Facebook, Inc. Multi-device mapping and collaboration in augmented-reality environments
CN111953849A (en) * 2020-08-28 2020-11-17 深圳市慧鲤科技有限公司 Method and device for displaying message board, electronic equipment and storage medium
CN112422680A (en) * 2020-11-18 2021-02-26 中国联合网络通信集团有限公司 Communication method and device
CN113398577B (en) * 2021-05-13 2024-04-09 杭州易现先进科技有限公司 Multi-person AR interaction method and system for offline space

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102891915A (en) * 2011-07-18 2013-01-23 上海盛畅网络科技有限公司 Actual and virtual interactive entertainment system and method for mobile terminal
CN106582016A (en) * 2016-12-05 2017-04-26 湖南简成信息技术有限公司 Augmented reality-based motion game control method and control apparatus
CN106730814A (en) * 2016-11-22 2017-05-31 深圳维京人网络科技有限公司 Marine fishing class game based on AR and face recognition technology

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2501929B (en) * 2012-05-11 2015-06-24 Sony Comp Entertainment Europe Apparatus and method for augmented reality
WO2015041697A1 (en) * 2013-09-23 2015-03-26 Empire Technology Development Llc Location graph adapted video games
CN108114471B (en) * 2017-12-04 2020-05-22 广州市动景计算机科技有限公司 AR service processing method and device, server and mobile terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102891915A (en) * 2011-07-18 2013-01-23 上海盛畅网络科技有限公司 Actual and virtual interactive entertainment system and method for mobile terminal
CN106730814A (en) * 2016-11-22 2017-05-31 深圳维京人网络科技有限公司 Marine fishing class game based on AR and face recognition technology
CN106582016A (en) * 2016-12-05 2017-04-26 湖南简成信息技术有限公司 Augmented reality-based motion game control method and control apparatus

Also Published As

Publication number Publication date
WO2019109828A1 (en) 2019-06-13
CN108114471A (en) 2018-06-05

Similar Documents

Publication Publication Date Title
CN108114471B (en) AR service processing method and device, server and mobile terminal
CN108446310B (en) Virtual street view map generation method and device and client device
US9525964B2 (en) Methods, apparatuses, and computer-readable storage media for providing interactive navigational assistance using movable guidance markers
US10241565B2 (en) Apparatus, system, and method of controlling display, and recording medium
CN106683195B (en) AR scene rendering method based on indoor positioning
CN104571532A (en) Method and device for realizing augmented reality or virtual reality
CN106157354B (en) A kind of three-dimensional scenic switching method and system
JP6348741B2 (en) Information processing system, information processing apparatus, information processing program, and information processing method
KR20190127865A (en) How to Assign Virtual Tools, Servers, Clients, and Storage Media
TW201229962A (en) Augmenting image data based on related 3D point cloud data
JP2012048597A (en) Mixed reality display system, image providing server, display device and display program
KR101181967B1 (en) 3D street view system using identification information.
US11375559B2 (en) Communication connection method, terminal device and wireless communication system
CN108134945B (en) AR service processing method, AR service processing device and terminal
CN108352086A (en) It determines and solar flux information is presented
JP2011233005A (en) Object displaying device, system, and method
CN109788359B (en) Video data processing method and related device
WO2011096343A1 (en) Photographic location recommendation system, photographic location recommendation device, photographic location recommendation method, and program for photographic location recommendation
CN105320958B (en) A kind of image-recognizing method and system based on location information
CN111190485A (en) Information display method, information display device, electronic equipment and computer readable storage medium
CN114827647B (en) Live broadcast data generation method, device, equipment, medium and program product
WO2022176450A1 (en) Information processing device, information processing method, and program
JP2016133701A (en) Information providing system and information providing method
TWI611307B (en) Method for establishing location-based space object, method for displaying space object, and application system thereof
JP2011041020A (en) Mobile terminal, display method, and display system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200526

Address after: 310051 room 508, floor 5, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Patentee after: Alibaba (China) Co.,Ltd.

Address before: 510627 Guangdong city of Guangzhou province Whampoa Tianhe District Road No. 163 Xiping Yun Lu Yun Ping B radio square 14 storey tower

Patentee before: GUANGZHOU UCWEB COMPUTER TECHNOLOGY Co.,Ltd.