CN114791766A - AR device-based operation method, device, medium and device - Google Patents

AR device-based operation method, device, medium and device Download PDF

Info

Publication number
CN114791766A
CN114791766A CN202210529050.3A CN202210529050A CN114791766A CN 114791766 A CN114791766 A CN 114791766A CN 202210529050 A CN202210529050 A CN 202210529050A CN 114791766 A CN114791766 A CN 114791766A
Authority
CN
China
Prior art keywords
virtual reality
reality scene
electronic equipment
current electronic
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210529050.3A
Other languages
Chinese (zh)
Inventor
张磊
贺宗艳
房晓辉
张曦月
张贺然
刘楠
张敏明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Neighborhood Information Technology Co ltd
Original Assignee
Shanghai Neighborhood Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Neighborhood Information Technology Co ltd filed Critical Shanghai Neighborhood Information Technology Co ltd
Priority to CN202210529050.3A priority Critical patent/CN114791766A/en
Publication of CN114791766A publication Critical patent/CN114791766A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an operation method, device, medium and equipment based on AR equipment, and the method comprises the following steps: the method comprises the steps of obtaining a virtual reality scene displayed by current electronic equipment, wherein the virtual reality scene is displayed on a graph of a door, obtaining an operation instruction of a user in the virtual reality scene, executing an operation corresponding to the operation instruction and displaying an operation result on the current electronic equipment. By acquiring the virtual reality scene displayed by the current electronic equipment and the operation instruction of the user in the virtual reality scene, the corresponding operation result is obtained according to the operation instruction, so that the user can perform corresponding operation based on the AR equipment.

Description

AR device-based operation method, device, medium and device
Technical Field
The present application relates to the technical field of operation methods for AR devices, and in particular, to an operation method, apparatus, medium, and device based on an AR device.
Background
At present, with the progress and development of technology, intelligent devices have gradually entered each household. Nowadays, China has developed an Augmented Reality technology (AR, a technology of skillfully fusing virtual information and a real world) and correspondingly developed AR equipment, and a user can personally feel various real scenes through the equipment. However, even if the AR device is developed, the user can only view, but cannot perform corresponding operations.
Disclosure of Invention
The present application is proposed to solve the above-mentioned technical problems. Embodiments of the present application provide an operation method, apparatus, medium, and device based on an AR device, which solve a problem that a user cannot perform corresponding operations through the AR device.
According to an aspect of the present application, there is provided an operation method based on an AR device, including: acquiring a virtual reality scene displayed by current electronic equipment; wherein the virtual reality scene is displayed on a graphic of a door; acquiring an operation instruction of a user in the virtual reality scene; and executing the operation corresponding to the operation instruction and displaying an operation result on the current electronic equipment.
In an embodiment, the acquiring the virtual reality scene currently displayed by the electronic device includes: acquiring a virtual reality scene displayed by current electronic equipment; wherein the virtual reality scene comprises a panoramic stereoscopic model.
In an embodiment, the executing the operation corresponding to the operation instruction includes: and if the obtained continuous operation time at one position of the current electronic equipment is longer than a first preset time, entering the inside of the panoramic stereo model from the outside.
In an embodiment, the executing the operation corresponding to the operation instruction includes: if the situation that the first position and the second position on the current electronic equipment are simultaneously continuously operated for less than second preset time, the first position moves to a first target position in a target direction, and the second position moves to a second target position in the target direction is obtained, the panoramic stereo model is controlled to rotate in the target direction; wherein the first target location is different from the second target location.
In an embodiment, the acquiring the virtual reality scene currently displayed by the electronic device includes: acquiring a virtual reality scene displayed by current electronic equipment; the virtual reality scene comprises a plurality of three-dimensional models of preset commodities.
In an embodiment, the executing the operation corresponding to the operation instruction includes: and if an operation instruction for moving the target model stored in the three-dimensional models of the preset commodities from the current position to a third target position on the current electronic equipment is obtained, controlling the target model to move from the current position to the third target position.
In an embodiment, the acquiring the virtual reality scene currently displayed by the electronic device includes: acquiring a virtual reality scene displayed by the current electronic equipment; the virtual reality scene comprises an image of a real scene and the stereoscopic models of the preset commodities are placed in the image of the real scene.
According to an aspect of the present application, there is provided an AR device-based operating apparatus, including: the scene acquisition module is used for acquiring a virtual reality scene displayed by the current electronic equipment; wherein the virtual reality scene is displayed on a graphic of a door; the instruction acquisition module is used for acquiring an operation instruction of a user in the virtual reality scene; and the execution module is used for executing the operation corresponding to the operation instruction and displaying an operation result on the current electronic equipment.
According to another aspect of the present application, there is provided a computer-readable storage medium storing a computer program for executing the method of operating an AR-based device as described in any one of the above.
According to another aspect of the present application, there is provided an electronic apparatus including: a processor; a memory for storing the processor-executable instructions; the processor is configured to execute any one of the above operation methods based on the AR device.
The application provides an operation method, device, medium and equipment based on AR equipment, and the method comprises the following steps: the method comprises the steps of obtaining a virtual reality scene displayed by current electronic equipment, wherein the virtual reality scene is displayed on a graph of a door, obtaining an operation instruction of a user in the virtual reality scene, executing an operation corresponding to the operation instruction and displaying an operation result on the current electronic equipment. By acquiring the virtual reality scene displayed by the current electronic equipment and the operation instruction of the user in the virtual reality scene, the corresponding operation result is obtained according to the operation instruction, so that the user can perform corresponding operation based on the AR equipment.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in more detail embodiments of the present application with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 is a flowchart illustrating an operation method of an AR-based device according to an exemplary embodiment of the present application.
Fig. 2 is a flowchart illustrating an operation method of an AR-based device according to another exemplary embodiment of the present application.
Fig. 3 is a flowchart illustrating an operation method based on an AR device according to another exemplary embodiment of the present application.
Fig. 4 is a flowchart illustrating an operation method based on an AR device according to another exemplary embodiment of the present application.
Fig. 5 is a flowchart illustrating an operation method of an AR-based device according to another exemplary embodiment of the present application.
Fig. 6 is a flowchart illustrating an operation method of an AR-based device according to another exemplary embodiment of the present application.
Fig. 7 is a flowchart illustrating an operation method of an AR-based device according to another exemplary embodiment of the present application.
Fig. 8 is a schematic structural diagram of an operation apparatus based on an AR device according to an exemplary embodiment of the present application.
Fig. 9 is a schematic structural diagram of an operating apparatus based on an AR device according to another exemplary embodiment of the present application.
Fig. 10 is a block diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein.
Fig. 1 is a flowchart illustrating an operation method of an AR-based device according to an exemplary embodiment of the present application. As shown in fig. 1, the operation method based on the AR device includes:
step 110: the method includes the steps of obtaining a virtual reality scene displayed by current electronic equipment, wherein the virtual reality scene is displayed on a graph of a door.
And acquiring a virtual reality scene displayed by the current electronic equipment. The virtual reality scene is shot by the electronic equipment, then image processing is carried out on the shot image to obtain a plurality of feature points, if a graph formed by the feature points is matched with a preset graph, an advertisement corresponding to the preset graph is obtained, an opening instruction is sent to the entrance guard controller, and then the advertisement is covered on the graph. The advertisement is provided with a virtual reality scene mark, and after the advertisement is acquired, a corresponding virtual reality scene is generated and displayed on a door graph.
Step 120: and acquiring an operation instruction of a user in the virtual reality scene.
The user may operate in a virtual reality scene on the electronic device or while wearing the VR device. And then acquiring an operation instruction generated when the user operates on the electronic equipment.
Step 130: and executing the operation corresponding to the operation instruction and displaying the operation result on the current electronic equipment.
And executing the operation corresponding to the operation instruction, and displaying an operation result on the current electronic equipment, so that a user can view a corresponding action during the operation.
The application provides an operation method, device, medium and equipment based on AR equipment, and the method comprises the following steps: the method comprises the steps of obtaining a virtual reality scene displayed by current electronic equipment, wherein the virtual reality scene is displayed on a graph of a door, obtaining an operation instruction of a user in the virtual reality scene, executing an operation corresponding to the operation instruction and displaying an operation result on the current electronic equipment. By acquiring the virtual reality scene displayed by the current electronic equipment and the operation instruction of the user in the virtual reality scene, the corresponding operation result is obtained according to the operation instruction, so that the user can perform corresponding operation based on the AR equipment.
Fig. 2 is a flowchart illustrating an operation method of an AR-based device according to another exemplary embodiment of the present application. As shown in fig. 2, step 110 may include:
step 111: the method includes the steps of obtaining a virtual reality scene displayed by current electronic equipment, wherein the virtual reality scene comprises a panoramic stereo model.
The panoramic stereoscopic model is displayed at the open-air position of the image of the real scene.
Fig. 3 is a flowchart illustrating an operation method of an AR-based device according to another exemplary embodiment of the present application. As shown in fig. 3, step 130 may include:
step 131: and if the obtained continuous operation time at one position of the current electronic equipment is longer than the first preset time, entering the inside of the panoramic stereo model from the outside.
If the user presses on the electronic device for a long time at one position and the continuous operation time is longer than the first preset time, the inside of the panoramic stereo model can be accessed from the outside. For example, if the panoramic stereo model is an automobile, the first preset time is 3 seconds, and the window of the automobile is pressed for 3 seconds, the user can enter the interior of the automobile, and can watch the scenery from the interior to the exterior of the automobile. Or the current electronic device displays a simulation object, and the simulation object can be moved when the situation that the continuous operation time of the user on the position of the simulation object is longer than a third preset time and the current position is transferred to another position is obtained, wherein the third preset time can be 1 second, but if the operation instruction of the user cannot be received in the moving process, the simulation object directly stops moving.
Or if the number of times of touch on one position of the current electronic equipment is larger than or equal to a preset number threshold, entering the panoramic stereo model from the outside.
If the number of times that the user touches at one position of the current electronic device is greater than or equal to a preset number threshold, the preset number threshold may be 2 times, and the panoramic stereo model enters the interior of the current electronic device from the exterior, for example, the panoramic stereo model may be a simulated vehicle, and if the user touches at one position of the current electronic device twice, the panoramic stereo model enters the interior of the simulated vehicle from the exterior of the simulated vehicle.
When the panoramic three-dimensional model enters the interior of the panoramic three-dimensional model and the duration of the operation on the target object of the current electronic equipment is greater than or equal to a fourth preset duration threshold which can be 1 second, the target object is selected and can be controlled to move, and when no operation is obtained on the target object of the current electronic equipment, the movement is stopped. And if the fact that the continuous operation time of the first sub-position on the current electronic equipment is less than the fifth preset time and the first sub-position moves to the first sub-target position in the target direction is obtained, controlling the current visual angle of the electronic equipment to rotate in the target direction. And if the fact that the third sub-position approaches to the fourth sub-position in the third direction and the fifth sub-position approaches to the sixth sub-position in the fourth direction on the current electronic equipment is obtained, enlarging the view angle of the current electronic equipment, wherein the third direction and the fourth direction are opposite directions. And if the continuous operation time of the second sub-position on the current electronic equipment is less than the sixth preset time and the second sub-position moves to the second sub-target position in the target direction in the magnified view, moving in the target direction in the magnified image. When the number of times of touching the target object on the current electronic device is greater than a preset number threshold, which may be 2 times, the target object is entered.
And if the continuous operation time of the user at the two positions of the current electronic equipment is longer than the fourth preset time, returning to the outside from the inside of the panoramic stereo model.
And if the continuous operation time of the user at the two positions of the current electronic equipment is longer than a fifth preset time, wherein the fifth preset time can be 3 seconds, returning from the inside to the outside of the panoramic stereo model. For example, the panoramic stereo model is a simulated vehicle, that is, if the user presses 3 seconds with the length of two fingers, the user exits the internal scene of the panoramic stereo model.
Fig. 4 is a flowchart illustrating an operation method of an AR-based device according to another exemplary embodiment of the present application. As shown in fig. 4, step 130 may include:
step 132: if the first position and the second position on the current electronic equipment are obtained, the simultaneous continuous operation time is less than second preset time, the first position moves to the first target position in the target direction, and the second position moves to the second target position in the target direction, the panoramic stereo model is controlled to rotate in the target direction, wherein the first target position is different from the second target position.
If the obtained first position and the second position on the current electronic device are simultaneously continuously operated for a time longer than a second preset time, the first position moves to the first target position in the target direction, and the second position moves to the second target position in the target direction, namely, the panoramic three-dimensional model can rotate leftwards or rightwards, namely, the panoramic three-dimensional model can rotate clockwise or anticlockwise. It can be understood that when there is a panoramic stereo model (AR object), the user slides left and right after pressing the object with two fingers, and clockwise or counterclockwise rotation of the panoramic stereo model can be realized. When a plurality of panoramic stereo models exist and an operation instruction of a touch target panoramic stereo model is obtained, the target panoramic stereo model is selected, for example, the A panoramic stereo model and the B panoramic stereo model exist, when the operation instruction of the touch A panoramic stereo model is obtained, the A panoramic stereo model is selected, wherein the first position and the second position can be the upper direction, the lower direction, the left direction, the right direction or any other directions or any other angles.
In an embodiment, if it is acquired that a third position on the current electronic device approaches to a fourth position in a first direction and a fifth position approaches to a sixth position in a second direction, the panoramic stereo model is enlarged, where the first direction and the second direction are opposite directions.
If the situation that the third position approaches to the fourth position in the first direction and the fifth position approaches to the sixth position in the second direction on the current electronic equipment is acquired, namely, the two fingers of the user perform outward expansion actions, the panoramic stereo model is enlarged, wherein the operation can be adapted to the operation of one or more panoramic stereo models.
In an embodiment, if the click simulation operation button is received, the current picture of the electronic device is controlled to be changed to the full screen.
When the user clicks the simulation operation button, the picture of the electronic device is displayed in a full screen, and the user can start the simulation operation thereon. And during simulation operation, the cloud can render in real time and issue a data stream.
Fig. 5 is a flowchart illustrating an operation method of an AR-based device according to another exemplary embodiment of the present application. As shown in fig. 5, step 110 may include:
step 112: the method includes the steps that a virtual reality scene displayed by the current electronic equipment is obtained, wherein the virtual reality scene comprises a plurality of three-dimensional models of preset commodities.
The virtual reality scene comprises a plurality of three-dimensional models of preset commodities, and the three-dimensional models of the preset commodities are stored in a list form. The user may click on the item to cache the item. When wearing the AR device, the user may view the stored stereoscopic model of the merchandise through the AR device.
Fig. 6 is a flowchart illustrating an operation method of an AR-based device according to another exemplary embodiment of the present application. As shown in fig. 6, step 130 may include:
step 133: and if an operation instruction for moving the target model stored in the three-dimensional models of the preset commodities from the current position to the third target position on the current electronic equipment is obtained, controlling the target model to move from the current position to the third target position.
And when an operation instruction for moving the target model from the current position to the third target position is acquired, controlling the target model to move from the current position to the third target position. For example, when the user touches the current position of the electronic device with a single finger or two fingers, the target model moves along the moving route of the user, and the instruction of the movement controls the target model to move from the current position to the third target position.
In an embodiment, if it is obtained that a user clicks a camera icon button, a picture of a real scene in a current shot is taken.
Fig. 7 is a flowchart illustrating an operation method based on an AR device according to another exemplary embodiment of the present application. As shown in fig. 7, step 110 may include:
step 113: the method comprises the steps of obtaining a virtual reality scene displayed by current electronic equipment, wherein the virtual reality scene comprises an image of the real scene and a plurality of stereoscopic models of preset commodities placed in the image of the real scene.
The method comprises the steps of obtaining a virtual reality scene displayed by current electronic equipment, wherein the virtual reality scene comprises an image of a real scene and a plurality of stereoscopic models of preset commodities are placed in the image of the real scene.
In an embodiment, if the AR device displays an image of a display scene and places a plurality of stereoscopic models of preset commodities in the image of a real scene, an operation flow is acquired from the cloud database and displayed on a display screen of the AR device.
In one embodiment, before step 110, the operation method based on the AR device may be embodied as: and if the current electronic equipment displays the video, covering the video on the graph of the door.
In one embodiment, before step 130, the operation method based on the AR device may be implemented as: receiving an instruction that the electronic equipment moves in the target direction and controlling the video to move along with the graph of the door.
Fig. 8 is a schematic structural diagram of an operation apparatus based on an AR device according to an exemplary embodiment of the present application. As shown in fig. 8, the AR device-based operation apparatus 20 includes: the system comprises a scene obtaining module 201, an instruction obtaining module 202 and an executing module 203, wherein the scene obtaining module is used for obtaining a virtual reality scene displayed by the current electronic equipment, the virtual reality scene is displayed on a door graph, the instruction obtaining module is used for obtaining an operation instruction of a user in the virtual reality scene, and the executing module 203 is used for executing an operation corresponding to the operation instruction and displaying an operation result on the current electronic equipment.
The application provides an operating means based on AR equipment, includes: the method includes the steps that a virtual reality scene displayed by the current electronic device is obtained through a scene obtaining module 201, wherein the virtual reality scene is displayed on a graph of a door, an instruction obtaining module 202 obtains an operation instruction of a user in the virtual reality scene, and an execution module 203 executes an operation corresponding to the operation instruction and displays an operation result on the current electronic device. By acquiring the virtual reality scene displayed by the current electronic equipment and the operation instruction of the user in the virtual reality scene, the corresponding operation result is obtained according to the operation instruction, so that the user can perform corresponding operation based on the AR equipment.
Fig. 9 is a schematic structural diagram of an operating apparatus based on an AR device according to another exemplary embodiment of the present application. As shown in fig. 9, the scene acquisition module 201 may include: the first scene acquiring subunit 2011 is configured to acquire a virtual reality scene displayed by the current electronic device, where the virtual reality scene includes a panoramic stereo model.
In one embodiment, as shown in fig. 9, the executing module 203 may include: a first executing subunit 2031, configured to enter the inside of the panoramic stereo model from the outside if it is obtained that the duration of the operation on one position of the current electronic device is longer than a first preset time.
In one embodiment, as shown in fig. 9, the executing module 203 may include: a second executing subunit 2032, configured to control the panoramic stereo model to rotate in the target direction if it is obtained that the first position and the second position on the current electronic device have the same continuous operation time that is less than a second preset time, the first position moves to a first target position in the target direction, and the second position moves to a second target position in the target direction, where the first target position is different from the second target position.
In an embodiment, as shown in fig. 9, the scene capturing module 201 may include: the second scene acquiring subunit 2012 acquires a virtual reality scene currently displayed by the electronic device, where the virtual reality scene includes a plurality of stereoscopic models of preset commodities.
In one embodiment, as shown in fig. 9, the executing module 203 may include: a third executing subunit 2033, configured to, if an operation instruction for moving a target model stored in the multiple preset product stereoscopic models from the current position to a third target position on the current electronic device is obtained, control the target model to move from the current position to the third target position.
In an embodiment, as shown in fig. 9, the scene capturing module 201 may include: a third scene obtaining subunit 2013, configured to obtain a virtual reality scene displayed by the current electronic device, where the virtual reality scene includes an image of a real scene and the stereoscopic models of the preset commodities are placed in the image of the real scene.
FIG. 10 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
As shown in fig. 10, the electronic device 10 includes one or more processors 11 and memory 12.
The processor 11 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 10 to perform desired functions.
Memory 12 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer-readable storage medium and executed by processor 11 to implement the AR device-based methods of operation of the various embodiments of the present application described above and/or other desired functionality. Various contents such as an input signal, a signal component, a noise component, etc. may also be stored in the computer-readable storage medium.
In one example, the electronic device 10 may further include: an input device 13 and an output device 14, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
When the electronic device is a stand-alone device, the input means 13 may be a communication network connector for receiving the acquired input signals from the first device and the second device.
The input device 13 may also include, for example, a keyboard, a mouse, and the like.
The output device 14 may output various information including the determined distance information, direction information, and the like to the outside. The output devices 14 may include, for example, a display, speakers, a printer, and a communication network and its connected remote output devices, among others.
Of course, for simplicity, only some of the components of the electronic device 10 relevant to the present application are shown in fig. 10, and components such as buses, input/output interfaces, and the like are omitted. In addition, the electronic device 10 may include any other suitable components depending on the particular application.
The computer program product may be written with program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
The computer readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (10)

1. An operation method based on an AR device, comprising:
acquiring a virtual reality scene displayed by current electronic equipment; wherein the virtual reality scene is displayed on a graphic of a door;
acquiring an operation instruction of a user in the virtual reality scene; and
and executing the operation corresponding to the operation instruction and displaying an operation result on the current electronic equipment.
2. The method of claim 1, wherein the obtaining the virtual reality scene currently displayed by the electronic device comprises:
acquiring a virtual reality scene displayed by current electronic equipment; wherein the virtual reality scene comprises a panoramic stereoscopic model.
3. The AR device-based operation method of claim 2, wherein the performing the operation corresponding to the operation instruction comprises:
and if the obtained continuous operation time at one position of the current electronic equipment is longer than a first preset time, entering the inside of the panoramic stereo model from the outside.
4. The AR device-based operation method according to claim 2, wherein the executing the operation corresponding to the operation instruction comprises:
if the situation that the first position and the second position on the current electronic equipment are simultaneously continuously operated for less than second preset time, the first position moves to a first target position in a target direction, and the second position moves to a second target position in the target direction is obtained, the panoramic stereo model is controlled to rotate in the target direction; wherein the first target location is different from the second target location.
5. The method of claim 1, wherein the obtaining the virtual reality scene currently displayed by the electronic device comprises:
acquiring a virtual reality scene displayed by current electronic equipment; the virtual reality scene comprises a plurality of three-dimensional models of preset commodities.
6. The AR device-based operation method according to claim 5, wherein the executing the operation corresponding to the operation instruction comprises:
and if an operation instruction for moving the target model stored in the three-dimensional models of the preset commodities from the current position to a third target position on the current electronic equipment is obtained, controlling the target model to move from the current position to the third target position.
7. The method of claim 5, wherein the obtaining the virtual reality scene currently displayed by the electronic device comprises:
acquiring a virtual reality scene displayed by the current electronic equipment; wherein the virtual reality scene comprises an image of a real scene and the stereoscopic models of the preset commodities are placed in the image of the real scene.
8. An operating apparatus based on an AR device, comprising:
the scene acquisition module is used for acquiring a virtual reality scene displayed by the current electronic equipment; wherein the virtual reality scene is displayed on a graphic of a door;
the instruction acquisition module is used for acquiring an operation instruction of a user in the virtual reality scene; and
and the execution module is used for executing the operation corresponding to the operation instruction and displaying the operation result on the current electronic equipment.
9. A computer-readable storage medium storing a computer program for executing the method of operating an AR-based device according to any one of claims 1 to 7.
10. An electronic device, the electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor configured to perform the method of operation of the AR-based device of any one of claims 1-7.
CN202210529050.3A 2022-05-16 2022-05-16 AR device-based operation method, device, medium and device Pending CN114791766A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210529050.3A CN114791766A (en) 2022-05-16 2022-05-16 AR device-based operation method, device, medium and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210529050.3A CN114791766A (en) 2022-05-16 2022-05-16 AR device-based operation method, device, medium and device

Publications (1)

Publication Number Publication Date
CN114791766A true CN114791766A (en) 2022-07-26

Family

ID=82463248

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210529050.3A Pending CN114791766A (en) 2022-05-16 2022-05-16 AR device-based operation method, device, medium and device

Country Status (1)

Country Link
CN (1) CN114791766A (en)

Similar Documents

Publication Publication Date Title
Kim et al. Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality
JP6659644B2 (en) Low latency visual response to input by pre-generation of alternative graphic representations of application elements and input processing of graphic processing unit
JP5885309B2 (en) User interface, apparatus and method for gesture recognition
TWI524210B (en) Natural gesture based user interface methods and systems
Cabral et al. On the usability of gesture interfaces in virtual reality environments
US9836146B2 (en) Method of controlling virtual object or view point on two dimensional interactive display
EP2814000B1 (en) Image processing apparatus, image processing method, and program
CN110162236B (en) Display method and device between virtual sample boards and computer equipment
JP2021128743A (en) Method for augmented reality application of adding note and interface to control panel and screen
EP3062203A2 (en) Three-dimensional virtualization
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
KR20140003149A (en) User customizable interface system and implementing method thereof
Osunkoya et al. Gesture-based human-computer-interaction using Kinect for windows mouse control and powerpoint presentation
JPH04308895A (en) Method for video operation, video search, video process definition, remote operation monitoring, and device or system
CN106200900A (en) Based on identifying that the method and system that virtual reality is mutual are triggered in region in video
CN112987924A (en) Method, apparatus, device and storage medium for device interaction
CN112156467A (en) Control method and system of virtual camera, storage medium and terminal equipment
KR20180045668A (en) User Terminal and Computer Implemented Method for Synchronizing Camera Movement Path and Camera Movement Timing Using Touch User Interface
CN114791766A (en) AR device-based operation method, device, medium and device
US20230267667A1 (en) Immersive analysis environment for human motion data
KR101743888B1 (en) User Terminal and Computer Implemented Method for Synchronizing Camera Movement Path and Camera Movement Timing Using Touch User Interface
CN114327063A (en) Interaction method and device of target virtual object, electronic equipment and storage medium
Ismail et al. Implementation of natural hand gestures in holograms for 3D object manipulation
EP4160595A1 (en) Information processing program, information processing system and informatin processing method
KR102392675B1 (en) Interfacing method for 3d sketch and apparatus thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination