CN118057466A - Control method and device based on augmented reality, electronic equipment and storage medium - Google Patents

Control method and device based on augmented reality, electronic equipment and storage medium Download PDF

Info

Publication number
CN118057466A
CN118057466A CN202211449018.0A CN202211449018A CN118057466A CN 118057466 A CN118057466 A CN 118057466A CN 202211449018 A CN202211449018 A CN 202211449018A CN 118057466 A CN118057466 A CN 118057466A
Authority
CN
China
Prior art keywords
hand
virtual
target
determining
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211449018.0A
Other languages
Chinese (zh)
Inventor
饶小林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202211449018.0A priority Critical patent/CN118057466A/en
Publication of CN118057466A publication Critical patent/CN118057466A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The disclosure provides a control method, a control device, an electronic device and a storage medium based on augmented reality. The disclosure provides a control method based on augmented reality, comprising the following steps: acquiring n hand images of a target hand in the real world in real time from at least n different angles, wherein n is not less than 2; determining a virtual hand model according to the hand images acquired at n different angles; displaying the target hand and the virtual hand model in an augmented reality world; wherein the morphology of the virtual hand model is the same as the morphology of the target hand. The method disclosed by the invention realizes the self-adaptive matching of the virtual hand model.

Description

Control method and device based on augmented reality, electronic equipment and storage medium
Technical Field
The disclosure relates to the field of computer technology, and in particular relates to a control method, a control device, electronic equipment and a storage medium based on augmented reality.
Background
Augmented reality refers to combining reality with virtual through a computer to create a virtual environment that can be man-machine interacted, and may include multiple technologies such as virtual reality, augmented reality, mixed reality, etc. In the augmented reality, when a hand model of a user hand is displayed, the model is often a fixed model, and morphological characteristics such as the size and the dimension of the model are inconsistent with the real hand of the user, and cannot be adaptively matched according to the user hand.
Disclosure of Invention
The disclosure provides a control method, a control device, an electronic device and a storage medium based on augmented reality.
The present disclosure adopts the following technical solutions.
In some embodiments, the present disclosure provides an augmented reality-based control method, comprising:
Acquiring n hand images of a target hand in the real world in real time from at least n different angles, wherein n is not less than 2;
Determining a virtual hand model according to the hand images acquired at n different angles;
Displaying the target hand and the virtual hand model in an augmented reality world;
wherein the morphology of the virtual hand model is the same as the morphology of the target hand.
In some embodiments, the present disclosure provides an augmented reality-based control device, comprising:
The acquisition unit is used for acquiring n hand images of a target hand in the real world in real time from at least n different angles, wherein n is not less than 2;
the control unit is used for determining a virtual hand model according to the hand images acquired by the n different angles;
A display unit for displaying the target hand and the virtual hand model in an augmented reality world; wherein the morphology of the virtual hand model is the same as the morphology of the target hand.
In some embodiments, the present disclosure provides an electronic device comprising: at least one memory and at least one processor;
The memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the method.
In some embodiments, the present disclosure provides a computer readable storage medium for storing program code which, when executed by a processor, causes the processor to perform the above-described method.
In some embodiments, the present disclosure provides a computer program product comprising instructions that, when executed by a computer device, cause the computer device to perform the method of any of the present disclosure.
According to the control method based on the augmented reality, the image of the real target hand in the real world is acquired from multiple angles in real time, and the virtual hand model is adjusted according to the target hand image, so that the form of the virtual hand model is identical to that of the target hand, and the self-adaptive matching of the virtual hand model is realized.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
Fig. 1 is a schematic diagram of a use augmented reality device according to an embodiment of the present disclosure.
Fig. 2 is a schematic diagram of an augmented reality-based control method according to an embodiment of the present disclosure.
Fig. 3 is a schematic layout diagram of a virtual hand model according to an embodiment of the present disclosure.
Fig. 4 is a realistic schematic diagram of an augmented reality world of an embodiment of the present disclosure.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in and/or in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "a" and "an" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be construed as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The following describes in detail the schemes provided by the embodiments of the present disclosure with reference to the accompanying drawings.
The augmented reality may be at least one of virtual reality, augmented reality, or mixed reality. Taking the augmented reality as the virtual reality, as shown in fig. 1, a user may enter a virtual reality space through an intelligent terminal device such as a head-mounted VR glasses, and control his/her virtual character (Avatar) in the virtual reality space to perform social interaction, entertainment, learning, remote office, etc. with other virtual characters controlled by the user.
The virtual reality space may be a simulation environment for the real world, a virtual scene of half simulation and half virtual, or a virtual scene of pure virtual. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, sea, etc., the land may include environmental elements of a desert, city, etc., and a user may control a virtual object to move in the virtual scene.
In one embodiment, in the virtual reality space, the user may implement related interactive operations through an operation device, which may be a handle, for example, the user performs related operation control through operation of a key of the handle. Of course, in other embodiments, the target object in the virtual reality device may be controlled using a gesture or voice or multi-mode control mode instead of using the controller.
In some embodiments of the present disclosure, the proposed control method may be used for a virtual reality device, which is a terminal that achieves a virtual reality effect, and may be generally provided in the form of glasses, a head mounted display (Head Mount Display, HMD), a contact lens for achieving visual perception and other forms of perception, although the form of achieving the virtual reality device is not limited thereto, and may be further miniaturized or enlarged as needed.
The virtual reality devices described in embodiments of the present disclosure may include, but are not limited to, the following types:
And the computer-side virtual reality (PCVR) equipment performs related calculation of virtual reality functions and data output by using the PC side, and the external computer-side virtual reality equipment realizes the effect of virtual reality by using the data output by the PC side.
The mobile virtual reality device supports setting up a mobile terminal (such as a smart phone) in various manners (such as a head-mounted display provided with a special card slot), performing related calculation of a virtual reality function by the mobile terminal through connection with the mobile terminal in a wired or wireless manner, and outputting data to the mobile virtual reality device, for example, watching a virtual reality video through an APP of the mobile terminal.
The integrated virtual reality device has a processor for performing the calculation related to the virtual function, and thus has independent virtual reality input and output functions, and is free from connection with a PC or a mobile terminal, and has high degree of freedom in use.
In the method, a virtual reality device can present a virtual reality image in a virtual reality space, an operating system such as android and IOS is often used at the bottom layer of the virtual reality device, and the displayed virtual reality image cannot be directly subjected to touch operation, so that an operation event performed through the virtual reality image cannot be directly executed by a system at the bottom layer.
The augmented reality in some embodiments of the present disclosure may be AR (Augmented Reality ): an AR set refers to a simulated set with at least one virtual object superimposed over a physical set or representation thereof. For example, the electronic system may have an opaque display and at least one imaging sensor for capturing images or videos of the physical set, which are representations of the physical set. The system combines the image or video with the virtual object and displays the combination on an opaque display. The individual uses the system to view the physical set indirectly via an image or video of the physical set and observe a virtual object superimposed over the physical set. When the system captures images of a physical set using one or more image sensors and presents an AR set on an opaque display using those images, the displayed images are referred to as video passthrough. Alternatively, the electronic system for displaying the AR scenery may have a transparent or translucent display through which the individual may directly view the physical scenery. The system may display the virtual object on a transparent or semi-transparent display such that an individual uses the system to view the virtual object superimposed over the physical scenery. As another example, the system may include a projection system that projects the virtual object into a physical set. The virtual object may be projected, for example, on a physical surface or as a hologram, such that an individual uses the system to view the virtual object superimposed over the physical scene. In particular, a technique for calculating camera attitude parameters of a camera in the real world (or three-dimensional world, real world) in real time in the process of acquiring images by the camera and adding virtual elements on the images acquired by the camera according to the camera attitude parameters. Virtual elements include, but are not limited to: images, videos, and three-dimensional models. The goal of AR technology is to socket the virtual world over the real world on the screen for interaction.
The augmented Reality in some embodiments of the present disclosure may be MR (Mixed Reality): by presenting virtual scene information in a real scene, an interactive feedback information loop is built up among the real world, the virtual world and the user, so as to enhance the sense of realism of the user experience. For example, integrating computer-created sensory input (e.g., virtual objects) with sensory input from a physical scenery or representations thereof in a simulated scenery, in some MR sceneries, the computer-created sensory input may be adapted to changes in sensory input from the physical scenery. In addition, some electronic systems for rendering MR scenes may monitor orientation and/or position relative to the physical scene to enable virtual objects to interact with real objects (i.e., physical elements from the physical scene or representations thereof). For example, the system may monitor movement such that the virtual plants appear to be stationary relative to the physical building.
As shown in fig. 2, fig. 2 is a flowchart of an augmented reality-based control method according to an embodiment of the present disclosure, including:
s11: n hand images of the target hand in the real world are acquired in real time from at least n different angles, n being not less than 2.
In some embodiments, the methods presented in the embodiments of the present disclosure may be used with an augmented reality device, for example, an augmented reality device. In some embodiments, a plurality of cameras may be configured to acquire hand images of a target hand in the real world from different angles, and the angles corresponding to the different hand images may be different. The target hand may be, for example, a hand with which the user interacts with the augmented reality world. For example, the user uses the right hand to control the virtual imagery in the augmented reality world, then the target hand may be the user's right hand. In some embodiments, the number of hand images may be 2 or more, for example, 3 or more. Hand images can be acquired from all directions and multiple angles.
S12: and determining a virtual hand model according to the hand images acquired at n different angles.
In some embodiments, the base model may be a pre-established virtual hand model, and the base model may be adjusted in real time according to hand images acquired at different angles, in some embodiments, the base model may be a virtual hand model at a previous time, that is, the current time may be adjusted according to the virtual hand model displayed at the previous time, so as to generate a virtual hand model at the current time, where the virtual hand model may be displayed in real time during the use of the augmented reality device by the user.
S13: the target hand and the virtual hand model are shown in the augmented reality world.
In some embodiments, the morphology of the virtual hand model is the same as the morphology of the target hand. In some embodiments, the user may perform bare-hand interaction during use of the augmented reality device, that is, without using any external device, and may interact with the virtual component in the augmented reality world by using the virtual hand model in the augmented reality world in order to implement bare-hand interaction, so during use of the augmented reality device, the virtual hand model is displayed in real time according to the target hand of the user, and the virtual hand model is the same as the target hand of the user in the real world, so that the user can interact with the augmented reality world very naturally through the virtual hand model. In some embodiments, the morphology of the target hand includes: one or more of hand shape, hand size, hand thickness, hand skin tone. The morphology of the virtual hand model corresponds to the morphology of the target hand.
In some embodiments of the present disclosure, the form of the virtual hand model displayed in the augmented reality world is not fixed, and morphological features such as size, thickness, skin color and the like of the virtual hand model are adjusted in real time according to real morphological features of a target hand of a user, that is, adaptive matching is performed according to the target hand of the user, and full and comprehensive form data of the target hand as much as possible is acquired through multi-angle image acquisition, so as to realize form adaptation of the virtual hand model and the target hand.
In some embodiments of the present disclosure, determining a virtual hand model from the hand images acquired at n different angles includes: and determining the structural characteristics of different areas of the target hand according to the hand image, and determining the wiring density of the corresponding area of the virtual hand model according to the structural characteristics of the different areas of the target hand.
In some embodiments, the virtual hand model has wiring thereon, and the wiring of the virtual hand model is not uniform, but is related to structural features of the target hand, including, for example, shape, waviness, and the like. The area of one part of the target hand corresponds to the area of the same part of the virtual hand, and if the structural characteristics of different areas of the target hand are different, the wiring density of the corresponding areas of the virtual hand is different. In the embodiment of the disclosure, the wiring of the model is reasonably distributed according to the structural characteristics, so that the optimal solution of multi-angle modeling is achieved.
In some embodiments, determining the wiring density of the region corresponding to the virtual hand model according to the structural features of the different regions of the target hand includes: determining corresponding flatness according to the structural characteristics of different areas of the target hand; determining the wiring density of the corresponding area of the virtual hand model according to the flatness of the different areas of the target hand; wherein the lower the flatness is, the higher the wiring density is.
In some embodiments, the wiring of the virtual hand model is adjusted correspondingly according to the flatness of different areas, and for a flat area with higher flatness, the shape change of the wiring is smaller, so the wiring of the virtual hand model can be relatively less, and for an area with lower flatness (a rugged area), the shape change of the virtual hand model is large, and the shape of the area with lower flatness needs to be described more finely, so that the shape of the virtual hand model can be the same as that of a real target hand. Through reasonable wiring, optimal model modeling is achieved. The left and right sides of fig. 3 show the wiring of the virtual hand model before and after the method proposed by the present embodiment, respectively, and it can be seen that the wiring of the finger portion of the virtual hand model is relatively uniform before the present embodiment is used, while the wiring of the finger portion of the virtual hand model is non-uniform after the present embodiment is used.
In some embodiments of the present disclosure, determining a virtual hand model from the hand images acquired at n different angles includes: analyzing the hand images acquired by the n different angles to determine the morphology of different areas of the target hand; and determining the morphology of the region corresponding to the virtual hand model according to the morphology of the different regions of the target hand.
In some embodiments, there are fewer hand images that can be captured through one angle, and thus, the image of each region of the target hand cannot be obtained, that is, the form of each region of the hand cannot be obtained, and the hand images of different regions can be captured through n hand images of different angles, so that the form of each region of the hand in the form of the virtual hand model is the same as the region of the target hand.
In some embodiments of the present disclosure, before the target hand and the virtual hand model are presented in the augmented reality world, further comprising: acquiring the real position of the target hand in the real world; presenting the target hand and the virtual hand model in an augmented reality world, comprising: and determining the display position of the virtual hand model according to the real position.
In some embodiments, in the augmented reality world, the display position of the virtual hand model may be related to the real-time position of the target hand in the real world, for example, the display position of the virtual hand model may be displayed at the real-time position or the peripheral side of the target hand, when the display position of the virtual hand model is located at the peripheral side of the target hand, for example, a preset offset value may be added as the display position of the virtual hand model on the basis of the position of the target hand, and the preset offset value may be related to the size of the target hand, for example, the larger the size of the target hand is, the larger the preset offset value is, so that mutual interference between the target hand and the image of the virtual hand model can be avoided, for example, in the augmented reality world, the virtual hand model may be sleeved on the target hand of the user. Fig. 4 schematically illustrates the display effect of the virtual hand model and the target hand in the augmented reality world in this embodiment, as shown in fig. 4, the shape of the virtual hand model is consistent with the shape of the real target hand and is located at the adjacent position of the target hand, and the user controls the virtual hand model by controlling the target hand of the user.
In some embodiments of the present disclosure, the method further comprises: determining whether the target hand and the virtual hand model are displayed in the augmented reality world to be synchronous, if the target hand and the virtual hand model are not synchronous, acquiring a hand image of the target hand, and adjusting the virtual hand model according to the acquired hand image.
In some embodiments, in the augmented reality world, the images of the virtual hand model and the real target hand are compared, and if the two displayed forms are inconsistent, that is, an asynchronous situation occurs, the hand image of the target hand is actively triggered to be acquired and the operation of correspondingly adjusting the virtual hand model is performed, so that the synchronization of the two is ensured.
In some embodiments of the present disclosure, analysis is performed through multi-angle image acquisition, and finally, the form of the target hand is mapped onto the virtual hand model, real-time modeling processing and real-time position capturing processing are performed, reasonable model wiring is performed during real-time modeling by utilizing multi-angle real-time image acquisition, real-time synchronization of the form and position of the virtual model is realized through real-time image acquisition and real-time position acquisition, real-time form and position adaptation of the virtual hand model is realized, and modeling rationality is improved.
The present disclosure also proposes a control device based on augmented reality, comprising:
The acquisition unit is used for acquiring n hand images of a target hand in the real world in real time from at least n different angles, wherein n is not less than 2;
the control unit is used for determining a virtual hand model according to the hand images acquired by the n different angles;
A display unit for displaying the target hand and the virtual hand model in an augmented reality world; wherein the morphology of the virtual hand model is the same as the morphology of the target hand.
In some embodiments, determining a virtual hand model from the hand images acquired at n different angles comprises: and determining the structural characteristics of different areas of the target hand according to the hand image, and determining the wiring density of the corresponding area of the virtual hand model according to the structural characteristics of the different areas of the target hand.
In some embodiments, determining the wiring density of the region corresponding to the virtual hand model according to the structural features of the different regions of the target hand includes:
determining corresponding flatness according to the structural characteristics of different areas of the target hand;
Determining the wiring density of the corresponding area of the virtual hand model according to the flatness of the different areas of the target hand; wherein the lower the flatness is, the higher the wiring density is
In some embodiments, determining a virtual hand model from the hand images acquired at n different angles comprises: analyzing the hand images acquired by the n different angles to determine the morphology of different areas of the target hand; and determining the morphology of the region corresponding to the virtual hand model according to the morphology of the different regions of the target hand.
In some embodiments, the morphology of the target hand comprises: one or more of hand shape, hand size, hand thickness, hand skin tone.
In some embodiments, the obtaining unit is further configured to obtain a real position of the target hand in the real world; presenting the target hand and the virtual hand model in an augmented reality world, comprising: and determining the display position of the virtual hand model according to the real position.
In some embodiments, the control unit is further configured to determine whether the target hand and the virtual hand model are displayed in the augmented reality world in synchronization, and if the target hand and the virtual hand model are not in synchronization, acquire a hand image of the target hand and adjust the virtual hand model according to the acquired hand image.
For embodiments of the device, reference is made to the description of method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the modules illustrated as separate modules may or may not be separate. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The method and apparatus of the present disclosure are described above based on the embodiments and applications. In addition, the present disclosure also provides an electronic device and a computer-readable storage medium, which are described below.
Referring now to fig. 5, a schematic diagram of an electronic device (e.g., a terminal device or server) 800 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in the drawings is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
The electronic device 800 may include a processing means (e.g., a central processor, a graphics processor, etc.) 801 that may perform various appropriate actions and processes in accordance with programs stored in a Read Only Memory (ROM) 802 or loaded from a storage 808 into a Random Access Memory (RAM) 803. In the RAM803, various programs and data required for the operation of the electronic device 800 are also stored. The processing device 801, the ROM 802, and the RAM803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
In general, the following devices may be connected to the I/O interface 805: input devices 806 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 807 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, etc.; storage 808 including, for example, magnetic tape, hard disk, etc.; communication means 809. The communication means 809 may allow the electronic device 800 to communicate wirelessly or by wire with other devices to exchange data. While an electronic device 800 having various means is shown, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via communication device 809, or installed from storage device 808, or installed from ROM 802. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 801.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods of the present disclosure described above.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided an augmented reality-based control method including:
Acquiring n hand images of a target hand in the real world in real time from at least n different angles, wherein n is not less than 2;
Determining a virtual hand model according to the hand images acquired at n different angles;
Displaying the target hand and the virtual hand model in an augmented reality world;
wherein the morphology of the virtual hand model is the same as the morphology of the target hand.
According to one or more embodiments of the present disclosure, there is provided an augmented reality-based control method for determining a virtual hand model from the hand images acquired at n different angles, including:
And determining the structural characteristics of different areas of the target hand according to the hand image, and determining the wiring density of the corresponding area of the virtual hand model according to the structural characteristics of the different areas of the target hand.
According to one or more embodiments of the present disclosure, there is provided an augmented reality-based control method for determining a wiring density of a region corresponding to the virtual hand model according to structural features of different regions of the target hand, including:
determining corresponding flatness according to the structural characteristics of different areas of the target hand;
Determining the wiring density of the corresponding area of the virtual hand model according to the flatness of the different areas of the target hand; wherein the lower the flatness is, the higher the wiring density is
According to one or more embodiments of the present disclosure, there is provided an augmented reality-based control method for determining a virtual hand model from the hand images acquired at n different angles, including:
analyzing the hand images acquired by the n different angles to determine the morphology of different areas of the target hand;
and determining the morphology of the region corresponding to the virtual hand model according to the morphology of the different regions of the target hand.
According to one or more embodiments of the present disclosure, there is provided an augmented reality-based control method, the morphology of the target hand including: one or more of hand shape, hand size, hand thickness, hand skin tone.
According to one or more embodiments of the present disclosure, there is provided an augmented reality-based control method, before the target hand and the virtual hand model are displayed in the augmented reality world, further comprising:
Acquiring the real position of the target hand in the real world;
presenting the target hand and the virtual hand model in an augmented reality world, comprising: and determining the display position of the virtual hand model according to the real position.
According to one or more embodiments of the present disclosure, there is provided an augmented reality-based control method, further comprising: determining whether the target hand and the virtual hand model are displayed in the augmented reality world to be synchronous, if the target hand and the virtual hand model are not synchronous, acquiring a hand image of the target hand, and adjusting the virtual hand model according to the acquired hand image.
According to one or more embodiments of the present disclosure, there is provided an augmented reality-based control apparatus including: the acquisition unit is used for acquiring n hand images of a target hand in the real world in real time from at least n different angles, wherein n is not less than 2;
the control unit is used for determining a virtual hand model according to the hand images acquired by the n different angles;
A display unit for displaying the target hand and the virtual hand model in an augmented reality world; wherein the morphology of the virtual hand model is the same as the morphology of the target hand.
According to one or more embodiments of the present disclosure, there is provided an electronic device including: at least one memory and at least one processor;
Wherein the at least one memory is configured to store program code, and the at least one processor is configured to invoke the program code stored by the at least one memory to perform any of the methods described above.
According to one or more embodiments of the present disclosure, a computer-readable storage medium is provided for storing program code which, when executed by a processor, causes the processor to perform the above-described method.
According to one or more embodiments of the present disclosure, there is provided a computer program product comprising instructions which, when executed by a computer device, cause the computer device to perform the method of any of the embodiments of the present disclosure.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (11)

1. An augmented reality-based control method, comprising:
Acquiring n hand images of a target hand in the real world in real time from at least n different angles, wherein n is not less than 2;
Determining a virtual hand model according to the hand images acquired at n different angles;
Displaying the target hand and the virtual hand model in an augmented reality world;
wherein the morphology of the virtual hand model is the same as the morphology of the target hand.
2. The method of claim 1, wherein determining a virtual hand model from the hand images acquired at n different angles comprises:
And determining the structural characteristics of different areas of the target hand according to the hand image, and determining the wiring density of the corresponding area of the virtual hand model according to the structural characteristics of the different areas of the target hand.
3. The method of claim 2, wherein determining the routing density of the region corresponding to the virtual hand model based on the structural features of the different regions of the target hand comprises:
determining corresponding flatness according to the structural characteristics of different areas of the target hand;
determining the wiring density of the corresponding area of the virtual hand model according to the flatness of the different areas of the target hand; wherein the lower the flatness is, the higher the wiring density is.
4. The method of claim 1, wherein determining a virtual hand model from the hand images acquired at n different angles comprises:
analyzing the hand images acquired by the n different angles to determine the morphology of different areas of the target hand;
and determining the morphology of the region corresponding to the virtual hand model according to the morphology of the different regions of the target hand.
5. The method of claim 1, wherein the step of determining the position of the substrate comprises,
The morphology of the target hand includes: one or more of hand shape, hand size, hand thickness, hand skin tone.
6. The method of claim 1, further comprising, prior to exposing the target hand and the virtual hand model in the augmented reality world:
Acquiring the real position of the target hand in the real world;
presenting the target hand and the virtual hand model in an augmented reality world, comprising: and determining the display position of the virtual hand model according to the real position.
7. The method as recited in claim 1, further comprising:
Determining whether the target hand and the virtual hand model are displayed in the augmented reality world to be synchronous, if the target hand and the virtual hand model are not synchronous, acquiring a hand image of the target hand, and adjusting the virtual hand model according to the acquired hand image.
8. An augmented reality-based control device, comprising:
The acquisition unit is used for acquiring n hand images of a target hand in the real world in real time from at least n different angles, wherein n is not less than 2;
the control unit is used for determining a virtual hand model according to the hand images acquired by the n different angles;
A display unit for displaying the target hand and the virtual hand model in an augmented reality world; wherein the morphology of the virtual hand model is the same as the morphology of the target hand.
9. An electronic device, comprising:
at least one memory and at least one processor;
wherein the at least one memory is configured to store program code, and the at least one processor is configured to invoke the program code stored by the at least one memory to perform the method of any of claims 1 to 7.
10. A computer readable storage medium for storing program code which, when executed by a processor, causes the processor to perform the method of any one of claims 1 to 7.
11. A computer program product, characterized in that it comprises instructions which, when executed by a computer device, cause the computer device to perform the method according to any of claims 1 to 7.
CN202211449018.0A 2022-11-18 2022-11-18 Control method and device based on augmented reality, electronic equipment and storage medium Pending CN118057466A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211449018.0A CN118057466A (en) 2022-11-18 2022-11-18 Control method and device based on augmented reality, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211449018.0A CN118057466A (en) 2022-11-18 2022-11-18 Control method and device based on augmented reality, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN118057466A true CN118057466A (en) 2024-05-21

Family

ID=91068868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211449018.0A Pending CN118057466A (en) 2022-11-18 2022-11-18 Control method and device based on augmented reality, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN118057466A (en)

Similar Documents

Publication Publication Date Title
CN112933599A (en) Three-dimensional model rendering method, device, equipment and storage medium
US20230405475A1 (en) Shooting method, apparatus, device and medium based on virtual reality space
CN116563740A (en) Control method and device based on augmented reality, electronic equipment and storage medium
CN117319725A (en) Subtitle display method, device, equipment and medium
CN118057466A (en) Control method and device based on augmented reality, electronic equipment and storage medium
WO2024016880A1 (en) Information interaction method and apparatus, and electronic device and storage medium
US20240078734A1 (en) Information interaction method and apparatus, electronic device and storage medium
CN118038772A (en) Display control method, device, terminal and storage medium based on augmented reality
US20230377248A1 (en) Display control method and apparatus, terminal, and storage medium
CN117745981A (en) Image generation method, device, electronic equipment and storage medium
CN117519457A (en) Information interaction method, device, electronic equipment and storage medium
CN118343924A (en) Virtual object motion processing method, device, equipment and medium
CN117788755A (en) Control method and device based on augmented reality, electronic equipment and storage medium
CN117765207A (en) Virtual interface display method, device, equipment and medium
CN117899456A (en) Display processing method, device, equipment and medium of two-dimensional assembly
CN117435041A (en) Information interaction method, device, electronic equipment and storage medium
CN117519456A (en) Information interaction method, device, electronic equipment and storage medium
CN117641040A (en) Video processing method, device, electronic equipment and storage medium
CN117991889A (en) Information interaction method, device, electronic equipment and storage medium
CN117749964A (en) Image processing method, device, electronic equipment and storage medium
CN117376591A (en) Scene switching processing method, device, equipment and medium based on virtual reality
CN117640919A (en) Picture display method, device, equipment and medium based on virtual reality space
CN117641026A (en) Model display method, device, equipment and medium based on virtual reality space
CN117687499A (en) Virtual object interaction processing method, device, equipment and medium
CN118135090A (en) Grid alignment method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination