CN112365607A - Augmented reality AR interaction method, device, equipment and storage medium - Google Patents

Augmented reality AR interaction method, device, equipment and storage medium Download PDF

Info

Publication number
CN112365607A
CN112365607A CN202011230667.2A CN202011230667A CN112365607A CN 112365607 A CN112365607 A CN 112365607A CN 202011230667 A CN202011230667 A CN 202011230667A CN 112365607 A CN112365607 A CN 112365607A
Authority
CN
China
Prior art keywords
component
target
attribute information
installation area
target installation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011230667.2A
Other languages
Chinese (zh)
Inventor
侯欣如
李园园
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sensetime Technology Development Co Ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN202011230667.2A priority Critical patent/CN112365607A/en
Publication of CN112365607A publication Critical patent/CN112365607A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present disclosure provides a method, an apparatus, a device and a storage medium for Augmented Reality (AR) interaction, wherein the method comprises: acquiring a target image of a working site acquired by AR equipment; identifying attribute information of a component to be mounted in a target image; determining target installation equipment matched with the attribute information of the component to be installed and a target installation area on the target installation equipment; and displaying AR prompt content for indicating a target installation area corresponding to the component to be installed through the AR equipment.

Description

Augmented reality AR interaction method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of Augmented Reality (AR) technologies, and in particular, to a method, an apparatus, a device, and a storage medium for AR interaction.
Background
The AR technology is a technology for superimposing a corresponding image, video, and 3-Dimensional (3D) model on an image according to a position and an angle of the image of a camera calculated in real time to realize fusion of a virtual world and a real world, and provides a new interactive experience for a user, and thus is widely applied to various technical fields such as consumption, medical care, and industry.
Taking a factory work scene as an example, when the AR technology is applied to this scene, the AR interaction experience is low due to lack of relevant processing for a specific work scene.
Disclosure of Invention
The embodiment of the disclosure at least provides a method, a device, equipment and a storage medium for augmented reality AR interaction.
In a first aspect, an embodiment of the present disclosure provides a method for augmented reality AR interaction, including:
acquiring a target image of a working site acquired by AR equipment;
identifying attribute information of a component to be mounted in the target image;
determining target installation equipment matched with the attribute information of the component to be installed and a target installation area on the target installation equipment;
and displaying AR prompt content for indicating the target installation area corresponding to the component to be installed through the AR equipment.
The target installation area determined by the augmented reality AR interaction method can be the installation area determined from the target installation equipment based on the attribute information under the condition that the attribute information of the component to be installed is identified from the target image, so that a user can install the component to be installed in the determined target installation area based on the guiding function of the AR prompt content, the interaction experience feeling between the user and the AR equipment is improved on the premise that the fusion of the AR virtual environment and the real scene is realized, and the AR prompt content of the target installation area corresponding to the component to be installed is superposed on the acquired target image in the actual industrial scene, so that the industrial operation process, such as the component installation process, can be effectively guided.
In one possible embodiment, the attribute information of the component to be mounted includes at least one of: category, size, model, name.
In a possible embodiment, the identifying the attribute information of the component to be mounted in the target image includes:
extracting feature information of the component to be mounted from the target image;
searching a standard mounting component matched with the component to be mounted from the database based on the extracted feature information and feature information of each standard mounting component stored in the database;
and determining the searched attribute information of the standard installation component as the attribute information of the component to be installed.
The database of the embodiment of the present disclosure may store the feature information of the standard installation component in advance, and thus, under the condition that the feature information of the component to be installed is extracted from the target image, the determination of the associated attribute information may be realized based on the feature information comparison, that is, the attribute information of the component to be installed may be rapidly determined by using the comparison function of the database, so that the target installation area may be rapidly and accurately determined, and the AR prompt content may be rapidly displayed, so as to perform effective guidance, and further, the interactive experience between the user and the AR device may be improved.
In a possible embodiment, the identifying the attribute information of the component to be mounted in the target image includes:
identifying attribute information of the component to be installed in the target image based on a pre-trained neural network for identifying the component to be installed; the neural network is obtained by training an image sample based on labeled component identifications, and each component identification has corresponding attribute information.
The attribute information of the component to be installed in the target image can be determined by utilizing the neural network, and the attribute information can be determined only by inputting the target image into the neural network because the neural network is trained in advance, so that the operation is simple, and the time and the efficiency are saved.
In a possible embodiment, the identifying the attribute information of the component to be mounted in the target image includes:
carrying out contour detection on the component to be installed in the target image to obtain a contour detection result;
determining that the attribute information of the component to be mounted includes the contour detection result.
Here, in consideration of the influence of the contour feature on the attribute of the component to be mounted, the component to be mounted may be subjected to contour detection first to determine attribute information of the component to be mounted according to a contour detection result, for example, information such as a size, a type, and the like of the component may be determined, which enriches the content of the attribute information, so that data support may be provided for determining a target mounting region more accurately subsequently.
In a possible embodiment, the component to be mounted is a plurality of components, and the method further includes:
after displaying AR prompt content for indicating the target installation area corresponding to one of the components to be installed through the AR equipment, determining target installation equipment matched with attribute information of the next component to be installed and the target installation area on the target installation equipment based on installation flow information among a plurality of components to be installed under the condition that the installation of one of the components to be installed is detected to be completed based on a currently acquired target image;
and displaying AR prompt content for indicating the target installation area corresponding to the next component to be installed through the AR equipment.
Here, considering that there is a specific installation order among different components to be installed for a specific scenario, for example, for three components to be installed, component a, component B, and component C, the installation order of these three components is a- > B- > C in turn, in this case, the embodiment of the present disclosure may also combine installation flow information of multiple components to be installed, and display AR prompt contents corresponding to different components to be installed according to the operation order, thereby effectively guiding the user to complete the job installation flow.
In a possible implementation manner, the presenting, by the AR device, AR prompt content for indicating the target installation area corresponding to the component to be installed includes:
determining image position information of the target installation area in the target image according to map position information of the target installation area in a three-dimensional map corresponding to the operation site;
displaying, by the AR device, AR prompt content indicating the target installation area corresponding to the component to be installed based on the determined image position information.
Here, the display position of the AR prompt content pointing to the target installation area on the display screen of the AR device may be determined based on a conversion relationship between a world coordinate system in which the target installation area is located and an image coordinate system in which the real scene image is located, so that the displayed AR special effect is directed to the component to be installed, thereby effectively guiding the user to complete the component installation process.
In a possible implementation manner, the presenting, by the AR device, AR prompt content for indicating the target installation area corresponding to the component to be installed includes:
displaying, by the AR device, the AR special effect marked with the target installation area; and/or the presence of a gas in the gas,
displaying, by the AR device, an AR special effect that includes a location pointing to the target installation area.
In the embodiment of the disclosure, display of AR prompt contents related to a target installation area can be realized based on multiple AR special effect display modes, wherein the AR special effect for marking the target installation area can highlight the target installation area, and the AR special effect pointing to the position of the target installation area can highlight a component to be installed so as to effectively guide and further improve AR interaction experience.
In a second aspect, an embodiment of the present disclosure further provides an apparatus for augmented reality AR interaction, including:
the acquisition module is used for acquiring a target image of a working site acquired by the AR equipment;
the identification module is used for identifying the attribute information of the component to be installed in the target image;
the determining module is used for determining target installation equipment matched with the attribute information of the component to be installed and a target installation area on the target installation equipment;
and the display module is used for displaying the AR prompt content used for indicating the target installation area corresponding to the component to be installed through the AR equipment.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor being configured to execute the machine-readable instructions stored in the memory, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method of augmented reality AR interaction according to the first aspect and any of its various embodiments.
In a fourth aspect, this disclosure also provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by an electronic device, the electronic device performs the steps of the method for augmented reality AR interaction according to the first aspect and any one of the various implementation manners thereof.
For the description of the effects of the above augmented reality AR interaction apparatus, device, and computer-readable storage medium, reference is made to the description of the above augmented reality AR interaction method, which is not repeated herein.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 shows a flowchart of a method for augmented reality AR interaction provided by an embodiment of the present disclosure;
fig. 2 shows a flowchart of another method for augmented reality AR interaction provided by an embodiment of the present disclosure;
fig. 3 shows a flowchart of another method for augmented reality AR interaction provided by an embodiment of the present disclosure;
FIG. 4 shows a scene schematic diagram of a method for augmented reality AR interaction provided by an embodiment of the present disclosure;
fig. 5 shows a schematic diagram of an apparatus for augmented reality AR interaction provided by an embodiment of the present disclosure;
fig. 6 shows a schematic diagram of an electronic device provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of embodiments of the present disclosure, as generally described and illustrated herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
Research shows that when the AR technology is applied to a factory work scene, the AR interaction experience is low due to the lack of relevant processing aiming at a specific work scene.
Based on the research, the present disclosure provides a method, an apparatus, a device and a storage medium for augmented reality AR interaction, which improve AR interaction experience.
The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
To facilitate understanding of the present embodiment, first, a method for AR interaction disclosed in the embodiments of the present disclosure is described in detail, where an execution subject of the method for AR interaction provided in the embodiments of the present disclosure is generally an electronic device with certain computing capability, and the electronic device includes, for example: the user terminal or the server or other processing devices may be, for example, a server connected to the user terminal, the user terminal may be a tablet computer, a smart phone, a smart wearable device, an AR device (e.g., AR glasses, AR helmet, etc.), and other devices having a display function and a data processing capability, and the user terminal may be connected to the server through an application program. In some possible implementations, the method of AR interaction may be implemented by a processor invoking computer readable instructions stored in a memory.
The following describes a method of AR interaction provided by the embodiments of the present disclosure.
Referring to fig. 1, a flowchart of a method for AR interaction provided in an embodiment of the present disclosure is shown, where the method includes steps S101 to S104, where:
s101, acquiring a target image of a working site acquired by AR equipment;
s102, identifying attribute information of a component to be installed in a target image;
s103, determining target installation equipment matched with the attribute information of the component to be installed and a target installation area on the target installation equipment;
and S104, displaying AR prompt contents used for indicating a target installation area corresponding to the component to be installed through the AR equipment.
The AR interaction method provided by the embodiment of the disclosure can be mainly applied to an interaction scene of factory operation, for example, under the condition that an operator holds a component to be installed in a working site, the AR interaction method is used for guiding the operator to install the component to be installed to a specific position on a specific device, the experience degree of AR interaction is good, and in an actual industrial scene, by superimposing AR prompt contents of a target installation area corresponding to the component to be installed on a collected target image, an industrial operation process, such as a component installation process, can be effectively guided.
In order to facilitate guiding an operator to mount a component to be mounted to a specific position on a specific device, the AR interaction method provided by the embodiment of the disclosure may be implemented by using AR prompt content displayed by the AR device and capable of indicating a target mounting area corresponding to the component to be mounted.
Before displaying the AR prompt content, the embodiments of the present disclosure may determine a target installation area corresponding to the component to be installed. In the implementation of the present disclosure, the property information of the component to be mounted in the target image may be first identified, and then the target mounting device bound by the property information of the component to be mounted and the target mounting area on the device are determined based on the mounting device bound by the property information of one mounting component and the corresponding mounting area.
The correspondence between the relevant mounting components and the mounting apparatuses and the corresponding mounting areas may be pre-stored, for example, the mounting apparatuses and the mounting areas of each mounting component matching the current work node may be determined based on relevant operations of a standard work flow, that is, the components to be mounted and the respective mounting apparatuses may be identified from the target image, and then the target mounting apparatuses matching the components to be mounted and the corresponding target mounting areas may be determined based on the correspondence.
The attribute information on the component to be mounted in the embodiment of the present disclosure may be one or more of information such as a component category, a size, a model, a name, and the like. In some embodiments, a target installation device and a corresponding installation area matched with the target installation device can be determined based on one attribute information, for example, in the case that the name of one component to be installed is determined, the corresponding target installation area can be determined based on the matching relationship between the current component to be installed and each installation device; in other embodiments, it is often necessary to combine multiple kinds of attribute information to determine the target installation device and the corresponding installation area, where the target installation device is determined to be approximately matched based on the category, and then the target installation area is precisely located based on the size, model, and other information. For example, in the case where it is determined that the component to be mounted is a screw of a specific type, the target mounting region determined here may be a mounting region including a nut that is fitted to the screw.
In addition, other information capable of indicating the component attribute may also be adopted as the attribute information in the embodiments of the present disclosure, and is not particularly limited herein.
In view of the key role of the identification of the attribute information on the component to be mounted on the determination of the target mounting apparatus and the target mounting region, the process of identifying the attribute information of the component to be mounted can be described in detail next.
In the embodiment of the disclosure, the attribute information of the component to be mounted can be determined not only by means of characteristic information comparison, but also by means of a pre-trained neural network, and also by means of contour detection. Specifically, the following three aspects can be respectively explained:
in a first aspect: the embodiment of the disclosure may determine the attribute information of the component to be mounted based on a characteristic information comparison manner, and specifically includes the following steps:
s201, extracting characteristic information of a component to be installed from a target image;
s202, searching for a standard installation component matched with the component to be installed from a database based on the extracted characteristic information and the characteristic information of each standard installation component stored in the database;
and S203, determining the searched attribute information of the standard installation component as the attribute information of the component to be installed.
Here, first, feature information of the component to be mounted may be extracted from the target image, and then, based on a comparison result between the extracted feature information and feature information of each standard mounting component stored in the database, a standard mounting component having the highest feature similarity with the component to be mounted may be searched for from the database, and further, attribute information of the component to be mounted may be determined based on attribute information on the searched standard mounting component stored in the database in advance.
The process of searching the matched standard installation component from the database is a process of performing feature matching. Under the condition that the extracted feature information and the feature information stored in the database are represented by vectors, the feature matching can be a process of calculating the cosine similarity of the vectors, and the higher the cosine similarity of the vectors corresponding to the two feature information is, the greater the similarity of the two feature information is, and the further the matching of the two installation parts is.
In a second aspect: in the embodiment of the present disclosure, attribute information of a component to be mounted in a target image may be identified based on a neural network trained in advance for identifying the component to be mounted.
In order to train the neural network, a plurality of image samples can be collected in advance, then all or individual image samples are labeled, the part identifiers have corresponding attribute information, so that the labeled image samples can be input into the neural network to be trained for training, and parameters of the neural network are trained through training the corresponding relationship between the input image samples and the output labeling results.
Under the condition that the parameters of the neural network are obtained through training, the parameters can be used for identifying the parts to be installed in the target image, and therefore corresponding attribute information is obtained.
In a third aspect: in the embodiment of the present disclosure, the attribute information of the component to be mounted may be determined by performing contour detection on the component to be mounted in the target image.
Here, the target image may be subjected to gradation processing first so as to highlight the component to be mounted from the target image. For the target image after the gray scale processing, polygon fitting can be performed, that is, a side line which may form the outline of the component to be mounted is determined, and then image position information of the component to be mounted in the target image is identified, and based on the image position information, attribute information related to the component to be mounted can be determined.
According to the AR interaction method provided by the embodiment of the disclosure, the number of the identified parts to be installed can be one or multiple. When the number of the to-be-mounted components is one, the AR device displays AR prompt contents capable of indicating the target mounting area corresponding to the to-be-mounted components under the condition that the attribute information of the to-be-mounted components is determined according to the steps. When the number of the to-be-installed components is multiple, the method for AR interaction may further perform corresponding installation guidance by combining AR prompt contents corresponding to the multiple to-be-installed components, respectively.
In some embodiments, considering that a plurality of components to be mounted may or may not have a sequence of operations, the embodiments of the present disclosure may adaptively display AR prompt contents by using a plurality of installation guidance manners.
For example, when a plurality of components to be mounted do not have a sequential operation order, the target mounting device matched with the attribute information of each component to be mounted and the target mounting area on the target mounting device may be determined respectively under the condition that the attribute information of the plurality of components to be mounted is identified, and then the corresponding components to be mounted may be correspondingly mounted through the display effect of the AR device on the plurality of target mounting areas.
In addition, under the condition that the multiple components to be mounted have a sequential operation sequence, after displaying the AR prompt content for indicating the target mounting area corresponding to one of the components to be mounted through the AR device, whether the one component to be mounted is detected based on the currently acquired target image, if so, the target mounting device matched with the attribute information of the next component to be mounted and the target mounting area on the target mounting device can be determined, and the display of the AR prompt content of the target mounting area corresponding to the next component to be mounted is performed, so as to perform sequential mounting guiding operation.
In the embodiment of the present disclosure, as shown in fig. 3, AR prompt content may be presented according to the following steps:
s301, determining image position information of the target installation area in the target image according to the map position information of the target installation area in the three-dimensional map corresponding to the operation site;
s302, displaying AR prompt content used for indicating a target installation area corresponding to the component to be installed through the AR equipment based on the determined image position information.
Here, first, the map position information in the three-dimensional map corresponding to the target installation area in the work site may be converted into the image position information of the target installation area in the target image based on the conversion relationship between the world coordinate system and the image coordinate system, and then the AR hint content may be presented at the image position indicated by the converted image position information.
The content of the AR prompt in the embodiment of the present disclosure may be an AR special effect showing and marking the target installation area, for example, a highlight special effect prompt related to the target installation area, a text special effect prompt, or an AR special effect including a three-dimensional virtual arrow pointing to the target installation area.
In order to further understand the method of AR interaction provided by the embodiments of the present disclosure, the process of AR presentation described above may be described in detail with reference to fig. 4.
The AR device in this embodiment of the present application may acquire a target image including a component to be mounted, where the component to be mounted may be held in a hand of an operator, and in some embodiments, the held component may be determined to be the component to be mounted based on gesture recognition.
Under the condition that attribute information of a handheld component to be installed is identified according to the AR interaction method, matched target installation equipment and a corresponding target installation area can be determined, wherein AR prompt content of the target installation area matched with the component to be installed can be displayed through the AR equipment, as shown in FIG. 4, according to the embodiment of the disclosure, related prompt (shown as a circular area) of the outline of the target installation area can be performed, a three-dimensional virtual arrow pointing to the target installation area can be displayed, and the component installation flow can be effectively guided by superposing the AR prompt content of the target installation area corresponding to the component to be installed on the collected target image, so that an operator can install the component to be installed according to the related prompt, the AR interaction experience is improved, and human eyes are not required to observe the area required to be installed by the component to be installed, this will greatly improve the working efficiency.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, an AR interaction apparatus corresponding to the AR interaction method is also provided in the embodiments of the present disclosure, and since the principle of the apparatus in the embodiments of the present disclosure for solving the problem is similar to the above-mentioned AR interaction method in the embodiments of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not described again.
Referring to fig. 5, which is a schematic diagram of an AR interaction apparatus provided in an embodiment of the present disclosure, the apparatus includes: an acquisition module 501, an identification module 502, a determination module 503 and a display module 504; wherein the content of the first and second substances,
an obtaining module 501, configured to obtain a target image of an operation site collected by an AR device;
an identification module 502 for identifying attribute information of a component to be mounted in a target image;
a determining module 503, configured to determine a target mounting apparatus that matches the attribute information of the component to be mounted, and a target mounting area on the target mounting apparatus;
a display module 504, configured to display, by an AR device, AR prompt content for indicating a target installation area corresponding to a component to be installed.
The target installation area determined by the AR interaction device can be the installation area determined from the target installation equipment based on the attribute information under the condition that the attribute information of the component to be installed is identified from the target image, so that a user can install the component to be installed in the determined target installation area based on the guiding function of the AR prompt content, the interactive experience between the user and the AR equipment is improved on the premise that the fusion of a virtual environment and a real scene is realized, and the AR prompt content of the target installation area corresponding to the component to be installed is superposed on the acquired target image in an actual industrial scene, so that the industrial operation process, such as the component installation process, can be effectively guided.
In one possible embodiment, the attribute information of the component to be mounted includes at least one of: category, size, model, name.
In one possible implementation, the identifying module 502 is configured to identify the attribute information of the component to be mounted in the target image according to the following steps:
extracting characteristic information of a component to be mounted from the target image;
searching a standard installation component matched with the component to be installed from the database based on the extracted characteristic information and the characteristic information of each standard installation component stored in the database;
and determining the searched attribute information of the standard installation component as the attribute information of the component to be installed.
In one possible implementation, the identifying module 502 is configured to identify the attribute information of the component to be mounted in the target image according to the following steps:
identifying attribute information of the component to be installed in the target image based on a pre-trained neural network for identifying the component to be installed; the neural network is obtained by training the image sample based on the labeled component identifications, and each component identification has corresponding attribute information.
In one possible implementation, the identifying module 502 is configured to identify the attribute information of the component to be mounted in the target image according to the following steps:
carrying out contour detection on a part to be installed in a target image to obtain a contour detection result;
determining that the attribute information of the component to be mounted includes the contour detection result.
In a possible embodiment, the component to be mounted is a plurality of components, the display module 504 being in particular configured to:
after displaying AR prompt content for indicating the target installation area corresponding to one of the components to be installed through the AR equipment, determining target installation equipment matched with attribute information of the next component to be installed and the target installation area on the target installation equipment based on installation process information among a plurality of components to be installed under the condition that the installation of one of the components to be installed is detected to be completed based on a currently acquired target image;
and displaying AR prompt content for indicating a target installation area corresponding to the next component to be installed through the AR equipment.
In a possible implementation manner, the presentation module 504 is configured to present, by the AR device, AR prompt content for indicating a target installation area corresponding to the component to be installed, according to the following steps:
determining image position information of the target installation area in the target image according to map position information of the target installation area in a three-dimensional map corresponding to the operation site;
and displaying AR prompt content used for indicating a target installation area corresponding to the component to be installed through the AR equipment based on the determined image position information.
In a possible implementation manner, the presentation module 504 is configured to present, by the AR device, AR prompt content for indicating a target installation area corresponding to the component to be installed, according to the following steps:
displaying the AR special effect marked with the target installation area through AR equipment; and/or the presence of a gas in the gas,
and displaying the AR special effect containing the position pointing to the target installation area through the AR equipment.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
An embodiment of the present disclosure further provides an electronic device, as shown in fig. 6, which is a schematic structural diagram of the electronic device provided in the embodiment of the present disclosure, and the electronic device includes: a processor 601, a memory 602, and a bus 603. The memory 602 stores machine-readable instructions executable by the processor 601 (for example, execution instructions corresponding to the obtaining module 501, the identifying module 502, the determining module 503, and the presenting module 504 in the AR interaction apparatus in fig. 5, and the like), when the electronic device is operated, the processor 601 communicates with the memory 602 through the bus 603, and when the processor 601 executes the following processes:
acquiring a target image of a working site acquired by AR equipment;
identifying attribute information of a component to be mounted in a target image;
determining target installation equipment matched with the attribute information of the component to be installed and a target installation area on the target installation equipment;
and displaying AR prompt content for indicating a target installation area corresponding to the component to be installed through the AR equipment.
For the specific execution process of the instruction, reference may be made to the steps of the AR interaction method described in the embodiments of the present disclosure, and details are not described here.
Embodiments of the present disclosure also provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, performs the steps of the method for AR interaction described in the above method embodiments. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The embodiments of the present disclosure also provide a computer program product, where the computer program product carries a program code, and instructions included in the program code may be used to execute the steps of the AR interaction method described in the foregoing method embodiments, which may be referred to specifically in the foregoing method embodiments, and are not described herein again.
The computer program product may be implemented by hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (11)

1. A method of Augmented Reality (AR) interaction, comprising:
acquiring a target image of a working site acquired by AR equipment;
identifying attribute information of a component to be mounted in the target image;
determining target installation equipment matched with the attribute information of the component to be installed and a target installation area on the target installation equipment;
and displaying AR prompt content for indicating the target installation area corresponding to the component to be installed through the AR equipment.
2. The method according to claim 1, wherein the attribute information of the component to be mounted includes at least one of: category, size, model, name.
3. The method according to claim 1 or 2, wherein the identifying of the attribute information of the component to be mounted in the target image comprises:
extracting feature information of the component to be mounted from the target image;
searching a standard mounting component matched with the component to be mounted from the database based on the extracted feature information and feature information of each standard mounting component stored in the database;
and determining the searched attribute information of the standard installation component as the attribute information of the component to be installed.
4. The method according to claim 1 or 2, the identifying attribute information of the component to be mounted in the target image, comprising:
identifying attribute information of the component to be installed in the target image based on a pre-trained neural network for identifying the component to be installed; the neural network is obtained by training an image sample based on labeled component identifications, and each component identification has corresponding attribute information.
5. The method according to claim 1 or 2, wherein the identifying of the attribute information of the component to be mounted in the target image comprises:
carrying out contour detection on the component to be installed in the target image to obtain a contour detection result;
determining that the attribute information of the component to be mounted includes the contour detection result.
6. The method according to any one of claims 1 to 5, wherein the member to be mounted is plural, the method further comprising:
after displaying AR prompt content for indicating the target installation area corresponding to one of the components to be installed through the AR equipment, determining target installation equipment matched with attribute information of the next component to be installed and the target installation area on the target installation equipment based on installation flow information among a plurality of components to be installed under the condition that the installation of one of the components to be installed is detected to be completed based on a currently acquired target image;
and displaying AR prompt content for indicating the target installation area corresponding to the next component to be installed through the AR equipment.
7. The method according to any one of claims 1 to 6, wherein the displaying, by the AR device, AR prompt content for indicating the target installation area corresponding to the component to be installed comprises:
determining image position information of the target installation area in the target image according to map position information of the target installation area in a three-dimensional map corresponding to the operation site;
displaying, by the AR device, AR prompt content indicating the target installation area corresponding to the component to be installed based on the determined image position information.
8. The method according to any one of claims 1 to 7, wherein the displaying, by the AR device, AR prompt content for indicating the target installation area corresponding to the component to be installed comprises:
displaying, by the AR device, the AR special effect marked with the target installation area; and/or the presence of a gas in the gas,
displaying, by the AR device, an AR special effect that includes a location pointing to the target installation area.
9. An apparatus for Augmented Reality (AR) interaction, comprising:
the acquisition module is used for acquiring a target image of a working site acquired by the AR equipment;
the identification module is used for identifying the attribute information of the component to be installed in the target image;
the determining module is used for determining target installation equipment matched with the attribute information of the component to be installed and a target installation area on the target installation equipment;
and the display module is used for displaying the AR prompt content used for indicating the target installation area corresponding to the component to be installed through the AR equipment.
10. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor for executing the machine-readable instructions stored in the memory, the processor and the memory communicating over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method of augmented reality AR interaction of any of claims 1 to 8.
11. A computer-readable storage medium, having stored thereon a computer program, which, when executed by an electronic device, causes the electronic device to perform the steps of the method of augmented reality, AR, interaction of any one of claims 1 to 8.
CN202011230667.2A 2020-11-06 2020-11-06 Augmented reality AR interaction method, device, equipment and storage medium Withdrawn CN112365607A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011230667.2A CN112365607A (en) 2020-11-06 2020-11-06 Augmented reality AR interaction method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011230667.2A CN112365607A (en) 2020-11-06 2020-11-06 Augmented reality AR interaction method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112365607A true CN112365607A (en) 2021-02-12

Family

ID=74509124

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011230667.2A Withdrawn CN112365607A (en) 2020-11-06 2020-11-06 Augmented reality AR interaction method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112365607A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991514A (en) * 2021-02-26 2021-06-18 北京市商汤科技开发有限公司 AR data display method and device, electronic equipment and storage medium
CN113392268A (en) * 2021-03-31 2021-09-14 百果园技术(新加坡)有限公司 Special effect text rendering method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920071A (en) * 2017-02-23 2017-07-04 广东电网有限责任公司教育培训评价中心 Substation field operation householder method and system
CN109032348A (en) * 2018-06-26 2018-12-18 亮风台(上海)信息科技有限公司 Intelligence manufacture method and apparatus based on augmented reality
CN111651047A (en) * 2020-06-05 2020-09-11 浙江商汤科技开发有限公司 Virtual object display method and device, electronic equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920071A (en) * 2017-02-23 2017-07-04 广东电网有限责任公司教育培训评价中心 Substation field operation householder method and system
CN109032348A (en) * 2018-06-26 2018-12-18 亮风台(上海)信息科技有限公司 Intelligence manufacture method and apparatus based on augmented reality
CN111651047A (en) * 2020-06-05 2020-09-11 浙江商汤科技开发有限公司 Virtual object display method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991514A (en) * 2021-02-26 2021-06-18 北京市商汤科技开发有限公司 AR data display method and device, electronic equipment and storage medium
CN113392268A (en) * 2021-03-31 2021-09-14 百果园技术(新加坡)有限公司 Special effect text rendering method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
Cao et al. Mobile augmented reality: User interfaces, frameworks, and intelligence
US20170293959A1 (en) Information processing apparatus, shelf label management system, control method, and program
CN112148197A (en) Augmented reality AR interaction method and device, electronic equipment and storage medium
CN106127552B (en) Virtual scene display method, device and system
CN111950521A (en) Augmented reality interaction method and device, electronic equipment and storage medium
CN105446626A (en) Augmented reality technology based commodity information acquisition method and system and mobile terminal
CN112288883B (en) Method and device for prompting operation guide information, electronic equipment and storage medium
CN112288882A (en) Information display method and device, computer equipment and storage medium
CN111401318A (en) Action recognition method and device
EP3505972A1 (en) Method, apparatus and system for assisting security inspection
CN108604256B (en) Component information search device, component information search method, and program
CN111640193A (en) Word processing method, word processing device, computer equipment and storage medium
CN112288889A (en) Indication information display method and device, computer equipment and storage medium
CN112598805A (en) Prompt message display method, device, equipment and storage medium
CN112905014A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN112580666A (en) Image feature extraction method, training method, device, electronic equipment and medium
CN115393872A (en) Method, device and equipment for training text classification model and storage medium
CN112991555B (en) Data display method, device, equipment and storage medium
CN112365607A (en) Augmented reality AR interaction method, device, equipment and storage medium
CN112991514A (en) AR data display method and device, electronic equipment and storage medium
CN115221037A (en) Interactive page testing method and device, computer equipment and program product
CN111638794A (en) Display control method and device for virtual cultural relics
CN111665945A (en) Tour information display method and device
CN111651049A (en) Interaction method and device, computer equipment and storage medium
CN111638792A (en) AR effect presentation method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210212

WW01 Invention patent application withdrawn after publication