WO2022267626A1 - Procédé et appareil de présentation de données de réalité augmentée, et dispositif, support et programme - Google Patents

Procédé et appareil de présentation de données de réalité augmentée, et dispositif, support et programme Download PDF

Info

Publication number
WO2022267626A1
WO2022267626A1 PCT/CN2022/085935 CN2022085935W WO2022267626A1 WO 2022267626 A1 WO2022267626 A1 WO 2022267626A1 CN 2022085935 W CN2022085935 W CN 2022085935W WO 2022267626 A1 WO2022267626 A1 WO 2022267626A1
Authority
WO
WIPO (PCT)
Prior art keywords
target scene
data
scene area
area
positioning information
Prior art date
Application number
PCT/CN2022/085935
Other languages
English (en)
Chinese (zh)
Inventor
田真
李斌
欧华富
Original Assignee
上海商汤智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海商汤智能科技有限公司 filed Critical 上海商汤智能科技有限公司
Publication of WO2022267626A1 publication Critical patent/WO2022267626A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Definitions

  • Embodiments of the present disclosure relate to the field of augmented reality technology, and in particular, to a method, device, device, medium, and program for displaying augmented reality data.
  • Augmented reality technology is a new technology that seamlessly integrates real world information and virtual world information. Touch, etc., through computer and other science and technology, simulate and then superimpose, apply virtual information to the real world, and be perceived by human senses, so as to achieve sensory experience beyond reality, and realize real-time superimposition of real environment and virtual objects on the same in a frame or space. With the development of augmented reality technology, augmented reality technology is increasingly used in cultural and tourism scenarios.
  • the present disclosure at least provides an augmented reality data display method, device, device, medium and program.
  • the present disclosure provides a method for displaying augmented reality data, the method being executed by an electronic device, including:
  • Detecting that the AR device is located outside the target scene area acquiring navigation map information for instructing the AR device to arrive at the target scene area from the position indicated by the first positioning information;
  • an augmented reality data display device including:
  • the first obtaining module is configured to obtain the first positioning information of the augmented reality AR device
  • the second obtaining module is configured to detect that the AR device is located outside the target scene area, and obtain navigation map information for instructing the AR device to reach the target scene area from the position indicated by the first positioning information;
  • a third acquiring module configured to detect that the AR device is located in the target scene area, and acquire second positioning information of the AR device in the target scene area;
  • the first presentation module is configured to, in response to the position indicated by the second positioning information being located in the explanation area corresponding to any preset knowledge point, play an audio explanation matching the any preset knowledge point through the AR device data and display the AR special effect data matching the audio commentary data.
  • the present disclosure provides an electronic device, including: a processor, a memory, and a bus, the memory stores machine-readable instructions executable by the processor, and when the electronic device is running, the processor and the The memory communicates with each other through a bus, and when the machine-readable instructions are executed by the processor, the method for displaying augmented reality data as described in the first aspect or any implementation manner described above is executed.
  • the present disclosure provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is run by a processor, the enhancement described in the above-mentioned first aspect or any implementation mode is executed.
  • Realistic data display method Realistic data display method.
  • An embodiment of the present disclosure also provides a computer program, where the computer program includes computer readable codes, and when the computer readable codes run in an electronic device, the processor of the electronic device executes any of the above embodiments The described augmented reality data display method.
  • Embodiments of the present disclosure at least provide a method, device, device, medium, and program for displaying augmented reality data.
  • the indication of the navigation map information can be used to make the AR device locate from the first position.
  • the position indicated by the information moves to the target scene area, and the movement of the AR device is instructed by obtaining the navigation map information, which can improve the efficiency of the AR device reaching the target scene area;
  • For the second positioning information of the area if the position indicated by the second positioning information is located in the explanation area corresponding to any preset knowledge point, the audio explanation data matching any preset knowledge point will be played through the AR device, and the audio explanation data corresponding to the audio explanation data will be displayed.
  • the matching AR special effect data makes the explanation process of the preset knowledge points clearer and more intuitive, and the explanation effect of the preset knowledge points is better.
  • the navigation map information to navigate the AR device to the target scene area, there is no need to model the scene area outside the target scene area, which reduces the waste of resources caused by scene modeling.
  • FIG. 1a is a schematic diagram of a system architecture of an augmented reality data display method provided by an embodiment of the present disclosure
  • Fig. 1b shows a schematic flowchart of a method for displaying augmented reality data provided by an embodiment of the present disclosure
  • Fig. 1c shows another schematic flowchart of a method for displaying augmented reality data provided by an embodiment of the present disclosure
  • Fig. 2a shows a schematic interface diagram of an AR device provided by an embodiment of the present disclosure
  • Fig. 2b shows a schematic interface diagram of an AR device provided by an embodiment of the present disclosure
  • Fig. 3a shows a schematic interface diagram of an AR device provided by an embodiment of the present disclosure
  • Fig. 3b shows a schematic interface diagram of an AR device provided by an embodiment of the present disclosure
  • Fig. 3c shows a schematic interface diagram of an AR device provided by an embodiment of the present disclosure
  • FIG. 4 shows a schematic interface diagram of an AR device provided by an embodiment of the present disclosure
  • Fig. 5a shows a schematic interface diagram of an AR device provided by an embodiment of the present disclosure
  • Fig. 5b shows a schematic interface diagram of an AR device provided by an embodiment of the present disclosure
  • FIG. 6 shows a schematic flowchart of another augmented reality data display method provided by an embodiment of the present disclosure
  • FIG. 7 shows a schematic structural diagram of an augmented reality data display device provided by an embodiment of the present disclosure
  • Fig. 8 shows a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure.
  • Augmented reality technology is a new technology that seamlessly integrates real world information and virtual world information. Touch, etc., through computer and other science and technology, simulate and then superimpose, apply virtual information to the real world, and be perceived by human senses, so as to achieve sensory experience beyond reality, and realize real-time superimposition of real environment and virtual objects on the same in a frame or space. With the development of augmented reality technology, augmented reality technology is increasingly used in cultural and tourism scenarios. Based on this, the embodiments of the present disclosure provide an augmented reality data display method, device, equipment, medium and program.
  • the execution subject of the augmented reality data display method provided by the embodiments of the present disclosure may be an AR device, and the AR device is a smart device capable of supporting AR functions.
  • the AR device includes but is not limited to a mobile phone, a tablet, and AR glasses.
  • FIG. 1 a is a schematic diagram of a system architecture of an augmented reality data display method provided by an embodiment of the present disclosure.
  • the system architecture includes: an AR device 11 , a network 12 and a control terminal 13 .
  • the AR device 11 and the control terminal 13 establish a communication connection through the network 12.
  • the AR device 11 reports the first positioning information of the AR device 11 to the control terminal 13 through the network 12, and the control terminal 13 communicates with the AR device.
  • the 11 to detect the location, if it is outside the target scene area, obtain the navigation map information; if it is located in the target scene area, obtain the second positioning information of the AR device 11; then, through the second positioning information, detect whether it is in any preset In the explaining area corresponding to the knowledge point; finally, if it is in the explaining area, the audio explaining data matching any preset knowledge point is played through the AR device 11 and the AR special effect data matching the audio explaining data is displayed.
  • control terminal 13 may include a positioning terminal, and the AR device 11 may include a visual processing device capable of processing visual information.
  • the network 12 can be wired or wirelessly connected.
  • the positioning terminal 13 may communicate with the visual processing device through a wired connection, such as performing data communication through a bus.
  • the AR device 11 may be a vision processing device with a video capture module, or a host with a camera.
  • the augmented reality data presentation method of the embodiment of the present disclosure may be executed by the AR device 11 , and the above-mentioned system architecture may not include the network 12 and the control terminal 13 .
  • FIG. 1b is a schematic flowchart of the augmented reality data display method provided by the embodiment of the present disclosure.
  • the method includes S101 to S104, wherein:
  • the navigation map information when it is detected that the AR device is located outside the target scene area, the navigation map information can be obtained, and the indication of the navigation map information can be used to make the AR device move from the position indicated by the first positioning information to the target scene area.
  • the navigation map information indicates the movement of the AR device, which can improve the efficiency of the AR device reaching the target scene area.
  • by obtaining the navigation map information to navigate the AR device to the target scene area there is no need to model the scene area outside the target scene area, which reduces the waste of resources caused by scene modeling.
  • the second positioning information of the AR device in the target scene area may be obtained, and when it is determined according to the second positioning information that the AR device is located in the target scene area and any In the explanation area corresponding to a preset knowledge point, the audio explanation data matching any preset knowledge point is played through the AR device, the AR special effect data matching the audio explanation data is displayed, and the audio explanation data is assisted by the AR special effect data , so that the explanation process of the preset knowledge points is clearer and more intuitive, and the explanation effect of the preset knowledge points is better.
  • the navigation map information to navigate the AR device to the target scene area, there is no need to model the scene area outside the target scene area, which can improve the efficiency of map creation.
  • the first positioning information of the AR device may be determined according to sensors set on the AR device, where the first positioning information may include position information of the AR device in a real scene.
  • the sensor may be a global positioning system (Global Positioning System, GPS), an inertial sensor (Inertial Measurement Unit, IMU), and the like.
  • the target scene area may be a location area designated by the user, for example, an area arbitrarily designated in the real scene where the AR device is located; it may also be an area of a scenic spot, for example, an area where any scenic spot such as a park or a museum is located.
  • the navigation map information is generated by calling a navigation application installed in the AR device, that is, in S102, the acquisition is used to indicate that the AR device arrives at the target scene area from the position indicated by the first positioning information.
  • Navigation map information can be realized through the steps shown in Figure 1c:
  • S1021 displaying the prompt information for opening the navigation application, so as to prompt to call the navigation application installed in the AR device to generate navigation map information to reach the target scene area from the position indicated by the first positioning information;
  • the AR device can display the prompt information for opening the navigation application, so that the prompt information can be opened through the navigation application, prompting to call the navigation application installed in the AR device.
  • the navigation application of the AR device generates navigation map information for reaching the target scene area from the current position indicated by the first positioning information of the AR device.
  • the navigation application installed in the AR device may be a map application based on GPS positioning.
  • the prompt information for opening the navigation application is displayed on the display interface of the AR device, so as to prompt to call the navigation application installed in the AR device to generate the navigation map information.
  • the navigation application installed in the AR device can also be an AR navigation application for visual positioning such as a high-precision map or simultaneous localization and mapping (Simultaneous Localization And Mapping, SLAM).
  • SLAM simultaneous Localization And Mapping
  • the AR device displays the prompt information for opening the navigation application, and the prompt information for opening the AR navigation application is displayed on the display interface of the AR device, so as to prompt to call the AR navigation application to generate the navigation Map information; in this way, the navigation map information generated by the AR navigation application is AR navigation map information, and the AR navigation map information is displayed on an AR screen, so as to provide users with a navigation map with better guidance and easier understanding for users.
  • high-definition maps or SLAM the scene area outside the target scene area is modeled, and the AR navigation map information can be obtained simply and quickly.
  • the navigation map information may be acquired in response to the trigger operation of opening the prompt information for the navigation application, and the navigation map information may be loaded into the AR screen currently displayed by the AR device.
  • the navigation map information includes at least one of a navigation position and a navigation direction.
  • Fig. 2a includes navigation application opening prompt information, such as navigation application opening prompt information may include "call navigation software, query navigation route 21", "first navigation software 22", "second navigation software 23" ;
  • the first navigation software can be invoked to generate navigation map information, and after obtaining the navigation map information, the navigation map information can be displayed, as shown in FIG. The navigation route of the scene area 25.
  • the second navigation software is invoked to generate navigation map information, and after the navigation map information is acquired, the navigation map information can be displayed.
  • the first navigation software and the second navigation software may both be applets or clients for navigation.
  • the prompt information for opening the navigation application can be displayed on the AR device to prompt the navigation application installed in the AR device to generate the navigation map information from the position indicated by the first positioning information to the target scene area, and then the navigation application can be accessed from the navigation In the application, the navigation map information is obtained, and the navigation map information is loaded into the AR screen currently displayed by the AR device, so that the AR device can move according to the displayed navigation map information, and the navigation efficiency is improved.
  • the method further includes the following two Condition:
  • Case 1 includes at least one of the following: when it is detected that the distance between the AR device and the target scene area is greater than a set distance threshold, based on the determined display position of the virtual model corresponding to the target scene area , controlling the AR device to display the virtual model;
  • the AR device When it is detected that the distance between the AR device and the target scene area is greater than the set distance threshold, the AR device is controlled to display direction guide information, and the direction guide information is used to indicate the AR device reach the target scene area.
  • Case 2 includes at least one of the following: when it is detected that the distance between the AR device and the target scene area is less than or equal to a set distance threshold, control the AR device to play the audio guide data;
  • the AR device When it is detected that the distance between the AR device and the target scene area is less than or equal to the set distance threshold, based on the display position of the AR navigation data corresponding to the target scene area, control the AR The device displays the AR navigation data.
  • the distance between the AR device and the target scene area can be determined, and when the determined distance is greater than the set distance threshold, the steps corresponding to the situation 1 can be executed; when the determined distance When it is less than or equal to the set distance threshold, the steps corresponding to the second case may be performed.
  • the AR device when the distance between the AR device and the target scene area is greater than a set distance threshold, the AR device can be controlled to display at least one of the virtual model and direction guidance information, so that the AR device can view the target scene area Visually display the position and moving direction of the AR device; and when the distance between the AR device and the target scene area is less than or equal to the distance threshold, the AR device can be controlled to play at least one of the audio guide data and the AR guide data to improve the target Navigation effects for scene areas.
  • the driving distance of the AR device from the position indicated by the first positioning information to the navigation starting point of the target scene area can be determined, and the The driving distance is used as the distance between the AR device and the target scene area.
  • the linear distance between the position indicated by the first positioning information of the AR device and the navigation starting point of the target scene area may also be determined, and the linear distance may be used as the distance between the AR device and the target scene area.
  • the AR device can be controlled to display the target scene area. , such as displaying the virtual model in the center of the screen of the AR device.
  • the display position of the virtual model can be determined, and the AR device can be controlled to display the virtual model at the display position.
  • the display position of the virtual model can be determined according to the relative position between the first positioning information of the AR device and the target scene area, and the control The AR device presents the virtual model at the display location. Referring to Fig.
  • the figure includes a virtual model 31 corresponding to the target scene area and direction guidance information 32; when the target scene area is located directly in front of the AR device, the virtual model can be displayed directly above the AR device.
  • the virtual model when the target scene area is located on the left of the AR device, the virtual model can be displayed on the left of the AR device.
  • the virtual model corresponding to the target scene area can be set according to the actual situation, for example, a virtual model of a landmark building in the target scene area can be used as the virtual model corresponding to the target scene area.
  • the virtual model may be a three-dimensional model or a two-dimensional model of the target scene area.
  • a prompt message for starting the navigation application may be displayed. That is, after displaying the virtual model and direction guidance information, the navigation map information can be obtained and displayed, as shown in FIG. 3c. After moving the AR device according to the navigation map information, when it is detected that the distance between the AR device and the target scene area is less than or equal to the set distance threshold, the AR device can be controlled to play the audio guide data corresponding to the target scene area, and/or , based on the determined display position of the AR navigation data corresponding to the target scene area, control the AR device to display the AR navigation data.
  • the AR device can be controlled to play the audio guide data corresponding to the target scene area, and/or, based on the display position of the AR guide data corresponding to the target scene area, the AR device can be controlled to display the AR guide data; for example, the AR guide can be determined in advance.
  • the display position of the navigation data is controlled, and the AR device is controlled to display the AR navigation data at the determined display position.
  • the audio guide data and the AR guide data may be determined according to the situation of the target scene area.
  • the audio guide data can be audio data introducing the paintings and calligraphy collected in the painting and calligraphy exhibition hall, and/or audio introducing the establishment history and closing status of the painting and calligraphy exhibition hall Data;
  • the AR guide data can be AR data that matches the audio guide data, for example, the AR guide data can be AR data including a guide video that introduces the collection status of the calligraphy and painting exhibition hall.
  • second positioning information of the AR device in the target scene area may be acquired.
  • the AR device is positioned by using the three-dimensional scene model and the scene information collected by the AR device, so as to determine the second positioning information of the AR device; that is, after detecting that the AR device is located in the target scene In the area, acquiring the second positioning information of the AR device in the target scene area may include: acquiring a scene image collected by the AR device; determining the AR device based on the scene image and the constructed 3D scene model The second positioning information of .
  • the second positioning information of the AR device can be relatively accurately determined by using the scene image and the constructed three-dimensional scene model.
  • the AR device when it is detected that the AR device is located outside the target scene area, and the AR device is located in the adjacent area corresponding to the target scene area, acquire the scene collected by the AR device image.
  • the adjacent area corresponding to the target scene area may be an area that is less than a preset threshold from the target scene area, for example, an area that is less than 2 meters away from the target scene area is taken as an adjacent area. If the AR device is located outside the target scene area and the AR device is located in the adjacent area, it means that the AR device is near the target scene area, so the AR device is used to collect the target scene area to obtain the scene image.
  • the scene image collected by the AR device can be obtained, and feature points in the scene image can be extracted, and the feature point can be matched with the feature point cloud included in the 3D scene model to determine the second positioning information when the AR device collects the scene image.
  • the second positioning information may include position information and/or orientation information, for example, the position information may be the coordinate information of the AR device in the coordinate system corresponding to the 3D scene model; the orientation information may be the Euler angle corresponding to the AR device.
  • the scene image can be an image collected by the AR device in the target scene area; the feature points in the scene image can identify the target object in the image, and the position information of the AR device can be determined by matching with the feature point cloud in the 3D scene model And/or orientation information when the AR device collects scene images.
  • the three-dimensional scene model can be constructed according to the following steps: collect multiple frames of scene images in different positions, different angles, and different times in the target scene area, extract feature points from each frame of scene images, and obtain the points corresponding to each frame of scene images
  • Cloud collection use the point cloud collections corresponding to the multi-frame scene images to obtain the feature point cloud corresponding to the target scene area, and the feature point cloud corresponding to the target scene area constitutes a 3D scene model.
  • the second positioning information of the AR device may also be determined through a synchronous positioning and mapping technology.
  • the preset knowledge point can be any explainable knowledge point set in the target scene area.
  • the preset knowledge point can be the name of any calligraphy and painting in the exhibition; for example, in the target scene area
  • the preset knowledge point can be the name of any building in the exhibition, etc.
  • the spatial position of each preset knowledge point and the explanation area corresponding to each preset knowledge point can be determined in the constructed three-dimensional scene model, for example, the location of the real object in the target scene area can be The position is determined as the spatial position of the preset knowledge point corresponding to the real object; the surrounding area of the real object in the target scene area may also be set as the explaining area of the preset knowledge point corresponding to the real object.
  • the second positioning information of the AR device can be obtained, and based on the second positioning information of the AR device, it can be judged whether the AR device is located in the target scene area. In the area, if yes, the audio commentary data matching any preset knowledge point is played through the AR device, and the AR special effect data matching the audio commentary data is displayed. Alternatively, based on the second positioning information of the AR device, the moving distance between the AR device and the spatial position of each preset knowledge point can be determined. Set the audio explanation data matching the knowledge points, and display the AR special effect data matching the audio explanation data.
  • the method further includes at least one of the following: when the audio commentary data corresponding to any preset knowledge point is played, and/or when the display of the AR special effect data ends, controlling the The AR device displays AR guidance data for instructing the AR device to move to the explaining area corresponding to the next preset knowledge point;
  • the AR guidance data may be "please go straight for 50 meters and then turn right".
  • the audio guide data may be the audio content of "please go straight for 50 meters and then turn right”.
  • AR guidance data corresponding to the moving distance and/or moving direction is generated by analyzing the moving distance and/or moving direction of the explaining area where the AR device reaches the next preset knowledge point, that is, the control said The AR device displays AR guidance data for instructing the AR device to move to the explanation area corresponding to the next preset knowledge point, including steps A1 to A3, wherein:
  • Step A1 determining the moving distance and/or moving direction of the AR device to the explaining area corresponding to the next preset knowledge point
  • Step A2 generating and displaying the AR guidance data based on the determined moving distance and/or moving direction;
  • Step A3 updating the AR guidance data according to the change of the moving distance and/or the moving direction; and controlling the AR device to display the updated AR guidance data.
  • the AR guidance data can be generated according to the determined moving distance and/or moving direction, and can be updated according to the change of the moving distance and/or moving direction, and the display of the AR guiding data is more flexible.
  • the AR device can clearly display the change of the moving distance and/or moving direction.
  • the AR guidance data includes a guidance identifier for guiding the movement of the AR device; according to the change of the movement distance and/or the movement direction, the update of the AR guidance data includes at least one of the following: The size of the guidance logo is updated; the color of the guidance logo is updated; the shape of the guidance logo is updated; the flashing effect of the guidance logo is updated.
  • the size, color, shape, flashing effect, etc. of the guidance logo can be updated, and the updated content is relatively rich and diverse.
  • the guidance identifier included in the AR guidance data may include: the location identifier of the AR device, the moving direction identifier, the location identifier of the target explanation location, and the like.
  • the pattern of the camera device is used to represent the position mark of the AR device
  • the form of an arrow is used to represent the mark of the moving direction
  • the pattern of a flag is used to represent the position mark of the explained position.
  • the size, color, shape, or flashing effect of the guide mark corresponding to each distance range can be determined.
  • the size of the guide mark corresponding to the first distance range can be the first size, the color can be green, the shape is a hexagon, and the flashing effect of the first flashing frequency;
  • the second distance range it is 60 meters to 20 meters (including 20 meters , not including 60 meters), the size of the guide sign corresponding to the second distance range can be the second size, the color can be yellow, the shape is pentagon, and the second flashing frequency flashing effect;
  • the third distance range is 20 From m to 0 m (including 0 m, excluding 20 m)
  • the size of the guide mark corresponding to the third distance range can be the third size, the color can be red, the shape is a quadrangle, and the flashing effect of the third flashing frequency.
  • the size of the guide mark can be the first size
  • the size of the guide mark can be the second size
  • the moving distance of the AR device changes from the first distance range to the second In the two-distance range, update the guidance identifier of the AR guidance data from the first size to the second size, and generate updated AR guidance data.
  • the linear relationship between the moving distance and the size or the flickering frequency it is also possible to predetermine the linear relationship between the moving distance and the size or the flickering frequency. According to the linear relationship and the determined moving distance, determine the current size or the current flickering frequency of the guide sign corresponding to the AR device, and then according to the current size and the current Flashing frequency to generate AR guidance data. And the AR guidance data can be updated according to the change of the moving distance, and the updated AR guidance data includes the guidance logo matching the changed moving distance.
  • the display direction of the moving direction mark in the guidance mark can be determined according to the moving direction, and then the AR guidance data including the moving direction mark in the display direction can be generated.
  • Fig. 5a shows the AR guidance data generated when the AR device is located at the first position indicated by the second positioning information.
  • the AR guidance data includes the position identifier 51, the moving direction identifier 52, and the target explanation The position identification 53 of the position, the navigation route for moving from the first position to the explanation area corresponding to the next preset knowledge point, and the text guidance data "go straight for 50 meters, then turn right".
  • the AR guidance data can be updated according to the change of the moving distance and/or the moving direction, and the generated updated AR guidance data is shown in FIG. 5b.
  • the AR device in FIG. 5a Update the position identification 51 of the AR device to the position identification 51 of the AR device in Figure 5b, update the movement direction identification 52 in Figure 5a to the movement direction identification 52 in Figure 5b, and update the position identification 53 of the target explanation position in Figure 5a
  • the location marker 53 of the location is explained for the object in FIG. 5b.
  • the method further includes: during the process of playing the audio commentary data corresponding to any preset knowledge point and displaying the AR special effect data matching the audio commentary data, responding to the The second positioning information determines that the AR device does not meet the audio playback conditions corresponding to the audio commentary data, and/or, the AR device does not meet the special effect display conditions corresponding to the AR special effect data, and controls the AR device to perform the following At least one action:
  • the second positioning information of the AR device may change, so the second positioning information determined according to the second positioning position
  • the AR device can be controlled to play the first prompt message to prompt any preset played by the AR device
  • the explanation of the knowledge points, or the AR device can also be controlled to play the second prompt information, so as to prompt the AR device to adjust the second positioning information.
  • the first prompt information is used to remind the AR device that any preset knowledge point has not been explained.
  • the first prompt information may be "knowledge point A has not been explained, please continue to check”.
  • the second prompt information may include prompt information for instructing the AR device to move towards a direction that satisfies the explanation condition corresponding to any preset knowledge point, so as to adjust the second positioning information of the AR device according to the prompt information After that, the AR device can continue to play the audio explanation data corresponding to any preset knowledge point, and display the AR special effect data corresponding to any preset knowledge point.
  • the second prompt information may include prompt information instructing the AR device to go to the explanation area corresponding to the next preset knowledge point, so that after adjusting the second positioning information of the AR device according to the prompt information, the AR device can play the next The audio explanation data corresponding to the preset knowledge point shows the AR special effect data corresponding to the next preset knowledge point.
  • the AR device can be controlled first to display prompt information indicating that the AR device is moving toward satisfying the explanation conditions corresponding to any preset knowledge point. If the AR device does not proceed according to the prompt information after displaying the prompt information, Move, and/or, when the second positioning information of the AR device does not meet the explanation condition corresponding to any preset knowledge point and the duration is longer than the set duration threshold, then control the AR device to display the information used to instruct the AR device to go to the next preset Set the prompt information of the explanation area corresponding to the knowledge point.
  • the method is applied to a client application platform, and the client application platform is a web page application platform or an applet application platform.
  • the method can be applied to a client application platform, and the client can be a web application platform on the AR device, or a small program application platform on the AR device.
  • the client application platform may also be an application program for AR navigation on the AR device.
  • the method for displaying augmented reality data may include:
  • S602. Determine whether the position indicated by the first positioning information of the AR device is outside the target scene area, and the AR device is located near the target scene area.
  • An adjacent area corresponding to the target scene area may be set, and when the AR device is located in the adjacent area, it is determined that the AR device is located near the target scene area. If the position indicated by the first positioning information of the AR device is outside the target scene area, and the AR device is located near the target scene area, go to step S603.
  • the positioning information of the AR device can be determined according to the scene image and the three-dimensional scene model, it is determined that the visual positioning is successful.
  • control the AR device to play the audio guide data corresponding to the target scene area, and/or, based on the AR guide data corresponding to the target scene area
  • the display position of the control AR device to display the AR navigation data.
  • the navigation application opening prompt information is used to prompt to call the navigation application installed in the AR device to generate navigation map information to reach the target scene area from the position indicated by the first positioning information of the AR device.
  • the scene image collected by the AR device is acquired; based on the scene image and the constructed 3D scene model, the second positioning information of the AR device is determined.
  • the movement distance of the AR device to the explaining area corresponding to the next preset knowledge point can be determined; based on the determined movement distance, AR guidance data can be generated and displayed; and the movement distance and/or movement direction can be changed to update the AR guidance data; and control the AR device to display the updated AR guidance data.
  • AR guidance data when updating the AR guidance data, the size, color, shape, and flickering effect of the guidance signs in the AR guidance data may be updated.
  • the writing order of each step does not mean a strict execution order and constitutes any limitation on the implementation process.
  • the specific execution order of each step should be based on its function and possible
  • the inner logic is OK.
  • the embodiment of the present disclosure also provides an augmented reality data display device, as shown in FIG.
  • the first obtaining module 701 is configured to obtain the first positioning information of the augmented reality AR device
  • the second obtaining module 702 is configured to detect that the AR device is located outside the target scene area, and obtain navigation map information for instructing the AR device to arrive at the target scene area from the position indicated by the first positioning information;
  • the third obtaining module 703 is configured to detect that the AR device is located in the target scene area, and obtain second positioning information of the AR device in the target scene area;
  • the first presentation module 704 is configured to, in response to the position indicated by the second positioning information being within the explaining area corresponding to any preset knowledge point, play audio matching the any preset knowledge point through the AR device explaining the data and displaying AR special effect data matching the audio explaining data.
  • the device further includes: a second presentation module 705 configured as at least one of the following:
  • the AR device When it is detected that the distance between the AR device and the target scene area is greater than the set distance threshold, based on the determined display position of the virtual model corresponding to the target scene area, the AR device is controlled to display the virtual model;
  • the AR device When it is detected that the distance between the AR device and the target scene area is greater than the set distance threshold, the AR device is controlled to display direction guide information, and the direction guide information is used to indicate the AR device reach the target scene area.
  • the device further includes: a third presentation module 706 configured as at least one of the following:
  • control the AR device When it is detected that the distance between the AR device and the target scene area is less than or equal to the set distance threshold, control the AR device to play the audio guide data corresponding to the target scene area; When the distance between the AR device and the target scene area is less than or equal to the set distance threshold, based on the display position of the AR navigation data corresponding to the target scene area, the AR device is controlled to display the AR navigation data.
  • the second obtaining module 702 obtains the navigation map information used to indicate that the AR device arrives at the target scene area from the position indicated by the first positioning information, it is configured to:
  • the third obtaining module 703 configures for:
  • Second positioning information of the AR device is determined.
  • the third obtaining module 703 is further configured to: when it is detected that the AR device is located outside the target scene area, and the AR device is located in the adjacent area corresponding to the target scene area In the case of , acquire the scene image collected by the AR device.
  • the device further includes: a fourth display module 707 configured as at least one of the following:
  • the fourth presentation module 707 when controlling the AR device to display the AR guidance data for instructing the AR device to move to the explaining area corresponding to the next preset knowledge point, is configured to: :
  • the AR guidance data includes a guidance identifier for guiding the movement of the AR device; the fourth presentation module 707, according to the change of the movement distance and/or the movement direction, When the AR guidance data is updated, it is configured to perform at least one of the following:
  • the device further includes: a fifth display module 708 configured to:
  • the AR device In the process of playing the audio commentary data corresponding to any preset knowledge point and displaying AR special effect data matching the audio commentary data, in response to determining that the AR device does not meet the requirements based on the second positioning information
  • the method is applied to a client application platform, and the client application platform is a web page application platform or an applet application platform.
  • the functions of the device provided by the embodiments of the present disclosure or the included templates can be used to execute the methods described in the above method embodiments, and its implementation can refer to the descriptions of the above method embodiments. For the sake of brevity, no Let me repeat.
  • an embodiment of the present disclosure also provides an electronic device.
  • FIG. 8 it is a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure, including a processor 801 , a memory 802 , and a bus 803 .
  • the memory 802 is used to store execution instructions, including a memory 8021 and an external memory 8022; the memory 8021 here is also called an internal memory, and is configured to temporarily store computing data in the processor 801 and data exchanged with an external memory 8022 such as a hard disk, The processor 801 exchanges data with the external memory 8022 through the memory 8021.
  • the processor 801 communicates with the memory 802 through the bus 803, so that the processor 801 executes the following instructions:
  • Detecting that the AR device is located outside the target scene area acquiring navigation map information for instructing the AR device to arrive at the target scene area from the position indicated by the first positioning information;
  • an embodiment of the present disclosure also provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is run by a processor, the augmented reality data display method described in the above-mentioned method embodiments is executed.
  • the storage medium may be a volatile or non-volatile computer-readable storage medium.
  • An embodiment of the present disclosure also provides a computer program, the computer program includes computer-readable codes, and when the computer-readable codes run in the electronic device, the processor of the electronic device executes the augmented reality data as described in any of the above-mentioned embodiments. Show method.
  • Embodiments of the present disclosure also provide another computer program product, the computer program product carries program code, and the instructions included in the program code can be used to execute the steps of the method for displaying augmented reality data described in the above method embodiments, see The foregoing method embodiments are not described in detail here.
  • the above-mentioned computer program product may be specifically implemented by means of hardware, software or a combination thereof.
  • the computer program product may be embodied as a computer storage medium, and in other embodiments, the computer program product may be embodied as a software product, such as a software development kit (Software Development Kit, SDK) and the like.
  • the device involved in the embodiments of the present disclosure may be at least one of a system, a method, and a computer program product.
  • a computer program product may include a computer readable storage medium having computer readable program instructions thereon for causing a processor to implement various aspects of the present disclosure.
  • a computer readable storage medium may be a tangible device that can retain and store instructions for use by an instruction execution device.
  • a computer readable storage medium may be, for example, but is not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • the computer-readable program instructions described herein can be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device over at least one of a network, such as the Internet, a local area network, a wide area network, and a wireless network.
  • a network such as the Internet, a local area network, a wide area network, and a wireless network.
  • a network adapter card or a network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing/processing device .
  • Computer program instructions for performing the operations of the present disclosure may be assembly instructions, Industry Standard Architecture (ISA) instructions, machine instructions, machine-dependent instructions, microcode, firmware instructions, state setting data, or in one or more source code or object code written in any combination of programming languages, computer readable program instructions can be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer Execute on the remote computer, or entirely on the remote computer or server.
  • ISA Industry Standard Architecture
  • machine instructions machine-dependent instructions
  • microcode firmware instructions
  • state setting data or in one or more source code or object code written in any combination of programming languages
  • computer readable program instructions can be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer Execute on the remote computer, or entirely on the remote computer or server.
  • electronic circuits such as programmable logic circuits, FPGAs, or programmable logic arrays (Programmable Logic Arrays, PLAs), can be customized by using state information of computer-readable program instructions, which can execute computer-readable Read program instructions, thereby implementing various aspects of the present disclosure.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some communication interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the functions are realized in the form of software function units and sold or used as independent products, they can be stored in a non-volatile computer-readable storage medium executable by a processor.
  • the technical solution of the present disclosure is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in various embodiments of the present disclosure.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disc and other media that can store program codes. .
  • Embodiments of the present disclosure provide a method, device, device, medium, and program for displaying augmented reality data, wherein the method is executed by an electronic device, and the method includes: acquiring the first positioning information of the augmented reality AR device; detecting The AR device is located outside the target scene area, and acquires navigation map information indicating that the AR device reaches the target scene area from the position indicated by the first positioning information; detecting that the AR device is located in the target scene area, Acquiring second positioning information of the AR device in the target scene area; in response to the position indicated by the second positioning information being within the explaining area corresponding to any preset knowledge point, playing a video related to the AR device through the AR device Audio commentary data matching any preset knowledge point and display AR special effect data matching the audio commentary data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Geometry (AREA)
  • Architecture (AREA)
  • Remote Sensing (AREA)
  • Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon des modes de réalisation, la présente divulgation concerne un procédé et un appareil de présentation de données de réalité augmentée, et un dispositif, un support et un programme. Le procédé consiste : à acquérir des premières informations de positionnement d'un dispositif de réalité augmentée (RA) ; lors de da détection du fait que le dispositif RA est situé à l'extérieur d'une zone de scénario cible, à acquérir des informations de carte de navigation qui sont utilisées pour indiquer que le dispositif RA arrive au niveau de la zone de scénario cible à partir de l'emplacement indiqué par les premières informations de positionnement ; lors de la détection du fait que le dispositif RA est situé à l'intérieur de la zone de scénario cible, à acquérir des secondes informations de positionnement du dispositif RA dans la zone de scénario cible ; et, en réponse à l'emplacement indiqué par les secondes informations de positionnement comme étant situé à l'intérieur d'une zone d'explication correspondant à un quelconque point de connaissance prédéfini, à lire, au moyen du dispositif RA, des données d'explication audio qui correspondent à un quelconque point de connaissance prédéfini, et à présenter des données d'effet spécial RA qui correspondent aux données d'explication audio.
PCT/CN2022/085935 2021-06-25 2022-04-08 Procédé et appareil de présentation de données de réalité augmentée, et dispositif, support et programme WO2022267626A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110711276.0A CN113345108B (zh) 2021-06-25 2021-06-25 增强现实数据展示方法、装置、电子设备及存储介质
CN202110711276.0 2021-06-25

Publications (1)

Publication Number Publication Date
WO2022267626A1 true WO2022267626A1 (fr) 2022-12-29

Family

ID=77478866

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/085935 WO2022267626A1 (fr) 2021-06-25 2022-04-08 Procédé et appareil de présentation de données de réalité augmentée, et dispositif, support et programme

Country Status (2)

Country Link
CN (1) CN113345108B (fr)
WO (1) WO2022267626A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117473592A (zh) * 2023-12-27 2024-01-30 青岛创新奇智科技集团股份有限公司 一种基于工业大模型的数据展示方法及装置
CN117911655A (zh) * 2024-03-19 2024-04-19 山东省国土测绘院 基于在实景三维地图上增强现实的方法及***

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113345108B (zh) * 2021-06-25 2023-10-20 北京市商汤科技开发有限公司 增强现实数据展示方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110779520A (zh) * 2019-10-21 2020-02-11 腾讯科技(深圳)有限公司 导航方法及装置、电子设备和计算机可读存储介质
US10871377B1 (en) * 2019-08-08 2020-12-22 Phiar Technologies, Inc. Computer-vision based positioning for augmented reality navigation
CN112179331A (zh) * 2020-09-23 2021-01-05 北京市商汤科技开发有限公司 Ar导航的方法、装置、电子设备及存储介质
CN112684894A (zh) * 2020-12-31 2021-04-20 北京市商汤科技开发有限公司 增强现实场景的交互方法、装置、电子设备及存储介质
CN113345107A (zh) * 2021-06-25 2021-09-03 北京市商汤科技开发有限公司 增强现实数据展示方法、装置、电子设备及存储介质
CN113345108A (zh) * 2021-06-25 2021-09-03 北京市商汤科技开发有限公司 增强现实数据展示方法、装置、电子设备及存储介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10573037B2 (en) * 2012-12-20 2020-02-25 Sri International Method and apparatus for mentoring via an augmented reality assistant
CN110703922A (zh) * 2019-10-22 2020-01-17 成都中科大旗软件股份有限公司 一种旅游景区专用电子地图导游方法
CN111638796A (zh) * 2020-06-05 2020-09-08 浙江商汤科技开发有限公司 虚拟对象的展示方法、装置、计算机设备及存储介质
CN111640171B (zh) * 2020-06-10 2023-09-01 浙江商汤科技开发有限公司 一种历史场景的讲解方法、装置、电子设备及存储介质
CN112348969B (zh) * 2020-11-06 2023-04-25 北京市商汤科技开发有限公司 增强现实场景下的展示方法、装置、电子设备及存储介质
CN112950790A (zh) * 2021-02-05 2021-06-11 深圳市慧鲤科技有限公司 路线导航方法、装置、电子设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10871377B1 (en) * 2019-08-08 2020-12-22 Phiar Technologies, Inc. Computer-vision based positioning for augmented reality navigation
CN110779520A (zh) * 2019-10-21 2020-02-11 腾讯科技(深圳)有限公司 导航方法及装置、电子设备和计算机可读存储介质
CN112179331A (zh) * 2020-09-23 2021-01-05 北京市商汤科技开发有限公司 Ar导航的方法、装置、电子设备及存储介质
CN112684894A (zh) * 2020-12-31 2021-04-20 北京市商汤科技开发有限公司 增强现实场景的交互方法、装置、电子设备及存储介质
CN113345107A (zh) * 2021-06-25 2021-09-03 北京市商汤科技开发有限公司 增强现实数据展示方法、装置、电子设备及存储介质
CN113345108A (zh) * 2021-06-25 2021-09-03 北京市商汤科技开发有限公司 增强现实数据展示方法、装置、电子设备及存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117473592A (zh) * 2023-12-27 2024-01-30 青岛创新奇智科技集团股份有限公司 一种基于工业大模型的数据展示方法及装置
CN117473592B (zh) * 2023-12-27 2024-05-14 青岛创新奇智科技集团股份有限公司 一种基于工业大模型的数据展示方法及装置
CN117911655A (zh) * 2024-03-19 2024-04-19 山东省国土测绘院 基于在实景三维地图上增强现实的方法及***
CN117911655B (zh) * 2024-03-19 2024-05-28 山东省国土测绘院 基于在实景三维地图上增强现实的方法及***

Also Published As

Publication number Publication date
CN113345108B (zh) 2023-10-20
CN113345108A (zh) 2021-09-03

Similar Documents

Publication Publication Date Title
WO2022267626A1 (fr) Procédé et appareil de présentation de données de réalité augmentée, et dispositif, support et programme
US10499002B2 (en) Information processing apparatus and information processing method
US11163997B2 (en) Methods and apparatus for venue based augmented reality
CN111065891B (zh) 基于增强现实的室内导航***
US10677596B2 (en) Image processing device, image processing method, and program
US9324298B2 (en) Image processing system, image processing apparatus, storage medium having stored therein image processing program, and image processing method
WO2016017253A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US10733798B2 (en) In situ creation of planar natural feature targets
TWI783472B (zh) Ar場景內容的生成方法、展示方法、電子設備及電腦可讀儲存介質
JP7150894B2 (ja) Arシーン画像処理方法及び装置、電子機器並びに記憶媒体
WO2020114214A1 (fr) Procédé et appareil de guidage de non-voyant, support d'informations et dispositif électronique
WO2022252688A1 (fr) Appareil et procédé de présentation de donnés de réalité augmentée, dispositif électronique et support d'enregistrement
CN113345107A (zh) 增强现实数据展示方法、装置、电子设备及存储介质
TW201126451A (en) Augmented-reality system having initial orientation in space and time and method
KR101914660B1 (ko) 자이로 센서를 기반으로 증강현실 컨텐츠의 표시를 제어하는 방법 및 그 장치
KR102443049B1 (ko) 전자 장치 및 그 동작 방법
Hew et al. Markerless Augmented Reality for iOS Platform: A University Navigational System
KR20180055764A (ko) 지형정보 인식을 기반으로 증강현실 오브젝트를 표시하는 방법 및 그 장치
JP2015121892A (ja) 画像処理装置、画像処理方法
KR102421370B1 (ko) 위치 기반 ar 서비스에서 지도와 ar 상의 poi 하이라이팅을 연동하는 방법 및 시스템
US20240087157A1 (en) Image processing method, recording medium, image processing apparatus, and image processing system
Giannakidis et al. Hacking Visual Positioning Systems to Scale the Software Development of Augmented Reality Applications for Urban Settings
KR20190006584A (ko) 지형정보 인식을 기반으로 증강현실 오브젝트를 표시하는 방법 및 그 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22827129

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22827129

Country of ref document: EP

Kind code of ref document: A1