WO2023124376A1 - 基于ar的无线网络仿真方法、***、终端及存储介质 - Google Patents

基于ar的无线网络仿真方法、***、终端及存储介质 Download PDF

Info

Publication number
WO2023124376A1
WO2023124376A1 PCT/CN2022/124687 CN2022124687W WO2023124376A1 WO 2023124376 A1 WO2023124376 A1 WO 2023124376A1 CN 2022124687 W CN2022124687 W CN 2022124687W WO 2023124376 A1 WO2023124376 A1 WO 2023124376A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
simulation
wireless network
data
grid
Prior art date
Application number
PCT/CN2022/124687
Other languages
English (en)
French (fr)
Inventor
刘海洋
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2023124376A1 publication Critical patent/WO2023124376A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W16/00Network planning, e.g. coverage or traffic planning tools; Network deployment, e.g. resource partitioning or cells structures
    • H04W16/18Network planning tools
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • H04W24/06Testing, supervising or monitoring using simulated traffic
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present disclosure relates to the technical field of computer communication, and in particular to an AR-based wireless network simulation method, system, terminal and storage medium.
  • Simulation technology is very important in the design of modern communication networks. At present, in the process of wireless network planning, it is usually necessary to use the simulation system to simulate the overall performance of the wireless network, and plan the layout of the wireless network according to the simulation results.
  • the main purpose of the embodiments of the present disclosure is to provide an AR-based wireless network simulation method, system, terminal and storage medium.
  • an embodiment of the present disclosure provides an AR-based wireless network simulation method, including: acquiring a simulation result, and analyzing the simulation result to obtain raster data; and superimposing the raster data on the terminal in the captured image for display.
  • an embodiment of the present disclosure further provides a terminal, the terminal includes a processor, a memory, a computer program stored on the memory and executable by the processor, and a computer program for implementing the processor and the A data bus connecting and communicating between the memories, wherein when the computer program is executed by the processor, the steps of any AR-based wireless network simulation method provided in the present disclosure are implemented.
  • an embodiment of the present disclosure further provides an AR-based wireless network simulation system, including: a terminal, configured to send a preset simulation radius and simulation parameters to the server; and a server, configured to It is assumed that the base stations within the simulation radius perform simulation calculations based on the simulation parameters to obtain simulation results, and feed back the simulation results to the terminal; wherein, the terminal is also used to perform the following steps based on the received simulation results: Steps of any AR-based wireless network simulation method provided in the disclosure specification.
  • an embodiment of the present disclosure further provides a storage medium for computer-readable storage, the storage medium stores one or more programs, and the one or more programs can be executed by one or more processors , so as to implement the steps of any AR-based wireless network simulation method provided in the present disclosure.
  • FIG. 1 is a schematic flowchart of an AR-based wireless network simulation method provided by an embodiment of the present disclosure
  • FIG. 2 is a schematic flow chart of the sub-steps of the AR-based wireless network simulation method in FIG. 1;
  • Fig. 3 is a schematic plan view of screening data based on the frustum model provided by the present embodiment
  • FIG. 4 is a stereoscopic schematic diagram of filtering data based on a frustum model provided by an embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of screening data based on a frustum overlapping algorithm provided by an embodiment of the present disclosure
  • FIG. 6 is a schematic diagram of converting raster data into screen color blocks provided by an embodiment of the present disclosure
  • FIG. 7 is a schematic structural block diagram of an AR-based wireless network simulation system provided by an embodiment of the present disclosure.
  • FIG. 8 is a schematic structural block diagram of a terminal provided by an embodiment of the present disclosure.
  • Wireless network emulation helps to fine-tune the deployment of wireless networks, reduce network construction costs, and provide customers with high-quality solutions.
  • the relevant wireless network simulation solutions are all large-scale software running on the computer side, and the simulation results can only be viewed on the electronic map, which is not intuitive enough.
  • Embodiments of the present disclosure provide an AR-based wireless network simulation method, system, terminal, and storage medium.
  • the AR-based wireless network simulation method can be applied to mobile terminals, and the mobile terminals can be electronic devices such as mobile phones, tablet computers, notebook computers, desktop computers, personal digital assistants and wearable devices.
  • FIG. 1 is a schematic flowchart of an AR-based wireless network simulation method provided by an embodiment of the present disclosure.
  • the AR-based wireless network simulation method includes steps S101 to S102.
  • Step S101 obtaining a simulation result, and analyzing the simulation result to obtain raster data.
  • the terminal Before the terminal obtains the simulation result, it needs to send the preset simulation radius and simulation parameters to the server, and the server performs simulation calculation on the base stations within the preset simulation radius based on the simulation parameters to obtain the simulation result.
  • the simulation calculation of the present disclosure adopts the scheme of combining the front and rear ends, the terminal performs the simulation configuration, the server side performs the simulation calculation, and the terminal and the server perform data communication through a wireless network.
  • the server Based on the terminal's GPS, the latitude, longitude and altitude of the current location are obtained, and the simulation radius and other simulation parameters are set, and the simulation parameters are transmitted to the server through the wireless communication unit. If the simulation parameters are wrong, the server will prompt the terminal to make corrections. If the simulation parameters are correct, the server will search for base stations within the simulation radius, and combine the high-precision maps within the corresponding range to perform simulation calculations to obtain simulation results.
  • each raster data contains information such as longitude, latitude, altitude, network strength, and grid size.
  • Step S102 superimposing the grid data on the captured image of the terminal for display.
  • the terminal After receiving the raster data of the simulation result, the terminal analyzes the raster data, and calls the AR function to superimpose the raster data on the captured image of the terminal for display.
  • step S102 includes: sub-step S1021 to sub-step S1022.
  • Sub-step S1021 filtering the raster data to obtain filtered raster data.
  • the raster data Before the raster data is superimposed on the captured image of the terminal for display, the raster data may also be screened according to the position parameter of the terminal and/or the field angle parameter of the camera of the terminal. Filter the raster data according to the first viewing frustum model.
  • a first position parameter of the terminal and a first field of view parameter of the camera of the terminal are acquired, and a first view frustum model is established based on the first position parameter and the first field of view parameter.
  • the first location parameter of the terminal includes: latitude and longitude where the terminal is located, altitude where the terminal is located, azimuth where the terminal is located, and an inclination angle of the terminal.
  • the terminal acquires the latitude and longitude by calling GPS data, the altitude data by calling barometer data, the orientation by calling magnetometer data, and the inclination by calling gyroscope data.
  • the first field of view parameter of the camera of the terminal includes: the field of view of the camera.
  • the terminal acquires the FOV of the camera by reading the configuration file.
  • a near plane distance of 0, and a far plane distance of L it is recorded as the first Frustum model (Frustum1).
  • the raster data is screened according to the first frustum model, and the raster data inside the first frustum model can be searched based on a preset frustum culling algorithm to obtain the screened frustum data. raster data.
  • FIG. 5 it is a schematic diagram of the overlapping portion of the first viewing frustum model and the second viewing frustum model.
  • a new frustum is generated based on the current position and the pose of the camera
  • the model is denoted as a second viewing frustum model (frustum2), and the raster data can be filtered according to the second viewing frustum model to obtain filtered raster data.
  • A is a preset angle
  • D is a preset distance
  • the parameters of A and D can be set based on the scene, which is not limited in the present disclosure.
  • the simulation results can be screened again by using the frustum overlapping algorithm.
  • the grid data overlapping the first viewing frustum model and the second viewing frustum model is screened out to obtain the first grid data; then based on the preset The viewing frustum culling algorithm filters out the raster data outside the overlapping range of the first viewing frustum model and the second viewing frustum model and inside the second viewing frustum model to obtain the second grid Data; obtaining filtered raster data according to the first raster data and the second raster data.
  • Sub-step S1022 superimposing the screened raster data on the captured image of the terminal for display.
  • each grid in the screened grid data is rendered and processed as a color block in the display screen of the terminal, and the color block is superimposed on the shooting of the terminal displayed in the image.
  • FIG. 6 is a schematic diagram of converting raster data into an image on the screen provided by an embodiment of the present disclosure.
  • the raster data needs to be re-screened because a new frustum model needs to be established based on the current position and camera pose.
  • the overlapping data can be screened based on the previous viewing frustum model and the current viewing frustum model, and the grid data and corresponding color blocks of the overlapping part can be retained, and the overlap of the first viewing frustum can be deleted.
  • Outer data and the corresponding color blocks in AR add the raster data of the non-overlapping part of the second frustum model and the first frustum model, and process the corresponding raster data into color blocks in AR. In this way, when the position of the terminal is displaced or the angle of the camera changes, part of the color blocks in the overlapped part of the first viewing frustum model and the second viewing frustum model can be reused to improve the efficiency of AR display.
  • the terminal in the embodiments of the present disclosure may or may not have the AR capability.
  • the display of the simulation result is completed through the AR capability of the terminal itself.
  • the terminal connects and invokes other devices with the AR capability to complete the display of the simulation results.
  • Other devices with AR capabilities include, but are not limited to: other smartphones with AR capabilities, AR glasses, AR helmets, etc.
  • the AR-based wireless network simulation method provided in the above embodiments displays the simulation results based on the AR technology, which greatly improves the user experience.
  • the frustum elimination algorithm is used to filter the simulation results based on the pose of the terminal, which reduces the burden on the terminal. Due to performance pressure, when the terminal pose changes, the frustum overlapping algorithm is used to screen the simulation results again, which greatly reduces the amount of calculation for screening data. Furthermore, by adopting the scheme of combining the front and back ends during the simulation, the simulation configuration is performed on the terminal, and the simulation calculation is performed on the server side, which greatly improves the simulation speed.
  • the terminal is a smart phone based on iOS or Android system, and the smart phone has AR capability and a camera for AR display.
  • the server is a server with simulation computing capabilities, and communicates with the smart phone through a wireless network.
  • Step 1 Obtain the latitude, longitude and altitude of the current location through the GPS on the mobile phone, and set the simulation radius and other simulation parameters.
  • Step 2 The mobile terminal transmits the simulation parameters to the server. If the simulation parameters are wrong, the user is prompted to correct them. If the simulation parameters are correct, the server searches for base stations within the simulation range, and performs simulation calculations in combination with the high-precision map within the corresponding range.
  • Step 3 After the simulation calculation is completed, the server sends the simulation results to the client in text form.
  • the text content is raster data, and each raster contains information such as longitude, latitude, altitude, network strength, and raster size.
  • Step 4 The mobile terminal downloads the simulation results, and updates the download progress in the user interface in real time. After the download is completed, the simulation results are analyzed and stored in the database.
  • Step 5 The mobile phone calls GPS data to obtain the latitude and longitude, calls the barometer data to obtain the altitude, calls the magnetometer data to obtain the orientation, calls the gyroscope data to obtain the inclination, and reads the configuration file to obtain the FOV of the camera. Based on the position parameters and field of view parameters, establish the frustum model frustum1 with the horizontal and vertical field of view angles as FOV+2 ⁇ A, the near plane distance as 0, and the far plane distance as L, and use the frustum elimination algorithm to find the frustum model.
  • A is a preset angle
  • L is a preset distance.
  • the value can be 100 meters or 200 meters, and L can be determined according to the simulation results of how far can be seen clearly in the camera lens.
  • the values of A and L are determined according to practical applications, which are not limited in the present disclosure.
  • Step 6 The mobile terminal calls the AR function for the screened raster data for AR display, converts the physical world coordinates composed of the latitude, longitude and altitude of the grid into screen coordinates, converts the grid size into the pixel size on the screen, and converts the network strength For different gradient colors, each grid is rendered as a color block on the screen, and the color block is superimposed on the real-time picture captured by the mobile phone camera for AR display.
  • Step 7 Monitor the rotation angle of the camera on the mobile phone and the position of the mobile phone.
  • a new frustum model frustum2 is generated based on the current position and the pose of the camera. , calculate the overlapping part of frustum1 and frustum2, keep the raster data of the overlapping part, delete the data outside the overlap of frustum1 and the corresponding color block in AR, add the data outside the overlap of frustum2, and process the corresponding data into the color in AR piece.
  • the terminal is a smart phone based on iOS or Android system, and the smart phone does not have AR capabilities.
  • the smart phone is connected to an AR helmet and can call the AR helmet for AR display.
  • Step 1 Obtain the latitude, longitude and altitude of the current location through the GPS on the mobile phone, and set the simulation radius and other simulation parameters.
  • Step 2 The mobile terminal transmits the simulation parameters to the server. If the simulation parameters are wrong, the user is prompted to correct them. If the simulation parameters are correct, the server searches for base stations within the simulation range, and performs simulation calculations in combination with the high-precision map within the corresponding range.
  • Step 3 After the simulation calculation is completed, the server sends the simulation results to the client in text form.
  • the text content is raster data, and each raster contains information such as longitude, latitude, altitude, network strength, and raster size.
  • Step 4 The mobile terminal downloads the simulation results, and updates the download progress in the user interface in real time. After the download is completed, the simulation results are analyzed and stored in the database.
  • Step 5 The mobile phone calls GPS data to obtain the latitude and longitude, calls the barometer data to obtain the altitude, calls the magnetometer data to obtain the orientation, calls the gyroscope data to obtain the inclination angle, and obtains the field of view FOV of the AR helmet camera. Based on the position parameters and field of view parameters, establish the frustum model frustum1 with the horizontal and vertical field of view angles as FOV+2 ⁇ A, the near plane distance as 0, and the far plane distance as L, and use the frustum elimination algorithm to find the frustum model. The raster data inside the cone model.
  • Step 6 The mobile terminal calls the AR function for the screened raster data for AR display, converts the physical world coordinates composed of the latitude, longitude and altitude of the grid into screen coordinates, converts the grid size into the pixel size on the screen, and converts the network strength For different gradient colors, each grid is rendered as a color block on the screen, and the color blocks are superimposed on the real-time images captured by the camera of the AR helmet for AR display.
  • Step 7 The mobile phone monitors the rotation angle of the camera of the AR helmet and the position of the mobile phone.
  • the rotation angle of the camera is greater than or equal to A/2, or the position of the mobile phone exceeds D, based on the current position of the mobile phone and the camera pose of the AR helmet.
  • Generate a new frustum model frustum2 calculate the overlapping part of frustum1 and frustum2, retain the raster data of the overlapping part, delete the data outside the overlap of frustum1 and the corresponding color block in AR, add the data outside the overlap of frustum2, and set
  • the corresponding data is processed into color blocks in AR.
  • FIG. 7 is a schematic structural block diagram of an AR-based wireless network simulation system provided by an embodiment of the present disclosure.
  • the system 300 includes: a terminal 301 and a server 302 .
  • the terminal 301 is configured to send a preset simulation radius and simulation parameters to the server.
  • the server 302 is configured to perform simulation calculation on the base stations within the preset simulation radius range based on the simulation parameters to obtain a simulation result, and feed back the simulation result to the terminal.
  • the terminal 301 is further configured to execute the steps of any AR-based wireless network simulation method provided by the embodiments of the present disclosure based on the received simulation result.
  • the AR-based wireless network simulation system 300 further includes: a wireless communication unit 303 , through which the terminal 301 and the server 302 perform data communication.
  • the terminal 301 can be divided into: a simulation result analysis unit 3011, a simulation result screening unit 3012, and an AR display unit 3013; wherein, the simulation result analysis unit 3011 is used to analyze the raster data in the text and store it in the In the database; the simulation result screening unit 3012 is used to filter out the raster data to be displayed; the AR display unit 3013 is used to display the simulation results based on AR.
  • FIG. 8 is a schematic structural block diagram of a terminal provided by an embodiment of the present disclosure.
  • the terminal 400 includes a processor 401 and a memory 402, and the processor 401 and the memory 402 are connected through a bus 403, such as an I2C (Inter-integrated Circuit) bus.
  • a bus 403 such as an I2C (Inter-integrated Circuit) bus.
  • the processor 401 is used to provide computing and control capabilities to support the operation of the entire terminal device.
  • the processor 401 can be a central processing unit (Central Processing Unit, CPU), and the processor 401 can also be other general processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC) ), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the memory 402 can be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk, or a mobile hard disk.
  • FIG. 8 is only a block diagram of a partial structure related to the solution of the embodiment of the present disclosure, and does not constitute a limitation on the terminal device to which the solution of the embodiment of the disclosure is applied.
  • the server More or fewer components than shown in the figures may be included, or certain components may be combined, or have a different arrangement of components.
  • the processor is configured to run a computer program stored in the memory, and implement any one of the AR-based wireless network simulation methods provided in the embodiments of the present disclosure when the computer program is executed.
  • the processor is configured to run a computer program stored in a memory, and implement the following steps when executing the computer program: obtain a simulation result, and analyze the simulation result to obtain raster data; The grid data is superimposed on the captured image of the terminal for display.
  • the processor before the processor implements superimposing the raster data on the captured image of the terminal for display, it is configured to implement: according to the location parameters of the terminal and/or the camera of the terminal The field of view parameter is used to filter the raster data.
  • the processor when the processor filters the raster data according to the position parameter of the terminal and/or the field of view parameter of the camera of the terminal, it is used to: acquire the A first position parameter of the terminal and a first field of view parameter of the camera of the terminal, establishing a first view frustum model based on the first position parameter and the first view angle parameter; according to the first view
  • the cone model filters the raster data.
  • the processor when the processor implements filtering of the raster data according to the first viewing frustum model, it is configured to: filter the first viewing frustum based on a preset viewing frustum culling algorithm. The raster data inside the cone model, and the filtered raster data is obtained.
  • the processor when implementing the AR-based wireless network simulation method, is configured to implement: when the rotation angle of the terminal is greater than or equal to a preset angle, and/or, when the terminal moves When the position is greater than or equal to the preset distance, acquire a second position parameter of the terminal and a second field of view parameter of the camera, and establish a second position parameter based on the second position parameter and the second field of view parameter.
  • a viewing frustum model filtering the raster data according to the second viewing frustum model.
  • the processor when it implements the AR-based wireless network simulation method, it is configured to: filter out the first viewing frustum model and the first viewing frustum model based on the preset viewing frustum culling algorithm Obtain the first raster data from the overlapping raster data of the second viewing frustum model; filter out the overlapping of the first viewing frustum model and the second viewing frustum model based on the preset viewing frustum culling algorithm The raster data outside the range and inside the second frustum model is obtained to obtain second raster data; and the filtered raster data is obtained according to the first raster data and the second raster data.
  • the processor when the processor implements the AR-based wireless network simulation method, it is used to realize: the grid data includes: one or more of longitude and latitude, altitude, network strength, and grid size item.
  • the processor when the processor implements superimposing the grid data on the captured image of the terminal for display, it is configured to implement: each grid in the screened grid data
  • the rendering process is a color block on the display screen of the terminal, and the color block is superimposed on the captured image of the terminal for display.
  • the processor when the processor renders each grid in the screened grid data into a color block on the display screen of the terminal, the processor is configured to: convert the screened The physical coordinates composed of the longitude, latitude, and altitude corresponding to each grid in the final grid data are converted into screen coordinates; the grid size corresponding to each grid is converted into a screen pixel size ; and converting the network intensities to colors of different gradients.
  • the processor before the step of obtaining the simulation result, is configured to: send a preset simulation radius and simulation parameters to the server, so that the server pair is within the preset simulation radius The base station performs simulation calculation based on the simulation parameters to obtain a simulation result.
  • An embodiment of the present disclosure also provides a storage medium for computer-readable storage, the storage medium stores one or more programs, and the one or more programs can be executed by one or more processors to implement the following: The steps of any AR-based wireless network simulation method provided in the description of the embodiments of the present disclosure.
  • the storage medium may be an internal storage unit of the terminal described in the foregoing embodiments, such as a hard disk or a memory of the terminal.
  • the storage medium may also be an external storage device of the terminal, such as a plug-in hard disk equipped on the terminal, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, a flash memory card (Flash Card) etc.
  • Embodiments of the present disclosure provide an AR-based wireless network simulation method, system, terminal, and storage medium.
  • the display of simulation results based on AR technology greatly improves the user experience.
  • the frustum elimination algorithm is used to screen the simulation results based on the terminal's pose, which reduces the performance pressure on the terminal.
  • the visual The cone overlapping algorithm screens the simulation results again, which greatly reduces the calculation amount of the screened data.
  • the simulation configuration is performed on the terminal, and the simulation calculation is performed on the server side, which greatly improves the simulation speed.
  • the functional modules/units in the system, and the device can be implemented as software, firmware, hardware, and an appropriate combination thereof.
  • the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be composed of several physical components. Components cooperate to execute.
  • Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application-specific integrated circuit .
  • Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media).
  • computer storage media includes both volatile and nonvolatile media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. permanent, removable and non-removable media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disk (DVD) or other optical disk storage, magnetic cartridges, tape, magnetic disk storage or other magnetic storage devices, or can Any other medium used to store desired information and which can be accessed by a computer.
  • communication media typically embodies computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism, and may include any information delivery media .

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Computer And Data Communications (AREA)

Abstract

本公开实施例提供一种基于AR的无线网络仿真方法、***、终端及存储介质,属于计算机及通信技术领域。该方法应用于终端,包括:获取仿真结果,并对所述仿真结果进行解析得到栅格数据;以及将所述栅格数据叠加到所述终端的拍摄图像中进行展示。

Description

基于AR的无线网络仿真方法、***、终端及存储介质
相关申请的交叉引用
本公开要求享有2021年12月31日提交的名称为“基于AR的无线网络仿真方法、***、终端及存储介质”的中国专利申请CN202111677646.X的优先权,其全部内容通过引用并入本公开中。
技术领域
本公开涉及计算机通信技术领域,尤其涉及一种基于AR的无线网络仿真方法、***、终端及存储介质。
背景技术
仿真技术在现代通信网络设计中十分重要,目前在无线网络规划的过程中,通常要利用仿真***对无线网络的整体性能进行模拟,根据仿真结果进行无线网络的布局规划。
发明内容
本公开实施例的主要目的在于提供一种基于AR的无线网络仿真方法、***、终端及存储介质。
第一方面,本公开实施例提供一种基于AR的无线网络仿真方法,包括:获取仿真结果,并对所述仿真结果进行解析得到栅格数据;以及将所述栅格数据叠加到所述终端的拍摄图像中以进行展示。
第二方面,本公开实施例还提供一种终端,所述终端包括处理器、存储器、存储在所述存储器上并可被所述处理器执行的计算机程序以及用于实现所述处理器和所述存储器之间的连接通信的数据总线,其中所述计算机程序被所述处理器执行时,实现如本公开说明书提供的任一项基于AR的无线网络仿真方法的步骤。
第三方面,本公开实施例还提供一种基于AR的无线网络仿真***,包括:终端,用于将预设仿真半径和仿真参数发送给所述服务器;以及服务器,用于对在所述预设仿真半径范围内的基站基于所述仿真参数进行仿真计算得到仿真结果,并将所述仿真结果反馈给所述终端;其中,所述终端,还用于基于接收到的仿真结果,执行如本公开说明书提供的任一项基于AR的无线网络仿真方法的步骤。
第四方面,本公开实施例还提供一种存储介质,用于计算机可读存储,所述存储介质存储有一个或者多个程序,所述一个或者多个程序可被一个或者多个处理器执行,以实现 如本公开说明书提供的任一项基于AR的无线网络仿真的方法的步骤。
附图说明
为了更清楚地说明本公开实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例提供的一种基于AR的无线网络仿真方法的流程示意图;
图2为图1中的基于AR的无线网络仿真方法的子步骤流程示意图;
图3为实施本实施例提供的一种基于视锥体模型筛选数据的平面示意图;
图4为本公开实施例提供的一种基于视锥体模型筛选数据的立体示意图;
图5为本公开实施例提供的基于视锥体重叠算法筛选数据的示意图;
图6为本公开实施例提供的一种将栅格数据转换为屏幕色块的示意图;
图7为本公开实施例提供的一种基于AR的无线网络仿真***的结构示意框图;以及
图8为本公开实施例提供的一种终端的结构示意框图。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
附图中所示的流程图仅是示例说明,不是必须包括所有的内容和操作/步骤,也不是必须按所描述的顺序执行。例如,有的操作/步骤还可以分解、组合或部分合并,因此实际执行的顺序有可能根据实际情况改变。
应当理解,在此本公开说明书中所使用的术语仅仅是出于描述特定实施例的目的而并不意在限制本公开。如在本公开说明书和所附权利要求书中所使用的那样,除非上下文清楚地指明其它情况,否则单数形式的“一”、“一个”及“该”意在包括复数形式。
无线网络仿真有助于精细化部署无线网络、降低建网成本,并能够为客户提供优质的解决方案。相关无线网络仿真方案都是运行在电脑端的大型软件,且仿真结果只能在电子地图中查看,不够直观。
本公开实施例提供一种基于AR的无线网络仿真方法、***、终端及存储介质。其中,该基于AR的无线网络仿真方法可应用于移动终端中,该移动终端可以手机、平板电脑、笔记本电脑、台式电脑、个人数字助理和穿戴式设备等电子设备。
下面结合附图,对本公开的一些实施例作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
请参照图1,图1为本公开实施例提供的一种基于AR的无线网络仿真方法的流程示意图。
如图1所示,该基于AR的无线网络仿真方法包括步骤S101至步骤S102。
步骤S101、获取仿真结果,并对所述仿真结果进行解析得到栅格数据。
首先,在终端获取仿真结果的步骤之前,需要先将预设的仿真半径和仿真参数发送给服务器,服务器对在预设仿真半径范围内的基站基于仿真参数进行仿真计算得到仿真结果。
本公开的仿真计算采用了前后端结合的方案,由终端进行仿真配置,服务器端进行仿真计算,终端和服务器通过无线网络进行数据通信。
基于终端的GPS获取当前位置的经纬度以及海拔,并设置仿真半径以及其它仿真参数,将仿真参数通过无线通信单元传递给服务器。若仿真参数有误,服务器端会提示终端进行更正,若仿真参数正确,服务器端查找符合仿真半径范围内的基站,并结合对应范围内的高精地图进行仿真计算得到仿真结果。
基于AR仿真展示的需要,仿真结果是以栅格数据的形式输出到文本中,每个栅格数据包含经度、纬度、海拔、网络强度、栅格大小等信息。
步骤S102、将所述栅格数据叠加到所述终端的拍摄图像中进行展示。
终端接收到仿真结果的栅格数据后,对栅格数据进行解析,并调用AR功能将栅格数据叠加到所述终端的拍摄图像中进行展示。
在一实施例中,参照图2,步骤S102包括:子步骤S1021至子步骤S1022。
子步骤S1021,对所述栅格数据进行筛选得到筛选后的栅格数据。
将所述栅格数据叠加到所述终端的拍摄图像中进行展示之前,还可以根据终端的位置参数和/或所述终端的摄像头的视场角参数,对栅格数据进行筛选。根据所述第一视锥体模型对所述栅格数据进行筛选。
获取终端的第一位置参数及终端的摄像头的第一视场角参数,基于第一位置参数及第一视场角参数建立第一视锥体模型。
其中,终端的第一位置参数包括:终端所在的经纬度、终端所在的海拔、终端所处的方位以及终端的倾角。
在一些实施例中,终端通过调用GPS数据获取经纬度,调用气压计数据获取海拔数据,调用磁力计数据获取方位,调用陀螺仪数据获取倾角。
终端的摄像头的第一视场角参数包括:摄像头的视场角。在一些实施例中,终端通过读取配置文件来获取摄像头的视场角FOV。
如图3、图4所示,基于第一位置参数和建立水平、垂直方向视场角为FOV+2×A、***面距离为0、远平面距离为L的视锥体模型记为第一视锥体模型(Frustum1)。根据所述第一视锥体模型对所述栅格数据进行筛选,可以基于预设的视锥体剔除(Frustum Culling)算法查找该第一视锥体模型内部的栅格数据,得到筛选后的栅格数据。
当所述终端的转动角度大于或等于预设角度,和/或,当所述终端的移动位置大于或等于预设距离时,获取所述终端的第二位置参数及所述摄像头的第二视场角参数,并基于所述第二位置参数和所述第二视场角参数建立第二视锥体模型;根据所述第二视锥体模型对所述栅格数据进行筛选。
如图5所示,为第一视锥体模型与第二视锥体模型重叠部分的示意图。
示例性地,通过监控摄像头转动的角度和终端的位置,当摄像头的转动角度大于等于A/2,或者终端位置的移动超过D时,基于当前的位置和摄像头位姿,生成新的视锥体模型,记为第二视锥体模型(frustum2),进而可根据所述第二视锥体模型对所述栅格数据进行筛选得到筛选后的栅格数据。其中,A为预设角度,D为预设距离,A和D的参数可基于场景进行设定,本公开对此不作限定。
在一些实施例中,为了降低筛选数据的计算量,还可以采用视锥体重叠算法再次进行仿真结果的筛选。首先基于所述预设的视锥体剔除算法筛选出所述第一视锥体模型与所述第二视锥体模型重叠的栅格数据,得到第一栅格数据;再基于所述预设的视锥体剔除算法筛选出所述第一视锥体模型与所述第二视锥体模型重叠范围之外且在所述第二视锥体模型内部的栅格数据,得到第二栅格数据;根据所述第一栅格数据和第二栅格数据得到筛选后的栅格数据。
子步骤S1022、将所述筛选后的栅格数据叠加到所述终端的拍摄图像中进行展示。
得到筛选后的栅格数据后,将所述筛选后的栅格数据中的每个栅格渲染处理为所述终端的显示屏幕中的色块,将所述色块叠加到所述终端的拍摄图像中进行展示。
如图6所示,图6为本公开实施例提供的将栅格数据转换为屏幕上图像的示意图。
将所述筛选后的栅格数据中的每个栅格对应的所述经度、纬度以及所述海拔组成的物理坐标转换为屏幕坐标;将每个所述栅格对应的所述栅格大小转换为屏幕像素大小;以及将所述网络强度转换为不同梯度的颜色。
当终端位置发生位移或摄像头角度发生变化时,由于需要基于当前位置和摄像头位姿建立新的视锥体模型重新筛选栅格数据。为了提高AR展示的效率,可基于前一次的视锥体模型与本次视锥体模型进行重叠数据的筛选,保留重叠部分的栅格数据以及对应的色块,删除第一视锥体重叠之外的数据以及AR中对应的色块,添加第二视锥体模型与第一视锥体模型未重叠部分的栅格数据,并将对应的栅格数据处理成AR中的色块。这样可以在终 端位置发生位移或摄像头角度发生变化时,复用一部分第一视锥体模型与第二视锥体模型重叠部分的色块,提高AR展示的效率。
需要说明的是,本公开实施例中的终端,可以具备AR能力,也可以不具备AR能力,终端本身具备AR能力时,通过终端本身的AR能力来完成仿真结果的展示。终端本身不具备AR能力时,终端连接并调用具备AR能力的其他设备来完成仿真结果的展示。其他具备AR能力的设备包括但不限于:具备AR能力的其他智能手机、AR眼镜、AR头盔等。
上述实施例提供的基于AR的无线网络仿真方法,基于AR技术进行仿真结果的展示,极大提升用户体验,采用视锥体剔除算法,基于终端的位姿对仿真结果进行筛选,减轻了终端的性能压力,在终端位姿发生变化时,采用视锥体重叠算法再次进行仿真结果的筛选,大幅缩小了筛选数据的计算量。进一步地,通过在仿真时采用了前后端结合的方案,在终端进行仿真配置,服务器端进行仿真计算,极大提升了仿真速度。
为了更好地说明本公开的基于AR的无线网络仿真方法,结合应用场景提供以下实施例。
实施例一
在本实施例中,所述终端为基于iOS或者Android***的智能手机,且该智能手机具备AR能力和用于AR展示的摄像头。服务端为具备仿真计算能力的服务器,且与智能手机通过无线网络进行数据通信。
步骤一、通过手机端的GPS获取当前位置的经纬度以及海拔,并设置仿真半径以及其它仿真参数。
步骤二、手机端将仿真参数传递给服务器端,若仿真参数有误,提示用户更正,若仿真参数正确,服务端查找仿真范围内的基站,并结合对应范围内的高精地图进行仿真计算。
步骤三、仿真计算结束,服务端通过将仿真结果以文本形式发送给客户端,文本内容为栅格数据,每个栅格包含经度、纬度、海拔、网络强度、栅格大小等信息。
步骤四、手机端下载仿真结果,并在用户界面中实时更新下载进度,下载完成后对仿真结果进行解析并存入数据库中。
步骤五、手机端调用GPS数据获取经纬度,调用气压计数据获取海拔,调用磁力计数据获取方位,调用陀螺仪数据获取倾角,读取配置文件获取摄像头的视场角FOV。基于位置参数和视场角参数建立水平、垂直方向视场角为FOV+2×A,***面距离为0,远平面距离为L的视锥体模型frustum1,通过视锥体剔除算法查找该视锥体模型内部的栅格数据。其中,A为预设角度,L为预设距离,示例性地,可以取值为100米、200米,可依据在摄像头镜头里面能看清多远的仿真结果来确定L。A和L的数值根据实际应用进行确定,本公开对此不作限定。
步骤六、手机端将筛选过后的栅格数据调用AR功能进行AR展示,将栅格的经纬度及海拔组成的物理世界坐标转换为屏幕坐标,栅格大小转换为屏幕中的像素大小,网络强度转换为不同梯度的颜色,进而实现将每个栅格渲染为一个屏幕中的色块,将色块叠加到手机摄像头拍摄的实时画面中进行AR展示。
步骤七、手机端监控摄像头转动的角度和手机的位置,当摄像头角度转动角度大于等于A/2,或者手机位置移动超过D时,基于当前位置和摄像头位姿,生成新的视锥体模型frustum2,计算frustum1和frustum2的重叠部分,保留重叠部分的栅格数据,删除frustum1重叠之外的数据以及AR中对应的色块,添加frustum2重叠之外的数据,并将对应数据处理成AR中的色块。
实施例二
在本实施例中,所述终端为基于iOS或者Android***的智能手机,且该智能手机不具备AR能力,智能手机与一AR头盔进行连接并可调用AR头盔进行AR展示,服务端为具备仿真计算能力的服务器,且与智能手机通过无线网络进行数据通信。
步骤一、通过手机端的GPS获取当前位置的经纬度以及海拔,并设置仿真半径以及其它仿真参数。
步骤二、手机端将仿真参数传递给服务器端,若仿真参数有误,提示用户更正,若仿真参数正确,服务端查找仿真范围内的基站,并结合对应范围内的高精地图进行仿真计算。
步骤三、仿真计算结束,服务端通过将仿真结果以文本形式发送给客户端,文本内容为栅格数据,每个栅格包含经度、纬度、海拔、网络强度、栅格大小等信息。
步骤四、手机端下载仿真结果,并在用户界面中实时更新下载进度,下载完成后对仿真结果进行解析并存入数据库中。
步骤五、手机端调用GPS数据获取经纬度,调用气压计数据获取海拔,调用磁力计数据获取方位,调用陀螺仪数据获取倾角,获取AR头盔摄像头的视场角FOV。基于位置参数和视场角参数建立水平、垂直方向视场角为FOV+2×A,***面距离为0,远平面距离为L的视锥体模型frustum1,通过视锥体剔除算法查找该视锥体模型内部的栅格数据。
步骤六、手机端将筛选过后的栅格数据调用AR功能进行AR展示,将栅格的经纬度及海拔组成的物理世界坐标转换为屏幕坐标,栅格大小转换为屏幕中的像素大小,网络强度转换为不同梯度的颜色,进而实现将每个栅格渲染为一个屏幕中的色块,将色块叠加到AR头盔的摄像头拍摄的实时画面中进行AR展示。
步骤七、***控AR头盔的摄像头转动的角度以及手机的位置,当摄像头角度转动角度大于等于A/2,或者手机的位置移动超过D时,基于手机的当前位置和AR头盔的摄像头位姿,生成新的视锥体模型frustum2,计算frustum1和frustum2的重叠部分,保留重 叠部分的栅格数据,删除frustum1重叠之外的数据以及AR中对应的色块,添加frustum2重叠之外的数据,并将对应数据处理成AR中的色块。
图7为本公开实施例提供的一种基于AR的无线网络仿真***的的结构示意性框图。
所述***300包括:终端301、以及服务器302。
所述终端301,用于将预设仿真半径和仿真参数发送给所述服务器。
所述服务器302,用于对在所述预设仿真半径范围内的基站基于所述仿真参数进行仿真计算得到仿真结果,并将所述仿真结果反馈给所述终端。
所述终端301,还用于基于接收到的仿真结果,执行本公开实施例提供的任一项基于AR的无线网络仿真方法的步骤。
进一步地,所述基于AR的无线网络仿真***300还包括:无线通信单元303,终端301和服务器302通过无线通信单元303进行数据通信。
进一步地,所述终端301可分为:仿真结果解析单元3011、仿真结果筛选单元3012、以及AR展示单元3013;其中,仿真结果解析单元3011,用于解析出文本中的栅格数据并存入数据库中;仿真结果筛选单元3012,用于筛选出需要显示的栅格数据;AR展示单元3013,用于将仿真结果在基于AR展示。
请参阅图8,图8为本公开实施例提供的一种终端的结构示意性框图。
如图8所示,终端400包括处理器401和存储器402,处理器401和存储器402通过总线403连接,该总线比如为I2C(Inter-integrated Circuit)总线。
处理器401用于提供计算和控制能力,支撑整个终端设备的运行。处理器401可以是中央处理单元(Central Processing Unit,CPU),该处理器401还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。其中,通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
存储器402可以是Flash芯片、只读存储器(ROM,Read-Only Memory)磁盘、光盘、U盘或移动硬盘等。
本领域技术人员可以理解,图8中示出的结构,仅仅是与本公开实施例方案相关的部分结构的框图,并不构成对本公开实施例方案所应用于其上的终端设备的限定,服务器可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
其中,所述处理器用于运行存储在存储器中的计算机程序,并在执行所述计算机程序时实现本公开实施例提供的任意一种所述的基于AR的无线网络仿真的方法。
在一实施例中,所述处理器用于运行存储在存储器中的计算机程序,并在执行所述计 算机程序时实现如下步骤:获取仿真结果,并对所述仿真结果进行解析得到栅格数据;将所述栅格数据叠加到所述终端的拍摄图像中进行展示。
在一实施例中,所述处理器在实现将所述栅格数据叠加到所述终端的拍摄图像中进行展示之前,用于实现:根据所述终端的位置参数和/或所述终端的摄像头的视场角参数,对所述栅格数据进行筛选。
在一实施例中,所述处理器在实现根据所述终端的位置参数和/或所述终端的摄像头的视场角参数,对所述栅格数据进行筛选时,用于实现:获取所述终端的第一位置参数及所述终端的摄像头的第一视场角参数,基于所述第一位置参数及所述第一视场角参数建立第一视锥体模型;根据所述第一视锥体模型对所述栅格数据进行筛选。
在一实施例中,所述处理器在实现根据所述第一视锥体模型对所述栅格数据进行筛选时,用于实现:基于预设的视锥体剔除算法筛选所述第一视锥体模型内部的栅格数据,得到筛选后的栅格数据。
在一实施例中,所述处理器在实现基于AR的无线网络仿真的方法时,用于实现:当所述终端的转动角度大于或等于预设角度,和/或,当所述终端的移动位置大于或等于预设距离时,获取所述终端的第二位置参数及所述摄像头的第二视场角参数,并基于所述第二位置参数和所述第二视场角参数建立第二视锥体模型;根据所述第二视锥体模型对所述栅格数据进行筛选。
在一实施例中,所述处理器在实现基于AR的无线网络仿真的方法时,用于实现:基于所述预设的视锥体剔除算法筛选出所述第一视锥体模型与所述第二视锥体模型重叠的栅格数据,得到第一栅格数据;基于所述预设的视锥体剔除算法筛选出所述第一视锥体模型与所述第二视锥体模型重叠范围之外且在所述第二视锥体模型内部的栅格数据,得到第二栅格数据;根据所述第一栅格数据和第二栅格数据得到筛选后的栅格数据。
在一实施例中,所述处理器在实现基于AR的无线网络仿真的方法时,用于实现:所述栅格数据包括:经度纬度、海拔、网络强度、栅格大小中的一项或多项。
在一实施例中,所述处理器在实现将所述栅格数据叠加到所述终端的拍摄图像中进行展示时,用于实现:将所述筛选后的栅格数据中的每个栅格渲染处理为所述终端的显示屏幕中的色块,将所述色块叠加到所述终端的拍摄图像中进行展示。
在一实施例中,所述处理器在实现将所述筛选后的栅格数据中的每个栅格渲染处理为所述终端的显示屏幕中的色块时,用于实现:将所述筛选后的栅格数据中的每个栅格对应的所述经度、纬度以及所述海拔组成的物理坐标转换为屏幕坐标;将每个所述栅格对应的所述栅格大小转换为屏幕像素大小;以及将所述网络强度转换为不同梯度的颜色。
在一实施例中,所述处理器在实现获取仿真结果的步骤之前,用于实现:将预设仿真 半径和仿真参数发送给服务器,以使所述服务器对在所述预设仿真半径范围内的基站基于所述仿真参数进行仿真计算得到仿真结果。
要说明的是,所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,上述描述的终端的工作过程,可以参考前述基于AR的无线网络仿真方法实施例中的对应过程,在此不再赘述。
本公开实施例还提供一种存储介质,用于计算机可读存储,所述存储介质存储有一个或者多个程序,所述一个或者多个程序可被一个或者多个处理器执行,以实现如本公开实施例说明书提供的任一项基于AR的无线网络仿真的方法的步骤。
其中,所述存储介质可以是前述实施例所述的终端的内部存储单元,例如所述终端的硬盘或内存。所述存储介质也可以是所述终端的外部存储设备,例如所述终端上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。
本公开实施例提供一种基于AR的无线网络仿真方法、***、终端及存储介质。基于AR技术进行仿真结果的展示,极大提升用户体验,采用视锥体剔除算法,基于终端的位姿对仿真结果进行筛选,减轻了终端的性能压力,在终端位姿发生变化时,采用视锥体重叠算法再次进行仿真结果的筛选,大幅缩小了筛选数据的计算量。进一步地,通过在仿真时采用了前后端结合的方案,在终端进行仿真配置,服务器端进行仿真计算,极大提升了仿真速度。
本领域普通技术人员可以理解,上文中所公开方法中的全部或某些步骤、***、装置中的功能模块/单元可以被实施为软件、固件、硬件及其适当的组合。在硬件实施例中,在以上描述中提及的功能模块/单元之间的划分不一定对应于物理组件的划分;例如,一个物理组件可以具有多个功能,或者一个功能或步骤可以由若干物理组件合作执行。某些物理组件或所有物理组件可以被实施为由处理器,如中央处理器、数字信号处理器或微处理器执行的软件,或者被实施为硬件,或者被实施为集成电路,如专用集成电路。这样的软件可以分布在计算机可读介质上,计算机可读介质可以包括计算机存储介质(或非暂时性介质)和通信介质(或暂时性介质)。如本领域普通技术人员公知的,术语计算机存储介质包括在用于存储信息(诸如计算机可读指令、数据结构、程序模块或其他数据)的任何方法或技术中实施的易失性和非易失性、可移除和不可移除介质。计算机存储介质包括但不限于RAM、ROM、EEPROM、闪存或其他存储器技术、CD-ROM、数字多功能盘(DVD)或其他光盘存储、磁盒、磁带、磁盘存储或其他磁存储装置、或者可以用于存储期望的信息并且可以被计算机访问的任何其他的介质。此外,本领域普通技术人员公知的是,通信介质通常包含计算机可读指令、数据结构、程序模块或者诸如载波或其他传输机制之类的 调制数据信号中的其他数据,并且可包括任何信息递送介质。
应当理解,在本公开说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者***不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者***所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者***中还存在另外的相同要素。
上述本公开实施例序号仅仅为了描述,不代表实施例的优劣。以上所述,仅为本公开的具体实施例,但本公开的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本公开揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本公开的保护范围之内。因此,本公开的保护范围应以权利要求的保护范围为准。

Claims (13)

  1. 一种基于AR的无线网络仿真方法,应用于终端,包括:
    获取仿真结果,并对所述仿真结果进行解析得到栅格数据;以及
    将所述栅格数据叠加到所述终端的拍摄图像中以进行展示。
  2. 根据权利要求1所述的基于AR的无线网络仿真方法,其中,所述将所述栅格数据叠加到所述终端的拍摄图像中进行展示之前,还包括:
    根据所述终端的位置参数和/或所述终端的摄像头的视场角参数,对所述栅格数据进行筛选。
  3. 根据权利要求2所述的基于AR的无线网络仿真方法,其中,所述根据所述终端的位置参数和/或所述终端的摄像头的视场角参数,对所述栅格数据进行筛选,包括:
    获取所述终端的第一位置参数及所述终端的摄像头的第一视场角参数,基于所述第一位置参数及所述第一视场角参数建立第一视锥体模型;以及
    根据所述第一视锥体模型对所述栅格数据进行筛选。
  4. 根据权利要求3所述的基于AR的无线网络仿真方法,其中,所述根据所述第一视锥体模型对所述栅格数据进行筛选,包括:
    基于预设的视锥体剔除算法筛选所述第一视锥体模型内部的栅格数据,得到筛选后的栅格数据。
  5. 根据权利要求3所述的基于AR的无线网络仿真方法,还包括:
    在所述终端的转动角度大于或等于预设角度,和/或,在所述终端的移动位置大于或等于预设距离的情况下,获取所述终端的第二位置参数及所述摄像头的第二视场角参数,并基于所述第二位置参数和所述第二视场角参数建立第二视锥体模型;以及
    根据所述第二视锥体模型对所述栅格数据进行筛选。
  6. 根据权利要求5所述的基于AR的无线网络仿真方法,还包括:
    基于所述预设的视锥体剔除算法筛选出所述第一视锥体模型与所述第二视锥体模型重叠的栅格数据,得到第一栅格数据;
    基于所述预设的视锥体剔除算法筛选出所述第一视锥体模型与所述第二视锥体模型重叠范围之外且在所述第二视锥体模型内部的栅格数据,得到第二栅格数据;以及
    根据所述第一栅格数据和第二栅格数据得到筛选后的栅格数据。
  7. 根据权利要求2-6任一项所述的基于AR的无线网络仿真方法,其中,所述栅格数据包括:经度纬度、海拔、网络强度、栅格大小中的一项或多项。
  8. 根据权利要求7所述的基于AR的无线网络仿真方法,其中,将所述栅格数据叠加到所述终端的拍摄图像中进行展示,包括:
    将所述筛选后的栅格数据中的每个栅格渲染处理为所述终端的显示屏幕中的色块,将所述色块叠加到所述终端的拍摄图像中进行展示。
  9. 根据权利要求8所述的基于AR的无线网络仿真方法,其中,所述将所述筛选后的栅格数据中的每个栅格渲染处理为所述终端的显示屏幕中的色块,包括:
    将所述筛选后的栅格数据中的每个栅格对应的所述经度、纬度以及所述海拔组成的物理坐标转换为屏幕坐标;
    将每个所述栅格对应的所述栅格大小转换为屏幕像素大小;以及
    将所述网络强度转换为不同梯度的颜色。
  10. 根据权利要求1所述的基于AR的无线网络仿真方法,其中,在所述获取仿真结果的步骤之前,还包括:
    将预设仿真半径和仿真参数发送给服务器,以使所述服务器对在所述预设仿真半径范围内的基站基于所述仿真参数进行仿真计算得到仿真结果。
  11. 一种终端,所述终端包括处理器、存储器、存储在所述存储器上并可被所述处理器执行的计算机程序以及用于实现所述处理器和所述存储器之间的连接通信的数据总线,其中所述计算机程序被所述处理器执行时,实现如权利要求1至10中任一项所述的无线网络仿真方法的步骤。
  12. 一种基于AR的无线网络仿真***,包括:
    终端,用于将预设仿真半径和仿真参数发送给所述服务器;以及
    服务器,用于对在所述预设仿真半径范围内的基站基于所述仿真参数进行仿真计算得到仿真结果,并将所述仿真结果反馈给所述终端;
    其中,所述终端,还用于基于接收到的仿真结果,执行如权利要求1至10中任一项所述的基于AR的无线网络仿真方法的步骤。
  13. 一种存储介质,用于计算机可读存储,所述存储介质存储有一个或者多个程序,所述一个或者多个程序可被一个或者多个处理器执行,以实现权利要求1至10中任一项所述的基于AR的无线网络仿真方法的步骤。
PCT/CN2022/124687 2021-12-31 2022-10-11 基于ar的无线网络仿真方法、***、终端及存储介质 WO2023124376A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111677646.XA CN116419296A (zh) 2021-12-31 2021-12-31 基于ar的无线网络仿真方法、***、终端及存储介质
CN202111677646.X 2021-12-31

Publications (1)

Publication Number Publication Date
WO2023124376A1 true WO2023124376A1 (zh) 2023-07-06

Family

ID=86997466

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/124687 WO2023124376A1 (zh) 2021-12-31 2022-10-11 基于ar的无线网络仿真方法、***、终端及存储介质

Country Status (2)

Country Link
CN (1) CN116419296A (zh)
WO (1) WO2023124376A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013006731A1 (en) * 2011-07-05 2013-01-10 3-D Virtual Lens Technologies, Llc Three-dimensional image display using a dynamically variable grid
CN105163337A (zh) * 2015-08-21 2015-12-16 北京拓明科技有限公司 一种基于覆盖预测仿真的移动网络数据地理映射的方法
CN109743743A (zh) * 2018-11-28 2019-05-10 中通服建设有限公司 网络性能的栅格化监测方法、电子设备、存储介质及***
EP3767987A1 (de) * 2019-07-19 2021-01-20 Siemens Aktiengesellschaft Verfahren zum optimieren eines funk-feldes mittels simulation
CN113766518A (zh) * 2020-06-02 2021-12-07 ***通信集团设计院有限公司 基站天线选择和广播波束规划方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013006731A1 (en) * 2011-07-05 2013-01-10 3-D Virtual Lens Technologies, Llc Three-dimensional image display using a dynamically variable grid
CN105163337A (zh) * 2015-08-21 2015-12-16 北京拓明科技有限公司 一种基于覆盖预测仿真的移动网络数据地理映射的方法
CN109743743A (zh) * 2018-11-28 2019-05-10 中通服建设有限公司 网络性能的栅格化监测方法、电子设备、存储介质及***
EP3767987A1 (de) * 2019-07-19 2021-01-20 Siemens Aktiengesellschaft Verfahren zum optimieren eines funk-feldes mittels simulation
CN113766518A (zh) * 2020-06-02 2021-12-07 ***通信集团设计院有限公司 基站天线选择和广播波束规划方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NING, FANGXIN ET AL.: "Rapid Updating Method of Local Texture in 3D Terrain Scene", GEOSPATIAL INFORMATION, no. 06, 28 December 2013 (2013-12-28), pages 87 - 89, XP009547385 *

Also Published As

Publication number Publication date
CN116419296A (zh) 2023-07-11

Similar Documents

Publication Publication Date Title
WO2021008456A1 (zh) 图像处理方法、装置、电子设备及存储介质
EP2991339B1 (en) Photographing method and electronic device
KR102126300B1 (ko) 올-인-포커스 이미지를 생성하기 위한 방법 및 장치
KR102547104B1 (ko) 전자 장치 및 복수의 영상을 처리하는 방법
US10848669B2 (en) Electronic device and method for displaying 360-degree image in the electronic device
CN104010124A (zh) 一种显示滤镜效果的方法、装置和移动终端
CN110062157B (zh) 渲染图像的方法、装置、电子设备和计算机可读存储介质
CN110515610B (zh) 页面绘制的控制方法、装置及设备
EP3822757A1 (en) Method and apparatus for setting background of ui control
US20240104810A1 (en) Method and apparatus for processing portrait image
KR20170136797A (ko) 구형 컨텐츠 편집 방법 및 이를 지원하는 전자 장치
US20170206048A1 (en) Content sharing methods and apparatuses
CN111352560B (zh) 分屏方法、装置、电子设备和计算机可读存储介质
CN110263301A (zh) 用于确定文字的颜色的方法和装置
WO2024051541A1 (zh) 特效图像生成方法、装置、电子设备及存储介质
WO2023124376A1 (zh) 基于ar的无线网络仿真方法、***、终端及存储介质
US20230237625A1 (en) Video processing method, electronic device, and storage medium
US11810336B2 (en) Object display method and apparatus, electronic device, and computer readable storage medium
US20240031518A1 (en) Method for replacing background in picture, device, storage medium and program product
CN106383679B (zh) 一种定位方法及其终端设备
CN115988322A (zh) 生成全景图像的方法、装置、电子设备和存储介质
CN111221444A (zh) 分屏特效处理方法、装置、电子设备和存储介质
CN115908221B (zh) 图像处理方法、电子设备及存储介质
US11651529B2 (en) Image processing method, apparatus, electronic device and computer readable storage medium
RU2802724C1 (ru) Способ и устройство обработки изображений, электронное устройство и машиночитаемый носитель информации

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22913640

Country of ref document: EP

Kind code of ref document: A1